Microsoft’s Windows Recall AI feature, a tool designed to enhance productivity by constantly capturing user activity, continues to draw scrutiny over privacy concerns.
Recent evaluations reveal persistent failures in Recall’s ability to block sensitive data, even as Microsoft now broadens its availability to Intel and AMD Copilot+ devices. These shortcomings cast a shadow on the feature’s promise of securely archiving user activity while highlighting the complexities of deploying AI-powered tools.
Windows Recall Privacy Filters are Failing in Common Scenarios
Recall’s sensitive information filter is meant to prevent screenshots containing private data—such as Social Security numbers, passwords, and credit card details—from being saved.
However, testing by Tom’s Hardware found critical weaknesses. Avram Piltch reported about his experiences: “When I entered a credit card number and a random username / password into a Windows Notepad window, Recall captured it, despite the fact that I had text such as “Capital One Visa” right next to the numbers.”
In another test, Piltch input sensitive information into a PDF loan application viewed in Microsoft Edge. Recall captured this data as well, despite its intended safeguards. A custom HTML page created to mimic a payment form further demonstrated Recall’s inability to recognize sensitive contexts.
Despite clearly labeled fields for credit card information, Recall captured the data without restriction. As Piltch noted, “What my experiment proves is that it’s pretty much impossible for Microsoft’s AI filter to identify every situation where sensitive information is on screen.”
While Recall succeeded in avoiding screenshots of credit card fields on major e-commerce sites like Pimoroni and Adafruit, these isolated successes contrast sharply with its broader failures in other common use cases.
Microsoft Acknowledges Limitations
In response to his findings, Microsoft emphasized the evolving nature of Recall’s sensitive data filtering, pointing to it’s blog post about the current preview of the feature:
“We’ve updated Recall to detect sensitive information like credit card details, passwords, and personal identification numbers. When detected, Recall won’t save or store those snapshots. We’ll continue to improve this functionality, and if you find sensitive information that should be filtered out, for your context, language, or geography, please let us know through Feedback Hub.”
This iterative development process relies heavily on user feedback to refine the feature. However, privacy advocates question whether such an approach is adequate for addressing the potential risks posed by inconsistent filtering in sensitive scenarios.
A Feature Defined by Delays and Challenges
Recall’s introduction has been very bumpy. Unveiled at Microsoft’s Build conference in May 2024, the feature was positioned as a transformative productivity tool. By capturing periodic screenshots of user activity, Recall allows users to search their digital history using natural language queries such as “budget presentation” or “email draft.”
Initially scheduled for release in June 2024, Recall’s rollout faced repeated delays due to privacy and security concerns. Early iterations of the feature lacked encryption, prompting criticism from privacy advocates and users alike.
These concerns led to significant architectural overhauls, culminating in a preview release for Copilot+ PCs in November. Senior Product Manager Brandon LeBlanc described the delays as necessary to deliver a “secure and trusted experience.”
Expanded Rollout and New Capabilities
In the first week of December, Microsoft expanded Recall’s availability to Intel and AMD-powered Copilot+ devices, marking an important milestone in the feature’s development. Previously exclusive to Snapdragon devices, Recall now supports a wider range of hardware.
Its core functionality—capturing and indexing screenshots of open applications, documents, and websites—remains central to its appeal. Users can revisit their digital activity via a timeline interface and interact with archived content through the “Click to Do” feature, enabling tasks like copying text, saving images, or generating summaries.
The expansion is part of a broader push to integrate AI into the Windows ecosystem. Complementary tools such as “Image Creator” in Microsoft Photos, which generates visuals based on text prompts, and “Cocreator” in Paint, which enables AI-assisted artwork creation, further illustrate Microsoft’s commitment to embedding AI across its productivity suite.
Enhanced Security Measures Fall Short
To address privacy concerns, Microsoft implemented multiple technical safeguards in the current version of Recall. Screenshots are encrypted using device-bound keys stored in the Trusted Platform Module (TPM), ensuring only authorized access.
Virtualization-Based Security (VBS) isolates screenshots in protected memory regions, protecting them from malware attacks. Additionally, users have the option to pause or delete screenshots and exclude specific apps or websites from being captured.
Despite these advancements, Recall’s inconsistent performance in filtering sensitive information undermines its reliability. The AI’s inability to discern sensitive data across diverse contexts suggests a gap between technical safeguards and real-world functionality.
Balancing AI Innovation with User Trust
Recall exemplifies the challenges of integrating AI into everyday productivity tools while ensuring robust privacy protections. While its ability to archive and retrieve past activities offers clear benefits, the ongoing issues with sensitive data filtering highlight the need for a more comprehensive approach to user trust and security.
Until these issues are resolved, we advise users to exercise caution when using Recall, particularly when handling private data.