Microsoft’s Recall feature is still threat to privacy despite recent tweaks
In May 2024, Microsoft suffered what’s been described as “one of its worst PR disasters to date” when it rolled out the AI-powered Recall feature in the Windows 11 preview. Microsoft compared the feature, which took snapshots of a user’s screen at regular intervals to a kind of ‘photographic memory.’ That memory would last for about three months, as Microsoft said Recall would use around 25GB of storage to save approximately three months' worth of snapshots, since all processing and storage happens locally on the device. Users and experts alike were not impressed, however, immediately sounding the alarm bells and dubbing the feature a “privacy nightmare”.
The unmitigated disaster
Originally, Recall was enabled by default and had multiple privacy and security flaws. One of the most glaring of those was that it would store screenshots of everything that the user was doing on the screen (including "disappearing" messages from apps such as Signal and WhatsApp and text it pulled out of images) in plain text, which means any app with enough access to your system could read it easily. There were few options to control what kind of information it collected, and no automatic filtering out of sensitive data such as credit card numbers.
Microsoft trying to fix the unfixable
After scathing feedback, Microsoft removed Recall from Windows 11 previews and started working on tweaks to make the feature more privacy-friendly while still useful. A year has passed since, and finally, Microsoft is reintroducing the reworked and refurbished Recall to the release preview version of Windows 11.
A lot has changed for the better. First and foremost, it’s no longer turned on by default — users now have to actively choose to enable it, and opt-in for the second time during setup (this may change in the future as Recall is still in preview, though). Windows Hello authentication is now required to even use the feature. However, once it’s set up, access only requires a Windows Hello PIN, which is arguably too easy a fallback.
Another key improvement is that Recall's data is now encrypted at rest, meaning even if someone gains access to your PC’s files, they can’t read the database without also breaking the encryption.
Microsoft has also introduced automated content filtering, designed to prevent sensitive data like credit card numbers, bank info, or ID documents from being saved in the first place. (researchers have already been able to find cases when some credit card information would slip through the automated filtering cracks, though). Users now have more granular control, which allows them to exclude specific apps or websites, limit how long snapshots are stored, and even fully uninstall Recall if you don’t want it on your system at all.
If you want to get into the thick of it, we highly recommend this in-depth article by Ars Technica’s Andrew Cunningham, which breaks down exactly what Microsoft has and hasn’t fixed, and a deep dive by Kevin Beaumont, which explores the remaining risks and why Recall still demands a high level of trust.
Why Recall is still a problem
There are still a bunch of unresolved issues — like the fact that after the initial setup, you can open Recall without using biometrics, just by signing in with your Windows Hello PIN. Another issue is that sensitive data, like banking details, doesn't always get filtered out. And surprisingly, apps you'd expect to be excluded by default, like Signal or all video conferencing apps, are not excluded. After discovering this, Signal took matters into their own hands and disabled screenshots of messages in their app for Windows. Disappearing messages — whether sent via Signal, Telegram, or WhatsApp — are also captured by default. And if Recall is enabled on the PC of someone you’re talking to, it means your ‘secret’ messages with them will be saved on their PC too.
Even brushing all of this aside, there’s one important consideration at stake.
As Cunningham correctly put it, Recall demands “an extraordinary level of trust that Microsoft hasn't earned.” Microsoft has never exactly been known for strong privacy protections—on the contrary, it’s recently gotten a lot of negative attention for pushing product ads onto users’ screens, which does not exactly scream ‘privacy.’
It will not come as a shocker if over the next year or two the Recall feature finds its way onto a wider range of devices. Maybe we shouldn’t expect it to go back to being enabled by default, considering the amount of negative feedback it got initially, but we also shouldn’t forget that it was that feedback that pushed back against the privacy violations brought by the initial draft of Recall. We as a community should stay vigilant and make sure that the Microsoft’s new feature stays within reasonable bounds.
What AdGuard is doing about Recall’s privacy risks
First off, Recall is only available on next-generation Microsoft Copilot+ PCs. So if your device came out before mid-2024, you're not getting it. You can check out the full list of compatible PCs here.
That said, while not many users qualify for Recall right now, that’s going to change: newer, more advanced PCs will eventually become the norm. That’s why we at AdGuard decided to take action and introduce a setting in our Windows desktop app that would block Recall. You can follow our work on this feature on GitHub. You might see this new setting included as early as the next AdGuard for Windows release.