Skip to main content

Microsoft is already backing down on its most controversial AI feature

The new Surface Pro on a table.
Luke Larsen / Digital Trends

Even before Copilot+ PCs have made it to store shelves, Microsoft is already making changes to its Recall feature. Recall is at the center of Copilot+, taking snapshots of everything you do on your PC and using a local AI model to sift through that information. In response to backlash, Microsoft is making changes to how Recall works, as announced through a Windows blog post.

For starters, Recall is now opt-in instead of opt-out. Previously, Recall would be the default setting on Copilot+ laptops, but Microsoft will now show a screen during the setup process that tells users what Recall does. If you skip past the screen, Recall will remain turned off.

Microsoft is also requiring Windows Hello to use Recall now. You’ll need to authenticate with either your face or fingerprint to use Recall, and Microsoft says that “proof of presence” through Windows Hello is required to see the snapshots that Recall has saved. That’s a pretty massive change, as leaving your Copilot+ PC open while you step away could open the door to a variety of privacy and security issues previously.

Finally, Microsoft says it’s using “just in time” decryption for your Recall database, as well as an encrypted the search index. That means that your snapshots will be decrypted immediately after you authenticate with Windows Hello, but they’ll be encrypted up to that point.

Recall caused a frenzy in the PC community due to its photographic memory, tracing back everything you do on your PC from web searches to private messages. The AI processing for this data happens on the device — it never gets sent to a data center — but there are still clear privacy and security issues with that setup. The changes here should help make Recall a bit more secure.

The most powerful change by far, however, is leaving Recall off by default. It seems that Microsoft eventually wants this feature as part of the wider Windows ecosystem, which could mean unsuspecting users are feeding Recall with data that they’re not using. It’s a shocking change for Microsoft, which traditionally enables its services by default in Windows. That speaks to how intense the backlash for Recall really was.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more