Skip to main content

Microsoft is backpedaling on Recall, but it’s for the best

Microsoft's CEO introducing Copilot+.
Luke Larsen / Digital Trends

Four days. We’re just four days away from Microsoft releasing the first wave of Copilot+ PCs, which have been available for preorder for almost a month, and Microsoft has decided to delay the marquee feature of the new devices, Recall. The AI-powered photographic memory feature has been mired in controversy since its introduction, with some going as far as to call it a “PR nightmare.”

Although the delay completely undermines Copilot+, it’s ultimately the right move for Microsoft. From the PR nightmare perspective, Microsoft has been here before with its rushed AI features. It’s hard to forget the ripple that Bing Chat caused last year when it told me it wanted to be human, and if we saw anything on that level out of Recall, it would have been even worse. Delaying Recall is the right decision, but it comes after the feature caused a frenzy in the PC industry in the first place.

It’s not surprising that the birth of the AI PC drew a lot of attention. Copilot+ comes with a handful of AI-driven features, though all of them are available either through the cloud or locally in Windows already. Short of Recall, that is. It’s the defining feature for Copilot+, offering your PC a photographic memory that can do everything from search through images to pull up your DMs. It’s the type of AI assistant that Microsoft has always envisioned, built with your personal context in mind.

That’s great on paper, but security researchers quickly found out that it wasn’t as secure as Microsoft said. In fact, one security researcher said that someone could steal everything Recall has ever recorded with two lines of code. That’s not to mention the privacy implications of Recall, which forced Microsoft to switch the feature from opt out to opt in last week.

The Surface Laptop running local AI models.
Luke Larsen / Digital Trends

Outside of the issues with Recall, which researchers have discovered well before the feature is even officially released, Microsoft has had a warping effect on the larger PC industry. Copilot+, and more specifically Recall, prompted AMD to release its Ryzen AI 300 processors way ahead of schedule. And it forced Intel to go deep on its Lunar Lake CPUs, which are set to arrive later in the year.

Even with the old guard of CPU makers rushing to get Copilot+ chips ready, they won’t have access to the software when they release. Instead, Microsoft is focusing Copilot+ exclusively on the Snapdragon X Elite, despite the fact that there are some early warning signs of poor performance out of the chips. With Recall now delayed, it seems like all three chip vendors are on a more level playing field, at the very least.

Microsoft is releasing Recall to Windows Insiders with a Copilot+ PC, and it says it plans “to make Recall (preview) available for all Copilot+ PCs coming soon.” We’re already so far removed from the original pitch of Recall, though. It’s now not only limited to an extremely exclusive set of devices, it’s limited to those devices that have opted into the Windows Insider program. And even then, it’s still in a “preview” status, as Microsoft notes.

Microsoft is backpedaling hard. It’s the right decision, as there are clearly some issues with Recall that go beyond a little privacy scare. Still, I can’t help but feel Microsoft sold us a bill of goods. Microsoft is in the driver’s seat when it comes to Recall and Copilot+, and its rush to announce and release the feature has backfired.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more