Skip to main content

Privacy advocates protest Microsoft’s backup of user encryption keys

An individual using a laptop that shows the logo for Microsoft OneDrive.
Pablo Calvog/Shutterstock
In the climate of government spying, and the ever present threat from nefarious hackers, encryption is a hot topic. Politicians don’t like it, privacy campaigners claim it’s a must, and end users are left worried about who to trust. Microsoft aimed to help make that decision easy with Windows 10 by having certain content, like corporate apps, emails and other sensitive data, encrypted by default.

You might know about this if you tuned in to some of Microsoft’s pre-release PR for the new operating system, but what you probably didn’t know is that Microsoft created a backup key (should you lose your password to decrypt the data), which it then stores remotely on its own servers.

This may well be a feature that was put in place to help protect those that might not otherwise have their decryption key stored safely. As ransomware victims no doubt can attest too, nothing much is worse than having your data encrypted and unrecoverable. However, some have suggested that this is a security risk in itself, and Microsoft hasn’t been very forthcoming.

With a remotely stored decryption key, there is always the danger of someone hacking the server where it’s stored or grabbing it during its transfer from your system to Microsoft’s servers. And as the Intercept points out, Microsoft has also been forced to give data on citizens to the NSA and other intelligence agencies in the past. If it stores customer decryption keys, it seems possible that it could be forced to hand those over to the authorities, too.

You can delete the back up key that Microsoft holds. To do so, simply login to your Microsoft account on the OneDrive page and you are quickly given access to all of the keys Microsoft stores for you. Deleting them there is just a few clicks away.

Privacy advocates still aren’t satisfied with this solution, though, because there’s no way to ensure the key was completely deleted. It may in fact still be available, but only to Microsoft. That’s a bit paranoid, but Microsoft didn’t volunteer the encryption key’s storage location in the first place, so trust is definitely an issue.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more