Skip to main content

Windows 10 is downloading even if you don’t want it

windows 10 is downloading even if you dont want it hpwindows10
Image used with permission by copyright holder
Windows 10 is being downloaded to users’ computers, even if they haven’t reserved a copy of the new operating system.

Microsoft has confirmed that anyone with automatic updates turned on will find that some of the necessary files are being downloaded to their computer if they make the decision to upgrade to Windows 10.

A user contacted The Inquirer after discovering a hidden folder called $Windows.~BT that was between 3.5GB and 6GB, according to the user. He expressed concern over files that were supposed to be optional being downloaded to people’s computers, often unknowingly.

If you have automatic updates turned off on your computer then you won’t be affected by this.

“For individuals who have chosen to receive automatic updates through Windows Update, we help upgradable devices get ready for Windows 10 by downloading the files they’ll need if they decide to upgrade,” said Microsoft in a statement that responded to reports of these downloads. “When the upgrade is ready, the customer will be prompted to install Windows 10 on the device.”

Some users as far back as July reported similar files being downloaded to their computers, reports Ars Technica. To remove the files, users will need to uninstall the KB3035583 update before deleting the actual folder. Novice users who aren’t interested in Windows 10 will likely not be impressed.

Microsoft has been pushing the upgrades to Windows 10 pretty hard, offering a free update for users in a bid to get everyone on to the same operating system. These unsolicited downloads may be taking zealous marketing a step too far, however.

These reported instances also highlight for users the potential risks of using automatic updates, which may install large unwanted files on your system.

Jonathan Keane
Former Digital Trends Contributor
Jonathan is a freelance technology journalist living in Dublin, Ireland. He's previously written for publications and sites…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more