Skip to main content

Microsoft accidentally installs Windows 10 on some systems

microsoft accidentally installs windows 10 on some systems review desktop experience
Image used with permission by copyright holder
It’s no secret that Microsoft is keen to encourage as many users as possible to upgrade to Windows 10 — the fact that the OS is currently being offered as a free upgrade should be evidence enough of that. However, it seems that there’s a problem with Windows Update involving the application of a little too much pressure on some users who haven’t yet made the jump.

Windows 10 is an optional update, but the Windows Update tool has reportedly been downloading the package automatically, without consent from the user. Given that it’s a sizeable download, those who have no intention of upgrading at the moment are understandably aggrieved by this practice.

Moreover, it seems that things have gone even one step further. Rather than just downloading the necessary files to update, Ars Technica reports that some users are claiming that their systems are now automatically firing up the installer.

The installer does still need the user to manually start the upgrade process, but the tickbox that selects the optional update is now checked by default. If you’re planning to continue using Windows 7 or Windows 8, it’s perhaps wise to make sure you take a careful look at what Windows Update is offering the next time you receive a prompt.

While the change certainly seems to fall in line with Microsoft’s plans to have Windows 10 installed on as many systems as possible, the company maintains that it was made in error. The update is apparently not intended to be selected by default, and a fix is imminent, the company says.

Some users might still want to preserve an earlier version of Windows, but there are certainly no shortage of people willing to try out Windows 10. Last month, Microsoft announced that the OS has been installed on more than 100 million devices worldwide, less than two months after it was officially launched.

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more