Skip to main content

Microsoft breaks down Windows 10’s EULA

Windows 10
Image used with permission by copyright holder
If there were any concerns that Windows 10 would bring with it a whole new set of terms and conditions with an updated End-User License Agreement (EULA), we needn’t have worried, as it’s about the same as it was with previous generations of the OS. There are a couple of tweaks that paint it as more of a software as a service (SAAS), but that’s more to do with the delivery method of the upgrade than anything else.

Transfer rights with the new version of Windows will be exactly the same as the last few. If you have a legitimate license for Windows 10, you can install it on a second PC if you uninstall it from the original one. The only caveat is that in Germany, where as ZDnet points out, a court ruling allows it, users can transfer OEM software too.

Activation will be present once again and will also happen automatically on OEM systems like things have been in the past. One new addition is that if you update from a pirated version of Windows from official Microsoft sources, it does not give you a valid license, just a legitimate version of the software.

If you find yourself not liking the upgrade however, you can always downgrade. Much like Windows 8.1, those that don’t like Windows 10 can revert to Windows 7 professional or 8.1 Pro. However it’s worth bearing in mind that considering the age of Windows 7, support will not last forever, so it will likely pull the ability to downgrade to that particular OS within a few years.

Updates on Windows 10 will be mandatory and automatic, with no ability to choose which are installed and which aren’t. However business and enterprise users can specify security fixes only.

Also, any version of the OS that comes with certain office tools preinstalled, they are licensed for personal use only. Those that want to use them commercially will be required to pay up for an Office 365 subscription.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more