Skip to main content

Dell announces end of the road for Windows XP

With Windows 7 doing so well, it was only a matter of time before major OEMs stopped selling XP machines. Dell has officially announced its end-of-life plans on its Direct2Dell blogWindows XP Mode information .

Per Microsoft rules, OEMs can’t ship computers with XP Pro or Home after Oct. 22, only a few short six weeks away. In preparation of that deadline, Dell will stop offering XP as an option on its computers at the end of this month. Its Microsoft Windows 7 page claims the last day to place an order is Oct. 1, but the date is subject to change. Regardless, if you are thinking of staying off the Windows 7 train and sticking with XP for a little longer, you don’t have a lot of time left to place that order. So, get cracking!

For existing XP users, there’s no rush to upgrade to Windows 7 if you are happy where you are. Dell will continue to offer Windows XP driver support until December 2012. As we reported earlier, Microsoft has decided to extend XP downgrade rights to 2020. This means you can still hand-install Windows XP after buying a brand-new computer. Dell will supply a copy of the media upon request.

There are some exceptions to the rule to make things a little confusing. Qualified customers who order computers using the Dell Custom Factory integration service will still have the option to get PCs with XP installed after the mid-October deadline. The Custom Image requirements apply only to new Windows XP Professional, XP Home, and XP Tablet products.

If you have XP programs you just can’t bare to give up, and you can’t get to Dell in time to get a new computer, Direct2Dell blog helpfully suggests trying Windows XP Mode to keep those applications chugging along. XP Mode is available on Windows 7 Professional and Ultimate.

Fahmida Y. Rashid
Former Digital Trends Contributor
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more