Skip to main content

Newly leaked Windows 9 screenshots make multiple references to Cortana

Winfuture.de has unveiled a new slew of Windows 9 technical preview screenshots which strongly suggest that Cortana will be coming to Windows 9. These images strongly suggest that Cortana, the virtual voice assistant that’s specific to Windows Phone 8.1, will be coming to Windows 9 once it’s released.

In one of them, you’ll see that a large batch of file names have the word “Cortana” in them. They’re contained in a folder whose name includes the phrase “Microsoft.Cortana,” as well.

Related: Here is everything we know about Windows 9

This folder contains sub-folders that also make references to Bing, Microsoft’s search engine. This implies that Bing will be baked into Cortana somehow, similarly to the way Cortana on Windows Phone uses Bing to perform search-related tasks.

Most of the new screenshots also feature a Remind Me app, which may work in conjunction with it. Remind Me is a feature that’s available on Cortana for Windows Phone 8.1 now.

Based on Winfuture’s translated text, which was originally written in German, the features in the version of Cortana that’s in Windows Phone 8.1 should largely be available in the Windows 9 version as well.

Related: Video of Windows 9 Notifications Center in action leaks

A rumor about Cortana’s arrival on the desktop has existed for a good while now, so this development isn’t exactly shocking.

Though these screenshots don’t exactly paint a clear picture of how Cortana will work in Windows 9, we expect a video of someone using it in this technical preview to emerge soon. Similar videos showing the new Start menu and multiple desktops in action have been leaked just in the past few days.

It’s worth noting that these images are based on a technical preview version of Windows 9, so the features contained in them may not make the final release version of Windows 9.

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more