Skip to main content

Microsoft Releases Singularity Research OS

Microsoft Releases Singularity Research OS

Microsoft is certainly best known for its Windows family of operating systems, but the company’s research and development group has been steadily working on a non-Windows operating system written from scratch with an eye towards dependability and lack of dependencies on other technologies. Dubbed Singularity, Microsoft has released a research development kit for the OS which is free for academic, non-commercial use under Microsoft’s Research License (not an open source license).

Under development for several years, the idea behind Singularity is to take the heavy lifting out of isolating applications processes, data objects, and runtime environments to create more reliable systems and applications. Built using manages code, Singularity aims to guarantee the isolation and security of software processes and (in theory) offer a greater degree of reliability without dependencies on the various subsystems that underly a typical operating system like Windows. It can also creates these isolated processes with very little overhead cost to the operating system, unlike many of today’s solutions which rely on hardware constraints (like memory segmentation) to isolate processes from one another.

“Singularity is not the next Windows,” Microsoft research VP Rich Rashid said in a statement. “Think of it like a concept car. It is a prototype operating system designed from the ground up to test-drive a new paradigm for how operating systems and applications interact with one another.”

Right now, Singular consists of a working kernel developed as managed code using Microsoft’s C# programming language and a new derivative version called Sing#, using a compiler and runtime environment called Bartok.

Microsoft currently has no plans to offer a commercial version of Singularity, but concepts from Singularity may well migrate to other Microsoft technologies, including embedded systems or distributed computing architectures.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more