Skip to main content

Move over, Hologram Tupac: Microsoft’s working on 3D holographic avatars for Skype

microsoft skype hologram avatar design
Image used with permission by copyright holder

Skype is basically an unofficial verb for video chatting. It’s one of the top most popular services that provide free video and voice calls, allowing for easy over-the-air conference meetings, international catch ups, or long distance romancing. But it appears the flat screen of most computers and smart televisions are simply not enough for Microsoft. The company is reportedly working on a fully interactive, 3D hologram telepresence so users can video chat with life-like avatars of their contacts.

Calling it a “realistic body double,” Microsoft has posted a new job listing for a software development engineer to help them on the project. The listing describes the telepresence functionality as “one that gives the remote worker a true seat at the table, the ability to look around the room, turn to a colleague and have a side conversation.” A holographic body double that interacts with other virtual friends? Suddenly, Hologram Tupac seems so last year. Literally.

The job listing hints at the continued development in its Viewport research project. Unveiled around this time last year, the Viewport project uses 3D reconstruction, infrared dot patterns, color cameras, and projectors to create an “immersive” teleconferencing system. With this much technology under wraps, it’s possible that if this holographic avatar system ever becomes a real product, consumers may have to shell out the extra bucks to ensure their computers, cameras, and televisions are compatible. Seems like a lot of work for a slightly gimmicky feature, unless Microsoft can prove to us otherwise.

Natt Garun
Former Digital Trends Contributor
An avid gadgets and Internet culture enthusiast, Natt Garun spends her days bringing you the funniest, coolest, and strangest…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more