Skip to main content

Microsoft’s new keyboard is aimed at the living room, and claims to survive liquid spills

microsofts new keyboard is aimed at the living room mskeyboard
Image used with permission by copyright holder

Not all wireless keyboards are necessarily built for living room use as well. They can be too bulky, plus, if they’re particularly large, handling them for lengthy periods of time can be uncomfortable. That can especially be the case if you have people sitting to either side of you. These are some of the ills that Microsoft attempts to address with its new “All-in-One Media Keyboard.”

Microsoft’s All-in-One is a wireless keyboard with a built-in touchpad, and can be used in conjunction with HID-compliant smart TVs and game consoles, including Xbox devices. It features customizable hotkeys, which allow users to configure certain buttons to perform specific actions based on how they use their gear. Microsoft claims that the keyboard is very liquid and drop-resistant, “so it can withstand the bumps, drops or spills of everyday life.” The keyboard comes bundled with a Wi-Fi USB dongle, which you need in order for you to use it. Plug the dongle into the device you want to pair it with, and the keyboard starts working in about a second.

The All-in-One is compatible with Windows 7 though 8.1, and Windows RT 8 and 8.1 as well. For non-Windows users, the keyboard also supports Mac OS X 10.7 through 10.9, along with Android 4.2, 4.12, and 4.03. We look forward to putting the “All-in-One Media Keyboard” through its paces, which will include liquid spill tests.

Microsoft’s All-in-One Media Keyboard will be available later this month for $39.95.

What do you think? Sound off in the comments below.

Mike Epstein
Former Digital Trends Contributor
Michael is a New York-based tech and culture reporter, and a graduate of Northwestwern University’s Medill School of…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more