Skip to main content

Logitech Cordless Desktop Express Review

Quote from the review:

“I came through at the end with mixed feelings about this setup. While the keyboard and mouse both appealed to me on an aesthetic-level, what with their nice stylings and limited use of desktop real-estate, they had a few flaws that can’t be ignored. For the keyboard, the new layout may take sometime to get used to, but of somewhat more importance was the slightly sluggish response I noted sometimes. For the mouse, the reduced refresh rate can be a minor annoyance as well. While I can’t recommend this set to anyone who absolutely *must* have the most out of their keyboard and mouse (for them, Logitech has other options available), I can recommend it to someone who just wants a plain-and-simple cordless setup that’s easy to use and doesn’t want to be hassled by extra unnecessary features.”

Read the full review

Ian Bell
I work with the best people in the world and get paid to play with gadgets. What's not to like?
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more