Skip to main content

321 Studios DVD X Copy Software Review

Quote from the review:

“Most of you fellas must know about 321’s legal woes – I don’t wanna really get into it but I will say that you can still purchase the physical software at your local retailer until it is soldout – the ban is only on 321 Studios to prevent shipping or downloads of the “ripper” version after a certain date. Retailers are NOT restricted – they just can’t get any more. 321 will now ship aversion of their wares that does not contain the ripper (or the re-encryption or legalese nag screens) Rippers are widely available on the internet though, find and use them at your own peril.”

Read the full review here

Ian Bell
I work with the best people in the world and get paid to play with gadgets. What's not to like?
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more