Skip to main content

ASUS A7N8X-E Deluxe Motherboard Review

Quote from the review at HardOCP:

“Now is the point where I’ll lose most of our readers. You may find yourself asking where our overclocking results are and I’ll be glad to tell you: There are none. For an undetermined reason, this board did not take kindly to overclocking at all. I refuse to dismiss the potential of this board (ASUS continues to cater to the overclocking crowd), but am sad to report that even a 215FSB was not in the plans. Very relaxed memory, all the voltage available, and all the luck I could muster or steal didn’t change anything. I wouldn’t use my results as your sole decision factor, but this was my experience nonetheless.”

Read the full review

Ian Bell
I work with the best people in the world and get paid to play with gadgets. What's not to like?
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more