Skip to main content

ATI To Show Off 512MB Video Card

In addition to offering gamers three days of top-notch gaming, the event will include an overclocking demonstration and a PC modding event. Attendees will get the opportunity to preview the world’sfirst 512MB gaming card and one lucky gamer will even get to take one home.

“Participating in events like this gives us the opportunity to find out what gamers want and need for competitive gaming,” said Rich Heye, Vice President and GM, Desktop Discrete Products, ATI. “And to say thanks for supporting ATI, we are giving attendees at this event an opportunity to check out the world’s first 512MB gaming cards before we send them out to our software partners for development of next-generation games.”

In addition, world champion overclockers will test their mettle – and their silicon – to set new world records. Top overclockers Eric Kronies, Charles Wirth and Sami Makinen will also take a break from overclocking their own systems to answer attendees’ system-tweaking questions and offer tips for overclocking PCs for ultimate performance.

Attendees will vote on whose PC at the event is most in need of an overhaul. The lucky winner will have his or her PC rebuilt by modding experts using the latest technology from ATI and its partners.

And as is the norm at ATI-sponsored LAN parties, participants can expect lots of giveaways and prizes, including high-end desktop and notebook PCs powered by ATI hardware. Other sponsors of the event include ABIT, AMD, ArenaNet, ATARI, Corsair, Gigabyte, Kingston, LG, NCSoft, OCZ, Running with Scissors, Sapphire and UBISOFT.

For more information or to register, please visit ATI at www.ati.com/gitg or the Texas Gaming Festival at www.txgf.com.

Ian Bell
I work with the best people in the world and get paid to play with gadgets. What's not to like?
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more