Skip to main content

New Google Toolbar Upsets People

According to News.com:

“When Web surfers install the toolbar in their Microsoft Internet Explorer Web browser and click the AutoLink button, Web pages with street addresses suddenly sprout links to Google’s map service by default. Book publishers’ ISBN numbers trigger links to Amazon.com, potentially luring shoppers away from competing book sellers such as BarnesandNoble.com. Vehicle ID licenses spawn links to Carfax.com, while package tracking numbers connect automatically to shippers’ Web sites.”

A similar hyperlinking advertising model named “Intellitext” is already running on several websites (including Designtechnica). While ingenious in theory, the new Google toolbar bypasses a webmasters control of the hyperlinking and puts it into the reader’s hands – with or without their consent by the sounds of it.

Read more at News.com

Read more at eWeek

Found VIA HardOCP

Ian Bell
I work with the best people in the world and get paid to play with gadgets. What's not to like?
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more