Skip to main content

Google-Bing spat gets the Stephen Colbert treatment

For us in the tech community, no story has been bigger this week than the public fight between search giant Google and its competitor, Microsoft’s Bing. Sure, Google essential runs the world. But that doesn’t mean that the general public cares about some whether or not Google has accused Bing of copying its search results. (Which they have.)

But last night, Comedy Central’s Stephen Colbert launched the catfight into prime time television territory when he covered it during the opening of his show. (In the spirit of full disclosure, Bing is a sponsor of the “Colbert Report.”)

“For the first time ever,” quipped Colbert, “someone’s search history has been busted for something other than porn.”

For those of you who haven’t been keeping tabs on the search engine scrap, it goes something like this: Google staged a “sting operation” that they say showed Bing had duplicated their results. They did this by tricking them into copying results for made-up words, like “hiybbprqag.”

Search Engine Land then reported the bust. This prompted Microsoft to issue a series of rebuttals, ranging from the suspiciously vague to “we do not copy Google’s results.”

We decided to do our own investigation into the matter, and discovered that about 50 percent of search results between Google and Bing were a match.

The war between the two competitors seems to have peeked, and we doubt much will be spoken about it by the time next week rolls around. But if you want to remember the squabble for a while longer, you could always purchase one of these special-edition “hiybbprqag” mugs.

Watch the clip:

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more