Skip to main content

Microsoft Wants Yahoo to Use Bing Globally

steve-ballmer
Image used with permission by copyright holder

Microsoft CEO Steve Ballmer has suggested that the Redmond software giant would like to extend its nascent search partnership with Yahoo to a global scale, having Yahoo tap into Microsoft’s Bing service to handle search queries from all over the world, not just in the U.S. and Europe. At a news conference in Tokyo (part of which is available on YouTube), Ballmer said the company would like to extend the partnership outside the United States, but “We will have to wait and see until we actually are able to get approval and consummate our partnership with Yahoo inside the U.S., and perhaps there will be news on that some other day.”

The current agreement—which the two companies are still trying to work out in its entirety—would need to be approved by regulators in both the United States and Europe; however, despite Microsoft’s antagonistic history with antitrust agencies, few expect the deal will see major difficulties given that Google is by far the dominant player in the Internet search market. In some cases—such as South Korea and Japan—Microsoft and Yahoo would have to obtain separate regulatory approval before Yahoo could begin funneling search traffic to Bing.

Under the deal, Microsoft and Yahoo entered into an agreement whereby Yahoo would shunt search queries to Bing in exchange for 88 percent of revenue from all ad sales during the first five years of the deal. Yahoo will also have the right to sell ads on some Microsoft sites.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more