Skip to main content

Microsoft Undecided On Yahoo Bid

Microsoft Undecided On Yahoo Bid

Microsoft’s board of directors met on Monday to discuss its takeover bid for Internet giant Yahoo—and, according to a Wall Street Journal report—couldn’t make up its mind.

This weekend marked the expiration of Steve Ballmer’s three-week deadline for Yahoo to reach a deal with Microsoft or face a hostile takeover attempt at a lower price. On receiving the threat, Yahoo basically said it was open the idea of Microsoft taking over the company, but needed to see a better offer than Microsoft’s $31 per share.

Now that the deadline has come and gone without Yahoo agreeing to a Microsoft takeover, Microsoft has to decide how it wants to proceed. The company basically has four options: raise its offer for Yahoo (which it has repeatedly indicated it won’t do), or try one of two hostile takeover tactics: nominate its own directors for the Yahoo board (who, if elected, would favor a Microsoft buyout), or take their offer directly to investors and hope to sway enough of them to go along with the takeover.

The fourth option would be to withdraw the buyout offer and go home to Redmond.

Although Microsoft hasn’t commented on its plans, the Wall Street Journal reports Microsoft has considered raising its bid to as much as $33 per share, although that still falls short of the $35 to $37 per share Yahoo’s major shareholders seem to believe it is worth. However, Microsoft CEO Steve Ballmer, who has been obviously frustrated at the difficulty and delay in making this deal happen, has said Microsoft may walk away from the deal, although most industry watchers view that as more of a negotiating stance than an actual option, since Microsoft has been actively looking at acquiring Yahoo since at least 2006.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more