Skip to main content

A ‘healthy’ PC means using Bing, according to Microsoft

PC Manager suggesting to use Bing as default search engine.
Judy Sanhz / Digital Trends

Microsoft hasn’t been shy about pushing first-party services and apps in Windows, but this time, it’s getting a little ridiculous. As reported by Windows Latest, the Microsoft application PC Manager claims you can “fix” your computer simply by changing Bing to be the default search engine.

The change was spotted when using the Edge browser and having, for example, Google as the default search engine. After you run a health check, one of the suggested changes will be to set Bing as your default search engine. If that’s how you want to go, there’s a button to make it happen.

If you take its suggestion and set Bing as your default search engine, you’ll get a message letting you know you’re all set and that your PC is using recommended settings to be sure, fast, and efficient. But if you decide to ignore the recommendation, you’ll notice that your computer will continue to work as usual, just as mine did.

Some may question Microsoft’s recommendation, especially since the default search engine has nothing to do with PC performance or security. It also feels particularly problematic because of the way PC Manager is positioned, which is to “safeguard your PC in a quiet and reliable way.” The app gives no reason for why Bing would be safer or faster to use than Google, making the change feel like nothing more than an attempt to trick people into switching.

Of course, Microsoft isn’t the only company to do things like this. If you’re using Google Search in a Microsoft Edge browser, seeing a suggestion to use Chrome may not come as much of a surprise.

This recent example from Microsoft feels particularly egregious though, and hopefully is something that Microsoft will address in the future.

Judy Sanhz
Judy Sanhz is a Digital Trends computing writer covering all computing news. Loves all operating systems and devices.
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more