Skip to main content

Bing’s Facebook integration to expand

bing-facebookBing has made no secret of its feelings about Facebook integration – it’s all for it. And now Microsoft’s search engine will extend the collaboration to include even more URLs. Bing introduced “Liked Results” earlier this year, which showed users links their Facebook friends approved of. Now Bing will enhance the application to include even more search queries.

According to a blog post from Bing social team executive Lawrence Kim, “as people spend more time online and integrate their offline and online worlds, they will want their friends’ social activity and their social data to help them in making better decisions.” Kim also explains that “If your friends have publicly liked or shared any of the algorithmic search results shown on Bing, we will now surface them right below the result.”

The search engine has been making strides to become more social, largely as an attempt to lure prospective users away from Google. It announced a partnership with Twitter earlier this year and has been increasing its Facebook integration. Originally began with status updates including in its returns only, but added Likes and mobile check-ins.

Of course, Bing isn’t alone. Earlier this week (days before Bing’s own announcement), Google revealed it would be giving priority to your own social media contacts in its search results. And it does Bing one better (or worse, it’s too early to tell – but there is such a thing as over-social-saturation) by putting Twitter, Quora, Flickr, and YouTube results into the mix.

One thing Bing can lord over Google is its Facebook data. Google and Facebook have a well-documented ongoing spat, and Google is won’t willingly give Facebook its information until Facebook does the same. We’re not holding our breath for that partnership anytime soon.

Molly McHugh
Former Digital Trends Contributor
Before coming to Digital Trends, Molly worked as a freelance writer, occasional photographer, and general technical lackey…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more