Skip to main content

Microsoft removes Bing Image Widget after Getty Images files lawsuit

microsoft bans it support ads bing
Image used with permission by copyright holder
Bing Image Widget goes up, Getty Images files suit, Bing Image Widget is taken down.

It looks like that’s essentially what happened after Getty Images, a widely-used firm that has a vast library of stock images, filed a lawsuit in the U.S. District Court for the Southern District of New York alleging that Microsoft’s Bing Image Widget also included photos that are copyrighted and/or owned by Getty, PCWorld reports.

Related: Bing adds over 50 new cities to Streetside library

Since then, Microsoft has decided to take down the Bing Image Widget. If you go to the page where the Bing Image Widget was originally hosted now, all you get is a blank page that reads “we have temporarily removed the beta.”

Related: Windows 8.1 with Bing may mean cheaper PCs, tablets for you

“Rather than draw from a licensed collection of images, Defendant gathers these images by crawling as much of the Internet as it can, copying and indexing every image it finds, without regard to the copyright status of the images and without permission from copyright owners like Plaintiff,” Getty Images’ legal complaint says.

In the interim, the report states that Microsoft is working with Getty Images to resolve issues involving the Bing Image Widget. It will be interesting to see if and when the Bing Image Widget makes a return.

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more