Skip to main content

Yahoo to retain search data for 18 months

Yahoo Y logo
Image used with permission by copyright holder

Back in 2008, Yahoo broke with most of the Internet search industry by announcing it was reducing the amount of time it retained users’ Internet search data to a mere 90 days. Now the company is backpedaling, announcing that it is extending its data retention period for search data to 18 months. The new retention period puts the company in line with search giant Google, which never backed down from its long-standing 18-month retention policy.

Yahoo plans to begin extending the retention period this summer, and plans to notify customers before it does so. The retained search data will include things like user’s IP addresses and cookies, which means the data can in many cases be linked to individual devices or people. After 18 months, Yahoo will retain most of the data, but anonymize it so it cannot be linked to individuals. Yahoo’s data retention plans may also extend beyond search data: the company has indicated it is considering keeping hold of other forms of user information as well.

Yahoo defends the move as a necessary step to keep Yahoo’s services competitive with those offered by the likes of Microsoft, social networking services like Twitter and Facebook, as well as Internet titan Google. Yahoo leverages aggregate user information to provide customized and personalized services that are tailored to users’ interests, usage patterns, and other data points. Some industry watchers may be increasing its data retention period to comply with potential international regulations on data retention, as some countries mull requirements that service providers retain user data for a long period of time in case law enforcement decides they need access to it.

However, consumer and privacy advocates are bemoaning Yahoo’s increased data retention, and the move comes as consumers are increasingly concerned about the amount of information about them stored by Internet companies—and who exactly has access to that data. Privacy advocates are increasingly arguing that Internet firms’ commercial interest in mining user data—and selling access to it or services based on it—is rapidly eroding consumer privacy. Some watchdog groups are calling for government regulations limiting the amount of time firms can retain personally-identifiable data.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more