Skip to main content

Google moves to take on Groupon with beta launch of Offers

Image used with permission by copyright holder

Google moved to build its own daily deals-giving service earlier this year following an unsuccessful bid to acquire one of the market leaders for such services, Groupon. Word of Google Offers first slipped out in January, and now the search giant has kicked off the first phase of its gradually unrolling beta test for the service.

For now, all you can do is sign up for the service at http://www.google.com/offers. The sign-up page notes that users can look forward to getting deals of 50 percent off or more. The beta will launch first in Portland, Ore. and soon after move on to New York City, N.Y. and San Francisco, Calif. Signing up for the beta is simple; select your location from a pulldown menu — NYC is divided into “downtown,” “midtown” and “uptown” sections and San Francisco is joined by the separate Oakland / East Bay — and click Subscribe.

Google’s Chris Messina revealed the beta launch in a tweet yesterday, later clarifying in a follow-up that only sign-ups are open right now, with offers to come “later,” first in Portland, then New York, then San Francisco. A video released by Google talks up the new initiative, promising deals for a range of products and services, much like the competitors do.

All of which echoes what was first revealed in January. A leaked factsheet revealed that Google Offers “is a new product to help potential customers and clientele find great deals in their area through a daily email.” Which is pretty much identical to what sites like Groupon and LivingSocial do.

Adam Rosenberg
Former Digital Trends Contributor
Previously, Adam worked in the games press as a freelance writer and critic for a range of outlets, including Digital Trends…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more