Skip to main content

Best Buy’s Buy Back program gets a boost from Justin Bieber and Ozzy Osbourne in Super Bowl ad

The consumer technology retail game is a funny thing. Like automobiles, virtually any piece of tech you purchase becomes a depreciating asset starting from the moment the cashier prints out your receipt. Time wears devices down of course, but the onward march of technological development is also a concern. How many of you have skipped out on buying the latest iDevice, knowing that in roughly 12 months you’ll see another one, very similar to the previous release only with a wider range of bells and whistles? In acknowledgment of this fact, leading tech retailer Best Buy last month launched a Buy Back program for used devices purchased from the store.

The retailer fired off a round of e-mails to its Reward Zone customers this week, revealing plans to spread the word on the new service during this Sunday’s Super Bowl. A commercial will air during the third quarter offering details about the program, with pop star Justin Bieber and Black Sabbath frontman-turned-reality-TV-star Ozzy Osbourne on hand for reasons that are completely unknown. Maybe they just like Best Buy an awful lot… or the appearance fees the retailer paid them.

The program covers televisions, desktops, notebooks, netbooks, and mobile devices (phones and tablets both). The catch is that you actually have to purchase the Buy Back option along with your product. TVs are covered for four years from the purchase date and everything else is covered for two years. The amount you receive for used items is based on how much time has elapsed since the date of purchase.

Sell your item back to the store within six months and you’ll get 50 percent back on it. The minimum amount, 10 percent, applies only to TVs purchased between two and four years ago. Expect more detail to emerge when Best Buy brings out the Bieber-Osbourne ad during Sunday’s game. If you want more info on the program now, check the company’s official website.

Adam Rosenberg
Former Digital Trends Contributor
Previously, Adam worked in the games press as a freelance writer and critic for a range of outlets, including Digital Trends…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more