Skip to main content

Pandora announces gift subscriptions

Image used with permission by copyright holder

Internet radio service Pandora has announced a new way to expose new listeners to its service—and rake in some cash at the same time. Pandora users can now purchase gift subscriptions for others for one year of streaming music service, and have the gifts sent right away or delivered via email up to a year in the future…although, in that case, buyers had better hope the recipient doesn’t change their email address.

[Update 14-Oct: Unfortunately, Pandora says they had to pull the ability to delay gift subscriptions at the last minute: gift subscriptions can only be purchased for immediate delivery.]

“It’s the perfect gift for any occasion,” said Pandora founder and chief strategy officer Tim Westergren, in a statement. “Now, you can give someone a year’s worth of great personalized music for a very low price. We anticipate the new Pandora gifting to be popular not only during the upcoming holiday season, but for all occasions where people are searching for that perfect, personalized gift to give.”

The gift subscriptions cost $36—the same as a regular one-year Pandora One subscription. Buyers will have the option of having the gift subscription sent immediately via email, or creating a printable version of the gift card if users want to send it via postal mail or include it with some other item. Recipients will get treated to the full complement of Pandora services: personalized music streaming based on individual likes and dislikes, high quality quality, access from any device that supports Pandora (including PCs, many phones, set-top boxes, streaming media players, and even TVs and Blu-ray players). And, best of all, there’s no limit to how much music subscribers can stream, and there are no ads.

Pandora offers up to 40 hours a month of free ad-supported listening, with an option for users to pay $0.99 to finish out a month where they hit that limit, or pay $36 to listen to Pandora free for a year, ad-free.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more