Skip to main content

Now that’s funny: Pandora gets into comedy

Pandora Logo
Image used with permission by copyright holder

Pandora is best known for offering customized music streams tailored to individual users via Music Genome Project technology, wherein users rate music they hear based on a variety of factors, and then Pandora rummages through its collection for music exhibiting similar traits and serves it up. Now, Pandora is expanding the idea into comedy, launching individualized comedy stations that serve up humorous material based on listeners’ tastes and preferences.

“Pandora is about creating a great, personalized radio experience,” said Pandora founder and chief strategy officer Tim Westergren in a statement. “Comedy is a natural part of that experience, and it’s something our listeners have been asking us to deliver for a while. We are delighted to now be able to give people a mix of familiar and new comic material that they’ll love to listen to.”

Pandora users can now tap into comedy the same way they would set up music selections: they choose favorite artists or genres, then give thumbs-up and thumbs-down ratings to Pandora to shape the station’s content. The more ratings users give, the more precisely Pandora can customize the content.

Pandora is preparing to launch an IPO that could bring the firm over $1 billion from eager investors; however, the company has recently come under fire for revealing significant information about its users to advertisers and other partners.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more