Skip to main content

Spotify adds lyrics to desktop app so you can annoy the hell out of everyone nearby

In a move that could cause untold friction in households around the world, Spotify has announced it’s adding a new ‘lyrics’ button to the desktop version of its music streaming service.

The new feature comes courtesy of Musixmatch – owner of the largest lyrics catalog in the world – so your ability to sing along to your favorite tunes and annoy the hell out of anyone in the same room is now only a click away.

In the coming weeks you’ll find the button at the bottom of the desktop app – hit it with any track (er, you can forget instrumentals, obviously) and away you go. It’s also possible to search and browse lyrics from popular tracks using the Explore feature.

Introducing Musixmatch Lyrics Button on Spotify desktop

It sounds like a neat integration into Spotify, making it easier for music fans to not only indulge in some karaoke fun, but also to catch the words of artists who tend to garble, mumble and slur rather than sing.

Other improvements rolled out Thursday include an enhanced Friend Feed making it easier to discover what music your buddies are getting off on, and a new ‘daily viral charts’ feature serving up the most shared tracks around the world and in your region. Additionally, all charts now feature indicators to highlight new music and how tracks are performing day by day.

We assume the new lyrics button won’t be coming to mobile, as breaking into song while you’re out and about listening on headphones is only going to end horribly (though Musixmatch’s own apps already let you do this).

[Source: Spotify]

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more