Skip to main content

Chinese Site to Steer Users to Legit Music Source

qtraxA music service that plans to offer free song downloads said Monday that China‘s largest search engine will send users to the service in a deal that could cut online music piracy.

The free service, Qtrax, has licensing deals with all the major recording companies and their publishing units. The company plans to fund its royalty payments to artists and the music industry through advertising.

Qtrax launches Thursday in Australia and New Zealand, which amounts to a world debut after several aborted launches and a 90-day U.S. preview in April.

The service is scheduled to begin in China on Dec. 17. Under the deal announced Monday, users of some entertainment and music pages operated by China’s leading search engine, Baidu.com, will see a button allowing a download from Qtrax when they search for a song or artist that is in Qtrax’s catalog. If users follow through, an advertisement will appear on the music-playing software that transfers the song.

While the deal does not cover searches from Baidu‘s main page, Qtrax CEO Allan Klepfisz said he was told by Baidu that the offshoot sites generate millions of daily visits, “which is exciting enough for us.”

Qtrax also has plans to launch in Singapore, Malaysia, the Philippines, Indonesia, Hong Kong and Taiwan this year, then the U.K. in February and the U.S. in the first quarter of 2010, Klepfisz said.

It remains to be seen how well its business model will work. Qtrax must pay royalties each time a song is played, but it gets revenue only on advertisements shown when a user is downloading a song or transferring it to a portable media player. The company has resisted putting audio ads in the song files themselves.

Qtrax, which is based in New York, is a subsidiary of Brilliant Technologies Corp.

Topics
Dena Cassella
Former Digital Trends Contributor
Haole built. O'ahu grown
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more