Skip to main content

DJ slapped on the wrist for uploading an unreleased Beyonce album to the Internet

dj slapped wrist uploading unreleased beyonce album bittorrent
Image used with permission by copyright holder

Loads of stuff goes up on torrent sites before they’re supposed to hit the light of day. That includes games, operating systems, music, movies and more. Clearly, you somehow have to be on the inside of any of the firms creating that content in order to get your hands on it and upload it to the ‘net. However, if you’re an insider and you get caught doing just that, you’re going to have some ‘splainin’ to do, as Ricky Ricardo of I Love Lucy fame might say.

Or you might have some payin’ to do.

A record producer and DJ based in Sweden who got a hold of a Beyonce album before it was set to be released, uploaded it, effectively leaking the entire album to the public. The man initially faced a fine of $233,000 and a jail sentence which could have lasted two years. However, in the end, he got a much lighter punishment than that.

The price? A $1,200 fine. No jail time, no nothin’. 

Not only did the DJ upload the album, but he used his neighbor’s wi-fi network to do it, in an effort to protect himself and thus have a patsy to take the fall for him in case the feds ever got on his case. The legal charge was spearheaded by Sony Music Entertainment, though considering that they didn’t get much for their efforts, if anti-piracy cases continue to end like this, it could hamper their desire to pursue legal action against those who illegally upload content to the Internet prior to its official release date.

Image credit: www.applemacvideo.com

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more