Skip to main content

Viacom and YouTube Start Dishing Dirt in Copyright Dispute

Image used with permission by copyright holder

Back in 2007, media giant Viacom filed a $1 billion copyright infringment suit against YouTube and parent company Google, claiming more than 160,000 clips of copyrighted content from Viacom properties—things like The Daily Show—were distributed without permission via YouTube, and those clips were viewed over 1.5 billion times, all the while generating revenue for Google and YouTube. Now, previously confidential filings in the case are being made public as both companies seek to have a judge decide the case rather than turn it over a more-unpredictable jury…and the filings reveal unpleasantness on both sides of the case.

For its part, Viacom maintains its assertions that YouTube’s founders know video content was being illegally uploaded to YouTube and failed to stop it; further, Viacom alleges that YouTube employees broke the law by knowing posting copyrighted content themselves. “YouTube was intentionally built on infringement and there are countless internal YouTube communications demonstrating that YouTube’s founders and its employees intended to profit from that infringement,” Viacom said in a statement. Included with the filings disclosed by Viacom—available in PDF format from Viacom’s YouTube litigation page—is a email from co-founder Steve Chen noting “one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”

For its part, Google isn’t taking the accusations lying down: it claims that Viacom “continuously and secretly” uploaded its own content to YouTube, hiring at least 18 different marketing agencies to upload “roughed up” videos of Viacom content to to YouTube and deliberately obfuscate that the content was coming from Viacom itself. According to Google, even Viacom couldn’t keep track of its clandestine uploads, and on “countless occasions” demanded YouTube remove clips it had uploaded to YouTube, only to “sheepishly” ask for their re-instatement later. Google also claims Viacom “routinely left up clips from shows that had been uploaded to YouTube by ordinary users” as a means of promotion.

The filings also reveal that, prior to being acquired by Google, Viacom itself had considered buying YouTube, in part as a move to keep it from being snapped up by social networking juggernaut MySpace.

Since being acquired by Google, YouTube has worked on content identification technologies designed to enable rights holder to block or monetize videos uploaded to the site without permission. YouTube says more than 1,000 media companies are now participating in its Content ID program.

The case is widely being seen as a major test of the Digital Millennium Copyright Act (DMCA), which makes technologies that deliberately circumvent antipiracy measures illegal, while at the same time restricting the liability of service providers for copyright infringements perpetrated by their users. YouTube believes the DMCA protects it from liability under Viacom’s copyright infringement claims.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
How to post a Short on YouTube
Two mobile devices showing two people dancing in YouTube Shorts videos.

Shorts are short-form videos and they're basically YouTube's answer to TikTok. And similar to how you can create and post TikTok videos using the TikTok app, you can record, edit, and post Shorts directly from the YouTube mobile app.

And in this guide, we're going to show you how to do just that. Keep reading to learn how to post a Short on YouTube.

Read more
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more