In 2000, the Secure Digital Music Initiative held a contest to test out a new watermarking method for digital audio. Among the participants was a team led by Edward Felten, a graduate of Caltech and the University of Washington who had acted as a witness for the government in the 1998 case of the United States vs. Microsoft.
Within three weeks, Felten and his team had managed to remove the watermark from the stipulated audio sample, satisfying the automated judging system implemented by the SDMI. This could have earned them a cash prize, had they not waived their right to the reward in order to dodge a binding confidentiality agreement.
Upon presenting their work to SDMI officials, the research team was informed that their entry was invalid, referring to a contest rule stipulating that sound quality couldn’t take a hit. This was a setback, but not an out-and-out defeat — happy with their work, the team set about developing the research into a scientific paper to be presented at the 2001 Information Hiding Workshop.
Weeks before the event, Felten received a letter from SDMI Foundation Secretary Matthew Oppenheim, which stated that sharing the team’s findings could “subject your research team to enforcement actions under the DMCA.” Yes — simply revealing the existence of an exploit can result in fines.
But what influence does copyright law have over a research paper on audio watermarks? As Felten and many other academics have found out, a lot more than it should.
DMCA Takedown
Jason Hong is an associate professor in the school of computer science at Carnegie Mellon University. While he’s never fallen foul of copyright law in his own research, he’s well-versed in the problems some of his colleagues have faced. “There have been other researchers who have had far worse experiences than me,” he told us, before illustrating his point with the aforementioned case of Edward Felten.
It’s worth noting at this point that Felten wasn’t and isn’t some maverick computer science researcher liable to go rogue with his findings. By all accounts, he’s rather trustworthy. The White House named him Deputy U.S. Chief Technology Officer last year.
Even if you put in the most trivial protection method, it’s a copyright violation to circumvent that.
Despite being developed as a means of helping content owners protect their libraries in the Internet age, the DMCA affects security researchers because their work commonly focuses on reverse-engineering protection systems that they did not create. That’s something the DMCA explicitly prohibits.
“There’s a provision that says if there’s any kind of technological protection method, security researchers can’t bypass that,” Jason told us. “Even if you put in the most trivial protection method, it’s a copyright violation to circumvent that, unless you get permission beforehand from the copyright owner.”
Sometimes, that permission is relatively easy to come by — just another annoyance on the road to a research project. In other situations,companies have good reason to keep technology shrouded in secrecy. But some organizations abuse the legislation, using it a shield to keep researchers silent.
“Remember the whole case with Volkswagen?” Jason asked. “How they had changed their software? Previously, it would have been very difficult for people to do that kind of research, because you had to have permission from the car manufacturer to inspect all their software. In all likelihood, those kind of behaviors could have been found earlier if people had access to it.”
Indeed, a round of exemptions proposed a month after the Volkswagen emissions scandal hit headlines did set in motion a plan to give researchers better ability to study in-car technology. However, the way the exemption process plays out is yet more evidence that the DMCA can’t cater to the ever-increasing needs of security.
Let’s Make a Deal
The cars on our roads are mobile computers, and that means they’re vulnerable to new exploits. The exemption proposed in 2015 is set to go into effect later this year, and will allow security researchers straightforward access to the software that needs to be inspected.
“Almost all the car manufacturers were against this,” Jason told us. “And to some extent, the statements they were making do make sense.”
Auto makers are concerned loosening the restrictions too much might encourage someone to make changes to the software that’s running in their vehicle. That could be as innocuous as removing a volume limiter on their stereo, but it could also put the driver, or the people around him, in harm’s way.
“You don’t want people just making arbitrary changes to the software in a car,” Jason continued. “But at the same time, [restrictions] also make it harder for security researchers to do their job, and to ensure people’s safety.” It’s a question of whether secrecy is a good enough substitute for extensive security research.
Deirdre Mulligan is an associate professor in the school of information at Berkeley, a co-founder of the Center for Democracy & Technology, and the first director of the Samuelson Law, Technology & Public Policy Clinic. Simply put, she’s no stranger to the areas where technology, law and the public interest collide.
Deirdre also referred to the case of Edward Felten, using it as an example of how legislators have attempted to demonstrate their awareness of the plight of researchers. “The Department of Justice wrote them a letter and said ‘we’re not going to shoot you; this is not the sort of thing that we would go after, this is academic research, this is a publication.”
In an ideal world, that would be enough to allow work to continue without researchers looking over their shoulders just to check there are no copyright lawyers lurking. In reality, there are forces that make the idea of “good faith” research seem naive.
Getting a PhD is Hard Enough
Between exemptions, the implication that cases made against legitimate research won’t be pursued, researchers should be able to pursue their work without worries of legal action against them based on DMCA legislation.
In reality, they’re working in a gray area of the law. Sure, they might be acting in the public interest — but what if their research leads them to tinker with a system that an unscrupulous corporation would rather keep closed? Depending on the product, a security firm or academic institution might find itself in a legal case they don’t have the funds to contest.
What Congress did was say, ‘OK, we don’t feel like taking the time to figure this all out now.
“There are so many areas that you could conduct research in that don’t raise any risk,” Deirdre explained. “If you’re advising your new PhD student or your new post-doc, if there’s an area where they might end up drawing a lot of ire and getting people in the department pissed off at them because, you know, some company’s now really angry at the university, or somebody files a lawsuit — getting a PhD is hard enough. You don’t want to create additional headaches for people.”
Jason added that individuals might find their decision-making process changed, purely because their end product might be inadmissible. “Because security researchers know that this law exists, there’s probably a lot of things that they wouldn’t do.” Work is of little use in academic circles if it can’t be published.
From that perspective, it’s easy to understand the frustration felt by security researchers like Jason, and advocates like Deirdre. “What happens when you adopt a law that doesn’t take into account the multiple values that it might impact?” she asked. “The law might not be fit for purpose.”
“What Congress did instead was say, ‘OK, we don’t feel like taking the time to figure this all out now, and we realize that things may change and we may need different sorts of exceptions to the anti-circumvention rule over time —we’re going to create this triennial rule-making procedure, so that people can come and make their case.'”
“So, on the one hand, you can say that it was good that they did that. And, on the other hand, the process is one that makes it a little bit easier recently. But generally it’s a process that requires a pretty high burden of truth.”
Public Interest and the Public Interest
Researchers and advocates are well aware of the difficulties caused by the DMCA. Yet we don’t often see such topics make headlines. “It’s not something that’s on the average person’s radar,” said Jason.
Not all of us will dabble in security research, but we would all benefit from that work being done. More to the point, we’ll all suffer if we don’t enable institutions and individuals to carry out this type of study.
Deirdre reels off a list of federal agencies working to push for more people to enter the field of security research, and they’re being supported by efforts from private entities. “At Berkeley, where I work, we got a multi-million dollar grant from the Hewlett Foundation that’s trying to improve the state of cybersecurity by growing the discipline,” she added. “They’re trying to do field development because we need more activity in this area.”
As encouraging as it is to see time and energy being dedicated to kindling the next generation of security researchers, it seems at odds with the obstacles standing in the way of the work itself. Most people outside of the research community don’t have a reason to campaign against the way DMCA and other legislation is being implemented.
“I have worked on encryption policy since the mid 90s, and I can tell you it has not been a sexy hot topic among my friends and family,” Deirdre told us. “Today, I can walk into a room and they’re like, ‘oh! you probably know a lot about this Apple v FBI thing.'”
The Sony Pictures email hack, the dump of Ashley Madison member information, search engines that prey on unsecured webcams, the shocking video of a Jeep being ‘killed’ remotely — we’re inundated with evidence that security research should be a top priority. Yet laws like the DMCA force security researchers to work under constant legal threat.
“Researchers, many of whom are funded by the National Science Foundation, and who are doing work to make our nation more secure, shouldn’t be required to place them and their institutions at risk of a lawsuit — that’s just not reasonable,” said Deirdre. “My hope is that we can create, not just one-off exceptions that allow research under the DMCA for things that get a three-year exemption, but a broader set of limitations on existing laws.”
“I think on balance, most people would think that human safety, security, and privacy research is probably more important to protect,” Deirdre concluded. “But a more narrowly written law could allow us to have both.”