Derek Bambauer is spending his last hours in Cambridge – literally – giving a final presentation at the Berkman Center. The lunch ends at 1:30pm, and he’s off to Detroit at 2pm to start a new career teaching intellectual property law at Wayne State law school. For his parting shot, he previews a paper he’s writing with Phil Malone on how the law currently limits – undesirably – software security research. He’s more clear, at this stage of his research, about the problem than about its potential solution.
Testing software and internet infrastructure for bugs and weaknesses, finding potentially dangerous exploits, is, Derek suggests, exciting, intellectually engaging and important to cyber-security as a whole. (Simson Garfinkel challenges this assertion later in the discussion…) But it’s legally risky to get involved with this sort of testing – you open yourself to civil law suits with large damage awards, and to possible criminal charges that could include prison time. Derek argues that the current state of regulation is hampering the state of computer security research.
For a case study, Derek looks at Mike Lynn, a researcher for Internet Security Systems. Lynn found what was described as “the holy grail” of internet security bugs, a bug Cisco’s Internet Operating System which allowed hackers to remotely damage Cisco routers, which have a reputation for being impregnible. Lynn alerted Cisco, which issued a patch… but Cisco wasn’t strongly pushing adoption, and Lynn believed they were dragging their heels so as not to damage their reputation for security.
So Lynn decided to present his results at the Black Hat conference in Las Vegas in the summer of 2005, on behalf of ISS, his employer. Cisco put strong pressure on ISS not to let Lynn make the presentation – eventually, Lynn decided to resign from ISS and make the presentation anyway. In the aftermath, Cisco threatened to sue Lynn claiming his power point presentation violated copyright by presenting snippets of copyrighted code. They further claimed that this information was a trade secret. (The copyright argument is likely entirely bogus, Derek thinks – this is a classic fair use scenario.)
Jennifer Granick acted as Lynn’s lawyer and negotiated a settlement – Lynn wouldn’t release the specific exploit code, and Cisco would drop the suit. In the grand scheme of things, it’s a “happy” outcome… though Lynn did lose his job and had his life radically transformed. Derek suggests that “you don’t need to win a case to be successful, you just need to create a chilling effect.”
A second story adds a layer of complexity to the situation: Snosoft, a team of security researchers, were trying to get Hewlett Packard to purchase their services. They discovered a buffer overflow in HP/UNIX, and another researcher published the bug they found to the Bugtraq list, along with code to use the exploit. HP responded with their full wrath, threatening criminal extortion charges. Snosoft found themselves in an unusual situation – did HP want to prevent publication of this information to protect their reputation? Or did they want to benefit from Snosoft’s discovery without compensating Snosoft for their work?
In general terms, security researchers are multiply vulnerable. They can run afoul of the DMCA, the Computer Fraud and Abuse act, intellectual property laws surrounding copyright, patents, trademark and trade secrets, and also copyright law, if reverse engineering violates the End User License Agreement. In some cases, experimenting with systems could cause vulnerability under tresspass or extortion laws. Derek argues that the safe harbors to protect this sort of exploration are insufficient – they’re narrow and untested in the courts.
And power is strongly on the side of software vendors – you’re breaking their stuff, and most judges will conclude that they’ve got a right to protect access to their property. As a result, it’s virtually impossible to get third party insurance as a software tester. There’s major legal risk without mitigating devices like insurance.
Derek acknowledges that there are debates within the security community about the details of intrusion testing. When do you let a company know you think you’ve found a vulnerability? When can you publish this information? 30, 45 days after warning the company? If Sony in installing rootkits on people’s machines, do you owe Sony anything before revealing that they’re distributing malware?
The fear Derek is trying to tackle is that security testing moves entirely underground – firms find weaknesses and sell them to the eastern european Mafia rather than reporting and publishing them. To prevent this, he explores some possibilities: making it harder for EULAs to override fair use, to prevent reverse engineering; shifting the burder on fair harbor provisions so the software companies must prove that you’re outside of fair harbor; potentially creating a trade association that allows a group of people to cooperate and ensure their activities against liability.
Much of the interesting pushback on Derek’s presentation came from Simson Garfinkel, a security researcher and world-class skeptic. He points out “some of the people who call themselves security researchers are involved with extortion” – do we want to be encouraging people to find key vulnerabilities in software when some of them are explicitly doing so as a way of threatening and extorting companies?
Instead of trying to protect people creating exploits, Simson believes we should look closely at the fact that most software licenses protect software companies from any and all liability. If Cisco could be sued due to documented limitations and failures in their software, they’d likely have a very different attitude about independent software testing and would work closely with anyone who released a bug to get it patched and limit liability.
Simson’s other interesting idea involves patenting exploits – if you’ve figured out a novel way to break software, patent it so other software testing firms need to license it from you. He admits that this certainly doesn’t stop the bad guys from using your techniques, but can create a revenue stream for folks engaged in this industry other than extortion.
It will be interesting to see where Derek goes with this – it’s not clear that the problems he sees are as clear to a critic like Simson. On the other hand, I think he’s made the case that there are instances where independent software testing is desirable, which means wrestling with these issues is likely to be worthwhile. I hope he’ll bring us up to date the next time he comes to visit from Detroit. Bon chance, Derek!
Pingback: Info/Law » Bambauer on Software Security Research
Pingback: Matasano Chargen » BeanSec: The Aftermath
it’s snosoft, not snowsoft
Comments are closed.