This past weekend, with support from the Ford Foundation, EFF and the MIT Media Lab, Center for Civic Media held a two day conference on the Freedom to Innovate. The first day featured experts on cyberlaw, activists and students who’d experienced legal challenges to their freedom to innovate. Sunday’s sessions included a brainstorm led by Cory Doctorow on imagining a world without DRM, and an EFF-led workshop on student activism around technology issues.
I was MC for the meeting on Saturday, and have only partial notes. I hope to post some impressions from these other sessions once I have more time to digest, but I’ll begin by posting my notes from opening talks by Jonathan Zittrain and Star Simpson.
I asked Jonathan Zittrain to give an opening keynote on the Freedom to Innovate because he’s one of the world’s leading thinkers about technical, legal and normative barriers to innovation. His book, “The Future of the Internet – And How to Stop It”, introduces the idea of generativity, the capacity of a system to enable users to invent and create new technologies.
JZ’s talk was titled “Freedom to Innovate, Beyond the Trenches”, and began with the technologies of, and before, his childhood: computers built from kits, PCs that you could take apart and reassemble, and operating systems that – whether or not they were free software – were rewritable and modifiable. (Waxing lyrical about MS-DOS, JZ notes that the blinking cursor was “an invitation to create: you could rewrite MS-DOS in MS-DOS.”) The PC and MS-DOS were “generative”, in JZ’s language – they don’t have a fixed set of uses, but are expandable and extendable to solve new problems. (To illustrate the expandability of PC hardware, JZ shows off the PC EZ bake oven… which might also function as a helpful heatsink.)
Jonathan Zittrain, and a PC EZ Bake oven
There are three freedoms that characterize this moment in tech history, Zittrain tells us. People are free to create new technologies. They’re free to adapt existing technologies to new purposes, to “tinker around the edges”. And they’re free to join and contribute to communities of like-minded actors. He explains that the next step after building your Heathkit H8 PC was to join a group of hobbyists who’d figured out how to program the machines – learning from others through apprenticeship was core to this moment in tech history.
When Stephen King published “Riding the Bullet” in 2000 – “a story so bad he couldn’t bring himself to publish it in print” – JZ argues that he ushered in a new era of technological creativity. The story was the first widely available commercial e-book, using digital rights management technology, and despite its low price ($2.50, and distributed free by Amazon and Barnes and Noble), folks at MIT hacked to copy protection to see if they could. “I see those MIT hackers as the leading drop on the crest of the wave of content, from people tinkering in the ham radio world to tinkering in the world of commerce.”
As more media went digital, this tinkering went mainstream. Audio Grabber was a piece of PC software that let users “rip” audio from CDs using a CD-ROM player, and make copies. For the audio industry, this was a step too far, a way in which tinkering escaped the hacker community and entered into mainstream parlance.
The music industry’s responses to copying CDs added a new freedom to the freedoms to create, to tinker and to connect with a community: the freedom to liberate. If content was tied up in a bad DRM system, you should be free to find a way to liberate it from those constraints.
Prior to CD ripping, the music industry looked for ways to deal with the “digital threat”. The Audio Home Recording Act – created to govern DAT tapes – sought to ensure that even if copies of digital materials could be made, that copies could not be made of copies. And when copies were made, fees would be charged to users through a fee on blank media and put into a fund that would help artists who might be harmed by this new technology. As JZ explained the intricacies of the AHRA, he noted, “If you’re already getting sleepy, that’s the point.” These agreements weren’t trying to protect user rights, or involve users in any way – they were negotiated between big parties with opposing interests – content creators and technology manufacturers – and were about dividing the spoils. When existing actors encountered the PC, they looked for ways to “make the PC safe for the CD”, to turn the PC into something as simple as an appliance, like a CD player. Audiograbber turned this equation on its head and demonstrated that users would look for ways to liberate their content and use them in other contexts.
As the audio industry sought to cope with audio ripping and the rise of devices like the Rio MP3 player, they began to engage in behavior that resembled hacking. People who purchased certain Sony CDs – The Invisible Invasion, Suspicious Activity?, Healthy in Paranoid Times – found that these CDs had autoexec files that installed rootkits on their PCs. Sony evidently wanted to monitor all actions these users were taking, tracking what content they were playing and trying to determine the origins of all the files on their systems.
People were widely outraged by Sony’s actions, suggesting that ripping of CDs by an individual felt like less of a transgression than systemic hacking by a corporation. Sony’s transgressions suggests another right we might support under the freedom to innovate: the freedom to audit, to understand what the systems we use are doing to our computers and with our information. “We need toe ability ot look it and to say that something isn’t right.”
Five aspects of the Freedom to Innovate
- Freedom to create new technologies
- Freedom to tinker with existing technologies
- Freedom to connect with communities of interest
- Freedom to liberate content for additional uses
- Freedom to audit existing systems
These rights – to create, to modify, to join communities, to liberate and to audit technologies, are all deeply complicated by DMCA 1201, a section of the Digital Millenium Copyright Act which shifts responsibility around the freedom to tinker with existing systems. Previously, if you altered a technology, your legal liability came from infringing a copyright by distributed cracked material. But under section 1201, simply circumventing copy-protection mechanisms is enough to face prosecution or liability. This shift puts legitimate security researchers, like Ed Felten – now Deputy U.S. Chief Technology Officer – who took the Secure Digital Music Initiative up on their challenge to remove watermarks from their sound recordings, and ended up threatening Felten with prosecution under section 1201.
The only ways around 1201, Zittrain tells us, are exemptions, like an explicit exemption that allows librarians to defeat copy protection so they can make the decision as to whether they want to acquire a copy of a work. “This has probably never been invoked,” Zittrain speculates. “It’s basically there to let librarians feel a little better about the law.”
“Why should this zone be one of cat and mouse?” asks Zittrain. The industry releases something and hopes the community won’t hack it. The community creates something new and wonders whether they’re going to be prosecuted over it. “There ought to be a way to have fair use without hacking to get it,” Zittrain argues. “And the best you’re ever going to get with litigating under 1201 is that you’ll get permission to hack into something like Facebook for a specific set of good reasons… now good luck hacking in!”
“Why shouldn’t the cat and mouse make peace? Why shouldn’t Facebook be required to make accessible data for certain types of research so we can understand what’s going on in the world?”
The recent discovery that Volkswagen had taught their cars to lie about admissions raises questions about the dangers of this cat and mouse game. But there’s a tension as well – we want to get into the circuit boards, review the code and figure out what the VW is and isn’t doing. But at the same time, we live in a society that is extremely paranoid about security (as we learned with Ahmed Mohamed’s clock) – will we want to drive our cars after hacking into them to review their emissions?
(Zittrain suggests that there may be some technologies where DRM is desirable to prohibit tinkering, like with CT scanners. Cory Doctorow, in the audience, argues that for that argument to hold, DRM would need to work, which it never does, and needs to be auditable because there’s no security through obscurity.)
As we head towards the Internet of Things, we’re going to fight over models for how objects talk to the internet. Will the internet of the Internet of Things be the real internet, where anything can talk to anything, and it’s up to the thing to figure out if it wants to listen. Or should it be a closed, corporate net where objects only talk to their vendors. We’ll end up resolving this against a backdrop of legal liability, a world in which things sometimes go feral. Who’s responsible when your Phillips tuneable bulb is reprogrammed to burn down your house? Amazon recently announced their platform for the internet of things, a framework that fills a genuine need, the ability to constrain what can talk to what. But Amazon is going to charge for this privilege, raising questions about whether we want to hand this responsibility to commercial entities.
— Erhardt Graeff (@erhardt) October 10, 2015
When we think about the generative, blinking cursor, Zittrain tells us, MIT and other academic institutions created this environment and this paradigm. And universities have a huge role to
play in defending and promoting freedom to tinker and freedom to innovate. “I feat that this mission has been forgotten, and that people like Peter Thiel, who are encouraging people to innovate outside the university, are helping this be forgotten.” We don’t want these institutions to be oracular, to predict the future of the devices we can use and how we interact with them. But we do want them to be “productively non-neutral”. We need universities to be opinionated about the freedom to innovate and the freedom to create the future.