Chris Conley leads one of the most difficult research projects we’ve undertaken at the Berkman Center – the surveillance study of the Open Net Initiative. Over the past five years, the good folks at ONI have gotten very smart about how the internet is filtered in nations around the world. What’s much less clear is how the internet is monitored and what governments, law enforcement agencies, corporations and others are able to track as far as online behavior.
One problem: if surveillance is performed competently, it should be undetectable. Second problem: it’s often to someone’s advantage to claim that surveillance is taking place, even when it’s not, as it can change behaviors. (Think about “dummy” cameras mounted on your house as part of a fake “security system”. If they’re convincing enough, perhaps they don’t actually need to work.) Conley mentions my comments about the panopticon effect of surveillance in a recent Newsweek article – I assert that Zimbabwe isn’t able effectively monitor the Internet… but by stating that they will, they’ve forced a large number of users to remove sensitive information and conversations from the Internet.
In attempting to understand and explain surveillance to academics, activists and the general public, Conley would prefer to study what’s actually happening. Unfortunately, that data’s pretty uncommon. We know about situations where surveillance is discovered by the target and cases where information is either leaked or publicly released, but these situations are quite rare.
Instead, in many cases, we do better to study capabilities. What tools are available that individuals or governments could use to monitor networks? What tools can be used to scan a hard drive over networks? What are the capabilities and vulnerabilities of tools like GMail, Google Docs and Facebook? How is the network laid out and what does that mean about technological constraints on monitoring?
It’s been widely reported – though only very thinly disclosed – that there’s widespread domestic surveillance taking place in the US with the intention of monitoring suspected terrorists. And CALEA – the Communications Assistance for Law Enforcement Act – provides a legal framework to enable wiretapping of traditional, mobile and VOIP phones in the US. These facts have implications for privacy, for civil liberties and human rights.
Conley wishes that those doing surveillance will consider more carefully the possibility that transparency is sometimes in their best interest. Unfortunately, that’s very counter to the ethos of the surveillance community. He quotes Ed Giorgio, a security consultant to the NSA, who says “We have a saying in this business: privacy and security are a zero-sum game.” The fear is that revealing surveillance would allow potential targets to avoid detection and monitoring. But he believes that there are cases where transparency might make surveillance more effective for the surveillers.
Transparency can raise awareness of surveillance – which might be to the advantage of a program designed to alter the behavior of people under surveillance. He notes that the effects of transparency depend on the purpose of the surveillance. Facebook, for instance, surveils your activity constantly to report it to your friends, but users really disliked it when Facebook began surveilling purchasing behavior on other sites via their Beacon program. He also suggests that transparency might need to be very specific to achieve a desired end. The RIAA can claim to surveil filesharing networks, but most users believe they won’t be caught. If the RIAA advertises that they can detect 5% of all illegal filesharing, that might be an incentive to stop sharing. But if they announce the can detect only BitTorrent sharing, that will likely drive people to alternative tools.
If Conley could influence those implementing surveillance, he’d suggest that the following types of disclosure might benefit the people performing surveillance – this is information with “limited negative effect and substantial benefits to disclosure”:
– the mere existence of surveillance programs
– the purpose of the program
– the scope of the program – is it targetting everyone, or just pre-selected targets
– third party cooperation which is nominally voluntary
There’s a complex set of legal issues that arise over surveillance in a digital environment. For one thing, there are many more channels for surveillance – systems like OnStar, a vehicle tracking system, can be turned on for law enforcement purposes. What are the legal rights for an OnStar user? What sort of US Fourth Amendment (privacy) restrictions apply to data you’ve stored online? As the project goes forward, analysis of these legal questions needs to complement research on what we know about surveillance and what might and might not be possible.