Rebecca’s got an excellent post up today explaining a critical distinction in Internet censorship in China – the difference between internal and external censorship. In external censorship, the “great firewall” (neither a firewall nor, actually, all that great) blocks access to certain sites hosted around the world to users in China. Human Rights Watch‘s website, for instance, is hosted in the US. Chinese censors can’t do anything about the content on those servers except to prevent it from being accessed from Chinese networks. This form of external censorship is pretty common – our friends at the Open Net Initiative have documented it at one point or another in roughly two dozen countries.
But China practices another kind of censorship as well. When bloggers in China write on Chinese-hosted blog services, authorities can exercise much more control over that content. As Rebecca explains, Chinese content-hosting companies maintain teams that monitor all content created and block sensitive content from being published. The CEOs of those Chinese blog companies meet with the Internet Information Administrative Bureau on a weekly basis to make sure the most sensitive content is blocked. The reason internet data centers have been shut down prior to the 17th Party Congress, Rebecca explains, is that the companies hosted at those centers don’t control content as tightly as Sina, Sohu and other big Chinese Web2.0 companies. (China uses external censorship to block access to most non-Chinese blog companies, making it difficult for Chinese users to avoid internal censorship by posting on WordPress.com, for instance.)
While internal and external is a good way to think about censorship in a Chinese context, there’s another way of parsing the situation that makes more sense in the international realm: social, versus political, versus generative filtering. The folks at ONI have done a good job of documenting this first distinction: social versus political. In social filtering, governments mandate the blocking of content that challenges social norms and mores. Saudi Arabia, for instance, blocks pornographic web content on the grounds that such content is un-Islamic – this type of blocking is fairly common in the Middle East, focused on more conservative countries. But it happens in Europe as well: France and Germany have moved to block some types of Nazi content and imagery.
Political filtering occurs when a country blocks content that’s opposed to the current government, either the sites of internal opposition movements or international critics like human rights groups. This sort of filtering, in my opinion, is a good deal more sinister than social filtering. While I would prefer a world where no content is filtered, it’s possible to accept that a country would choose to prevent its citizens from viewing online pornography; it’s much harder to accept that a legitimate government would restrict its citizenry from hearing the voices of opposing political parties or external critics. One of the dangers of social filtering, some ONI researchers have suggested, is that it invites governments who’ve implemented it to expand their reach and begin political filtering as well.
There’s a third front developing in filtering: blocking access to generative tools. I’m using Jonathan Zittrain’s term “generative” to refer to technologies that are easy to use, accessible to a broad audience, and can be turned to multiple purposes. In this sense, many Web 2.0 technologies are highly generative – they’re designed for easy use by a wide range of users, and can be used for purposes that range from banal to profoundly political. (See Cute Cat Theory for more thoughts on this.)
Censoring these generative technologies is undeniably tempting for governments. When Thailand discovers that videos offensive to the Thai king are appearing on YouTube, the temptation to block the site is an obvious one. But it’s even more dangerous than blocking individual expressions that a government finds offensive. Blocking generative tools means blocking the possibility of speech, not just speech found to contravene local laws and norms. When Thailand blocks YouTube, they’re not just blocking some videos that many Thai people would find offensive; they’re also blocking millions of videos that are inoccuous, or which criticize or comment on issues in Thailand in interesting, important and meaningful ways.
Open Net Initiative is documenting the filtering of generative tools in a general “internet tools” category that includes the blockage of voice over IP software, tools used to circument firewalls and search and translation services (which can sometimes be used to circumvent firewalls). I think there’s a need to recognize that filtering these sorts of tools – which China has raised to an art form – is a special threat towards what’s most important about the internet: the fact that anyone with access to the net can create as well as consume content, to express his or herself as well as hearing voices from the rest of the world.
Blocking these tools altogether is clumsy and hard to sustain. The very fact that these tools have so many banal uses means that a block of Flickr will piss off people who want to share family or vacation photos as well as using images for political protest. But China’s in a near-unique position here. The Chinese Web 2.0 industry is huge and well developed (see this montage of Chinese Web 2.0 logos, a response to a similar montage of US and European Web 2.0 companies). As a result, it’s possible to shut off access to non-Chinese Web 2.0 sites and permit access to domestic sites, relying on corporate censorship to prevent certain types of expression from taking place.
Social filtering in many countries has made use of tools developed to support social filtering in the US – countries like the United Arab Emirates use Smartfilter to accomplish their social filtering, a tool that was developed to prevent students from accessing pornography from school computers. It will be instructive to see whether China’s method of filtering generative tools will be echoed by other governments who want to maintain both the appearance of an open, generative web and control some types of expression.
Hi Mr Zuckerman
in my university, we have a case-study about Geekcorps.And we are going to have a discussion how and why you got that idea for Geekcorps. I want to get more infomation about it, so that I really appreciate if I have an small discussion with you as soon as possible.
I hope you are willing to help me, you can contact with me via my email nnpd737071@yahoo.com or yahoo messenger (if you use it) nnpd737071
thank you so much, hoping to talk with you
Denise
Pingback: ABC Digital Futures » Blog Archive » Censorship moves to Web 2.0 tools
Comments are closed.