Open Source Censorship?



A NewsForge article on the ONI raises the issue of open source censorship, something that we have discussed in the past. Internet filtering technologies are plagued by two inherent flaws under-blocking (content that should be blocked is accessible) and over-blocking (content that should not be blocked is inaccessible); this has been pointed out time and time and time again. Most filtering systems use a block list method where administrators configure the filtering software to block categories of pre-selected URLs. In the case of proprietary filtering technologies these block lists are kept secret. Efforts to legally obtain the contents of these secret lists have failed because the lists are the intellectual property of the censorware vendors. When using proprietary filtering technology neither users nor administrators know exactly what is and is not blocked. Furthermore, countries/administrators may be blamed when URLs are unintentionally blocked because they have been mis-classified by the censorware vendor. But unlike proprietary filtering technology, open source filtering software (DansGuardian, SquidGuard) can be configured to use open block lists. These open block lists can be scrutinized and users and administrators are thus fully informed as to what exactly is being blocked. When applied at the national level, this means that censorship can be implemented in an open and transparent manner by using open source censorware. But is this good enough?

I have found that the open source filtering product DansGuardian is being used in Myanmar (Burma). In this case, the authorities have customized the block list to include dissident and human rights web sites, such as freeburma.org. Thus despite using an open product there is still no transparency or accountability as the government has not made the modified list publicly available. The end result is that what exactly is blocked is still hidden from users. And researchers must still resort to trial and error, however automated, to identify blocked sites. However, administrators are advantaged for they know exactly what is blocked. It is thus less a technological question and more a political one. Transparency and accountability is a matter of policy, not purely technology. Myanmar is governed by a repressive military regime; using open source will not rectify that.

In Thailand, the authorities have begun issuing lists of URLs to ISPs. The ISPs are then responsible for implementing the technical means of filtering to block access to these specific URLs. The block lists is open and meetings have been held in order to determine what web sites should be blocked, although the censors apparently had trouble agreeing on what to block. Despite this open and transparent process at least one unofficial block has been identified. The promise of transparency does not necessarily mean accountability. In this case the web site in question highlights a corruption scandal. Once a national filtering system is in place it governments may be tempted to use it as a tool of political censorship. There needs to be strong civil society involvement and the capacity to hold the government accountable. There needs o be a system in pace with strong disincentives for the censors to over step their authority. Iím not so sure if this is the case in Thailand, although Iíd be glad if I were proven wrong.

Another case concerns Internet filtering in schools and libraries in the USA. Proprietary censorware vendors, such as N2H2 (BESS), Secure Computing (SmartFilter) and Websense, have been capitalizing on the Children’s Internet Protection Act (CIPA) which requires K-12 schools and public libraries to implement Internet filtering in order to receive federal funding. Schools and libraries have been installing these technologies yet they have no clue about what exactly they are blocking. Here is a classic case of what Ron Deibert describes as ďceding to commercial entities the responsibility of placing limitations on freedom of speech through tools that are sheltered from close public scrutiny because of intellectual property protections

Would open source censorship make a difference? The technology on its own would not. However, it is a case where open source technologies and open block lists, if implemented, could be scrutinized by school/library officials, parents could be consulted and an open process for adding and removing blocks could be established. That is, a policy process would need to be put into place in addition to the technology. I donít know if even that would be enough.

Using open source censorware does not solve the issue of filtering and censorship, but it can expose it. If accompanied by strong civil society opposition it can raise issues of free speech and expression in a public way at a time when secret proprietary lists are currently in place and are continuing to be put into place world wide. Censorship works because of secrecy; in fact itís predicated on it. It has to be clear that open source censorship does not, on its own, create a situation that facilitates transparency and accountability. Rather it should be seen as a step towards removing censorship, by raising awareness by dragging these technologies and issues into the realm of public debate.

One comment.

  1. censorship should go in f**king tired of seeing parental advisories

Post a comment.