Regardless of the initial reason for implementing Internet filtering, there is increasing pressure to expand its use once the filtering infrastructure is in place.
Norway has been filtering child abuse websites since October 2004. The system operates similarly to others (Canada, UK, Sweden) where people can report illegal websites, the the websites are reviewed and if they are deemed it to be illegal they are added to the block list.
These programs, especially since they are restricted to web-based filtering, do not impact a person determined to view child abuse images on the Internet. At best, they prevent casual or inadvertent access to designated websites. Filtering systems can be easily circumvented by those actively seeking and disseminating child pornography. Moreover, web-based child pornography is only the tip of the iceberg. Much of it moves by P2P or private groups online.
Also, none of these filtering systems contribute to the prosecution of those trafficking in child pornography. These Internet filtering programs that have been implemented contribute neither to the arrest and prosecution of individuals within these countries who access child pornography nor to the people that create or traffic these images outside of the country.
Norway’s system differs slightly in that it is a partnership between the police and the ISP’s whereas other similar programs in Canada and the UK are private partnerships between NGO’s and ISP’s (although there government was involved in bringing the partnership together).
In Norway, Telenor and KRIPOS, the Norwegian National Criminal Investigation Service, have introduced a filtering system to block child abuse websites on the Internet. Telenor is responsible for the technical solutions and KRIPOS provides updated lists of web sites that distribute such material. The cooperation stemmed from the efforts of the Norwegian Minister of Justice who brought the two together.
In communication with the authorities involved I was told that block lists are produced by the police through a combination of browsing known newsgroups where such web sites are promoted and in conjunction with a police operated tip line. Domain names (and very few IP addresses) that contain child abuse images are added to the filter. The filtered websites are reviewed at least once a month.
Telenor does not store any data or logs on users that trigger the filter. When users trigger the filter they are presented with a webpage that indicates that the site is blocked because it contains illegal content. It also contains links to Norwegian law as well as an email address users can write to if they believe the content has been blocked in error. Also, the filter does not affect file sharing services or email.
I was also told that any websites related to payment for access to child abuse websites are added to the block list. The stated goal is to disrupt the ability of such organizations to sell and distribute child pornography to a Norwegian client base. (I am not sure of what exactly these sites are, I assume they don’t mean paypal).
Now 3 years later a new filtering proposal is being floated in Norway. (The original article is only available in Norwegian). Via the translation provided by luni.net the proposal calls for expanding the system to block sites that are:
* Foreign gambling sites (preserving the very lucrative Government owned monopoly on gambling)
* P2P sites offering illegal downloads such as MP3s, TV shows and movies
* Sites desecrating the Flag or Coat of Arms of a foreign nation
* Sites promoting hatred towards public authorities, racism and hate speech
* Sites offering pornography that may cause offence
I am not sure if this proposal will actually be implemented, however, it clearly indicates that once the filtering system is in place there will be pressure to expand its use beyond the original stated goal.
Recently, ISP’s in Canada announced that they would begin filtering child abuse websites and in response of public pressure (and here) and the work of Michael Geist documentation has been posted that clarifies some key concerns such as appeals process, continuing review and potential overblocking.
While many questions have been answered some key ones remain, such as:
* What is actually blocked, IP’s, domains, urls or url’s to specific images? The FAQ says the collect URL’s, but does not how ISPs are choosing to block.
* Is there any effort made to contact equivalent organizations to Cybertip (or law enforcement) in the country in which the offending content is hosted? The FAQ says “Reports deemed potentially illegal are forwarded to the appropriate law enforcement jurisdiction” but it is unclear if that means Canada or the world.
* Have the ISPs sought/received permission from the CRTC to block sites? Do they need permission from the CRTC? the FAQ says that “ISPs’ AUPs and Terms of Service permit this action” but it does not address CRTC authorization.
Is it possible to have a test research site without any illegal content added to the blocklist for research purposes?
In terms of mission creep, Michael Geist warned:
Canadian law would currently prohibit extending the block list to other forms of content, however, ISPs and the Internet community must be vigilant to ensure that fears of a “slippery slope” do not come to fruition.
I think the current Norway case is indicative of the direction that filtering takes. It often begins with a serious issue that demands concrete action but then becomes a blunt instrument to pursue other content areas. We do need to be vigilant.
Since filtering programs do not facilitate the removal of child abuse images and do not facilitate prosecution of those who create and traffic in them they only conceal the problem. While there may never be an easy answer, I believe that this serious issue demands solution that is not predicated on blocking “accidental access”, especially since the risk of mission creep is so substantial.
The issue of child abuse images on the Internet is a significant international challenge. However, this challenge should not deter us from fighting the trafficking of child abuse images. We have to make a serious effort to coordinate our efforts nationally and internationally within an institutional framework that has the expertise and authority to identify and remove child pornography and prosecute those who are creating, distributing, buying and accessing it.
I don’t have a good enough answer to this serious and important issue, but I know that filtering is not the solution.