Posts tagged “Protecting Children”

Wikipedia, Cleanfeed & Filtering

IWF classified a Wikipedia page as containing a pornographic image of a child. As a result UK ISP’s that participate in the cleanfeed program are now blocking access to the Wikipedia page of the band the Scorpions because of a controversial album cover that is potentially child pornography and thus illegal under UK law. The IWF states:

A Wikipedia web page, was reported through the IWF’s online reporting mechanism in December 2008. As with all child sexual abuse reports received by our Hotline analysts, the image was assessed according to the UK Sentencing Guidelines Council (page 109). The content was considered to be a potentially illegal indecent image of a child under the age of 18, but hosted outside the UK. The IWF does not issue takedown notices to ISPs or hosting companies outside the UK, but we did advise one of our partner Hotlines abroad and our law enforcement partner agency of our assessment. The specific URL (individual webpage) was then added to the list provided to ISPs and other companies in the online sector to protect their customers from inadvertent exposure to a potentially illegal indecent image of a child.

But why didn’t they just block access to the specific URL of the offending image? Instead they block the entire page, the text (and other images) of which are completely legal. There is no technical reason why they cannot block URLs to specific offending images in exactly the same way as they can block a specific Wikipedia page and not the entire Wikipedia site.

The IWF collects URLs that are potentially illegal for containing child pornography and sends them to participating ISP’s in the U.K. as part of the cleanfeed program. The ISPs then block access to these URLs. These URLs may be shared with other agencies through the IN HOPE network and possibly with commercial filtering companies as well. Canada has a similar cleanfeed program in which Cyberip collects the potentially illegal URLs and send them to Canadian ISPs who then block access to them. One of the main why cleanfeed has been successful and replicated in oher countries is that it was supposed to elegantly avoid the pitfall of overblocking, the key objection that was consistently raised civil libertarians and others with respect to filtering. This is why filtering at the URL level is so important: one offending page can be blocked while the rest of the site remains available.

One of the questions I’ve often raised (in the Canadian context) concerns what precisely is blocked. We know that cleanfeed systems can block at the URL level, so why block access to the web page containing the offending image and not the the URL to the offending image itself? There is no technical reason for not doing so. If IWF added the URL to the specific offending image embedded in the Scorpions Wikipedia page the text of the article, which is perfectly legal, along with all the other legal images would still be available. Only the one offending image would have been blocked.

For a system that was designed to not overblock I find it hard to understand why they don’t the specific offending images. If an entire website was devoted to showing images of child abuse then it would be understandable, but Wikipedia?

ISP Filtering

After reading this great enumeration of various efforts to block accidental access to images of child sexual abuse I updated updated to include the blockpages from Sweden, Switzerland and Denmark.

This document notes many of the unintended consequences of filtering, especially overblocking, and it challenges the wisdom of making the blocking look like an error, as opposed to presenting the user with a blockpage:

Providing such a notice seems far more likely to achieve the intended objective of discouraging access to material that is illegal to possess, and raising public awareness of the fact that such a law exists, than merely providing a ‘page not found’ notice.

In the context of Sweden it also discusses threats to block the bit torrent tracker Pirate Bay by adding it to the child pornography blocklist. Mission creep is always present.

I’ve updated with the blockpage that users in Denmark see when they try to access Pirate Bay.

Filtering for the reason of copyright violation is reportedly gaining in Europe:

To recap, the Commission saw great merit in an anti-piracy system where Internet Service Providers (“ISPs”) would voluntarily agree to monitor their users and report the infringers to the industry reps or to the authorities, as well as possibly cut off their internet connection. From what we have heard from our sources at the Commission, a lot of the feedback they have currently received has been very supportive of the idea of filtering and monitoring. This has now emboldened some officials to push forward with plans to implement such voluntary EU-wide proposals, although nothing has yet been firmly decided. EU law clearly states that ISPs have no obligation to monitor and filter content, but the carrot they get from participating is that they are less likely to be sued by IFPI and others.

This is something that the Copyright Lobby has been slowly moving toward here in Canada.

Finland Filtering

Finland’s filtering system, put in place to block access to images of child abuse (child pornography) is blocking sites that do not match this criteria. In addition to blocking an anti-censorship activism site, the filtering seems to be significantly overblocking. EFFi reports:

The censorship supposedly applies only to foreign web sites that are used to distribute child pornographic images and the block list indeed reportedly contains such sites. However, many of the censored sites are apparently legal pornographic sites. Most of the censored sites are located in the United States or in the EU countries which have strict legislation against child pornography. Many of the censored US sites contain the 18 U.S.C. 2257 notice. Many of the blocked sites are link farms, without actual independent image content. The block list reportedly contains disproportionately many gay sites.The censorship however extends not only to the adult sites.

An interesting issue brought up in this case concerns links. The website of the anti-censorship activist Matti Nikki was censored after he published the blocklist as hyperlinks:

Previously the list of censored sites on Nikki’s site had just the names of the sites, not links. To enter a censored site one had to copy the site name to the address bar of the browser. The site was censored after Nikki had made the names of the sites clickable links (after which there was no need to manually copy the site names to the address bar of the browser). According to the police FAQ (in Finnish) the block list includes sites with “a working link to a site containing child pornography”. There is however no apparent legal basis for the distinction between not censoring a site with a written site name of an alleged child pornographic site, and censoring a site with the corresponding clickable link.

This is an interesting case as it shows how the lack of transparency and accountability can lead to practices that impinge on freedom of expression despite the intended goal of protecting children.

(More screen shots of block pages at

Child Protection Online

The Privacy Commission’s blog has an interesting post about the protection of children online. The context is in terms of privacy and not the usual implementation of filtering technologies.

There are increasingly deep levels of intimacy between marketers and children – there’s a thin line between content and commerce

All the major children’s playsites comply with data protection laws – in fact they all market themselves as champions of children’s privacy

In these children’s sites, the pervasive market research invades privacy – seamless surveillance – colonizing their play – constraining the identities available to them – recasting things like citizenship, friendship, autonomy, choice and control within the framework of the marketplace

It is interesting because all these sites would not be blocked by filtering software (ostensibly implemented to block pornography etc…) because they are kids sites. It not only demonstrates that throwing technology at a social problem will not “fix” it as well as need for parents and children to communicate and educate themselves about Internet safety.

Mission Creep

Mission Creep:

Regardless of the initial reason for implementing Internet filtering, there is increasing pressure to expand its use once the filtering infrastructure is in place.

Norway has been filtering child abuse websites since October 2004. The system operates similarly to others (Canada, UK, Sweden) where people can report illegal websites, the the websites are reviewed and if they are deemed it to be illegal they are added to the block list.

These programs, especially since they are restricted to web-based filtering, do not impact a person determined to view child abuse images on the Internet. At best, they prevent casual or inadvertent access to designated websites. Filtering systems can be easily circumvented by those actively seeking and disseminating child pornography. Moreover, web-based child pornography is only the tip of the iceberg. Much of it moves by P2P or private groups online.

Also, none of these filtering systems contribute to the prosecution of those trafficking in child pornography. These Internet filtering programs that have been implemented contribute neither to the arrest and prosecution of individuals within these countries who access child pornography nor to the people that create or traffic these images outside of the country.

Norway’s system differs slightly in that it is a partnership between the police and the ISP’s whereas other similar programs in Canada and the UK are private partnerships between NGO’s and ISP’s (although there government was involved in bringing the partnership together).

In Norway, Telenor and KRIPOS, the Norwegian National Criminal Investigation Service, have introduced a filtering system to block child abuse websites on the Internet. Telenor is responsible for the technical solutions and KRIPOS provides updated lists of web sites that distribute such material. The cooperation stemmed from the efforts of the Norwegian Minister of Justice who brought the two together.

In communication with the authorities involved I was told that block lists are produced by the police through a combination of browsing known newsgroups where such web sites are promoted and in conjunction with a police operated tip line. Domain names (and very few IP addresses) that contain child abuse images are added to the filter. The filtered websites are reviewed at least once a month.

Telenor does not store any data or logs on users that trigger the filter. When users trigger the filter they are presented with a webpage that indicates that the site is blocked because it contains illegal content. It also contains links to Norwegian law as well as an email address users can write to if they believe the content has been blocked in error. Also, the filter does not affect file sharing services or email.

I was also told that any websites related to payment for access to child abuse websites are added to the block list. The stated goal is to disrupt the ability of such organizations to sell and distribute child pornography to a Norwegian client base. (I am not sure of what exactly these sites are, I assume they don’t mean paypal).

Now 3 years later a new filtering proposal is being floated in Norway. (The original article is only available in Norwegian). Via the translation provided by the proposal calls for expanding the system to block sites that are:

* Foreign gambling sites (preserving the very lucrative Government owned monopoly on gambling)
* P2P sites offering illegal downloads such as MP3s, TV shows and movies
* Sites desecrating the Flag or Coat of Arms of a foreign nation
* Sites promoting hatred towards public authorities, racism and hate speech
* Sites offering pornography that may cause offence

I am not sure if this proposal will actually be implemented, however, it clearly indicates that once the filtering system is in place there will be pressure to expand its use beyond the original stated goal.

Recently, ISP’s in Canada announced that they would begin filtering child abuse websites and in response of public pressure (and here) and the work of Michael Geist documentation has been posted that clarifies some key concerns such as appeals process, continuing review and potential overblocking.

While many questions have been answered some key ones remain, such as:

* What is actually blocked, IP’s, domains, urls or url’s to specific images? The FAQ says the collect URL’s, but does not how ISPs are choosing to block.

* Is there any effort made to contact equivalent organizations to Cybertip (or law enforcement) in the country in which the offending content is hosted? The FAQ says “Reports deemed potentially illegal are forwarded to the appropriate law enforcement jurisdiction” but it is unclear if that means Canada or the world.

* Have the ISPs sought/received permission from the CRTC to block sites? Do they need permission from the CRTC? the FAQ says that “ISPs’ AUPs and Terms of Service permit this action” but it does not address CRTC authorization.

and …

Is it possible to have a test research site without any illegal content added to the blocklist for research purposes?

In terms of mission creep, Michael Geist warned:

Canadian law would currently prohibit extending the block list to other forms of content, however, ISPs and the Internet community must be vigilant to ensure that fears of a “slippery slope” do not come to fruition.

I think the current Norway case is indicative of the direction that filtering takes. It often begins with a serious issue that demands concrete action but then becomes a blunt instrument to pursue other content areas. We do need to be vigilant.

Since filtering programs do not facilitate the removal of child abuse images and do not facilitate prosecution of those who create and traffic in them they only conceal the problem. While there may never be an easy answer, I believe that this serious issue demands solution that is not predicated on blocking “accidental access”, especially since the risk of mission creep is so substantial.

The issue of child abuse images on the Internet is a significant international challenge. However, this challenge should not deter us from fighting the trafficking of child abuse images. We have to make a serious effort to coordinate our efforts nationally and internationally within an institutional framework that has the expertise and authority to identify and remove child pornography and prosecute those who are creating, distributing, buying and accessing it.

I don’t have a good enough answer to this serious and important issue, but I know that filtering is not the solution.

CleanFeed Canada

There is now an FAQ on CleanFeed Canada as well as information on the appeal process.

ISP’s vs. the State

The recent CRTC/Warman case has generated much discussion. One of the issues brought forward is that this case sought to have the CRTC authorize ISP’s to voluntarily block sites, something that distinguishes this from government-mandated censorship. While true, it rasises several issues.

Firstly, it is not neccessarily a positive thing. ISP’s have engaged in misguided vigilantism in the past, blocking content the ISP feels it is entitled to block (and doing so in a manner that blocked nearly 800 non-related sites in order to block just one site). ISP’s are not equipped or qualified to make judgements on content and will always default to the lowest common denominator which has serious repercussion on freedom of speech and expression. Putting “the issue in the hands of ISPs” is not necessarily a better option than state censorship.

Secondly, as the case of China illustrates, the government will often authorize certain topics to be censored leaving the decision on what is specifically blocked to private firms. As the N.Y. Times reported:

American Internet firms typically arrive in China expecting the government to hand them an official blacklist of sites and words they must censor. They quickly discover that no master list exists. Instead, the government simply insists the firms interpret the vague regulations themselves.

Domestic Chinese blog providers in China have told me how they create lists of what terms to filter themselves. Governments delegate responsibilty for censorsing to ISP’s and ICP’s (content providers). Often, ISP’s will become very conservative, blocking more than they are required to block. In Iran, the ISP ParsOnline is known for block more content than required by the government.

Thirdly, the filtering systems in use to block child pornography also suffer from serious problems. In addition to the fact that filtering does not contribute to the removal of child pornography from the Internet and does not contribute to the arrest and prosecution of criminals who traffic in this material it is also easily circumvented. At best, it prevents casual or inadvertent access to designated websites. (Websites are not neccessarily the most common way this material is transmitted, p2p systems for example are unaffected by this type of filtering). Also, there is often no review process (once on the list always on the list) or appeals process (if content is removed or a mistake made). And although the technology is getting better there is often significant over blocking — blocking content that was never intended to be blocked in the first place.

For example, a Pennsylvania state law that required ISP to block access to web sites suspected of hosting child pornography violated the First Amendment, according to the Federal District Court in Philadelphia. The ruling details the process, both legal and technical, that lead to the blocking of 1.5 million legitimate websites while trying to block access to approximately 400 websites suspected of containing child abuse images.

The protection of children, the elimination of child pornography and combatting hate speech are extremely important concerns. Actions taken to fight the production and dissemination of child pornography and hate speech should be designed to be effective while not impinging on the free speech rights of citizens or placing the responsibility to determine offending content on Internet Service Providers (ISP). To combat child pornography on the Internet, government, law enforcement the ISP/web hosting industry and civil society organizations must come together to work collectively on this serious problem.