COPA and Filtering



The Child Online Protection Act (COPA) received a blow from the U.S. Supreme Court which ruled that it is likely that COPA is unconstitutional. COPA is a law that purports to protect minors from viewing sexually explicit materials on the Internet by requiring those who post sexually explicit content for “commercial purposes” to restrict access to the content by requiring a credit card or some technological measure so that minors cannot gain access. Failure to do so could result in a $50,000 fine and six months in prison.

The Court decision argues that “blocking and filtering software is a less restrictive alternative” and that COPA can be circumvented by minors that have access to credit cards and that COPA does not apply to foreign pornographic content. Therefore even if COPA were to take effect U.S.-based porn outfits could just move their operations overseas. It also concludes that COPA would only apply to Web content, therefore filtering technology, which can block email, chat, and other forms of Internet communication is more effective. Finally, it suggests that filtering technology is less restrictive because it can “impose selective restrictions on speech at the receiving end, not universal restrictions at the source” and its use can be encouraged “by giving strong incentives to schools and libraries”. But is the use of filtering technologies a good substitute for bad legislation?

Text from the decision itself shows the problem with filtering. First, there is the suggestion that ALL pornography can be effectively filtered.

First, the record demonstrates that a filter can prevent minors from seeing all pornography, not just pornography posted to the Web from America.

Then an admission that filtering may not be as effective as previously asserted, that some pornographic content would still be accessible and that some non-pornographic content would in fact be blocked.

Although filtering software is not a perfect solution because it may block some materials not harmful to minors and fail to catch some that are, the Government has not satisfied its burden to introduce specific evidence proving that filters are less effective.

Filtering technologies’ inherent problems of over-blocking and under-blocking and raised, but not really acknowledged as a fundamental problem. Instead, it is merely presented as more effective than requiring some form of age verification. But the content question seems far more important that the technical one. The decision asserts that filtering does not criminalize particular content.

Promoting filter use does not condemn as criminal any category of speech, and so the potential chilling effect is eliminated, or at least much diminished.

Moreover, it gives parents the ability to monitor and control what content their children have access to.

COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.

But here we can see how the slippery slope of pornography can be used to further restrict additional categories of speech. Most filtering software filters a lot more than pornography and the lists of the web sites on those lists are secret. In reality, when you use filtering technologies you don’t know what you are blocking access to. You are in fact delegating your choice as to what content you or your children should or should not have access to a company that will not reveal what they in fact block access to.

Besides the flaws concerning the blocking of non-pornographic sites that have been misclassified, filtering software generally has various categories of “sexual” content including sex education, including teen pregnancy, abortion, condom use etc… as well as “bikini” sites etc… But they also have categories for gambling, sports, news, and a host of other content. So this quickly becomes about more than just pornography, it becomes a question of acceptable content. What content categories will be turned on in schools and libraries ‘Extreme’ , ‘Nudity’ , ‘Provocative Attire’ , ‘Profanity’, ‘Sexual Materials’ , ‘Tasteless/Gross’ or just ‘Pornography’?

So has one problem just simply been replaced with another?

One comment.

  1. “So has one problem just simply been replaced with another?”

    This has driven a debate among civil-libertarians for, oh, about the last ten years 1/2 :-).

Post a comment.