Filtering in Libraries



The ACLU has released a report on Internet Censorship in State’s Public Libraries. The study focuses on libraries in Rhode Island, which like libraries across the U.S., installed filtering software to comply with the Children’s Internet Protection Act (CIPA). Rather than individually purchase and maintain filtering software, the libraries receive filterng services as part of the Cooperating Libraries Automated Network (CLAN). CLAN uses Websense to implement filtering and, by default, filters the categories “sex,” “adult content,” and “nudity.” However, individual libraries can add additional categories; some have added the “Gambling”, “Games”, “Illegal” and “Chat” categories as well. The ACLU concludes that even the minimum default level of filtering “exceeds what federal law requires”.

What I found particulary interesting was not the technical aspects of filtering — the over blocking and underblocking that is inherent in filtering technology — but the social aspect of the implementation of filtering. More specifically, the social controls used to deny people their constitutional rights to request that content blocked beyond the scope of the law be made accessible. In short, the chilling effects.

A recent visit to the Providence Library by the author of this report raised concerns in this regard. There, a librarian responded to a deactivation request ­ for a blocked Google search on nudism ­ with questions about subject matter, judgmental comments, and ultimately a refusal to disable the filter for viewing of what she wrongly characterized as “pornography.”

One of the key points the ACLU makes is that “the U.S. Supreme Court declared use of blocking software to be constitutional, but only on the condition that it be deactivated for any lawful adult user who asks.” Despite the law, the social enforcement of overblocking creates a situation in which people will not ask for blocking to be removed and if they do, the request can still be refused.

Post a comment.