Is Filtering the Right Solution for Child Protection on the Internet in Public Libraries?
By Sylvestre Ngoma
MS. in Internet Strategy Management
The advent of the Internet and its vast array of resources in public libraries in the United States has brought about changes in the way these public institutions serve communities. As a vast network offering varied and diversified materials, the Internet exposes public libraries to the unexpected and enormous challenge of accommodating this tool as an integral part of libraries resources without altering their traditional missions and their patrons' rights.
Not surprisingly, in a country where children's physical and psychological well-being are fully guaranteed by a myriad of child protection laws, the hottest question in these public places is how to shield children from inappropriate materials on the Internet. Indeed, children's safety online has become a national debate, preoccupying decision-makers, educators, librarians, parents, civil liberties associations, software companies and politicians. The debate is centered around whether to filter children's access to the Internet or to just leave them alone.
Some libraries have embraced filtering policies, seeking to regulate the Internet content so as to shield the children from patently offensive and inappropriate materials. Others have complied with the recommendation of the American Libraries Association, which advises against filtering mechanisms.
In either case, there have been lawsuits: some against public libraries that restrict children's access to the Internet for blocking out constitutionally-protected speech and others against public libraries that provide unfiltered access for not protecting children from inappropriate materials. What is the solution?
This paper suggests a different approach to handle these very important issues. It seems to me that to understand this approach, some basic questions should to be addressed, such as:
Whether the view of the online world as perilous, overpopulated by children molesters/predators or flooded with horrifying stuff is accurate; Are American children insufficiently prepared to surf around the dangers of the Internet safely in public libraries? Are they more vulnerable than other children in the rest of the world? What solution would take into account everybody's concerns and interests without compromising First Amendment guarantees of free speech for children?
What is the role of public libraries to their communities and does that influence how they use the Internet?
Once these are understood, perhaps a new way of looking at the issue could be considered.
2.1. Public Libraries As Diversity Centers Par Excellence
The central goal of a public library is to provide an opportunity for all, including the minorities, to inform and educate themselves. Public libraries are, by definition, places of inclusion, not exclusion; places in which different kinds of materials and resources are stored for public use.
Just like the Internet that the United States Supreme Court found to "constitute a vast platform from which to address a world-wide audience of millions of readers, viewers, researchers, and buyers" (1) and, thereby, should merit full constitutional protection, public libraries should be "open to all segments of the American population on an equitable basis and should provide resources for the interest, information, and enlightenment of all people of the community that they serve."(2) They should serve as public forums par excellence and should, therefore, cooperate with all persons or groups who need their services while resisting abridgement of free expression and access to ideas.
Most importantly to our topic, the public libraries charge to children should be to aid in stimulating children's interest and appreciation for learning and reading. The introduction of the Internet in public libraries is consistent with the central goal of public libraries. Libraries should enable all individuals to receive speech from the entire world and speak to the entire world, especially since public libraries are the only places some people have to be able to do this. Public libraries should, indeed, be centers of culture and literacy.
Interestingly, the Internet is really becoming an essential tool in many public libraries. According to the American Library Association, of nearly 9,000 public libraries in America, 60.4 percent offer Internet access to the public, up from 27.8 percent in 1996. (3)
It is important to look at some of the provisions of the public libraries Bill of Rights and their interpretations, since this is the document which guides librarians in their policy-making. Article III states that: "libraries should challenge censorship in the fulfillment of their responsibility to provide information and enlightenment." This stands to say that by allowing the use of what some have referred to as "censorware", known as blocking filters, which censor free speech on the Internet, libraries violate this article. For blocking filters' role is to block out information and ideas deemed to be inappropriate, many of which are not only protected by the First Amendment but also can be enlightening and useful to children, in many ways.
ACLU calls this filtering procedure in public libraries "censorship in a box"(4). Another article that is worth pointing out here is article V. In fact, article V states that: "a person should not be denied or abridged because of origin, age, background, or views". "The right to use a library includes free access to, and unrestricted use of, all the services, materials, and facilities the library has to offer. Every restriction on access to, and use of library resources, based solely on the chronological age, educational level, or legal emancipation of users violates this article."(5)
As pointed out by the National Telecommunications and Information Administration of the U.S. Department of Commerce in 1995, "public libraries can play a vital role in assuming that the advanced information services are universally available to all segments of the American population on an equitable basis"(6). And their dream is perhaps what libraries should strive for which is: "just as libraries traditionally made available the marvels and imagination of the human mind to all, libraries of the future are planning to allow everyone to participate in the electronic renaissance".
As ACLU puts it, "without free and unfettered access to the Internet, this exciting new medium could become, for many Americans, little more than a souped-up, G-rated television network".
3. Overview of Internet Filters in Public Libraries
3.1. What Internet Filters do
Before embarking on the task of discussing filtering policies in public libraries and spelling out important First Amendment and legal issues involved, it would be beneficial to summarize what the blocking software products do, what their limitations are, and what kind of speech they block out when they can. Access control software products, mostly known as cybersitters, cybercops, blocking filters or simply filters, are used to provide a mechanism to block users from accessing sites considered to inappropriate.
These filters are used in different ways: "blocking every terminal in the library (and requiring adults to identify themselves to bypass the filter); restricting all the children unless they have parental permission; not restricting children unless they have parental objections on files; offering restricted terminal for families who wish to use one, with no library enforcement of any kind.
Other options include: no filters with librarians monitoring Internet use; no filters but children cannot use the Internet with parental permission; and a "tap-on-the-shoulder" policy."(7)Another option is to use software such as Triple Exposure that keeps track of where its users (in this case children) have been on the Internet. This is thought to be another alternative to Internet filtering.
Filters operate by restricting access to Internet content, based on an internal database of the product. Or they can restrict access to Internet content through a database maintained external to the product itself or based on the source of information or based on certain ratings assigned to those sites by a third party. So, we can see that their two basic functions involve rating and filtering.As for rating, values judgements are used to categorize web sites based on their content. These ratings could use simple allowed/disallowed distinctions like those found in programs like Cybersitter or NetNanny, or they can have many values, as seen in the ratings systems based on Platform for Internet Content Selection (PICS).
As for filtering, with each request for information, the filtering software examines the resource that the user has requested. If the request is on the "not allowed" list, or if it does not have the proper PICS rating, the filtering software tells the user that access has been denied and the browser does not display the contents of the web site. The first content filters were stand-alone systems consisting of mechanisms for determining which sites should be blocked, along with the software to do the filtering, all provided by a single vendor.The other type of content filter is protocol-based. These systems consist of software that uses the established standards for communicating ratings information across the Internet. Unlike stand-alone systems, protocol-based systems simply know how to find this information on the Internet, and how to interpret it.
While Internet filters are capable of incredible work in blocking out a lot of materials deemed "inappropriate" and "harmful" to children, they are imperfect at best. But, in fact, it is well-documented that filtering software is over-inclusive. Even the most sophisticated filters cannot make the kind of legal and complex judgements that only humans can, for they do not have built-in legal and moral capacities needed for that purpose.In fact, their technical configuration cannot help but produce results unexpected by their programmers, due to the multi-dimensional semantic and morphophonemic composition of languages involved. As a result, most filtering software products such as X-Stop, CyberCop, CyberSurf, CyberWatch, Cyberpatrol, Cybersitter 97, Surfcontrol, NetNanny, etc. have been reported to block sites on breast cancer, AIDS, women's rights, animal rights, gays and lesbians and many other useful sites while some graphic sexual images get though the blocks.
Talking about filtering software, Robert Gelman (1998: 32) supports this point when he says; "this is an imperfect solution, however, because none of these programs can promise a hundred percent accuracy in its filtering criteria. This lack of perfection has caused some to call these programs "censorware".
ACLU calls the filtering policy in libraries "censorship in a box". Further support is found through a study tone by IPS's findings after testing filter programs. They concluded that filter programs are lacking in three important areas:
- Ability to filtering all pornography and other unacceptable material. Ability to blocking access to useful, non-pornographic resources.
- Are disabled easily, allowing a user to bypass them for unfiltered Internet use.(8)
The key point is the blocking software cannot discriminate between obscene materials and information about sexual topics. That is, they block access to a lot of informative and useful materials that they are not designed to block.
Most blocking software prevents access to sites based on criteria provided by the vendor. To conduct site-based blocking, a vendor establishes criteria to identify specified categories of speech. Blocked categories may include "hate speech", "criminal activity", "sexually explicit speech", "religious speech", "adult" speech, violent speech, and even sports and entertainment. Using this list of criteria or one like this one, this software vendor compiles and maintains lists of "unacceptable" sites. Some software vendors employ individuals who browse the Internet for sites to block. Others use automated searching tools to identify which sites to block. These methods may be used in combination.
Typical examples of blocked words and letters include "XXX" sites, "breast", which blocks sites and discussion groups about "breast cancer"; and the consecutive letters "S", "E", "X", which block sites containing the words "sexton", "Essex", "Mars exploration", among any others. Some software blocks categories of expression along blatantly ideological lines such as information about feminism or gay and lesbian issues.
Yet, most web sites offering opposing views on these issues are not blocked. Ken Soohoo (9), founder and chief technical officer of PlanetWeb Inc., said of the company's filtering package PlanetView: "we have a gay filter with three levels of intensity. We also have filters for politics, religion, sex, hate speech and advertising". Most of these are protected by the First Amendment law.
Over-inclusive blocking violates the First Amendment rights of youth and children, as well as, to access constitutionally protected materials. It is well-documented that filtering software is over-inclusive, blocking not only sites that may have sexual content, strong language, or unconventional ideas considered harmful or offensive-but also sites having no controversial content whatsoever.
Filters are known to have blocked web pages of the Religious Society of Friends (Quakers), the American Association of University Women, the banned books page at Carnegie Mellon University, the AIDS Quilt site, the fileroom Project Censorship database, even the conservative Heritage Foundation.
Several legal issues come into play when public libraries consider using blocking software to protect children from materials labeled "inappropriate" or "harmful". What is the legal definition of "inappropriate"? Are all "inappropriate" materials constitutionally unprotected by the First Amendment? How legal is it to block materials constitutionally protected by the First Amendment? How do First Amendment rights of children mesh with public libraries Internet filtering policies? What is the constitutionality of a library use policy regulating the Internet content, thus blocking speech that is entitled to absolute constitutional protection under all circumstances?
According to the Supreme Court (10), materials "harmful to minors" include "descriptions or representations of nudity, sexual conduct, or sexual excitement that appeal to the prurient, shameful, sexual, or morbid interest of minors, are patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors, and lack serious literary, political, artistic, or scientific value for minors."
However, what is termed inappropriate by blocking software vendors goes farther than the constitutional definition to include: child pornography, "adult" speech, obscenity, unpopular speech, sexual acts, satanic cult, drugs/drug culture, sexually explicit speech, criminal activity, violent speech, some type of religious speech, and even sports and entertainment.
Aside from child pornography, which has been deemed outside First Amendment protection and which is prohibited by federal law, most of these "inappropriate" materials are protected by the First Amendment.
The reason why the Communications Decency Act of 1996, an attempt to outlaw "indecent" communications online, was struck down by the U.S. Supreme Court was that the so-called "indecent" speech was protected by the First Amendment which states that, "Congress shall make no law…abridging the freedom of speech or of the press."
The Internet School Filtering Act came under wild criticism for the same reason. The term "inappropriate" is no different than "indecent". This is the reason why many associations including the ACLU views the ISFA and COPA as CDA II.
The Child Online Protection Act (H.R. 3783), which outlaws material deemed "harmful to minors", is no different. Although it is narrower than its predecessor the Communications Decency Act (CDA) of 1996 still raises major constitutional problems. According to CDT (11), there are 3 major problems:
It imposes serious burdens on constitutionally-protected speech, including materials such as the recently released Starr report, movies, and television programs, when disseminated through popular web sites such as CNN, Yahoo, or MS-NBC. Fails to effectively serve the government's interest in protecting children, since it will not effectively prevent children from seeing inappropriate material originating from outside of the U.S. nor will it cover material available through other Internet resources besides the world wide web, such as chat rooms, or email.
Does not represent the least restrictive means of regulating speech, according to the Supreme Court's findings that blocking and filtering software give parents the ability to screen out undesirable content without burdening them.
In a letter to the Newburgh Board of Education, the ACLU wrote: "the constitution protects dirty words, racial epithets and sexually explicit speech, even though that speech may be offensive to some"(12). The first amendment prohibits librarians from directly censoring protected speech in the library; just as it prevents indirect censorship through blocking software"(ACLU).
The category "indecent" is too broad and too burdensome and covered too much speech. Although Congress tried to narrow the class of speech that it is to regulate (to cover only speech "harmful to minors") and it has narrowed the range of speakers covered (only commercial providers on the web), it still has kept the same mechanisms for verifying that an adult is an adult. How do you determine who is adult and who is not? The core of the Child Online Protection Act is a national prohibition on information considered "harmful to minors."
The definition of "harmful to minors" refers to "community standards." In the context of the global Internet, the community standards definition is impermissibly vague. The US Supreme Court has never approved of a single, national obscenity standard, nor has it approved a "harmful to minors" statue based on a national, as opposed to local standard. As the Court stated in the landmark obscenity decision Miller v. California,  there cannot be: fixed, uniform national standards of precisely what appeals to the "prurient interest" or is "patently offensive." These are essentially questions of fact, and our Nation is simply too big and too diverse for this Court to reasonably expect that such standards could be articulated for all 50 states. "Community standards" seem pretty tricky to nail down.
Besides, knowing what materials are actually "inappropriate" or "harmful" to minors when "inappropriate and harmful" being defined in rather broad terms can be very hard. So blocking software vendors as well as libraries authorities expose themselves to liabilities by allowing the filtering policies in public libraries.
On what constitutional ground can public libraries or U.S Congress justify the use of blocking software to restrict access to these materials." As can be seen all these attempts by Congress to protect children online suffer from constitutional infirmities: should these only apply to communications on the web? Should only "commercial" entities be concerned?
Even more importantly, they fall within the bound of violation of the First Amendment to the U.S constitution, "the single most important source of rights for electronic communications in the United States", as Lance (1995: 1) puts it. There does not seem to be an objective set of standards to judge what is inappropriate or harmful and what is not.
Many public libraries which offer Internet access have adopted filtering policies, restricting children solely to computers with filtering software. The argument heard thus far is that children, whose minds and values are still developing, need to be protected. It is a fact that children, unlike adults, who are deemed to have acquired maturity needed to participate fully in a democratic society and their rights to speak and receive speech entitled to full first amendment protection, have traditionally been afforded less first amendment protection particularly within the context of public high schools.
But if it is agreed that all First Amendment rights are "indivisible", then why should there be a borderline between children's rights and adults' rights in cyberspace? Furthermore, the Library Bill of Right grants equal status to all patrons regardless of age. As a professional code, it expresses the role of public library in a society as one of open access for all patrons of all ages. Then, how do we justify the denial of First Amendment rights to children, which they deserve? It seems to me that filtering policies in public libraries not only alter the fundamental function of these institutions but also is against the intellectual freedom rights of children.
My sense is children should not surrender their First Amendment rights online. There is a big constitutional conflict between children's rights and filtering policies.
As noted above, all that has been done thus far to protect children online has met with technical problems and/or legal and First Amendment issues. Literature reveals that blocking filters being used now never closely match unprotected speech; which means that filtering policies-whether it is minimal filtering or optimal filtering-are inconsistent with children's First Amendment rights and with free speech.
And blocking filters are not as effective as they are expected to be. Bernard Robin, Elissa Keeler and Robert Miller, in their Educator's Guide To The Web hold that: "We can't stress enough that access control software isn't a perfect fence, and some students will be resourceful enough to find ways to get around it."
One justification for filters, according to them (1997: 161), is that they are meant to serve as a "protective fence, much like the fence around the playground or schoolyard, defining the area kids can freely explore". My view, however, is that children be pushed towards more autonomy, more responsibility, and more self-control. Children should be given a real and legitimate sense of power over how they can be safe online. Children need to be armed with the capacities to make critical judgements. None of these happens through the blocking software products currently in use. In other words, give children the opportunity to be their own filters.
An important part of giving these children some power to make their own decisions would be to invite them to give their input in these policies or software manufacturing. To not do so revives the once-widespread ideas that: (a)"Children are to be seen and not heard." (b) "children are empty vessels which need to be filled." But often we do fall back on these traditional, authoritarian values because, in this case, we perhaps assume that all children are inclined, by nature, to explore inappropriate materials.
If parents, librarians, decision makers, etc are too controlling- which is what is happening now- children will keep missing out on some basic human values they need for their growth. If filters keep making choices for children, choices which can be very misleading sometimes, we as a society will contribute to slow down children's growth and personal development. On the other hand, if children are empowered by their educators, their parents, the community, and their attention is directed towards what is useful for them and how they can use public resources usefully to develop themselves, we will have less worries and concerns about inappropriate materials.
No police, not even filters, not even parents, not even librarians, is more powerful than a child's power of self-control, of autonomy, and of responsibility. To state it even more strongly: anything less than children's empowerment is ultimately a disservice to our children. Robert Frost once said: "I'd ask to know what I was walling in or walling out". (Mending Wall). My sense is that we are busy trying to "wall out"-inappropriately so far.
My question is what are we "walling in" in our children? Are we helping them build internal "walls" so that they can take care of themselves or are we saying that there is nothing else we can do. If you "wall in" successfully, the "walling out" will ultimately take care of itself.
The Internet is too vast and no blocking software can ever protect children without any unintended damage elsewhere. The key question should not be: "how can we protect children online?" It should rather be: "how can we empower our children so they can be wise consumers in the cyberspace, and thereby, able to protect themselves?"
A Brattleboro Union High School thirteen-year old girl wrote a paper on censorship a few days ago. Some of her thoughts are very striking: "a child who knows or heard about the beauty of freedom will become upset at her/his parents and become rebellious and begin to lie to them if he/she experiences censorship"… " I bet if you are a parent who censors your kids you think you are doing the best thing for them, you are bringing them up right, you are not." "I think censorship is horrible, give your kid some freedom of choice, speech and expression, everything our country stands for. After all, we will be running it (this country) someday; wouldn't you want us to know how freedom works?"
That says it all. Classrooms or homes should not be moved to public libraries nor should public libraries be substitutes for these institutions. That the government subordinates the right for libraries to receive subsidies to the endorsement of filtering systems is a big erroneous move that does not assist the process of children's empowerment. Children's empowerment is the pivotal tool for children's protection online. Filters only give a false sense of security. And mandatory filtering in public libraries is tantamount to public services obstruction.
There are several options available for public libraries now, none of which seems to be ideal. Each brings up enormous challenges. Many civil liberties associations including ALA and ACLU have adopted the "open access without filters" option because they believe that the use of filtering software by libraries to block access to constitutionally protected speech violates the Library Bill of Rights and they encourage people to educate themselves about how to best use the Internet.
This option may, according to some, run the risk of inadvertently exposing the children to obscenity, harmful material, indecent and offensive material. Another option is to place filters on one or more in the children's room only. No filtering software in "adults' room". Another one is to place filters on all terminals used by children under 18, which again raises the question of first amendment violation.
I would recommend that schools, libraries, governing bodies, and parents teach children about how to best use the Internet. This can be done through teach-in sessions, workshops, lectures, demonstrations, good examples of Internet use, and any other kind of formal or informal training sessions, heightening their awareness of potential dangers online-dehumanization, physical molestation, harassment, exposure to inappropriate material and so forth- and their awareness of critical thinking.
Children need to be armed with the knowledge and skills to make informed choices rather than giving up their destiny to some software or some software vendors. Access restriction is not effective. The Internet is a new tool in public libraries as well as in schools. It makes sense that the community takes the responsibility to educate the children about how they can use it usefully and safely. Its management by public libraries which, presumably were unprepared to incorporate it into their resources, calls for an educational approach rather than a police one.
© Congo Vision