The Hamilton Spectator
Monday, February 16, 1998

Software solution may not help Burlington library

by David Akin, dakin@ham.southam.ca

Like hundreds of libraries and school boards across North America, the Burlington Public Library is caught in an ugly information-access dilemma with no clear solutions.

Parents, angry that their children have been accidentally exposed to offensive, often pornographic, images on the Internet while at a public library, are demanding tighter controls on the kind of content that gets displayed on the public computer terminals that can be accessed by children.

Others worry that in taking steps to prevent children from viewing certain kinds of information, librarians and administrators will become censors and prevent the dissemination of ideas and discussions they believe are part of the core values of a democratic society.

The Burlington Public Library will examine the issue again Thursday night, although it's unlikely new proposals or actions will be approved at that time.

One of the things the board will examine, though, is whether there is some kind of technical solution available that can automatically block a child's access to offensive Internet sites - however that may be defined.

Some libraries have tried special computer monitor screens that make it difficult for anyone but the viewer to see the screen, but they don't restrict access to Internet sites.

The general consensus in the computer science and information technology industry is that there is no silver bullet that can automatically block Internet resources that contain pornography, racism, or violence.

"The filtering tools that exist today, such as NetNanny, cyberPatrol, and so on, are still very clumsy and crude", McMaster University computer science professor David Jones said in a presentation to the Burlington Public Library late last month.

Jones is co-founder of Electronic Frontier Canada (EFC), an offshoot of the American-based group, Electronic Frontier Foundation. Both groups work to protect freedom of expression in cyberspace.

Jones' group has created a section of its web site (www.efc.ca) that contains newspaper clippings and other information about the library situation.

A recent addition to the EFC web site is a demonstration of the fallibility of one service, CleanNet, that claims to block objectionable material. The EFC page (detour.efc.ca) defeats the service by providing a detour around the CleanNet block.


Technical solutions to the thorny issue of controlling Internet access for children usually take one of three different approaches.

Blocking software uses a database of prohibited Internet resources. The database contains the Internet addresses of sites deemed objectionable by a parent or administrator. Each time an Internet user tries to access a particular resource on the Internet, the blocking software compares the address of the site that the user is trying to access to all the addresses in the database.

If the blocking software finds a match, it takes some kind of action, usually informing the user that access is denied.

The problem with this kind of approach is that new sites that are not in the database can still be accessed. There may also be heavy maintenance demands to keep the database current. In some cases, users have to pay a monthly fee to get regular updates of the database and even then there are no guarantees that all offensive material will be blocked.

Filtering software takes a slightly different approach.

Filtering software also has a database but, rather than Internet addresses, it contains rules or algorithms that instruct the computer to do something if a certain set of variables exist. One rule, for instance, might instruct the computer to deny access to any site that contains a particular word or series of words.

The problem with filters is that they are frequently unable to denote the subtleties of a particular kind of information request. For instance, one filter rule may examine all incoming text for the string of letters that form the word breast. Sites on breast cancer and sites containing chicken breast recipes would be caught in this filter.

The Burlington Public Library tried some filtering software last year, but discontinued its use because the filtering software also applied itself to the library's online catalogue, preventing access in some cases to information about the library's collection.

For filters to work effectively, and administrator must create a filter with some sophisticated Boolean logic operators - rules containing the operators "and", "or", "not", and other variables. Critics of filtering software say that even if sophisticated rules can be drawn up, they are still not infallible. Adult human beings have enough trouble describing to each other what kind of broad rules should apply to children when they surf the Internet. It seems unlikely, then, that what humans cannot articulate to each other could be accurately and properly described to a machine.

Monitoring software is a third approach.

Monitoring software records all the web sites, e-mail addresses, and newsgroups a particular user accesses over an Internet connection.

An administrator or parent can later review a user log to determine where that user went on the Internet and what kind of information was accessed.

If a child knows, for instance, that his or her steps on the Internet can be later retraced, the child may use the Internet more responsibly.

But some worry that such software may be used to identify adults, for instance, who might be looking for information on spousal abuse or unpopular political opinions.

"Society as a whole has had to learn to live with all sorts of deviant behaviour", says Wendy Schick, chief librarian at the Burlington Public Library. "I'm sure that, as a society, we will find ways of dealing with this as well."


Copyright © 1998 by The Hamilton Spectator. All Rights Reserved. Reprinted with permission.