Most people online cherish and support the freedom to control their information environment, to evaluate sources of information, to block spam and obnoxious intrusions. Reader moderation (and even higher-order filtering systems) represent the first meaningful efforts to control the epidemic hostility, spamming and chaos that overwhelm public spaces online. This self-policing in media is a radical and powerful idea -- but it isn't that simple. It also permits people to eliminate opposing points of view, promoting a new kind of fragmentation.
Those who moderate comments on Slashdot, Kuro5shin and other community-based weblogs may downgrade content they don't find worthwhile in a genuine effort to express their thoughts as readers and participants -- a freedom no newspaper reader or television viewer has. One person's new freedom is another's' censorship, though. Congress has required, for instance, that schools and libraries who want to take advantage of lucrative e-Rate funding for their networking projects employ content-filtering software. The same basic mechanism (content is chosen before it reaches the viewer), but with very different motivations. As various methods and reasons for content filtering spread, they bring with them some dark clouds.
In republic.com, University of Chicago law professor Cass Sunstein argues that through its filtering and moderating systems, the Internet may be Balkanizing speech and thought, and thus weakening democracy, by eliminating the public spaces that traditionally offered common ground. Sunstein asserts that the age of mass media is ending, that radically de-centralized and intensely individualistic forms of information are not only emerging but becoming dominant. But he believes that certain elements remain essential for a well-functioning system of free expression, and that filtering and moderation software may endanger them.
People living in democracies, Sunstein maintains, should be exposed to ideas they might not have chosen themselves. Unplanned, spontaneous, unanticipated encounters are central, though they "often involve topics and points of view that people have not sought out and perhaps find quite irritating." They are important, nonetheless, he says, partly because they protect against fragmentation and extremism, a predictable outcome when like-minded people communicate only with one another."
Sunstein also cites the impact of collaborative filtering programs like those used by Amazon and other sites which collect information on past use and preferences, and allow people to pre-select from a menu of subjects and books they are likely to like or agree with. Clearly this is a customer service, but it's also a way of filtering out ideas and subjects people don't want to hear. Browsers in a store are nearly guaranteed to come across unanticipated or new ideas. The users of collaborative filtering systems will see far fewer.
Sunstein believes that citizens should have a range of common experiences. Without them, any heterogeneous society will have a much tougher time addressing social problems. People may even find it hard to understand one another. "Common experiences, emphatically including the common experiences made possible by the media, provide a form of social glue," he notes.
Sunstein's imagined -- but very plausible -- world of innumerable, diverse editions of the "the Daily Me" is the furthest thing from a utopian dream; it will, he claims, create serious problems. Sunstein offers several possibilities for reform. He suggests "must-carry" rules in the form of links imposed on the most popular websites, designed to produce exposure to substantive questions. He even advocates "must-carry" rules, also in the form of links, for even the most highly partisan websites, designed to ensure that viewers learn about opposing views.
These interventions into Net content are provocative, but a bit of a shocker coming from a Constitutional scholar. Should sites really be forced by law to carry view points that are abhorrent to them, to mimic the press's deadly habit of balancing every single point of view with an opposite one, creating eternal arguments and stalemates that turn civic discussions into WWF matches? In a democratic culture, isn't polarization as much a choice as consensus?
Such requirements, he argues, aren't rooted in nostalgia or reactionary love for the past. Nor is Sunstein taking a position for or against technology or its value. He wrote the book, he says, in an effort to explain what makes freedom of expression successful -- a question little considered online, or even in the United States Congress, which routinely enacts censorious, anti-democratic laws in the name of patriotism and morality.
But hardly anyone in high-tech, contemporary America engages in face-to-face, participatory democracy in their town parks and streets. If they do this anywhere, in 2001, they do it online. The Net is the new public space; does that mean it needs those same constitutional protections, and are Netizens obliged to keep at least some of this space open and unfiltered?
Sunstein doesn't fall into the obvious trap of romanticizing the era, blessedly over, when three TV networks controlled much of the news and offered Americans bland, incomplete mirrors of the world. But he has a point when he says that for all their flaws, TV broadcasts had vast audiences and had the quality of a genuinely common experience. One of the central accomplishments of the American Revolution was the crafting a political process that peacefully absorbed different points-of-view. It has worked astonishingly well, longer than almost any previous democratic political system.
In the last 30 years, though, the networks have lost about a third of their audience, or 39 million viewers. The most highly rated show on any current network has fewer viewers than the fifteenth highest-rated show of the 1970's. Sunstein doesn't suggest that all our new choices -- the Net, Web, cable -- are bad. "My only claim is that a common set of frameworks and experiences is valuable for a heterogeneous society, and that a system with limitless options, making for diverse choices, will compromise some important social values. ... if we believe that a set of common experiences promotes active citizenship and mutual self-understanding, we will be concerned by any developments that greatly reduce those experiences."
People who care about the Internet ought to be concerned. The tech nation may be a collection of brilliant, creative, outspoken people, but it defines the notion of being politically disconnected. The legislative system which nominally represents Net users passes laws from the Communications Decency Act to the Digital Millenium Copyright Act to the Children's Internet Protection Act that directly impinge upon our freedom of expression. But there is little organized response, or even much awareness.
The truth is that people who increasingly turn to filtering programs (including ready-made portal sites) become accustomed to censoring ideas they think they may not like. But they can't ever really be sure, since they have no idea what they're not seeing, or how the person or ideas they are blocking might have evolved.
Just ask Jeffrey Pollock. When Pollock ran for Congress last year, he posted campaign information and position papers on a campaign web site. Among others things, he declared his support for Federally-mandated use of Net filtering programs to block porn in schools and public libraries. He was amazed to learn that his own site was blocked by CyberPatrol.
If there is a flaw in Sunstein's arguments, it is that the information winnowing he decries has become more and more necessary due to the sheer volume of data beamed at individual users. In a sense, the moderation advocates are correct when they say they are preserving people's freedom to think and make information choices.
The volume of hostility and junk communications coming off the Net and Web is now staggering, itself a threat to a democratic culture. Moderating systems can also identify leaders and spokesmen, and make it easier to find intelligent or responsive comments. They take some power away from the hostile and disruptive. And they have quickly become valued communication tools: "I personally love the moderation and meta moderation system," e-mailed one advocate of this site's tiered approach to moderation, "self-policing while at the same time adding a degree of competition and ego-feeding."
Sunstein offers no meaningful solutions for dealing with flamers, or professional lobbyists who flood people with spam.
His argument also seems to pre-suppose that common spaces won't evolve on the Net without help. But just why not? Wouldn't a democratic model hold that eventually, when enough people want such a space, they will create and participate in it? And if they don't want such a space, isn't that also their choice?
Perhaps these spaces won't be like the old TV networks, but they could conceivably be big and open enough to host the civic functions that streets and parks used to serve. After all, television networks themselves act as a giant filter, as does much of Big Media. They picked a handful of stories -- fires, celebrity gossi$presented them as a picture of the world. They were inadequate and incomplete, and people abandoned them in droves the first chance they got.
"If an individual freely chooses to join a service that moderates or filters some source of information according to criteria that are fully disclosed to the joining individual, even if those criteria are the 'whim of the moderator,' then the viewer has expressed his inalienable right to listen only to what he wants," writes Shawn McMahon (himself a moderator) in an e-mail to me. "Nothing," McMahon adds, "could be more democratic."
He has a strong point. Don't people have the right to choose the information they want?
But that doesn't make Sunstein's questions any less valid, or his book less significant and compelling.
Look for another viewpoint on this book in an upcoming reader-submitted book review.