L. Jean Camp Donna M. Riley
Carnegie Mellon University
Department of Engineering & Public Policy
Pittsburgh, PA 15215
Also available in in pdf.
Much of the debate about the Internet has focused on the classification of various services into traditional media types. There has been particular focus on Usenet, bulletin boards and the World Wide Web. We argue that there is no consensus on the appropriate media classification because no one media type accurately characterizes all aspects of Internet communications. The technology of the Internet has confused legal theorists into attempting to inappropriately apply those rules set for machine-assisted communication despite the fact that the machine no longer defines the characteristics of the communication. We will illustrate this first by showing how Web and Usenet services fit equally well, and equally badly, in each media type.
We argue the dominance of the media types framework in the debate over electronic forums and services has in some cases created threats both to speech and to minimal responsible controls. A more appropriate analog than different media is different spaces. Using events at Carnegie Mellon University, Santa Rosa Junior College, and the University of Michigan, we will show in more detail how the attempts to fit media types to Internet services has created incentives which neither create protected spaces nor encourage open debate. We then use these cases to show how the proposal to treat virtual spaces as their physical counterparts would better serve both the institutions and the individuals.
The opinions presented in this paper are the opinions of the authors. This work does not represent the opinions of the Department of Engineering and Public Policy, Carnegie Mellon University, the Department of Energy, the United States Government, Sandia National Laboratories, its parent corporation or its sister laboratories.
Determining the First Amendment rights, institutional responsibility and institutional perspective using the model of four media is not feasible in the new electronic media. Media classifications are failing in many open networks. For example, cellular technology is undeniably broadcast technology, yet it fits (appropriately) under the regulatory rubric of wired telephony. Defining First Amendment rights in electronic forums is critical to preserving them (Pool 1983).
There has been debate over the question of whether network services fall under the rubric of broadcast, publisher, distributor, common carrier or some yet undetermined media type (Beall 1987, Becer 1989, Di Lello 1992, Sassan 1992, Berman 1995, Krattenmaker and Powe 1995). Every media classification has an argument for and against it. The efforts to find a single media classification, or invent one, are doomed to failure because the provision of a bboard can be done as a publisher, a distributor, a broadcaster or a common carrier. The differences between bboards or newsgroups are much too subtle to be contained in a four way or even five way media classification. The differences are sufficiently subtle that rather than a single media type or single space (Naughton 1992), the only appropriate model for cyberspace is that of multiple varied physical spaces. There is no such thing as cyberspace, but there are many cyberspaces.
Electronic spaces are as varied as physical spaces. Previously the responsibilities of the producers and distributors of speech were determined by the medium of transmission: printed paper, over the air broadcast, or closed private circuit (as in telephony). However, the classification of speech by media type is inadequate on the Internet. Instead, system configuration, tradition, applications in use, and purpose determine the types of spaces that are created on a common medium.
We will show how the media model is ill-suited to the Internet by illustrating how different actions on Usenet news and the Web are broadcast, distributing, publishing and common carriage. We will further describe how the bboards at Carnegie Mellon University all serve different purposes which are analogous to different spaces: bedrooms, classrooms, club meetings, departmental conference rooms, and public spaces. Finally we will consider three cases where the space analogy may have worked, but the media analogy did not.
The differences in traditional media types are based on the inherent technological capabilities to control content and distribution. With Internet services, the classification depends on the configuration and use of the service, not the inherent technological capabilities. In this section we will show how Usenet news and web services do and no not fit into traditional media types.
Broadcasters are under particularly stringent regulation for two reasons: the natural scarcity of bandwidth and the lack of filtering mechanisms for viewers.
Broadcasters have an inherent capability to control what they broadcast. The facilities for distributing broadcast information are concentrated, and the person or institution that controls those facilities can exercise absolute control over the distribution. Because of the nature of broadcasting, the decision to broadcast one stream of information prohibits anyone else in the service area from using the same spectrum to broadcast any information. Thus broadcasters both select what they will broadcast and prevent others from broadcasting different information.
With broadcast transmissions, there is no way to prevent someone in the broadcast area from receiving the information. Thus broadcasters have no control over distribution.
Usenet news posts and html documents are like broadcast in that they enable the transmission of information to large numbers of people. Once on Usenet or on the Web, the information may be downloaded by any of the millions of subscribers, thereby becoming inaccessible even to cancelbots. Because of the ease of secondary distribution of electronic information, once information has been sent over the network or placed in a publicly accessible forum it is nearly impossible to prevent widespread distribution.
Usenet news posts and html documents are like broadcast because the information comes directly into the home, and it may be impossible to retrieve or destroy the information after it has been broadcast. According to one advocate of the Communications Decency Act, it was exactly these broadcast characteristics that enabled the regulation to pass. Dee Jepsen of Enough is Enough described the need for the Communications Decency Act as follows: "hard-core, child pornography and "indecent" material, which is harmful to minors, are being transmitted over the Internet directly into our homes " (Jepsen 1995).
Like broadcast, accessing much of the information in Usenet groups and most information on the Web requires little technical expertise. On the Web one need only know how to use a mouse to find questionable material.
Providers of Web and Usenet services are unlike broadcast systems in that they are not constrained by a natural scarcity of bandwidth. Maintaining or providing access to an information service does not prevent others from doing so. Also unlike broadcasters, there is a level of technical sophistication required to obtain some types of information (such as downloading images) available on Usenet.
Providers of Usenet services are unlike broadcasters in that the information is organized and can be searched on a high level. Newsgroups are in named hierarchies, which users can choose to explore, or not. Thus the sudden appearance of explicit images is extremely unlikely. With information on a newsgroup users can choose to look into the newsgroup or not. The Usenet hierarchy is completely unenforceable, but is self- sustaining through the practices of publicly ignoring and privately flaming inappropriate posters, as well as the judicious use of cancelbots.
With the Web badly named links and misdirection can cause a user of information services to be confronted with unwanted information, or may allow a user to stumble unintentionally onto offensive imagery.
Usenet service providers may arguably provide one broadcasting capability: the creation of groups. There is some information seen by every user of Usenet: the names of the groups. The creation of Usenet discussion groups is initiated by a create group command. Organizations where users can issue create group commands at will serve as broadcasters because the information goes into every feed, and the user need not actively choose to investigate to encounter the name. However, it is notable that the names of Usenet groups are necessarily text, which makes the use of group creation abilities to broadcast images arguably impossible. Thus the sudden appearance of explicit images is not enabled by the creation of groups.
One example of a group name is "alt.Barney.die.die.die." Certainly there is nothing wrong with poking fun at a children's show. However the simple fact that the knowledge of this group was extremely common, although few would search for "Barney" or "die" illustrates the broadcast characteristics of newsgroup titles. Note that groups have respected the desires of others in keeping names modest, for example, the discussion group for gay men is soc.motss, (meaning members of the same sex), and virulent racists tend to misc.activism.militia.
The attempt to regulate the Internet as a broadcast medium through the Communications Decency Act has failed. The regulation of the Internet as a broadcast medium has been found to be unconstitutional, and thus is not an option (Stets 1996, American Library Association Inc. v United States).
Publishers are assumed to have full knowledge of the information they present -- as newspapers have editorial control. Publishers are given unique freedom to expose private facts to public scrutiny for the public good (Ginger 1975, New York Times v United States, 1971).
That a provider of information services can be found to be a publisher is illustrated in the case of Prodigy. Prodigy is a publisher as illustrated by the recent court decisions,?. However, Prodigy was found to be a publisher not of Usenet, but of its proprietary newsgroups and possibly the posts of its subscribers. Prodigy filtered messages and rejected those found to be unacceptable for the family environment, which Prodigy targets as its market.
Providers of information services are like publishers in that the primary goal is to give readers, electronic or physical, access to selected information.
All system operators are like publishers in that they cannot reasonably be held liable when information they produce is incorrect and leads to bad decision making by the receiver (Daniel v Dow Jones). Providers of Usenet services are also similar to publishers in that they have no control over secondary dissemination of the materials they provide.
Providers of Usenet and Web servers are like publishers in that they can control the initial distribution of information. Operators of private subscriber systems can choose to restrict their customers to employees or those that can provide proof of age; firewalls, IP address and password protection is available for Web pages.
Treating Internet service providers (ISPs) as publishers would result in prohibition of vast amounts of information, as suggested by Stratton Oakmont, Inc. v. Prodigy Servs. Such a regulatory regime would require that Internet service providers search all posts and approve all pages. This has implications not only for speech rights, but also privacy rights.
Providers of information services are like distributors in that they have the ability to filter at the highest level of content. That is information services providers can select Usenet groups or domains where access is allowed, while prohibiting others. Like publishers, system operators have been found not to be liable when information they produce is incorrect and leads to bad decision making by the recipient (Daniel v Dow Jones 1987).
Providers of Internet services are like distributors in that they cannot have knowledge of all information that is provided through their service. Only high level filtering is possible for many providers of Internet services. Filtering such as provided by Prodigy requires large investments both in filtering software, and in labor since each flagged message must be evaluated by a human.
Like distributors, providers of Internet services may have reason to know some content. For example, any ISP would know the content of any group named *.pedophilia would be unacceptable, and in fact requests to create such groups are denied.
Providers of Internet services are unlike distributors in that a limited ability to distinguish users means a limited ability to control access. The sheer volume of information which flows through a Usenet feed, and the amount of information requested by users' clients from various Web servers makes detailed filtering extremely difficult.
CompuServe is a distributor because it provides access to information "published" or written by the subscribers, while CompuServe itself does not provide content (Cubby v CompuServe Inc., 1991). CompuServe doesn't edit the content in specific posts and forums, but rather chooses which forums are allowed. This is analogous to a bookstore which selects specific books but cannot reasonably know what is in every book.
However, regulating Usenet and Web services as secondary publishers creates difficulties. High level decisions on content mean broad prohibitions. Prohibited content may have a particularly valuable character for reasons of health. For example broad prohibition of obscene words forced breast cancer survivors on AOL to create a chat group for "hooter" cancer survivors, and prevented women from listing themselves as breast cancer survivors in their .profiles ( roughly equivalent to .plan files). Arguably, electronic discussions of sex, as takes place on alt.sex.wizards and alt.sexual.abuse.recovery are particularly valuable on the Internet since there is no threat of violence, and there is potential for anonymous or pseudonymous discussion.
Providers of Usenet and Web servers also arguably have characteristics of common carriers. Providers of Internet services provide routing for others' packets with nondiscrimination. Providers of Usenet feeds provide the entire feed, with the recipient responsible for filtering.
Usenet news posts and html documents are like telephone conversations in that they are so numerous that knowledge of all content is not possible.
However, Internet provision of Usenet and Web services is not common carriage in that there is no requirement to provide nondiscriminatory services to individual users. ISPs can remove users at will, unlike telephone companies. Internet providers can choose whether or not to act when a subscriber is sending or receiving harassing or offensive email, and an ISP can choose to block certain domains or sites.
The difficulty with regulating the Internet and Internet service providers as though Usenet news and Web services were common carriage is that there is no need or right to act. If Internet services are common carriage then an ISP could not remove Internet access privileges for abuse of other individuals, or other behavior commonly defined as abuse of the Internet itself such as spamming.
The law is not coming to the Internet, it has always been there. It has never been the case that communicating a death threat to a Federal official has been legal if it was communicated through email. That the President is only recently on the Internet means that this law is only recently applied -- it does not mean that death threats were ever legal.
Recent arrivals to the Internet, or newbies, often see the Internet as a place of chaos -- actually it is quite orderly. There are rules of the road on the information highway. The fundamental common law is, "You spam, we flame." Spams and cancelpoodles always create responses. Complaints to employers and ISPs of offenders is common, as well as simple flaming.
Where an item is posted or how it is linked defines the acceptability of the information on the Internet. Hate posts, which may seem unacceptable anywhere, are sometimes flamed for being in the wrong place. This is substantially different than objecting to one's right to speech -- it is an objection that the reader's right to ignore such speech in daily life is being violated.
There are vastly different levels of civility, and vastly different signal to noise ratios in different areas of the Internet. Cross-posting does create confusion and blurs such lines. However, some things are certain. For example, alt.peeves will remain forever less civil than rec.crafts.textiles.needlework.
This range of acceptability suggests a recognition: that there is no cyberspace, but many cyberspaces. The attempt to fit all ISPs, or all Usenet groups, or all Web pages into a single category will fail just as attempts to determine universal laws for speech will fail. In the next section we consider physical spaces, and their electronic analogs.
One reason denizens of the Internet call it cyberspace is that the various attributes applicable to defining different physical spaces can also be used to define different virtual spaces. Note that the physical space of a speech has been a defining characteristic of its acceptability; you can neither falsely yell "Fire!" in a crowded theater nor expound upon philosophies of racial superiority at work. Yet there are places for both outrageous exclamations and offensive arguments in the physical world. We argue that the same is true in the virtual world.
The organization and language of users on the Internet suggest an option: parallels of physical space. Consider how the description of files and services currently reflects an awareness of how different services offer different types of space: cyberspace, home page, chat room, mail box, dungeon, and home directory to name a few.
The argument that the same constraints apply to electronic newsgroups as to physical bboards reflects a recognition that there are different spaces; what governs the appropriate use of a physical bulletin board is in no small degree its location. The bulletin board in the grocery store varies significantly from the bulletin board in the office.
Why are cyberspaces more like spaces than media? First, media assumes that the technology determines content control. Multiple spaces are set up using the same fundamental construction technologies and principles. Similarly electronic spaces with very different characteristics can be set up using the same electronic tools.
Secondly media types assume control exists at a single location. For telephony the location is at the end user, for broadcasters it is at the originator. With electronic services there can be multiple levels of control. An institution may limit access to an internal Web page, or allow it to be open. In either case the author of the Web page may choose to provide access to a select group, or to many.
Thirdly there is limited assurance of speech protection in various media types. Obscenity can be outlawed by the states for example; yet raiding a home or private poetry reading would be extreme. Because the equivalent spaces on the Internet are not so closed such a raid or prohibition may be likely to happen. Authors have advocated protecting cyberspace as a whole as an electronic soapbox (Berman, 1996). The Internet offers a unique opportunity for speech -- particularly the speech of the despised or threatened. Individuals can interact with those that support violence against groups of to which the individual belongs without fear of personal reprisal. For example, few of the liberals who argue on the militia group would go a militia meeting to confront the armed self-proclaimed patriots physically. Physical threat is far less effective at silencing speech when the space is virtual and the participants dispersed. Yet it is precisely those dialogues that would be limited by a threatening environment that are most likely to be subject to limitation -- areas for hate speech and sexual speech provide forums for debates which would otherwise be unlikely to occur. Recognizing the value in these debates implies protecting them from closure, and assuring providers and participants free reign to advocate the most outrageous stands without fear of liability. It also requires protection of speech from other users who would silence opponents through cancels.
Fourth, the space analogy offers an opportunity to declare that there are special cases where control must be asserted without requiring complete control. A case where control may be required would be when an ISP is being used to stage a malicious attack on other systems, or is sending out a collection of perfectly legal but speech-destroying cancelbots. With media types, control in an extreme situation shows evidence of the ability to control and therefore of broad liability.
Finally the space analogy has the advantage that its understanding is intuitive. The social contract that may appear to fail in a workspace may be failing simply because it is not understood to be a workplace by all parties. Such an understanding would bring to the electronic forum far better internalized understanding among users than the application of media types.
The chart below shows how different discussion groups at Carnegie Mellon University can correspond to different spaces; if forced into the media type paradigm, these would need to be in different media types.
|Local Group||Media Type||Space |
|May offer information|
unsuitable for children
ex. on local sexual assualt
events or policies
should be extrmely
liability for obscentity
|Club Groups |
|Common Carrier||Club meetings||No knowledge|
may be unable
to maintain electronic presence
|Graffiti||Common Carrier||Public Forums||No knowledge|
|No universal accesss|
No action on threats
In the bedroom an individual has the right to speak and act in most offensive ways that would not be acceptable outside the home. Should a group gather in a bedroom there would be no action against speech absent criminal intent.
Law enforcement must have reason to even enter a person's bedroom. There must be some evidence of a crime, or a clear and present danger. Lewd, obscene and pornographic speech find protection in the home.
Private distribution lists created for the consensual discussion of adult topics may be an example of a bedroom operating on the Internet. Dial-up pornography BBSs may be a second example. A user's personal email can serve as a bedroom.
Not only do citizens have the right to speak freely in our bedrooms we have the right to bar ideas from our bedrooms. Just as technical mechanisms have developed to filter material from our desktops and mailboxes (Spertus 1996) legal mechanisms have been developed to protect our bedrooms. Speech that has only political implications on the street corner, such as the advocacy of violence against a select group, has much different implications in terms of politics and present danger when reiterated in the bedroom of a member of the targeted group.
Bedrooms can provide more freedom of speech than telephones, particularly sexual speech. As gender issues are part of today's most hotly contested political debates, determination to prohibit passionate speech which as obscenity can strictly limit these debates. Network owners have been allowed to prohibit constitutionally protected speech and have refused to enter contracts with those that present undesirable viewpoints. Declaring that some areas are legitimately the proper space for undesirable opinions and obscene speech, and that retributions for such speech made in the appropriate forums is prohibited, has the potential to provide greater freedom of speech than current regulation of telephony network owners (Barron, 1993).
In the workspace, the spirit of respect for others is an important aspect of maintaining a professional environment. Depending on the work being done there, other values of how colleagues should treat one another will prevail; in a corporate environment, there may be expectations for more formal speech. In a university, the spirit of inquiring and openness may dominate.
In a university setting, workspaces may include departmental newsgroups, and some student groups that maintain a professional environment or that students consider to be preparation for their careers, such as a newspaper or business organization.
Speech is controlled in a workspace. Speech that would be dangerous because of implications of physical violence in a bedroom can create the danger of economic deprivation if repeated in workspaces, as recognized by the EEOC.
In the classroom, there should be a spirit of inquiry that enables the broaching of any subject and the expression of any viewpoint. There should also be a spirit of respect for others and a code of behavior that prohibits personal attacks and intimidation. An environment that is free from harassment and intimidation yet that accords individuals due respect is most conducive to the uninhibited sharing of ideas.
Course discussion newsgroups at universities may function as a classroom space. Clarifications of homework, queries and comments between students, professors, and teaching assistants sometimes result in terse or angry debates. The advocacy of ethnic cleansing, racist remarks, or personal insults each remain as inappropriate or appropriate as in the classroom -- depending on the topic at hand. In the classroom the instructor and other students work together to maintain the appropriate learning atmosphere by responding to inappropriate behavior.
Just as one would be unlikely to come to class in bedclothes, some speech appropriate in the bedroom is inappropriate in a classroom. Conversely, as one never need listen to offensive speech in the bedroom one must listen to ideas in the classroom. The purpose of being in a classroom is to encounter ideas. Unlike workplaces, the exchange of passionate ideas is an ideal in the classroom.
In the town square, free expression is paramount. It is a space designed for the exchange of ideas political and personal. Personal attacks, outrageous arguments, hate speech, and the like are welcome here. Direct criminal threats and slander are the bounds of speech here; propriety is not at issue.
On the Internet, town squares may include groups that wish to create dialogue on various issues, such as political organizations. The entire alt hierarchy is a town n-cube, far surpassing the space and time limitations imposed on a physical square.
It is clear that a given entity on a college campus could serve more than one purpose. For example, an African-American students' organization could be involved with political dialogue, offering support to African- American students on campus, and the actual business of running the organization. It might choose to have virtual analogs that imitate business meetings, discussion groups, and open political forums.
Note that discussion in a town square can actually be more limited than the discussion in a bedroom. Just as public nudity is prohibited, some arguments and actions protected in the bedroom can be prohibited in the town square. Thus the ability to create spaces in addition to the information agora implies the ability to provide a level of freedom of debate and discussion unmatched in any single medium.
We will consider three examples in which the existence of media analogies as the dominant paradigm for regulating electronic communications created incentives (in the form of potential liability) for poor policy decisions in the university environment. Others have argued that the current regulatory regime allows limitations on speech, but they have focused on case law rather than the effects of the shadows of liability (Hammond 1995). We choose examples in the academic environment not only because it is this environment with which we are most familiar, but also because universities have the creation and sustenance of dialogue as a primary purpose. Examples from business or government domains would be more complicated by questions of misuse of resources.
In the first case, Carnegie Mellon implemented a broad policy censoring all Usenet newsgroups in the alt.sex.* hierarchy (Camp and Riley 1995; Steinberg 1995). Though the stated purpose of the policy was to bring Carnegie Mellon in compliance with Pennsylvania obscenity law, what resulted was the denial of access to all conversations about sex and sexuality, and the allowance of some legally obscene material (e.g. child pornography) to remain available to the CMU community.
In the second case, two female students sued Santa Rosa Junior college after having been the subject of sexually explicit rants posted to an all- male class discussion board (Dorgan 1994). The entire situation could have been prevented had the professor, students, and university been clear from the start about the purpose of the discussion board. Instead, what resulted was a temporary job termination, involvement of the Department of Education's Office of Civil Rights, and an embarrassing settlement for the college.
Third is the much-hyped Jake Baker case, in which a student at the University of Michigan posted an abduction, rape and murder fantasy involving a female student in one of his classes. He then sent a second email saying, "It's not enough anymore to think about it, I have to do it. (MacKinnon 1995)" While most of the hype surrounds the federal case in which Baker was arrested for transmitting a threat across state lines, we will concern ourselves with the response from the University of Michigan itself.
In November 1994, Carnegie Mellon announced that it would remove all newsgroups in the alt.sex hierarchy, as well as selected groups in the alt.binares.pictures.erotica hierarchy, and a few others (Camp and Riley 1995). The newsgroups were selected for removal by title, not by substance. This makes clear that CMU was acting as a distributor not as a publisher, since it avoided knowledge of the content of Usenet newsgroups. Hence, groups such as alt.sex.fat, which is a support and discussion group for issues of sexuality and being overweight, and alt.sex.NOT which is a discussion group about abstinence, were banned. Similarly, valuable information about sexual health was banned through the removal of groups like alt.sex.safe and alt.sex.wizards.
The policy's stated goal was to comply with state and federal law. CMU determined that it was a distributor of Usenet information and thus had to act where the University could not deny knowledge of obscenity. Thus CMU cut a broad swath through the Usenet feeds. Yet CMU let remain groups that regularly contain child pornography, such as alt.binaries.pictures.girls. By trying to minimize perceived liability the administrators at Carnegie Mellon both limited important sexual discourse and failed to remove material which is uniquely prohibited.
However, had CMU selected for content and removed only the child pornography postings, it would be in a position of monitoring. Monitoring is expensive technically -- it involves processor time and coding costs. It is also hazardous in that such monitoring would have created the potential for CMU to be a publisher; like Prodigy, the University would have created an infrastructure theoretically capable of monitoring every post.
As long as the media types framework applies, institutions like CMU cannot take the necessary steps to comply with child pornography laws without making themselves vulnerable to a publisher's standard of libel.
Consider a more obscure event at CMU. The Women's Center was bombarded by hate messages in the spring of 1993. The flame war that ensued was so vigorous that readers who subscribed to the board to receive announcements of business meetings found it too cumbersome to wade through the drivel to the business messages. Furthermore, some women felt like their board should be a safe space where discussion could focus on women's experiences, not on the wittiest response to aspiring Limbaughs. Other women felt it was important to engage the trolls and expose them to feminist ideas.
They resolved the issue with the traditional feminist "both-and" solution; they created a discussion board and an announcement board so people could choose whether to enter the town square dialogue with the anti-feminists or simply read workspace announcements. The women's center account was given the authority to move posts from one board to the other, as appropriate. In this way, women maintained a safe space and protected free speech at Carnegie Mellon.
There was not a single standard applied to both newsgroups. The University had the same level of control in both the assocs.womens-center and assocs.womens-center.discussion groups. These two newsgroups are of the same media type yet one serves as a protected space for the expression of opinions by an underrepresented group and the other serves as a completely protected forum for even the most offensive speech. The application of media types to all groups at Carnegie Mellon would prevent such distinctions.
When Roger Karraker taught a journalism class at Santa Rosa Junior College (SRJC), he maintained an electronic conferencing system. The professor used the electronic conferencing system to motivate his students to have wide ranging discussions unlimited by classroom constraints, and to expand the learning experience by familiarizing his students with the electronic medium. At the students' request, he established single-sex spaces for the continuation of classroom discussions.
When the student newspaper published a swimwear ad featuring a woman in a thong bikini, one female student picketed the paper to protest the objectification of women. After another woman defended her, they both became targets of sexually derogatory comments on the all-male bulletin board. One of the men told the women about the comments, and they complained to Karraker who immediately shut down the single-sex boards. However, one student was not satisfied with this response and complained to the administration, who put Karraker on paid leave. The other students retaliated against the loss of the professor by attacking the woman who had made the original complaint. She dropped out of college, and threatened a sex discrimination suit (Dorgan 1995, Lewin 1995).
The Office of Civil Rights (OCR) in the department of education demanded that the college have an across-the-board policy on all its electronic services against sexually explicit comments that could create a hostile educational environment. With respect to Usenet, this would have resulted in the termination of the alt.sex hierarchy in addition to the removal of much of the soc and talk hierarchies. In order to meet this condition, the University would have had to eliminate many distribution lists as well. Arguably, the OCR treated SRJC as a publisher because this particular case originated in the journalism class which published the school newspaper. Certainly the journalism class did have newsgroups that were published by the class as journalistic endeavors. Because one newsgroup at Santa Rosa was a publication, all newsgroups must be publications under the media rubric.
Consider the answer of the more subtle question: what is the correct analogy to the physical world? Was it a locker room? A group therapy session? Surely in these contexts the speech is to be protected. The group therapy analogy is probably most appropriate considering that students were required to abide by a confidentiality rule regarding the contents of the discussions. Indeed in such a situation, personal attacks are to be expected and the privacy of the participants is paramount.
If one considers the forum to be an extension of the classroom, as Karraker first characterized it, professional limitations may be placed on speech. Vulgar personal attacks are prohibited in a classroom, and Karraker as the course instructor would have a personal and possibly legal responsibility to intervene.
Regardless of the classification, choosing an analogy for that particular electronic space would not have defined all electronic spaces associated with the journalism class. The attempt of the OCR to fit the entire college computing system into that of a publisher is a misguided solution that ignores the nuances of cyberspace, and thus resulted in threats to on-line speech.
The professor's choice to create an specified therapy session in a classroom created an environment that resulted in the professor's suspension, loss of access to some on-line speech for journalism students, threats to many unrelated electronic forums on campus, and one young woman's abandonment of academic goals because she perceived it as an extension of the classroom. The lack of a clear conceptual model created great harm when there was no intention to do so. Because all the speech was electronically assisted communication did not make all the spaces equivalent. Here again, the lack of consideration of the difference in cyberspaces resulted in the loss of access to on-line free speech at Santa Rosa Junior College.
Jake Baker posted a set of detailed steps to kidnap, rape, torture, and murder another student. Jake Baker named that student and identified her living area in alt.sex.stories. What is the University's responsibility under different media types?
First, if Michigan is a common carrier, their responsibility is essentially to do nothing. By definition it does not have any knowledge of the information posted to alt.sex.stories and thus has no obligation to intervene. Michigan could have maintained its contention of common carriage by refusing to act.
If Michigan is a publisher or a distributor, it must choose to act. In this case the options included deleting the message, or investigating Jake Baker's intent, or both. However, acting on that message implies broad future control of information and future liability in cases of libel, copyright, and other infringements that may not pose the same dangers implied by this particular case.
Consider instead Michigan's responsibility if the issue is one of cyberspaces. If alt.sex.stories could be considered a bedroom or dorm room, then the University could intervene on the basis that a direct physical threat was made and not face future liability. Quite simply, entering one dorm room to prevent a crime does not imply that the University has knowledge of all actions in all dorm rooms.
An appropriate analogy might be an erotica reading night at a campus cafe. People who attend the readings do so knowingly and expect to hear erotic stories of an explicit nature. The University can choose as part of its mission to provide students with the opportunity to explore erotic literature, and to support its students' writing experiments by providing such a forum. However, upon hearing a specific plan of violence directed at a specific individual, it is reasonable that the administration would act to prevent any clear danger. In choosing to act in a single case to prevent physical harm, the University would not commit to prohibiting erotic story hours, or be considered a distributor of all such papers written on campus.
In short, the media type classification encourages irresponsibility by potentially categorizing an organization based on one action in a particular case and then applying the standards of the media type to all forums.
The classification of the Internet into a media type, or the creation of a new media type is tempting for many reasons. First, the media model for electronically assisted communication has served well for decades.. The model of four media has remained solid through the early days of the wireless revolution, and several court cases have seen fit to apply a specific media type to specific provider.
Space as the determinant has advantages and disadvantages. The advantage is that control over speech may be exercised in some forums, without hindering the free flow of information on the Internet as a whole. Users can define and declare the limits of their own spaces. It would enable a broad but nuanced policy where control results in liability, without blanket restrictions directed at all providers. Treating cyberspace as cyberspaces also assists in users' right to address and access a public forum, as well as users' right to be left alone in their own electronic spaces.
The disadvantage is that there is no graceful unified theory as now exists with media types. There is no overarching theory of speech in spaces, rather a quilt of court decisions and regulations. Like all frameworks, regulation of electronic spaces as space could be manipulated.
As in physical spaces there exist common carriers, publishers, distributors and broadcasters in cyberspace. A single person can use all of these services --by sending email through a simple gateway, by creating a Web page, by setting up a reflector, and by spamming. Yet cyberspace has more modes and models than can be contained in these four media types. The regulation of cyberspace as a multi-dimensional world of spaces is far more complex than the search for the appropriate media type. Yet the virtual world requires such complexity.
cancelbot - a program that removes spams
cancelpoodle - a program that removes a single message because of content
spam - the sending of many identical messages separately to a large number of newsgroups
.plan file - on UNIX systems, a user's plan file displays personal information that the user chooses to make available to other users who access the file
Usenet newsgroups - one of the 10,000 topic areas on Usenet, a distributed bulletin board system
home page - one's personal billboard on the World Wide Web
BBS - electronic bulletin board system - a message database where people can log in and leave messages for others in topic groups
bboard - electronic bulletin board, in this piece refers in general to topic groups in a BBS or a Usenet newsgroup, or specifically to CMU's campus- wide BBS - a physical bboard is the traditional cork and push-pin memo board
flame - an email message intended to insult and provoke, often directed with hostility toward an individual
firewall - widely used security product that limits the flow of information between a specific domain and the wider network
chat room - private bboard on America On Line
dungeon - short for multi-user dungeon or MUD - set of real-time chat rooms that have the feel and structure of an adventure game
IP address - Internet Protocol address, which provides electronic location information
American Library Association Inc, et. al. vs. United States, et al., Civil,
No. 96-1458, 3rd Circuit U.S. Court of Appeals, 1996Beall R., 1987,
"Developing a coherent approach to regulating bboards", Computer Law
Journal, Vol. 7, Fall, 499-516.
Barron, 1993, "The telco, the common carrier model and the first amendment -the Dail-A-Porn precedent," Rutgers Computer anmd Technoloy Law Journal, Vol. 19, 371, 385-391.
Becker L.E., 1989, "Liability of computer bulletin board operators," Connecticut Law Review, Vol. 22, 203-238.
Berman, 1995, "Abundance and user control: renewing the democratic heart of the first amendment in the age of interactive media," Yale Law Journal, Vol. 104, 1619-1637.
Camp, L.J. and Riley, D.M., 1995, "Women, Children, Animals and the Like: Protecting an Unwilling Electronic Populace," Proceedings of the Fifth Conference on Computers, Freedom and Privacy, March 28-31; Burlingame, CA; pp 120-139
Cubby v Compuserve Inc., 1991, 776 F. Supp. 135 (S.D.N.Y. 1991).
Daniel v Dow Jones, 1987, 520 NYS2d 334, 338.
Di Lello E.V., 1992, "Functional equivalency and its application to freedom of speech on computer bulletin boards," Columbia Journal of Law and Social Problems Vol. 26, 199-247.
Dorgan, M. "Free Speech in Cyberspace: Santa Rosa case tests government limits on computer bulletin boards," San Jose Mercury News, September 17, 1994.
Hammon, A., 1981, "Regulating the multi-media chimera: electronic speech rights in the United States", Rutgers Computer anmd Technoloy Law Journal , Vol. 21, 1-87
Krattenmaker and Powe, 1995, "Converging First Amendment Principles for Converging Communications Media," Yale Law Journal, Vol. 104, 1719-1744.
Jepsen, D., 1995 , Testimony for Senate Judiciary Hearing on Cyberporn, July 24.
Ginger, A. F., 1975, Pentagon Papers Case Collection: Annotated Procedural Guide and Index, Oceana Publications, Dobbs Ferry, New York.
Lewin, T. "College Settles Harassment Charges Stemming from Computer Conferences," New York Times, September 21, 1994.
Lewis, P. H., 1995, "After Apology From Prodigy, Firm Drops Suit," N.Y. Times,, Oct. 25, at D1.
MacKinnon, K., 1996, Transcript of Michigan Telecommunications and Technology Law Review Conference "Policing the Internet: Jake Baker and Beyond", http://www.umich.edu/~mttlr/archives/bakerconf/ October.
Naughton E.J., 1992, "Is cyberspace a public forum? Computer bulletin boards, free speech and state action," Georgetown Law Journal, Vol. 81, 409-441.
New York Times Co. v. United States 403 US 713 (1971)
Pool, I.S., 1983, Technologies of Freedom, Harvard University Press, Cambridge, MA.
Sassan A. J., 1992, "Comparing apples to oranges: the need for a new media classification," Software Law Journal, Vol. 5, 821-844.
Spertus, E., 1996, per "Social and Technical Means for Fighting On-Line Harassment", Virtue and Virtuality: Gender,Law, and Cyberspace, April 20-26, MIT, Cambridge, MA.
Steinberg, E., 1995, "Living on the Slippery Slope," Law in the Misinformation Age: The First Amendment, Privacy and Electronic Networks, Duke Law School, January 26-28, Durham, NC.
Stets, D., 1996, "Judge Rejects the Internet obscenity law," Philadelphia Inquirer, Thursday, June 13, 1:1.
Stratton Oakmont, Inc. v. Prodigy Servs. Co, N.Y. Sup. Ct. May 25, 1995.
Travis, M.L., 1991, "Dueling forums," Cincinnati Law Review, Vol. 60, 757-795.