Computers and Society. September 1998, p. 17-19. Originally written as a position piece for the DIMACS workshop on Design for Values: Ethical, Social and Political Dimensions of Information Technology, February 28 - March 1, 1998, Princeton, NJ.

Bias and Responsibility in 'Neutral’ Social Protocols

Lorrie Faith Cranor [1]
AT&T Labs-Research
lorrie@research.att.com

The Platform for Internet Content Selection (PICS) [2] and the Platform for Privacy Preferences Project (P3P) [3] are both technologies developed with the express purpose of offering technical solutions to social problems of concern to policy makers. While the designers of these platforms may have thought about them in much broader terms [4], in each case it was a single policy issue -- protecting children from harmful online materials and protecting online privacy, respectively -- that was primarily responsible for driving the development of each of these technologies [5]. Despite the explicit social goals of these technologies, in both cases efforts were made to develop platforms capable of supporting a multitude of diverse policies. Thus, these technologies have been referred to as 'policy neutral' or 'value neutral'. While these terms certainly reflect the fact that no specific policies were built into these technologies, I argue that 1) the platforms are none-the-less inherently biased and 2) that the platform designers may have a responsibility to prevent, or at least warn against, the use of the platforms to implement dangerous or unethical policies.

PICS and P3P are both projects developed under the Technology and Society Domain of the World Wide Web Consortium [6]. PICS is an Internet metadata format that was developed primarily to help parents choose appropriate content for their children. P3P, which is still under development, is being designed to allow Web sites to express their privacy practices and users to express their privacy preferences in such a way that a mutually acceptable agreement can be reached rapidly through an automatable procedure.

Inherent Bias

The PICS specification documents make no recommendations about what types of online content are appropriate or inappropriate for children, nor do they even restrict PICS ratings to the question of appropriateness for children. Instead, PICS facilitates the creation of rating systems that specify the criteria with which content should be rated. Thus, PICS equally supports rating systems that classify content based on age appropriateness, entertainment value, veracity, or any other set of criteria. But PICS rating systems do not implement policies, they merely allow for the expression of information that can be used to determine whether a particular piece of content is consistent with a policy. It is up to software such as Web browsers to provide tools for implementing particular policies. For example, a rating system may be used to classify content as 'interesting' or 'boring', but there is nothing inherent in the rating system that allows or disallows access to material on this basis. Individuals might configure their Web browsers to filter out all boring materials; while employers might configure software on their corporate Internet gateways to filter out all interesting materials that their employees might be looking at instead of doing their jobs. Thus policy decisions are completely separated from PICS specifications, and mostly separated from PICS rating systems [7]. Nonetheless, PICS is not completely without policy bias.

The W3C receives more suggestions for new projects than the consortium has resources to manage. And for every official submission to W3C there are probably dozens of other ideas that will never even make it to official submission status. Thus a decision on the part of the consortium to launch a project, and decisions by consortium members to devote resources to a project, indicate that at a particular point in time, that project is more important than all of the other possible projects in that area that resources could have gone into. At the time PICS was launched, people had a variety of ideas for online applications of metadata. But the only one compelling enough to elicit a commitment of resources, was one that could be used to help parents choose appropriate content for their children. Thus, the creation of the PICS standards could be construed as an endorsement of the need for parents to have automated tools to help them choose content for their children. The ACLU and others have argued that such tools are unnecessary and even dangerous in that they could open the door to government censorship [8]. If the W3C and its members felt that there was no need to help parents choose content for their children or that other solutions were preferable to automated tools, presumably they would not have chosen to develop PICS at all.

While the PICS designers did an admirable job of designing PICS as a general Internet metadata platform that can indeed be used for applications completely unrelated to the originally stated purpose of PICS, the PICS design and subsequent implementation were likely biased by this purpose. To the PICS designers' credit, it is not obvious how PICS might have been designed differently if it had been designed for a different purpose, and speculation on design details is beyond the scope of this essay. My point here is merely to raise the possibility that this design -- and perhaps all technology designs -- may be so biased.

That PICS implementations have been biased by the original PICS purpose is an easier point to demonstrate. Despite the flexibility of the PICS specifications and a lot of interesting ideas about applications for PICS, we have yet to see an actual PICS implementation that was not designed primarily as a tool for parents to choose appropriate content for their children. Perhaps the most well known PICS implementation, the Content Advisor component of the Microsoft Internet Explorer Web browser, comes with the RSACi rating system [9] pre-installed. RSACi has four categories designed primarily for use by parents to select content for their children: sex, violence, nudity, and adult language. Aspects of the Content Advisor user interface are well suited for an RSACi-like rating system, but are ill suited for other types of rating systems. For example, the user interface includes a slider that can be used to set acceptable levels for each rating category. Ratings displayed to the left of the point where the user has set the slider are acceptable; ratings displayed to the right of that point are unacceptable and content with that rating is blocked. For rating systems like RSACi that assign high scores to content that is most likely to be inappropriate, this user interface is adequate. But for rating systems that assign high scores to content that is high quality or very entertaining, this user interface is useless except for people who wish to avoid high quality or entertaining content. This user interface is also useless for applications in which a typical policy might look for content that falls within a particular range in the middle of a spectrum. Furthermore, we have yet to see commercial PICS applications of some of the most interesting uses of metadata that involve seeking out content of a specific nature rather than blocking it.

P3P has inherent biases as well. Once again, the decision to develop P3P is essentially an endorsement of the need to provide individuals with automated tools to help them protect their privacy online. The fact that P3P is being designed to support multiple policies is acknowledgement that there is neither a universally accepted standard right to privacy nor are there universally accepted limits on the amount of personal information that an individual may choose to disclose (perhaps in exchange for a benefit) [10].

Responsibilities of Technology Designers

Designers of technologies intended to address social needs may have a responsibility to publicly acknowledge the inherent bias in the technologies they design, and sometimes to design in further bias or publicly caution against using the technologies to implement dangerous or unethical policies. The designers of PICS and P3P have done this to some extent, but in both cases, I believe, they should do more.

Even while pointing out the many uses for PICS, its designers have been up front about its original stated purpose. However, while acknowledging the possibility that PICS could be used for undesirable ends, the designers chose not to build mechanisms into the design to prevent this [11], and they have done little to steer PICS implementations away from undesirable uses or towards desirable ones. Until PICS was criticized by first amendment advocates, few efforts were made to even determine what constituted desirable and undesirable uses of PICS, or to offer guidelines for its ethical use [12].

The P3P designers have spent a lot of time grappling with design elements that increase the flexibility of the design but also introduce new opportunities for abuse. Early on in the P3P design process, designers decided to use the PICS model and develop P3P as a flexible system that could be used to implement a diversity of policies rather than as a system that simply enshrined one particular policy in software. Thus P3P permits the creation of policies that not only do nothing to help someone protect their privacy, they may actually make it easier for people to disclose their personal information. Furthermore, as with PICS, the P3P specifications will not dictate anything about how user interfaces must be implemented; a program will be considered P3P-complient if it communicates with other P3P-complient software using the standard P3P protocol. Nonetheless, the user interfaces that are developed to give individuals the ability to configure and change policies will be critical factors to P3P's success or failure as a privacy-enhancing tool.

One might assume that P3P user interface standards are unnecessary because it will be in software developers' best interests to give consumers the kinds of user interfaces they want -- that is, we can let the market decide. But it remains to be seen whether markets for P3P software will develop in which consumers can make meaningful choices. And, especially if companies continue to give their Web browsing software away for free, the server market may exert more of an influence over P3P software development than the consumer market. Thus, user interface design may reflect the desires of Web sites that collect information more than the desires of users. For example, Web sites might want information collection to be as seamless as possible. This might result in the development of user interfaces that automatically reveal information to Web sites that have data practices consistent with the user's policy, without notifying the user or giving the user the opportunity to consent to that release. Users, on the other hand, might prefer a user interface that gives them the option of reviewing and consenting to all data releases.

It is important for P3P designers to expose the potential for the technology to be coerced in ways that undermine the goals of P3P, and to offer advice and positive examples for implementing P3P in a privacy-friendly way. The P3P designers have begun to draft an "Implementation Guide"[13] that will include guidelines for software implementers, Web sites, and users, as well as a set of guiding principles for the implementation and use of P3P.

In some sense, we are navigating through uncharted territory when developing information technologies like PICS and P3P that are designed to address a social problem without hard-coding in specific policies. It is not entirely clear what the best means are for designers of these technologies to carry out their ethical responsibilities.

Notes

  1. My comments here reflect my own observations and opinions, and do not necessarily reflect the policies or opinions of the members of the PICS and P3P working groups, W3C, or my employer. PICS and P3P are ongoing efforts, and thus my comments should not be construed as reflecting the final status of any effort. My role in the PICS effort has been mostly one of an observer, reporter, and sometimes evangelist. On the other hand, my role in the P3P effort has been substantial; I have actively participated in all phases of the project to date.
  2. http://www.w3.org/PICS/
  3. http://www.w3.org/P3P/
  4. The PICS homepage [2] explains PICS broadly as "an infrastructure for associating labels (metadata) with Internet content" and lists a variety of potential PICS applications.
  5. The PICS homepage [2] states that PICS "was originally designed to help parents and teachers control what children access on the Internet." The PICS effort was driven largely as a response to the Communications Decency Act in the United States and similar legislation and proposed legislation abroad. The P3P homepage [2] describes the goal of P3P as developing "an interoperable way of expressing privacy practices and preferences by Web sites and users respectively." While this goal had been discussed for some time by PICS designers, the looming threat of legislation restricting online data collection in the US was probably most responsible for P3P work moving ahead.
  6. http://www.w3.org/TandS/
  7. The choice of what criteria to build into PICS ratings systems limits the types of policy decisions that can be made based on these systems; thus, policy decisions are not completely separate from the underlying rating system. For example, if the only rating systems policy makers have available to them rate content on the basis of age appropriateness and the inclusion of sex, violence, and vulgar language, then they cannot implement policies that allow or restrict content on the basis of entertainment value or Web site privacy policies.
  8. See the ACLU's white paper, Fahrenheit 451.2: Is Cyberspace Burning? How Rating and Blocking Proposals May Torch Free Speech on the Internet, available online at http://www.aclu.org/issues/cyber/burning.html
  9. http://www.rsac.org/
  10. This policy decision was criticized at the June 1997 Federal Trade Commission Public Workshop on Consumer Information (see transcripts at http://www.ftc.gov/bcp/privacy2/index.html and my notes at http://lorrie.cranor.org/pubs/ftc_97_notes.html) by privacy advocates who argued that privacy should be treated as a basic human right and that consumers should not be given the option to disclose personal information in exchange for benefits. The notion of privacy as a basic human right is central to the European Union's privacy directive.
  11. Paul Resnick's PICS, Censorship, & Intellectual Freedom FAQ, available online at http://www.si.umich.edu/~presnick/pics/intfree/faq.htm, describes ways that PICS might be used by governments to censor their people. It answers the question, "Could W3C have controlled the uses of PICS by licensing the technology?" by saying, "Licensing such a technology was not considered to be a feasible option during the time of the CDA. Not only would it have undercut the 'neutrality' and appeal of the technology, the W3C then would have had to be in the position of determining who should and should not use it; this is not a role the W3C is competent to play."
  12. See the Electronic Frontier Foundation's Policy on Public Interest Principles for Online Filtration, Ratings and Labelling Systems, available online at http://www.eff.org/pub/Net_info/Tools/Ratings_filters/eff_filter.principles. In June 1998 the PICS designers issued a Statement on the Intent and Use of PICS: Using PICS Well, see http://www.w3.org/TR/NOTE-PICS-Statement.
  13. The Guiding Principles section of the P3P Implementation Guide is now online at http://www.w3.org/TR/1998/NOTE-P3P10-principles