The Platform for Internet Content Selection (PICS) [2] and the Platform for Privacy Preferences Project (P3P) [3] are both technologies developed with the express purpose of offering technical solutions to social problems of concern to policy makers. While the designers of these platforms may have thought about them in much broader terms [4], in each case it was a single policy issue -- protecting children from harmful online materials and protecting online privacy, respectively -- that was primarily responsible for driving the development of each of these technologies [5]. Despite the explicit social goals of these technologies, in both cases efforts were made to develop platforms capable of supporting a multitude of diverse policies. Thus, these technologies have been referred to as 'policy neutral' or 'value neutral'. While these terms certainly reflect the fact that no specific policies were built into these technologies, I argue that 1) the platforms are none-the-less inherently biased and 2) that the platform designers may have a responsibility to prevent, or at least warn against, the use of the platforms to implement dangerous or unethical policies.
PICS and P3P are both projects developed under the Technology and Society Domain of the World Wide Web Consortium [6]. PICS is an Internet metadata format that was developed primarily to help parents choose appropriate content for their children. P3P, which is still under development, is being designed to allow Web sites to express their privacy practices and users to express their privacy preferences in such a way that a mutually acceptable agreement can be reached rapidly through an automatable procedure.
The W3C receives more suggestions for new projects than the consortium has resources to manage. And for every official submission to W3C there are probably dozens of other ideas that will never even make it to official submission status. Thus a decision on the part of the consortium to launch a project, and decisions by consortium members to devote resources to a project, indicate that at a particular point in time, that project is more important than all of the other possible projects in that area that resources could have gone into. At the time PICS was launched, people had a variety of ideas for online applications of metadata. But the only one compelling enough to elicit a commitment of resources, was one that could be used to help parents choose appropriate content for their children. Thus, the creation of the PICS standards could be construed as an endorsement of the need for parents to have automated tools to help them choose content for their children. The ACLU and others have argued that such tools are unnecessary and even dangerous in that they could open the door to government censorship [8]. If the W3C and its members felt that there was no need to help parents choose content for their children or that other solutions were preferable to automated tools, presumably they would not have chosen to develop PICS at all.
While the PICS designers did an admirable job of designing PICS as a general Internet metadata platform that can indeed be used for applications completely unrelated to the originally stated purpose of PICS, the PICS design and subsequent implementation were likely biased by this purpose. To the PICS designers' credit, it is not obvious how PICS might have been designed differently if it had been designed for a different purpose, and speculation on design details is beyond the scope of this essay. My point here is merely to raise the possibility that this design -- and perhaps all technology designs -- may be so biased.
That PICS implementations have been biased by the original PICS purpose is an easier point to demonstrate. Despite the flexibility of the PICS specifications and a lot of interesting ideas about applications for PICS, we have yet to see an actual PICS implementation that was not designed primarily as a tool for parents to choose appropriate content for their children. Perhaps the most well known PICS implementation, the Content Advisor component of the Microsoft Internet Explorer Web browser, comes with the RSACi rating system [9] pre-installed. RSACi has four categories designed primarily for use by parents to select content for their children: sex, violence, nudity, and adult language. Aspects of the Content Advisor user interface are well suited for an RSACi-like rating system, but are ill suited for other types of rating systems. For example, the user interface includes a slider that can be used to set acceptable levels for each rating category. Ratings displayed to the left of the point where the user has set the slider are acceptable; ratings displayed to the right of that point are unacceptable and content with that rating is blocked. For rating systems like RSACi that assign high scores to content that is most likely to be inappropriate, this user interface is adequate. But for rating systems that assign high scores to content that is high quality or very entertaining, this user interface is useless except for people who wish to avoid high quality or entertaining content. This user interface is also useless for applications in which a typical policy might look for content that falls within a particular range in the middle of a spectrum. Furthermore, we have yet to see commercial PICS applications of some of the most interesting uses of metadata that involve seeking out content of a specific nature rather than blocking it.
P3P has inherent biases as well. Once again, the decision to develop P3P is essentially an endorsement of the need to provide individuals with automated tools to help them protect their privacy online. The fact that P3P is being designed to support multiple policies is acknowledgement that there is neither a universally accepted standard right to privacy nor are there universally accepted limits on the amount of personal information that an individual may choose to disclose (perhaps in exchange for a benefit) [10].
Even while pointing out the many uses for PICS, its designers have been up front about its original stated purpose. However, while acknowledging the possibility that PICS could be used for undesirable ends, the designers chose not to build mechanisms into the design to prevent this [11], and they have done little to steer PICS implementations away from undesirable uses or towards desirable ones. Until PICS was criticized by first amendment advocates, few efforts were made to even determine what constituted desirable and undesirable uses of PICS, or to offer guidelines for its ethical use [12].
The P3P designers have spent a lot of time grappling with design elements that increase the flexibility of the design but also introduce new opportunities for abuse. Early on in the P3P design process, designers decided to use the PICS model and develop P3P as a flexible system that could be used to implement a diversity of policies rather than as a system that simply enshrined one particular policy in software. Thus P3P permits the creation of policies that not only do nothing to help someone protect their privacy, they may actually make it easier for people to disclose their personal information. Furthermore, as with PICS, the P3P specifications will not dictate anything about how user interfaces must be implemented; a program will be considered P3P-complient if it communicates with other P3P-complient software using the standard P3P protocol. Nonetheless, the user interfaces that are developed to give individuals the ability to configure and change policies will be critical factors to P3P's success or failure as a privacy-enhancing tool.
One might assume that P3P user interface standards are unnecessary because it will be in software developers' best interests to give consumers the kinds of user interfaces they want -- that is, we can let the market decide. But it remains to be seen whether markets for P3P software will develop in which consumers can make meaningful choices. And, especially if companies continue to give their Web browsing software away for free, the server market may exert more of an influence over P3P software development than the consumer market. Thus, user interface design may reflect the desires of Web sites that collect information more than the desires of users. For example, Web sites might want information collection to be as seamless as possible. This might result in the development of user interfaces that automatically reveal information to Web sites that have data practices consistent with the user's policy, without notifying the user or giving the user the opportunity to consent to that release. Users, on the other hand, might prefer a user interface that gives them the option of reviewing and consenting to all data releases.
It is important for P3P designers to expose the potential for the technology to be coerced in ways that undermine the goals of P3P, and to offer advice and positive examples for implementing P3P in a privacy-friendly way. The P3P designers have begun to draft an "Implementation Guide"[13] that will include guidelines for software implementers, Web sites, and users, as well as a set of guiding principles for the implementation and use of P3P.
In some sense, we are navigating through uncharted territory when developing information technologies like PICS and P3P that are designed to address a social problem without hard-coding in specific policies. It is not entirely clear what the best means are for designers of these technologies to carry out their ethical responsibilities.