Having personally spent a good part of seven years working on the P3P 1.0 specification, I can’t help but perk up my ears whenever I hear P3P mentioned. I still believe that P3P was, and still is, a really good idea. In hindsight, there are all sorts of technical details that should have been worked out differently, but the key ideas remain as compelling today as they were when first discussed in the mid 1990s. Indeed, with increasing frequency I have discussion with people who are trying to invent a new privacy solution that actually looks an awful lot like P3P.
P3P:CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=15165 for more info."
Google’s approach is both clever and (with apologies to Magritte) surreal. The website transmits the code that means, “I am about to send you a P3P compact policy.” And yet the content of the policy says “This is not a P3P policy!” Thus, to IE this is a P3P policy, and yet to a human reader it is not. As P3P is computer-readable code, not designed for human readers, I argue that it is a P3P policy, and a deceptive one at that. The issue got a flurry of media attention last February, and then was quickly forgotten. The United States Federal Trade Commission and any of the 50 state attorney generals (or even a privacy commissioner in one of the many countries that now has privacy commissioners to enforce privacy laws) could go after Google or one of the the thousands of other websites that have posted deceptive P3P policies. However, to date, no regulators have announced that they are investigating any website for a deceptive P3P policy. For their part, a number of companies and industry groups have said that circumventing IE’s privacy controls is an acceptable thing to do because they consider the P3P standard to be dead (even though Microsoft still makes active use of it in the latest version of their browser and W3C has not retired it).
The problem with self-regulatory privacy standards seems to be that the industry considers them entirely optional, and no regulator has yet stepped in to say otherwise. Perhaps because no regulators have challenged those who contend that circumventing P3P is acceptable, some companies have already announced that they are going to bypass the Do Not Track controls in IE because they do not like Microsoft’s approach to default settings (see also my blog post about why I think the industry’s position on ignoring DNT in IE is wrong).
Necessary But Not Sufficient: Standardized Mechanisms for Privacy Notice and Choice
For several decades, “notice and choice” have been key principles of information privacy protection. Conceptions of privacy that involve the notion of individual control require a mechanism for individuals to understand where and under what conditions their personal information may flow and to exercise control over that flow. Thus, the various sets of fair information practice principles and the privacy laws based on these principles include requirements for providing notice about data practices and allowing individuals to exercise control over those practices. Privacy policies and opt-out mechanisms have become the predominant tools of notice and choice. However, a consensus has emerged that privacy policies are poor mechanisms for communicating with individuals about privacy. With growing recognition that website privacy policies are failing consumers, numerous suggestions are emerging for technical mechanisms that would provide privacy notices in machine-readable form, allowing web browsers, mobile devices, and other tools to act on them automatically and distill them into simple icons for end users. Other proposals are focused on allowing users to signal to websites, through their web browsers, that they do not wish to be tracked. These proposals may at first seem like fresh ideas that allow us to move beyond impenetrable privacy policies as the primary mechanisms of notice and choice. However, in many ways, the conversations around these new proposals are reminiscent of those that took place in the 1990s that led to the development of the Platform for Privacy Preferences (“P3P”) standard and several privacy seal programs.
In this paper I first review the idea behind notice and choice and user empowerment as privacy protection mechanisms. Next I review lessons from the development and deployment of P3P as well as other efforts to empower users to protect their privacy. I begin with a brief introduction to P3P, and then discuss the privacy taxonomy associated with P3P. Next I discuss the notion of privacy nutrition labels and privacy icons and describe our demonstration of how P3P policies can be used to generate privacy nutrition labels automatically. I also discuss studies that examined the impact of salient privacy information on user behavior. Next I look at the problem of P3P policy adoption and enforcement. Then I discuss problems with recent self-regulatory programs and privacy tools in the online behavioral advertising space. Finally, I argue that while standardized notice mechanisms may be necessary to move beyond impenetrable privacy policies, to date they have failed users and they will continue to fail users unless they are accompanied by usable mechanisms for exercising meaningful choice and appropriate means of enforcement.