Helen Nissenbaum Shapes Administration's Thinking on Privacy
In a sign the Obama Administration was taking consumer privacy seriously, the White House unveiled a blue print yesterday for a "Privacy Bill of Rights" to protect individuals in a networked environment. The document title, "We Can't Wait," reflects the sense in Washington that a set of guidelines cannot come soon enough.
We spoke with professor Helen Nissenbaum from MCC about her scholarship on privacy and how it has contributed to the Administration's own thinking on the subject.
Q: What is the Consumer Privacy Bill of Rights?
The Consumer Privacy Bill of Rights, unveiled yesterday, is a set of seven privacy principles developed by the Obama Administration articulating clear expectations regarding the way companies handle the collection and use of consumer information. The Privacy Bill of Rights incorporates a core of traditional principles but it includes new elements, particularly addressing growing concerns over unregulated and problematic activities posed by online and mobile communications media. Recent problems include iPhone and Android Apps that upload a phone’s entire contact list to app companies’ servers, surreptitious collection of personal information during transactions both online and off, and frequent alterations in company privacy policies.
The Administration's Consumer Privacy Bill of Rights will inform multi-stakeholder deliberation with the aim of producing detailed sets of sector-based best practices. What happens next will be controversial. A model favored by companies whose business involves capturing and using personal information is voluntary adoption of these best practices enforceable by the consumer protection arm of the federal government (the Federal Trade Commission). Many privacy advocates and advocacy NGOs oppose voluntary adoption and favor the passage of law in which the Bill of Rights is embedded.
Q: Can you explain the role that you played in shaping this new privacy plan?
Last year, with two postdoctoral research fellows (Kenneth Farrall and Finn Brunton), I submitted a public comment in response to the Administration's request for comments on an earlier position paper; See: http://www.ntia.doc.gov/files/ntia/comments/101214614-0614-01/attachments/NissenbaumIPTFComments.pdf
In our comments, we referenced the theory of privacy as contextual integrity, which I had advanced in my book, Privacy In Context: Technology, Policy and the Integrity of Social Life. (Stanford University Press, 2010) According to this theory, at the heart of privacy is the expectation that personal information will flow appropriately, which, in turn, is determined by the social context, type of information, who is receiving it, and the constraints under which it is shared. Many of the companies that the Privacy Bill of Rights addresses are using information technologies and digital media in ways that have radically disrupted expected information flows. These have become so complex that the companies themselves are hardly able to understand them, let alone all of us directly affected by these practices.
The Consumer Privacy Bill of Rights cites my book, as well as our public comment from last year and includes Respect for Context as Principle Three.
I have argued that transparency alone will not safeguard consumer privacy, and urge policy makers to support substantive constraints on flow of personal information--both online and off.
Q: Why should consumers care about online privacy rights?
I want to be clear that my work isn’t limited to thinking about privacy online, but rather to what I refer to as privacy in a networked world. The online component of this network vastly magnifies the capacities to collect, utilize, and distribute information, and hence magnifies the privacy problem.
Many of the services we use, and the advertisements we see online, make heavy use of personal information. Social networks, such as, Facebook, all of Google’s services, Amazon, and more, draw on personal information, whether provided voluntarily or captured surreptitiously. These companies argue that since many of these services are free (that is, do not charge dollars and cents), users should not begrudge them the use of information as an alternative currency -- who we are, what we like, what we do, where we go, what we say, etc.
The trouble is that this exchange of information for service is usually implicit and open-ended. In brick and mortar shopping malls there is no database recording the time we arrive, what we purchased, what we looked at, what stores we entered, how long we spent in the bathroom, and so forth. Or, when we order something online, we may understand why a shipping address is needed. But what we do not realize is that the company is using the data for other purposes: for targeted advertising, charging us more for certain products because they have determined we would be willing to pay the price and selling this information to companies whose business it is to build massive dossiers.
Imagine being told by a company that though they are willing to tell us they are charging us for goods and services, they are unwilling to say how much. If information is the currency of a digital world, then this is equivalent to the bargain we are being asked to accept.
There are many reasons to care about privacy: individuals can be harmed by inappropriate collection and distribution of information; our freedom and autonomy may be abridged; we may suffer unfair discrimination; and many social institutions, as fundamental as democracy, may be threatened if norms of privacy are not respected.
Q: How will this new plan affect the everyday online lives of consumers?
The Consumer Privacy Bill of Rights expresses a baseline of expectations for the collection and use of personal information online and off. Online, it will encourage ongoing efforts to provide a "Do Not Track" setting in web browsers to curb marketing efforts, similar to the existing "Do Not Call" list for telephone marketing.
The Bill of Rights underscores the rights consumers have to: 1) control what personal information is collected and how it is used; 2) easily understandable information about privacy practices; 3) expect that companies will respect the context in which personal information was provided and only use that information in ways that are consistent with this context; 4) secure and responsible handling of their personal information; 5) access and correct personal information; 6) narrowly focused collection of personal information that is retained only as long as it is needed; and 7) enforcement of these principles if a company should claim to abide by them but actually does not do so in practice.
Adding “respect for contexts,” to the other historically significant principles provides an additional foothold toward progress in achieving robust privacy protection for individuals and society.