|Bruce Umbaugh at Webster.edu|
|Philosophy on the Web|
By Bruce Umbaugh
Monday, July 29, 2002
D R A F T - - under review - - Please do not quote, cite, or distribute this version.
In this essay, I mean to bring together divergent strands in contemporary social thought: the ethics of care, and concern about privacy and technology. I maintain that any of us concerned to defend privacy, and any of us inclined to view moral matters from a care perspective, ought to take into account our roles in establishing technological standards in our decisions about what to do with particular technologies.
I want to argue--or perhaps only assert--that valuing privacy is ultimately grounded in care.
Most discourse about privacy misses this. In particular, the view of privacy that dominates discussions about privacy and technology is highly individualistic in orientation. Discussion about ethical issues having to do with the Net generally focuses strongly on rights, and gives little attention to why or for what purpose freedom of expression might be valuable or privacy might be valuable and so on. The prevailing discourse about privacy and technology is severely limited by this perspective.
What is privacy, if not some simple right or complex of rights? I believe that Jeffrey H. Reiman gets it approximately right in "Privacy, Intimacy, and Personhood":
Privacy is a social ritual by means of which an individual's moral title to his [sic] existence is conferred. Privacy is an essential part of the complex social practice by means of which the social group recognizes--and communicates to the individual--that his existence is his own. And this is a precondition of personhood. . . . .
. . . . [P]rivacy is necessary to the creation of selves out of human beings, since a self is at least in part a human being who regards his existence--his thoughts, his body, his actions--as his own.
On this account, privacy helps makes persons out of prepersonal infants, privacy confirms and demonstrates respect for the personhood of already developed persons, and privacy is a crucial ingredient for further personal development.
Hence, privacy--moreover, institutions of privacy very like those we have now--is crucial to human flourishing.
Privacy and Care
What would be wrong with alternative institutions of privacy? Consider the examples Sissela Bok offers in Secrets. Imagine societies in which
- you and I can keep nothing secret, but others--perhaps a deity--can do so. (Neal Stephenson, in "Challenging the Assumptions," offered the analogy of the all-seeing eye atop the pyramid.)
- you and I can pierce all secrets and everyone else is transparent. (Think of the Ring of Gyges [Plato, Republic 359d-360c]--or the first example, with us at the eye.)
- no one can keep secrets. (Stephenson offers the model of the aquarium.)
- everyone can keep secrets at will. (The dark cloud.)
Bok asserts that each of these is inferior to our own social and metaphysical order. I agree. To illustrate how so, consider an example of a small thing: a surprise birthday party.
In three of the sorts of worlds Bok (and Stephenson) sketch, the alternative institutions of privacy entail that no one could give me a surprise birthday party. Among other things, a surprise birthday party requires someone knowing that it is my birthday, sharing that information and planning with others, all the while keeping the planning (if not the fact of my impending birthday) hidden from me. It requires some secrecy that is missing in that world. (In the first of those worlds, others could give one of us, but only one of us, a surprise party, I suppose.)
Moreover, in order to have the institution of birthday parties--in order for giving me a party to be as meaningful as it is in this world--knowledge of my date of birth has to be something that I share with some intimates but not with just everyone. I regulate "closeness" with others, in part, by sharing different sorts of information about myself, major and minor: what I did over the weekend, what happened in class, how I felt about it, my aspirations for my career or for my children, what I have been reading, my birthrate and so on. I treat some people as intimates, some as close friends, some as mere friends, some as coworkers, some as acquaintances, and so on. Depending on what sort of relationship I mean to have with them, I disclose more or less about various aspects of me. Acquaintances do not need to know my birth date, my students do not need to know my income, whereas my accountant needs to know both (though not, perhaps, what I've been reading). Those closest to me know more of what I think has to do with being me. Privacy, according to Reiman, is the complex social ritual or cluster of institutions by which others recognize our selves as our own, and it is achieved in part by granting control not only over our actual physical bodies but also over extensions of our selves, such as information about me, or my diary, a computer file, information about me in a database, and so on. [See Dennett's "On the Origins of Selves."]
Something else is required for me to have a surprise birthday party. It is not enough for someone to have the special knowledge of my date of birth, to have the capacity to plan in secret with others, and so on. Someone also has to care about my birthday, expect others to care, expect that I would be touched by whatever expression of affection is shown through the giving of a surprise birthday party. (We hope that I wouldn't just be humiliated for others' entertainment.) Those different forms of caring are necessary for giving meaningful surprise birthday parties. Reiman argues that those different forms of care are only possible within an institution of respect for privacy much like the institutions we in fact have. Institutions of privacy make possible expressions of care such as surprise birthday parties or greetings, as well as the keeping of secrets.
If this is in the ballpark of being right, we need privacy for small things like throwing me a surprise birthday party, as well as writ large for things like the forming of personal plans, the direction of one's own life, making commitments to others, and generally for human flourishing. This account of privacy has a big, metaphysical aspect--if there is no privacy, then we have no selves--and also a small, multi-faceted gem of relationships--that things like trust, meaningful expressions of care, reciprocity, and more depend on privacy as well. Both ways, I believe, institutions of privacy grounded in care are ingredients of human flourishing. The more individualist orientation that dominates the privacy debates is diminished by its neglect of the role of privacy in relationships.
Privacy and Technology
It will shock no one to read that technology (or, to be more precise, particular implementations of forms of technology) can help or hinder the cause of privacy. Consider, for example, that communications technologies can be used for surveillance that intrudes on persons' privacy, or to encrypt communications to enhance individual privacy.
Technologies that exhibit what are called "network effects" are of particular interest on this score. The classic example to illustrate a network effect is the example of the FAX machine. If I have the only FAX machine in the world, it isn't of much use and isn't very valuable. If you also have one (and it interoperates with mine) then mine is considerably more valuable. Indeed, the value of those first FAX machines increases dramatically as more people have them, since the pool of people with whom we can communicate using them gets so much bigger. Technologies that exhibit network effects become more valuable as more people adopt the technology. [In truth, the values may be transformed in a variety of ways by widespread adoption. For example, FAX machines become targets for mass advertising only after adoption passes some threshold level. The transformation of value is decidedly nonlinear, just as the value of things may be complex and things have value for a variety of reasons.]
In many instances, because of network effects, at some point a "standard" locks in, and it is difficult for a new entrant to gain a foothold. Arguably, this has happened not only with FAX machines, but with English spelling, QWERTY keyboards, Web browsers, and so on.
Significantly, if we construe "technology" broadly [as Joseph Pitt suggests in Thinking About Technology, e.g.], it includes not only "stuff" like computers and cameras, but also business practices and the like. Construed broadly, the consequences of choices about things we don't usually think of as technology come in for consideration under this analysis: responses to cold-call marketing pitches or to spam e-mail, freely giving personal information to marketers, and so on may be acceding to a practice as standard and to be expected.
Privacy, Technology, and Care
We all understand the theory that buying a thing gives its makers and marketers incentives to make more of them or charge a higher price. When it comes to technologies, the economics are importantly different. Choices of individual consumers have consequences beyond rewarding and encouraging manufacturers, for any domain that exhibits network effects.
My acquiring a FAX machine, say, gives a reason to have one to those who want to communicate with me. In the same way, my beginning to use e-mail gives some other people a reason to use e-mail, my using the World Wide Web gives some other people a reason to build Web sites, my building a Web site gives some other people a reason to begin to use the Web, and so on.
It is impossible, of course, just to do one of these things. To use e-mail, one must use some particular program or programs as an e-mail client, must have a provider of an Internet connection to transmit and receive messages, and so on and so forth. With e-mail, much is standardized, and so for the most part the particular technology I use does not give others incentive to use one particular e-mail client rather than another, say. [More recently, the rise in e-mail that uses Hypertext Markup Language (HTML) in the message body, perhaps, qualifies this claim.]
Matters are different when it comes to the World Wide Web and the "browser wars." If Microsoft Internet Explorer version 5.4, Netscape Communicator 4.7, and the Opera Web browser version 6.0 display pages differently (and they do), what will (or should) designers do? Should a designer create multiple versions of a Web site, and enable browser checking to insure that the minority using something other than MSIE see the site displayed properly? Or design for the most common set of features?
On one Web site that I maintain, 95% of visits in a recent month used some version of Microsoft Internet Explorer. Although only one-third of visits used version 6 or greater, that was still more than any other competitor. If I design for that browser, many visitors will not see the pages as I planned, but I can use the newest technological tricks such as cascading style sheets. If others do likewise, then those characteristics become the standard. This is true whatever the characteristics may be and without regard to whether they were reasons for its adoption by users and designers.
Although it may seem trivial to focus on the Web browser, it is important to note that for information and networked computing devices, including Web browsers, the characteristics that lock in as standards may well include characteristics with privacy implications. Examples include:
- support for encryption, of what strength, in what circumstances, revealing what information about use and the user,
- sharing of medical information beyond what's needed to complete an insurance transaction,
- baby monitors and security cameras that broadcast pictures and sound throughout the neighborhood.
Some technologists are acutely aware that design consequences can reach far. We find movements such as the "Viewable With Any Browser" campaign that advocates designing Web pages that degrade gracefully to be accessible not matter what browser one uses. At events such as The Workshop on Freedom and Privacy by Design, the Workshop on Fair Use by Design, and the Workshop on Privacy Enhancing Technologies, policy makers, programmers, and other professionals working in the area have investigated the social implications of design choices.
Another example is John Gilmore's Free S/WAN project. Gilmore's aim is to secure a significant proportion of Internet traffic against passive wiretapping, using inexpensive hardware and freely available software.
The idea is to deploy PC-based boxes that will sit between your local area network and the Internet (near your firewall or router) which opportunistically encrypt your Internet packets. Whenever you talk to a machine (like a Web site) that doesn't support encryption, your traffic goes out "in the clear" as usual. Whenever you connect to a machine that does support this kind of encryption, this box automatically encrypts all your packets, and decrypts the ones that come in. In effect, each packet gets put into an "envelope" on one side of the net, and removed from the envelope when it reaches its destination.
. . .
As each person installs one for their own use, it becomes more valuable for their neighbors to install one too, because there's one more person to use it with. The software automatically notices each newly installed box, and doesn't require a network administrator to reconfigure it. Instead of "virtual private networks" we have a "REAL private network"; we add privacy to the real network instead of layering a manually-maintained virtual network on top of an insecure Internet. ["FreeS/WAN: Securing the Internet against Wiretapping"]
Many of the efforts of these technologists are clearly motivated by political and other principled commitments. For the most part, the rhetoric of the policy advocates and technological design professionals is one of "rights" rather than "care." But whether they know it or not, as the first part of this essay explained, only within a framework in which people care about interpersonal relationships and the well being of others does such work make sense.
It seems to me, therefore, that anyone who values privacy, and who acts to protect or defend privacy, should take into account concerns that would be raised by adopting a care perspective, since the care perspective is so deeply implicated in the evaluation and analysis of matters involving privacy. All those who care about privacy should think seriously about how care fits into a model for right action and they should perhaps be at least partially motivated by care.
So, if you, to pick someone not entirely at random, value privacy, or act to defend privacy, then you should perhaps be motivated at least in part by care. It would make sense for you to adopt a care perspective in deliberating about privacy.
Because of network effects it is likely that you play a role in establishing standards even if you do not design technological devices, advocate for public policies regarding technology, or participate in the deliberations of bodies that adopt formal standards. If you are motivated by care, then the role you play in establishing standards should be a consideration in your choices whether to adopt a technology.
Phil Zimmerman, author of the PGP (or Pretty Good Privacy) encryption program, wrote a short piece called, "Why Do You Need PGP?" In it, he addressed the question of adopting the use of strong encryption even if you are a law-abiding citizen with nothing to hide:
Perhaps you think your e-mail is legitimate enough that encryption is unwarranted. If you really are a law-abiding citizen with nothing to hide, then why don't you always send your paper mail on postcards? Why not submit to drug testing on demand? Why require a warrant for police searches of your house? Are you trying to hide something? You must be a subversive or a drug dealer if you hide your mail inside envelopes. Or maybe a paranoid nut. Do law-abiding citizens have any need to encrypt their e-mail?
What if everyone believed that law-abiding citizens should use postcards for their mail? If some brave soul tried to assert his privacy by using an envelope for his mail, it would draw suspicion. Perhaps the authorities would open the mail to see what he's hiding. Fortunately, we don't live in a that kind of world, because everyone protects most of their mail with an envelope. There's safety in numbers. Analogously, it would be nice if everyone routinely used encryption for all their e-mail, innocent or not, so that no one drew suspicion by asserting their e-mail privacy with encryption. Think of it as a form of solidarity.
If the analysis of privacy, technology, and care I've sketched here is approximately right, then the choices we make, each of us, each day, in adopting given forms of technology, offer the opportunity to exhibit care and stand in solidarity with others, or to fail to do so. Individual choice "means more" in this framework because it reaches farther. My bad decisions, analyzed individualistically, amount to "too bad for me" (plus, perhaps, some social costs of cleaning up my mess). But attending to these social aspects of technological choices means seeing that in some of my choices I am acting not just for myself but "for all humankind"--not in the manner of the philosophers' categorical imperative, but in the manner of the economists' network effects.
When we make these choices, as we do daily, we stand in for others, effectively making choices for them, and our ethics for deciding what we ought to do should be a fiduciary ethics.
[I thank Kate Parsons, as well as the participants in the Society for Philosophy in the Contemporary World session on "Fiduciary Ethics," at the meeting of the Central Division of the American Philosophical Association, April 2002, for their comments.]
Agre, Philip E. and Marc Rotenberg, eds. Technology and Privacy: the new landscape. Cambridge: The MIT Press, 1998.
Alderman, Ellen and Caroline Kennedy. The Right to Privacy. New York: Vintage, 1997.
Bok, Sissela. Secrets: on the ethics of concealment and revelation. New York: Vintage, 1983.
Bartow, Ann. "Electrifying Copyright Norms and Making Cyberspace More Like a Book." Proceedings of the Workshop on Fair Use By Design. San Francisco (2002).
Burk, Dan L. and Julie E. Cohen. "Fair Use Infrastructure for Copyright Management Systems." Proceedings of the Workshop on Fair Use By Design. San Francisco (2002).
Burstein, Cari D. "Viewable With Any Browser." Electronic document at http://www.anybrowser.org/campaign/ (2000).
Chaum, David. "A New Paradigm for Individuals in the Information Age." Proceedings from IEEE 5th Symposium on Security and Privacy. April 29-May 2, Oakland, California: 99-103 (1984).
Cranor, Lorrie Faith. "The Role of Privacy Advocates and Data Protection Authorities in the Design and Deployment of the Platform for Privacy Preferences." Proceedings of the Twelfth Conference on Computers, Freedom, and Privacy. San Francisco (2002).
Dennett, Daniel C. "The Origins of Selves," Cogito, 2 (1989), 163-173.
Etzioni, Amitai. The Limits of Privacy. New York: Basic Books, 1999.
Foner, Lenny. "Introduction to the Workshop on Freedom and Privacy by Design." Proceedings of the Tenth Conference on Computers, Freedom, & Privacy. Toronto (2000).
Garfinkel, Simson. Database Nation: the death of privacy in the 21st century. Sebastapol, California: O'Reilly, 2000.
Gilmore, John. "FreeS/WAN: Securing the Internet against Wiretapping." Circulated widely in electronic form, 1999, with an earlier version dating to circa 1995. Revised version online at http://www.freeswan.org/history.html (2002).
Godwin, Mike. Cyber Rights: defending free speech in the digital age. New York: Times Books, 1998.
Pitt, Joseph. Thinking About Technology: foundations of the philosophy of technology. New York: Seven Bridges Press, 1999.
Plato, The Republic and Other Works. B. Jowett, tr. Garden City, New York: Dolphin/Doubleday: 1960.
Proceedings of the Workshop on Privacy Enhancing Technologies (PET2002). Springer Lecture Notes in Computer Science (Forthcoming).
Reiman, Jeffrey H. "Privacy, Intimacy, and Personhood," Philosophy and Public Affairs 6 (1): 26-44 (1976).
Stephenson, Neal. "Challenging the Assumptions." Speech given at The Tenth Conference on Computers, Freedom, and Privacy, Toronto, April, 2000.
Zimmerman, Philip. "Why Do You Need PGP?" In The Official PGP User's Guide, Cambridge: The MIT Press, 1995: 5-7.