Prisoner's Dilemma

Grade schools in the US ignore philosophy as a subject. A few high schools give it brief mention (and then it's only to cover historically important people). Even in many colleges it remains elective. The result is that many important subjects in philosophy are unknown to the general public, despite the fact that they are simple and can have a great influence on our everyday lives.

I've mentioned concepts like confirmation bias and the sunk cost fallacy before. These are common mistakes all people make in reasoning that can be avoided if we learn about them. These have aspects of psychology as well as philosophy. A more purely philosophical concept everyone should understand is the Prisoner's Dilemma. A typical example goes like this:

Two suspects are arrested for a robbery. Each is questioned separately by police and told this: Our evidence against the two of you for the robbery is thin, but we can give each of you a year in jail on a lesser weapons charge. If you confess and squeal on your buddy, he'll get five years and we'll let you walk. But if you both squeal, you each get three years.

A keeps silentA confesses
B keeps silentA: 1 year
B: 1 year
A: free
B: 5 years
B confessesA: 5 years
B: free
A: 3 years
B: 3 years

Each suspect reasons like this: I can't talk to my buddy, and I have no control over what he does. If he clams up, I get a year if I do as well, but I go free if I confess. If he squeals, then I get five years if I stay silent and three if I confess. In both cases, I'm better off confessing. Both suspects reason this way, so both confess, and each gets three years in prison. But if they had both remained silent, they both would have gotten only a year. So the essence of the Prisoner's dilemma is this: reasoning separately, both parties doing what is clearly in their best interest end up with a result that is worse that what they would have gotten if they had cooperated.

Many situations in life mirror this. Take doping in sports, for example. Whether or not your opponent is doping is out of your control. If he is, you must dope to compete. If he isn't, doping won't lessen your chance of winning, so individually you are always better off doping. But if everyone in the sport is doping, the results will be roughly the same as if no one is doping, so as a group, it would be better if everyone didn't. Other situations like the tragedy of the commons can be modeled this way.

The way out of these dilemmas is to find some means to encourage or force cooperation. For example, a criminal gang might have a prior agreement—or strong social taboo—against snitching. Sports regulators might have strong rules against doping and do regular testing. It has even been suggested that one of the primary motivations for which people create governments is to have a third party to resolve such dilemmas between citizens.

Like any simplified mathematical model of complex human interaction, there is a danger of applying it to situations that don't quite match. In a recent episode of the Philosophy Bites podcast, Jeff McMahan suggests modeling aspects of the gun control debate as a prisoner's dilemma: for example, the interaction between a burglar and homeowner. Each reasons that if the opponent is armed, he is certainly safer being armed himself, and if his opponent is unarmed, being armed doesn't hurt, so it is better to be armed in each case. But collectively, they are safer if both are unarmed.

I don't think this particular argument holds water, even ignoring all the other aspects of a very complicated issue. First, the “payoffs” (that's mathematical jargon for the relative value of the results to each of the participants) are not symmetrical. In the “both unarmed” condition, the winner of the interaction is likely to be whoever is bigger, stronger, or the more experienced fighter—probably the criminal. The “both armed” condition raises the stakes and the danger for both, but is also equalizes them, so it is likely to benefit the homeowner relative to the burglar. Second, it assumes that both the criminal and the homeowner place equal value on their own safety. This is a psychological question. Perhaps the burglar is a sociopath who values violence for its own sake. It also makes the assumption that the “both unarmed” condition is something that's possible to achieve in real life.

Another way out of the prisoner's dilemma is available when situations are repeated more than once: reciprocity. When we have multiple interactions with people, we can gain a reputation for being cooperative, making others more likely to cooperate with us. Robert Axelrod's classic experiments along these lines are explored in his book The Evolution of Cooperation. Reciprocity can also explain how our moral sense (and things like our ability to recognize faces) can evolve from the essentially selfish process of natural selection.

This concept is simple, and recognizing it in real life situations can make such a difference in life that it should be taught to everyone in grade school.

Predictably Irrational

Dan Ariely's Predictably Irrational should be required reading in all English-speaking high schools. Though he is an academic of impeccable credentials (including a 2008 Ig Nobel prize), he is also an entertaining writer—a rare combination. The book details some of his important and cutting-edge research in the emerging field of behavioral economics, but its writing is accessible, clear, funny, and effective.

His examples resonate with the ordinary choices we all make in life: buying magazines, dating, vacationing. He shows us the mistakes we all make, but not in a way that is condescending or cynical. Indeed, his intent is clearly to show us how we can avoid making those mistakes even while he shows us how universal they are. His advice is not that of a college professor or a parent, but more like a best friend telling you “Wow, I just did something really stupid—don't do that.”

Indeed, its very lucidity might be a risk: you might be tempted to think “Well, of course, how obvious” after he explains some aspect of human behavior, and not realize that his discoveries were not obvious, and that they are backed up by solid experimental evidence, not just platitudes.

You can get a taste of his style from Youtube, but the details in the book are worth the time spent. While it is likely that academics will continue to cite the groundbreaking 1974 Kahneman and Tversky paper as the founding work of the field, Ariely's book is likely to be one most discussed by the rest of us, and it will serve you well to be familiar with it when related subjects come up in conversation.

Show me the mouse

I once attended a scientific conference where several of the speakers were doing research into longevity. Each had a promising area of research. We have learned a lot about aging in recent years and know many of the biochemical changes that take place. There are drugs and other interventions (like calorie restriction) that show promise in slowing, stopping, or even reversing some of those changes. The speakers explained their work and why it had promise, then invited questions (as is standard practice in scientific conferences).

The first question for every speaker was usually the same: Where's your 5-year-old mouse? Mice are commonly used in medical research for many reasons. They are easy to breed and keep, their biochemistry is reasonably similar to humans (and most other mammals), and their life cycle is short and fast. Testing longevity drugs on humans would take decades. Mice only live a year or two. So if someone discovered a drug that could significantly extend human lifetimes, it is likely that it would be tested on mice first. If a drug really was the breakthrough we hope for, pictures of 5-year-old mice would be on news shows and websites everywhere.

800px-Lab_mouse_mg_3263.jpg

None of the researchers was able to show a 5-year-old mouse. Some did have good results with lower animals like flies, some had mice that were measurably healthier in their later months than controls, but no one had the holy grail. But this is not a story of failure: research continues, new things are being tried, and new things are being learned and shared at conferences. My point is that science is successful precisely because everyone knows what the hard questions are and can't duck them.

Contrast science to, say, advertising. A commercial on this year's Superbowl for vitamins touted their benefits by saying “Centrum Silver was part of the recently published landmark study evaluating the long-term benefits of multivitamins.” This statement still appears on their website, verbatim. They can say these things safely knowing that no one will ask the obvious question—so what were the results of the study? Even the website doesn't link to the study, for good reason. The study showed no long-term benefits from multivitamins. But advertisers aren't scientists. They can give their audience carefully crafted, misleading—but totally true—statements while ducking the obvious questions.

Many people think science is about studying lots of facts discovered by people many years ago. That's certainly part of it, but far more important than yesterday's answers is learning what the right questions are.

Advocates for a product or a cause can make a very eloquent case, even if they're wrong. This is because they don't have to face hard questions. A book, a movie, or a TV documentary can all make you believe nonsense because you can't talk back. And if an idea supports our ideology, or benefits us, we are more likely to believe it without questioning, even when we can ask questions. Good scientists know this, and are trained to be most suspicious of things they would like to believe, like the idea that they could live longer.

This habit of being overly credulous or optimistic about things we would like to be true is called confirmation bias. It's another habit of bad poker players that we can take advantage of. They want to call, so they convince themselves that their opponent is bluffing. They want to fold, so they convince themselves that their opponent has the nuts.

Be skeptical. Especially of yourself, and what you want to be true. Don't ever forget to ask yourself the tough questions. Even if you're telling me what I want to hear, I'm going to tell you to shut up and show me the 5-year-old mouse.

Sunk costs

Chafuu.png

There is a fable about a woman who makes a pot of tea and mistakenly puts salt in it instead of sugar. She thinks, “Well, maybe if I put in extra cream it will be better”, but it still tasted awful. Then she thinks “Pepper is a very strong flavor, maybe it will mask the salt”, so she adds pepper. But she doesn't like the peppery tea at all. “Perhaps some aromatic herbs would rescue it”, she thinks, and adds herbs. But this too just makes the tea worse, as do other things she tries to add. It is only when she finally asks a neighbor for advice that her neighbor says “Why not just throw it out and make a fresh pot?”

There are many mistakes in reasoning people make regularly. One of the most dangerous is called the sunk cost fallacy. This is our unwillingness to abandon things we have invested our time, effort, money, and ego into even after we realize they will be of no use to us in the future. We don't want to leave a job or a relationship we've been in for years even if it's a bad one. We'd rather go out of our way to find justifications for staying because we're invested, even if it's likely that a new one would be better. We hang on to a stock or a house that's losing money, even if selling for a loss to invest in something better would make financial sense. We cling to cherished beliefs in the face of evidence against them.

At the poker table, this happens when a player has invested a significant amount of money in a hand, and believes that this should affect his decision to keep it or fold it later in the hand. Every decision you make in a poker hand should be made based on what you may win or lose after that decision. The amount of money in the pot certainly affects that. How much of that money you put there should not. Once you've put money in the pot, it's not yours anymore—It's already gone. It should have no more affect on your future decisions than the money you spent on dinner last Friday. If you know you can't win the pot now, your most profitable play is to fold.

But many people don't. They're invested, “pot stuck”, and they hang on. This is good for me—I can bet more with my best hands against such people knowing they will call more than they should. Often, they will even say out loud, “I knew you had me, but I had to call.” I love hearing that.

If you've ever heard yourself say something like “I've put too much in this pot not to call”, maybe you should think twice. And maybe you should take that same second look at your job, your relationship, your investments, and even your religious and political beliefs. Are they really serving you, or are you just hanging on because you're invested?

Transparency and privacy

A note for context: I presented a much-abbreviated version of this talk at an Extropian conference in San Jose on June 17, 2001 (with David Brin in the audience, just to make it a little more nerve-wracking). I have reworked it a bit to better suit a general audience. The references to Hal Varian, Harvey Newstrom, Tadd Hogg, and Mark Miller refer to earlier talks in that conference. I also updated it a bit. This was before Facebook, before the iPhone, before Youtube. I think those things have strengthened my position.

In The Transparent Society, David Brin envisions a world of universal surveillance by small ubiquitous cameras, recorders, and other electronic monitoring and profiling techniques that deprive citizens of much of the privacy we enjoy today. But far from being the Orwellian nightmare a knee-jerk libertarian reaction might suppose, Brin hails this as a positive step in the evolution of our culture and looks forward to it as a world of unprecedented freedom. Though we disagree about other political issues, I happen to agree with him on this one.

This is a minority opinion, even among those with whom I share much political and ethical common ground. The Electronic Frontier Foundation, for example, which I heartily endorse for their great work supporting free speeach and the rights of consumers on the Internet, also advocates for privacy protections well beyond what I would consider necessary or reasonable.

Definitions of Privacy

The word “privacy” is used to mean different things, so I think it is important to begin by outlining those. There are at least three significant meanings I have found:

  1. The ability (technical and legal) to choose for yourself with whom you will associate and how. We live in individual “private” houses with roommates and family of our choosing; we form work groups and social groups of our own choosing; we do not in general want others to impose themselves upon our lives in ways that we can't control.
  2. The ability (technical and legal) to keep secrets. We want the ability to have information that we don't reveal to others, or that we reveal selectively. Likewise, we want to be able to give and keep confidences with others.
  3. The ability to prevent others from distributing information about you.

The first of these is pretty uncontroversial. It is generally accepted in our culture that we have a right to choose our own associates, and we have technologies to accomplish it: walls and fences, caller ID and blocking, transportation for meeting our chosen friends in chosen places, and online tools like spam filters. The law recognizes invasions of this kind of privacy as trespassing, harassment, stalking, and other crimes.

The second of these is also widely supported, at least to the extent that most people believe one should have the right and power to keep secrets. I agree, and in projects such as Freenet I have fought to create that technological power. These technologies make it possible, for example, for political dissidents to exchange information freely and anonymously, or to publish criticism of powerful institutions without fear of reprisal. The ability to keep secrets is not absolute in our laws: an American court may compel testimony on pain of imprisonment, with narrow exceptions carved out for lawyers, doctors, and clergy. I would argue that all of us should have this right as well. I will describe here many reasons one might choose not to keep secrets that today are routinely kept, but I nonetheless support everyone's right to keep them.

It is the last of these that is problematic, and I will argue against it in general. I do not believe we have a right to control what another person—or group of people, such as a corporation—does with information they have been willingly given. Once we have given someone information, and it leaves our brain through our voice or our fingers on a keyboard, it is no longer ours, even it is information about us. Controlling such information requires us to impose upon the liberty of others, and allows others to impose on our liberty.

Information enables progress in many ways, and it is not possible for any person to imagine the ways in which information might be used when it can be used freely. Scientific research and business, in particular, thrive on information. The more these institutions know about us, the more they are empowered to create things that benefit us. Of course, as with technology, it also empowers them the create things that harm us. But it is a mistake to use that fear to justify restricting technology, business, or the free flow of information. Any restrictions we impose today are less likely to protect us that they are to hamper future good uses that we could not have imagined.

Even if we assume (which I do not) that preventing others from distributing information about us is a benefit to us, enshrining that into law becomes an economic entitlement that distorts the marketplace. It forces business to adopt methods and models that assume such information is “owned” by someone else, and therefore requires them to pay more for it than they would without the laws. This will make some business models and research proposals impractical that might otherwise have been a great benefit to us. Credit reporting is a classic example. We may not like the fact that businesses share information about our purchases and our finances, but without that sharing of information, the economic benefits of credit would not be available to us, or would be significantly more expensive. In short, if this kind of privacy is something we want, then we should have to pay for it ourselves, and develop technologies to make it cheaper.

Hal Varian brought up a wonderful example. When Blockbuster Video first opened, their business model was to purchase videotapes of movies from the producer and rent them to the consumer. It generally worked, but the high cost of the tapes (required because the producer got nothing but the up-front cost and had no control over rentals) forced them to buy very few of them, and popular movies were constantly unavailable at the time when they were most in demand—clearly a bad situation for consumers and sellers. Blockbuster's solution was to install a nationwide network of cash registers to track exactly when each tape was rented, with this information made available to the movie producers. This enabled them to change to a different contract with the producers, whereby instead of buying a small number of tapes at a high price, they bought large numbers cheaply with the agreement that they would forward part of the rental fee to the producer. Blockbuster can now guarantee that popular movies will be available, more people rent them when they are in demand pleasing the consumers, and both Blockbuster and the producers make more money. This win-win situation was only made possible by the fact that information about when each tape is rented—information that some people might want to restrict in the name of privacy—was made freely available. [Update: Yes, I realize this example is dated. Today I'd probably talk about how services like Facebook, Google, and many others remain free because they use information about you to target advertising, which also makes the advertisements less annoying.]

Motivations for privacy

Perceived liberty

We all live with powerful and intrusive governments that interfere in the lives of their subjects in many ways. Even in the United States, arguably one of the freest societies in history, most states will still put you in jail for smoking marijuana, hiring a prostitute, gambling, and many other peaceful activities that have no legitimate reason to be crimes. Most states still criminalize certain kinds of sex between consenting adults.

One way to minimize this intrusion into our lives is to jealously guard privacy, to ensure that the government doesn't know what you're smoking or who you're sleeping with. They can't prosecute you if they can't find you. But this is a band-aid approach. It doesn't solve the real problem that the government can haul you away for something you have a basic human right to do, and in fact it may even hamper efforts to solve the real problem by allowing people to tolerate bad laws as long as they have the privacy to avoid them. [Update: as of 2013, only Colorado explicitly allows recreational marijuana use, though of course it continues in private everywhere.]

We also use privacy to avoid discrimination by individuals. There are still many employers who would prefer not to hire gays or ethnic minorities or others they don't like. We may not want family or friends to know about some of our activities they might disapprove of. Here too privacy is a band-aid that masks the underlying problem—that many people dislike or disapprove of others for irrational reasons.

The ability to keep secrets does have some very pragmatic uses that are harder to argue against, and I do applaud those who create technological solutions for them. These include major uses such as national security (though this excuse is much overused) and confidential confession, and minor ones like the ability to play multiplayer games over the Internet without the possibility of cheating. [Update: This technology existed back then and was well known, yet there is not a single online poker site that protects its users in this way.] Because these uses do exist, and because of the necessary short-term use of minimizing government intrusions, I support efforts at technological solutions like cryptography that enable keeping of secrets, even though I believe these methods are often used in ways that may not support our goals.

Security

My third sense of privacy—the desire to prevent others from spreading information about us—is usually seen as a means to physical security. We believe that if we make it harder for evildoers to find us or learn about us, they will be less able to harm us. A typical example are laws that forbid companies to give personal information about you (such as social security number and address) without your permission. They are often hailed as a way to curb identity theft.

This sense of security is unjustified. Computer networking engineers call this security through obscurity, an epithet used to describe lackadaisical measures that fail to properly secure a computer system. “Don't worry,” an IT manager says, “we know the server can be remotely accessed, but no one knows it's here, so we're OK”. “Yes, the users can run the disk formatting program, but most of them don't know how, so it's not a problem.” As every security consultant knows, those two systems will be broken into and erased in short order. The proper way to secure a computer network—and a human being—is to make sure that systems are in place that make it impossible even for someone with detailed information to take advantage. The solution to identity theft is not to make it harder for the criminal to get basic information about you that is revealed in the ordinary affairs of business; the solution is to make sure that someone who does get the information still can't use it to defraud you. My bank account number is secure in this way. I could give you the number and you wouldn't be able to impersonate me or withdraw my money unless you could pass further security checks like forging my signature or guessing the PIN on my ATM card.

New cryptography-based technological measures such as those mentioned by Tadd Hogg, like zero-knowledge systems and capability security, are far superior to legal systems for accomplishing some of these goals. Credit card purchases over the net are well-secured by cryptography, and can be further protected by using single-use credit card numbers. With these systems it will be possible to give a merchant the ability to send you a package without actually revealing your physical address to him. [Update: This idea never caught on. Amazon knows my real address.] Many mail order and Internet sales in Europe already use a system whereby a merchant is given the ability to withdraw a certain amount from your credit card account without your having to reveal your card number. A lender can be given the ability to verify your employment history and income without revealing the details. Again, having laws that cover up these problems may actually delay implementation of the good technological solutions.

Legal protections have two flaws: they generally don't physically prevent security breaches but only punish offenders after the fact, and they only apply to humans. More and more, the information that we think is private is being collected and shared by computer software that may grow beyond the control and oversight of even those who create it. Electrons are constrained only by the laws of physics, not the laws of your local legislature. Hiding your information from other humans may not be enough.

Benefits of transparency

Before I describe the benefits of a more transparent society, let me first say that I fully endorse the sentiments of Benjamin Franklin: “They that give up essential liberty to obtain a little temporary safety...deserve neither safety nor liberty.” I do not propose that we give up the liberties implied by the first two definitions of privacy above—the liberty to choose our associates and the liberty to keep secrets. But there are good reasons why we might choose not to exercise some of those liberties in many situations, and the third use of the word privacy is not a liberty for ourselves at all. It is a desire to restrict the liberty of others to share information in ways we don't approve of.

I have deliberately avoided using the word “freedom” here because it too is notoriously ambiguous and is used to mean both the liberty to choose and act for oneself (e.g., “freedom of speech”), and also to mean the absence of some condition we find undesirable (e.g., “freedom from want”). It is the former that Franklin is talking about and that I am talking about when I say that my third definition of privacy is not an example of it. We may well have reasons to desire that others not give personal information about us to third parties, but if we do, it is our job to figure how not to give them the information in the first place rather than shackling them with restrictions on what they can do with it after we have given it. Otherwise we are sacrificing their liberty for our safety, which I believe Franklin's admonition applies to as well.

There are benefits to choosing to reveal over choosing to conceal. The above mentioned benefit of enabling new forms of commerce is one of these. Security and crime prevention is another. Examples include day care centers with web cameras where parents can view their children at any time during the day to ensure their well being; cameras in public places to deter crime (the effectiveness of which is still debatable); web sites like beenverified.com where chatroom acquaintances can get background information on the people they meet in cyberspace before risking real-life encounters; shopping sites like Amazon where information about my book purchases and those of others allows them to recommend things I might like but haven't found for myself. I would answer Eliezer Yudkowsky's objection that your children may not want to be in such a day care center by pointing out that the reason they are in day care in the first place is that they are not yet competent to take responsibility for their own lives and safety. My mother understands the dynamics involved and insists that if she ever has to face going to an elder care center, she wants one with webcams that I and my sister can monitor.

I may play low-stakes poker at home among friends in private, but when I want to play for serious money, I do what most serious poker players do and head for a public casino. A casino typically rakes 25-50 cents per hand, per player, for the privilege of using their tables. Why is it worth that cost? Because they have trained dealers, floormen to watch the game and arbitrate disputes, and cameras in the ceiling to catch cheaters. They go to great lengths to ensure that the customer gets a fair game, because they know that's what keeps the customer coming back.

Reagan at Brandenburg Gate, 1987.

Reagan at Brandenburg Gate, 1987.

The impact of the video camera to democracy and freedom cannot be overemphasized. Ronald Reagan didn't bring down the Berlin wall, Sony did. Germany reunited because the East could no longer hide the prosperity and freedom of the West from its citizens, who became increasingly resentful. Oppressed people around the world give witness to their conditions with video. Video of Tienanmen square moved the world, and news went out over the Internet even after China closed all the official media. We know about the LAPD because someone filmed Rodney King's beating. It is no accident that the framers of the Constitution protected the right to a public trial, and that we reacted to President Nixon by passing the Freedom of Information act. Sunlight is the best disinfectant. [Update: Of course I never could have predicted Youtube. Or Twitter and the Arab Spring.]

Certainly there are times when more information can seem to be a problem. Presidents and other public figures have a hard time hiding aspects of their personal lives. While some might lament this (especially the politicians), I think the nation as a whole is better for it. The American people demonstrated in the 1996 election that they are capable of choosing a candidate in spite of personal peccadilloes, and while we might argue with the choice they made, it does demonstrate that they are sophisticated enough to see through tabloid headlines and make decisions for themselves. In the 1930s, President Roosevelt was able to hide from the American people the fact that polio had confined him to a wheelchair, which he would certainly not be able to do today. Is it an accident that cultural attitudes about the disabled have improved only in recent years? We as a culture have grown up. We now know more about the personal lives of others, and this has led to understanding and tolerance.

Scientific research lives and dies on information. A great example is the Hoffman LaRoche/DeCODE Genetics study in Iceland. Iceland's population is a geneticist's dream. Their population is almost entirely descended from a small group of explorers and they have almost no immigration, so they are genetically isolated. They also have a passion for genealogy, and keep detailed records going back to the time of the Vikings. Because of this, DeCODE proposed to get genetic samples and detailed medical records and family histories from nearly everyone in the country, which they could then use to isolate genes for the genetic diseases that often plague the Icelandic (as well as others). The country held a public vote on the proposal, and approved a plan to allow DeCODE to collect the data and use it for this research—and even sell it to others for their research—for 12 years. While this proposal has been criticized for being deceptively marketed to the Icelanders and for overly benefiting a few private corporations, the value of the data for research is rarely questioned, and it is reaping benefits. In a more open society less concerned with privacy and secrecy, not only would the data be available to everyone, but the benefits of the research would be as well.

Information is the fuel of the social machine, and trust is the lubricant. As Mark Miller has shown us, trust is the essential ingredient that enables business. It enables personal relationships. And yes, it enables betrayal. Hiding information about ourselves may shield us from betrayal, but it also limits our interactions. Greater exchange of information by all sides can both protect us from betrayal and facilitate our interactions. In short, don't fear the camera pointed at you--rejoice in in it, and make sure you know who is controlling it by having your own cameras too.

Law and custom

Current privacy law in the United States is odd, and often depends on the circular definition of “reasonable expectation”. You have a right to privacy if you reasonably expect that you do, which basically means that that a judge somewhere has ruled that it is reasonable to expect it. For example, you have an expectation of privacy in land-line telephone conversations, but not cellular phones. Whether one party has the right to record the other without consent varies from state to state, but the police need a warrant to tap your land line. Anachronisms of law make it illegal to audio record people in many situations where it would be legal to videotape them. Voyeur cams, whereby men secretly record the view looking up women's skirts are surprisingly legal in many places. The Supreme Court has issued dozens of confusing rulings: police can be issued no-knock warrants on the say of an anonymous informant, but they can't use a thermal imaging device to look for pot plants in your home.

The upshot is that law generally follows custom, but it follows at a distance. Social and technological progress moves much faster than legislatures, so it is unlikely that the law will ever fully comport with the culture's sense of what should be private and what shouldn't be. That's to be expected when the culture itself is a dynamic thing. It is important to realize that if too many of our less rational desires for privacy are encoded into law, they are very difficult to remove even after we discover conflicts between privacy and the free exchange of information, because people will come to expect them, and the law will recognize them, for a long time even after the problems are discovered.

In the absence of effective law, voluntary restraint works remarkably well. People generally do respect what they perceive to be the privacy of others, with the exception, perhaps, of a few paparazzi. Voyeur cams are very rare despite being legal, and many web hosting services refuse to give them connectivity. Major newspapers have a long-standing policy of not outing gay celebrities like Jody Foster and Rosie O'Donnell even though their secrets are hardly news to anyone who's paying attention, because they know that these celebrities still personally value having these secrets kept in some markets and they agree to do that. [Update: O'Donnell has since come out, and married. Foster is, well, an even more open secret.]

Another social custom that has arisen in the Internet age is for consumers to protest sites that ask for more information than the consumer thinks they need. They protest by simply ignoring those sites in favor of others, and by the expedient of lying, helped by sites like bugmenot.com. Orwell predicted this response as well when talked about “jamming”, or flooding the watchers with more useless information than they could handle. I imagine other creative techniques will evolve as well. [Update: Sites like Facebook and Google are constantly increasing their privacy controls not because of legislation but because users demand it.]

The technological future

Harvey Newstrom did a marvelous job showing us just how much companies want to find information about you and the vast array of tricks they have for doing it. It's really quite remarkable considering the lengths to which people go to try to prevent it. When I was working on the PNG image file format, we had to include a section on security issues. Most of us thought this was rather silly since it's just a method of representing images as bits, but the standardization process required it, so it's there. Every Internet standards document has that section, and all are carefully reviewed by many people before becoming a standard, so security issues on the Internet are heavily scrutinized. Despite that, people can and do find breaches and companies do get personal information.

The ability of the Internet to collect and correlate personal information will get better. Companies already cooperate to compile shared profiles of you as a consumer. Databases already exchange information without detailed human direction. As the Internet gets faster and bigger, it will collect more information from more sources and correlate them faster.

But that's only the tip of the iceberg compared to what will be possible in the near future. Cameras and other recording devices are getting smaller and cheaper every year. Today we carry them in our pockets and mount them in teddy bears. Soon they will be flying about like birds, then buzzing about like insects, and then floating in the air like dust. They will be everywhere, observing everyone, all the time, sending their information to the Internet to be collected and correlated with other information sources.

The wearable PC you use to surf the net while going about your daily life will enable you to get information wherever you are, but it will also tell the net about you. It will have a GPS receiver so you can get maps to help you get where you're going, but that will also tell the net where you are (and even how fast you're moving—yes, there have been speeders caught by their own GPS devices). These computers will be embedded in the walls of your house, the instrument panel of your car, even your clothing. [Update: No, I didn't predict that this portable computer would also be your phone, and that it would also save all your texts and voicemails.]

Trying to stem this flood of information and technology with legal protections and cultural norms is ineffective now, and will become absurd sooner than most people think. A far better approach is to recognize that these systems are coming and create cultural attitudes and legal systems for dealing with them, not pretending to prevent them.

First, let us understand who wants the information and why. Companies want to sell us things. This is not particularly sinister as motives go, and public indignation to corporate abuse is a very effective means of curbing it. Companies are so concerned about appearing to violate social norms of privacy that they hire security consultants with contracts that forbid them to reveal what they find. Public opinion is a powerful force--just ask any tobacco company. The more serious threat is government. They want to spy on us to catch us doing things they disapprove of. Sometimes that leads them to catch real criminals, but far too often it lets them harass innocent citizens. But even here, citizens turning the lights and cameras back at the government has done far more to stem these abuses than privacy protection. Freedom for supporters of more open marijuana laws to speak has led to more relaxed laws in many states over the loud objections of politicians who are behind the times. Let us strive to ensure that we are free to do those things we now want to hide, rather than continuing to hide them.

An ethical argument

In another essay of mine, Moral realism, I explain how accurate knowledge of the world is esential regardless of your goals or values. No matter where you're going, you can't get there unless your map matches the territory.

I argue, therefore, that the more correct information is known about the world and about me, the more likely it is that other people's actions toward me will be rational. This is borne out by many experiments that show exposure to foreign ideas tends to increase tolerance and reduce bigotry. The more we know about others, the better enabled we are to act for their benefit. The more others know about us, the better enabled they are to act in ours.

Of course, if someone's goal is my destruction, then knowing more about me will also aid him in achieving that goal. But despite what the Calvinists and others might believe, I don't think enough people really want that for it to be a problem. The vast majority of people merely want a good life for themselves, and those few who do lust for power and destruction can be kept in check by making sure we know enough about them to protect ourselves. At any rate, I believe that misinformation is a greater risk than evil motives, because evil motives are self-limiting and not advantageous in the long run. It is those who get along in society that prosper—if they have enough good information to do so.

Open source software

As a final example about the benefits of transparency, I'd like to talk about open source software. It is well known in that community that keeping software secret is a very good way to ensure that it has bugs and security holes. Only software that is examined by everyone can be regarded as safe. But further than that, open source gives us the ability to repair and replace modules that we wouldn't have with proprietary systems, and opens up new opportunities.

Software is complex. I doubt many people—even computer scientists—could name every software module between their PC's silicon chips and, say, a Java applet. There are many layers: microcode in the chips, hardware controllers with onboard ROM, system BIOS software, bootstrap loaders, the operating system kernel, device drivers, internet protocols, operating system APIs, shared libraries, language runtimes, a system shell, browser software, a Java virtual machine, Java APIs, the applet software itself, and several interactions among those various levels. Some of these are proprietary closed systems, and some are more open (or can be). Because each of these things can have bugs or can behave in ways we don't like, it is important that we understand how they work and be able to repair or replace them to improve performance, security, or other features. This can be extraordinarily difficult with closed systems. Vernor Vinge's science fiction work A Deepness in the Sky describes a character whose job is “programmer/archaeologist”, which is not at all far-fetched.

The importance of being able to change these modules can be shown by what happens when they are too proprietary. System BIOS software, for example, is relatively simple, but its function is important. Without being able to duplicate and update it, manufacturers would not have been able to make the PC clones that opened up the PC market to everyone. The way they did this was to to spend thousands of dollars doing something that could have been done with a few minutes in an EPROM burner if it weren't for copyright law; they made what is called a “clean room” implementation. They hired two groups of programmers, one to write the duplicate BIOS and one to reverse-engineer the function of the existing one, but they made sure that the first team got no information about the actual code from team two so that they could not be accused of copying actual code. Likewise, hundreds of programmers have spent years creating Linux to replace proprietary operating systems like Microsoft Windows, because they can't fix or improve upon closed systems.

Open source software improves the situation considerably, allowing us to look, for example, at the actual source code of the operating system, various libraries, and application code. There are many free and open Java virtual machines, for example, and most of them are superior to Sun's original in some way, which has helped to spread the use of the language considerably. Most web servers use Linux because it is fast and secure and easily updated. Despite the fears of the those who still support the proprietary software sales model, open source can be big business, too. Red Hat software makes good money despite the fact that every software product they sell can be downloaded over the net for free and fully examined. Proprietary software still has the upper hand in the marketplace so far, but the trend is definitely toward more open systems, and for many good reasons.

Conclusion

Just as Napster [Update: OK, maybe Bittorrent] and Freenet show the futility of trying to maintain business models and social systems based on the ownership of information, the coming technologies of surveillance and data collection make it clear that social systems based on scarcity of information won't work. Rather than trying to put our finger in the dyke to prevent the flood, let us build a hydroelectric plant to benefit from it, and use it to shine a brighter light on the dark forces we now hide from.