Greg Kochanski |
My own brother has asked me "Why are you disturbed by surveillance if you have nothing to hide?" The implication is that us honest people have nothing much to fear from anyone, so since/if you're honest, why worry about it?
I suppose he's thinking of a perfect world where we all agree on morality, the law is in perfect agreement with justice, and everyone follows the law to the letter, including the police. I suppose that if such a world existed, I wouldn't worry much about privacy. After all, I'd be nearly perfect, too.
But, in the real world, I do worry. There are disagreements about morality and legality, and law doesn't quite match up with justice. So, even someone as pure as snow could get into trouble in the real world. Beacause there are criminals; and good people who do criminal acts for one confused reason or another; and people who do things now that are currently legal but which society will eventually decide are troublesome and illegal; and people who operate in the grey zones.
Any human system is faulty to some degree, and society is no exception. So, we can reasonably ask, "How do we design society to so that small problems don't cause big trouble?"
There are many examples of engineered systems that are designed to minimize the damage if something goes wrong. Ask yourself why airplane windows are small. The answer is that a large window would cause unacceptable damage if it broke. (Overly rapid decompression and passengers getting sucked out; that kind of thing.)
Or, look at the electrical wiring leading from the substation to your house. There are a few necessary transformers, but beyond the bare minimum, there is little more than copper wire and circuit breakers. Not much fancy electronics. No extra complexity. The reason is simply that the risks of fire and failure are significant, and every extraneous component adds to those risks.
Cars are another example. Much of the design of the front half of a car is controlled by the possibility of a crash. In case of a crash, the car needs to crumple to absorb the impact without breaching the passenger compartment.
So, in all these cases and many more, we design our systems to survive things going wrong around them. The idea is called "robustness". The more robust your design is, the more likely that it (or its passengers) will survive some failure, crash, or what have you.
The obvious analogy is that we want to design our society so that it continues working well, even if individual parts of it misbehave. Governments misbehave in varying degrees all the time. Historically, misbehaviour has been relatively modest in the US, but it's always been present. Bribery and corruption is part of it, but misbehaviour also includes well-intentioned violations of the law and of social expectations. It also includes accidental losses of data that could be picked up by unsavory people. Corporations misbehave in various ways, also, as do individuals. Certainly, one tries to eliminate such misbehaviour, but the other thing to do is design your society so that small amounts of misbehaviour do not cause too much damage.
A society that's not robust against small problems is like a car without a seat belt or air bag.
Privacy helps make society robust against a whole variety of misbehaviours by simply keeping valuable information out of the hands of people who may mis-use it. Specifically, if I hold information private, I keep it out of everyone else's hands. Thus, the number of people who can mis-use my data is 1 (if I'm extremely private), instead of 6 billion (if I publicize everything).
No one wants complete privacy, of course. There is a lot to be gained by opening up some information to some people. I put a certain amount of information up on the web because of that. However, being an (almost) engineer, I keep the rest reasonably private because I want my life to be robust against various kinds of individual, corporate, and government misbehaviour.
The important thing to understand is that I do not need to have any particular misbehaviours in mind. I know that society is far more creative than I am, and will think of the most amazing things to do with my data. I won't even begin to imagine most of those things until it's too late. So, privacy is a pre-emptive strike against the not-yet-imagined annoyances and threats of the next decade.
Similarly, the engineer who designs the next airplane you will ride doesn't try to consider every possible thing that can go wrong because it can't be done. He or she will certainly consider the more important known threats, but any good engineer will always try to leave a little margin of safety for the totally weird unexpected things that have never happened before. The good engineer knows that they're going to happen, even if he/she doesn't know in detail what they will be.
So, that's my engineering view of privacy. It's a rational response to the knowledge that people will be doing stuff with your data. Most of what they do may be legal and some of what they do may be welcome, but you expect that some will be undesireable, harmful and/or illegal. A desire for privacy is simply a desire to limit the possible damage that can be done to you. Or, more broadly, it is a desire to limit the influence that these unknown people will have on your life.
So, privacy is analogous to building a ship with watertight compartments. It's analogous to putting fire doors in a building. It's analogous to having an army (you only need an army when something goes wrong). It's analogous to a "double insulated" power tool. That second layer of insulation is unnecessary if everything works correctly: however we've learned that things go wrong, and when they do, you really want that extra safety margin. Likewise with society.
[ Papers | kochanski.org | Phonetics Lab | Oxford ] | Last Modified Mon Aug 11 05:22:10 2008 | Greg Kochanski: [ Home ] |