Part 2 in an Ongoing Series
Data is the oil of the modern economy. That analogy, although it has become a cliché, seems even more apt when you dig deeper. A century ago, we were building our economy around the suddenly ubiquitous flow of oil. We didn’t assign much weight to the externalities in our rush to take advantage of all the benefits that this new resource provided, though some even then warned of environmental concerns. Now, as society begins to feel the effects of climate change and seeks ways to reduce our reliance on petroleum, we might look back at those early days and wonder if different policies might have led to better results.1
Part 1 of this series addressed the principle that privacy is a balance, and examined the fallacies embedded in the arguments “If you have nothing to hide, you have nothing to fear” and “All our data is already out there.” This installment looks at another principle and a few more fallacies.
Second Principle: Context Matters
Jim is a high school math teacher. He is well respected in his community and tries to set a good example for his students. Twenty years before, when he was in his early twenties, Jim totaled his car and was convicted of driving under the influence. It was a wake-up call, and he hasn’t touched a drop of alcohol since. He is ashamed of what happened and never talks about the accident.
Bob and Janice seem like the perfect couple. They have two children and a third on the way and are very active in their community. Lately, they have been arguing a lot and have started going to a marriage counselor. They don’t want anyone in their social circle to know.
After going away to college in another state, Jana realizes that she’s attracted to women. While out about her sexual orientation at school she has chosen not to come out to her parents yet. She’s nervous about doing so because her parents have expressed disapproval of same-sex relationships.
Each of these people is trying to manage their image by controlling who has access to certain pieces of information about them. They are not being shady or dishonest or scurrilous – they are simply being human. However, the information that they are trying to be discreet about is also in some ways “public.”
Jim’s DUI is a matter of public record, available to anyone willing to delve into the local court’s archived criminal docket. Bob and Alice walk into their therapist’s office in broad daylight, and anyone who happens to be walking down the street can see them do so. Jana’s sexuality is known to everyone who knows her at college, in a different state from her parents.
While all these facts are knowable to anyone who is motivated enough to search for them, they are also obscure. That is to say, most people don’t know them and probably never will. It is possible for each of these people to manage their image the same way people have done throughout history: by acting discreetly and being careful with whom they share personal information. There’s always the chance that their secrets will be discovered, but they are relying on the unlikelihood of that happening to control how certain people see them.
While the notion of obscurity has not been thoroughly developed in jurisprudence, in 1989 the Supreme Court recognized the concept of “practical obscurity” and the fact that a person’s information can have been revealed publicly but that individual still has a privacy right in limiting broader circulation of the information. Of course, this case was decided before the internet supercharged the potential for obscure information to be cheaply and widely broadcast and technology threatened to eliminate the concept of obscurity. Websites now delve into public records and broadcast facts that might be embarrassing or which the people involved would prefer to keep private. Smart phones may track our every movement, and that information can be sold to anyone. Data aggregators pull information from every corner of the internet and combine it with purchase records, web browsing history, magazine subscriptions, and any other information they can find to compile dossiers about you that you cannot control, and then they sell those dossiers for profit.
When challenged, the individuals or companies that implement these technologies argue that they are merely collecting public information, that consumers voluntarily reveal their information for anyone to see, or that there is nothing problematic about gathering innocuous pieces of information about a person’s life. But there is a meaningful difference between a neighbor inadvertently stumbling upon a person’s secret and a business whose sole purpose is, for example, to scrutinize a person using artificial intelligence and web tracking. Everyone should get to decide what part of their personal information is innocuous and what is sensitive. An accidental disclosure is very different in kind and magnitude from millions of intentional disclosures that arise from a profit motive. Context matters.
Fallacy Three: Terrorists! Serial Killers! Disease!
Few rights are absolute. In the United States, freedom of speech and assembly has exceptions. Even the right to continue to live can be abridged in certain circumstances. The only absolute right is freedom of thought, though even that might someday be infringed upon with the advance of technology.
Privacy is no exception. Privacy is almost always brought up when it seems to compete with some other compelling need. National security, public safety, fraud prevention, exposing corruption, and tracking diseases can all be legitimate reasons to violate someone’s privacy. The problem is, those espousing the overriding principle often present that principle as the absolute and give short shrift to individuals’ right to privacy. How dare you insist on privacy, they might argue, if giving it up allows police to capture a murderer?
But willingness to risk that a murderer may escape in some circumstances has been the policy of the United States since its founding. Police cannot search a home without a warrant and, to obtain that warrant, they must convince a judge that there is probable cause to believe that a search will yield evidence. The judicial sign-off exists as a check on the government’s ability to invade constitutionally protected privacy. If the police evade this protection, the evidence they obtain will be excluded from consideration, even if that means the bad guy goes free. So yes, the U.S. Supreme Court says, there are circumstances in which even capturing a murderer is less important than an individual’s privacy.
From a policy perspective, the issue is that the party that wants to invade your privacy honestly believes their reason is important enough to do so, and the decision is often made with no one in the room to argue privacy’s case. Without laws setting out clear lines of when privacy rights can and cannot be infringed, the anti-privacy argument will almost always prevail.
Fallacy Four: But It’s So Convenient!
This argument has the virtue of being true. We trade privacy for convenience all the time. Allowing companies to build profiles about us helps us quickly find the information we want, the products we need, even the movies that interest us. Allowing them to track our location helps us get from point A to point B, stay in touch with friends, and get emergency help when our car breaks down.
Allowing companies to know us renders frictionless such transactions as logging in, buying things, and acquiring credit. Having all the apps we use be part of the same ecosystem, overseen by the same companies, saves us time. Allowing companies to insert themselves into our households lets us pretend we’re on the starship Enterprise, able to speak commands to faceless robots who fulfill our needs, while our living spaces adjust themselves to our desires better than we’d ever be able to do for ourselves.
But swapping convenience for privacy is a false tradeoff. If businesses collected only as much information as necessary to fulfil the promise of convenience, if the data were used just for these purposes, and if the data were sufficiently protected, everything would be fine.
Those are, of course, three big ifs. In fact, businesses do not collect only the information they need to provide the service, do not limit their usage of this data, and do not do enough to protect the data. Absent regulation, what incentive do they have to do so? The money to be made from misusing our information is too great and the state and market have consistently failed to punish bad actors.
Fallacy Five: There is Nothing Wrong with Targeted Advertising
Surveillance advocates often talk about targeted advertising, which consumers think of as “ads that follow you around the internet.” They can be unsettling and represent the most obvious face of data collection. However, I have frequently been told that “creepy is not a harm.” In fact, many people like targeted ads, because they show them advertising that is of interest to them. Finally, such ads are what allow many services to be provided for free. Targeted ads are the basis for the business model of the internet. While there are arguments that targeted ads are inherently problematic, they may be one of the least bad things about the surveillance economy.
That’s precisely why surveillance advocates often shift the conversation to this red herring. They distract people from concerns that the same companies and practices that enable targeted ads can also enable far more dangerous privacy erosions, including discrimination, harassment, psychological manipulation, authoritarianism, and other insidious problems that privacy advocates truly want to address.
As with the argument for convenience, if data collection were limited to targeted advertising, we wouldn’t be so concerned. But targeted ads are just the mildly unsettling mask that hides the far more disturbing face of the surveillance beast.
As described above, arguments against privacy protections and limits on use of personal information are often compelling. They may play on fear or they may emphasize convenience. In many cases, they also assume that the status quo surveillance economy is the superior economic model and that anything that threatens the profitability of that model is inherently suspect. It is important to remember that though some are reaping enormous financial benefits from this model, everyday people are deeply unsettled by it and want us to do more to protect their privacy.
- The analogy is even more cogent when you consider that a century ago there was a surge in antitrust enforcement to rein in the power of the early oil barons – innovators who were among the first to recognize the value of this newly monetizable resource and who went on to develop massive monopolies.