The following is an excerpt from Bruce Schneier's Data and Goliath. Listen to SciFri on Friday, March 6, to hear Schneier talk more about data and privacy.
One of the common defenses of mass surveillance is that it’s being done by algorithms and not people, so it doesn’t compromise our privacy. That’s just plain wrong.
The distinction between human and computer surveillance is politically important. Ever since [Edward] Snowden provided reporters with a trove of top-secret documents, we’ve learned about all sorts of NSA word games. The word “collect” has a very special definition, according to the Department of Defense. It doesn’t mean collect; it means that a person looks at, or analyzes, the data. In 2013, Director of National Intelligence James Clapper likened the NSA’s trove of accumulated data to a library. All those books are stored on the shelves, but very few are actually read. “So the task for us in the interest of preserving security and preserving civil liberties and privacy is to be as precise as we possibly can be when we go in that library and look for the books that we need to open up and actually read.”
Think of that friend of yours who has thousands of books in his house. According to this ridiculous definition, the only books he can claim to have collected are the ones he’s read.
This is why Clapper asserts he didn’t lie in a Senate hearing when he replied “no” to the question “Does the NSA collect any type of data at all on millions or hundreds of millions of Americans?” From the military’s perspective, it’s not surveillance until a human being looks at the data, even if algorithms developed and implemented by defense personnel or contractors have analyzed it many times over.
This isn’t the first time we’ve heard this argument. It was central to Google’s defense of its context-sensitive advertising in the early days of Gmail. Google’s computers examine each individual e-mail and insert a content-related advertisement in the footer. But no human reads those Gmail messages, only a computer. As one Google executive told me privately in the early days of Gmail, “Worrying about a computer reading your e-mail is like worrying about your dog seeing you naked.”
But it’s not, and the dog example demonstrates why. When you’re watched by a dog, you’re not overly concerned, for three reasons. The dog can’t understand or process what he’s seeing in the same way another person can. The dog won’t remember or base future decisions on what he’s seeing in the same way another person can. And the dog isn’t able to tell anyone—not a person or another dog—what he’s seeing.
When you’re watched by a computer, none of that dog analogy applies. The computer is processing what it sees, and basing actions on it. You might be told that the computer isn’t saving the data, but you have no assurance that that’s true. You might be told that the computer won’t alert a person if it perceives something of interest, but you can’t know whether that’s true. You have no way of confirming that no person will perceive whatever decision the computer makes, and that you won’t be judged or discriminated against on the basis of what the computer sees.
Moreover, when a computer stores your data, there’s always a risk of exposure. Privacy policies could change tomorrow, permitting new use of old data without your express consent. Some hacker or criminal could break in and steal your data. The organization that has your data could use it in some new and public way, or sell it to another organization. The FBI could serve a National Security Letter on the data owner. On the other hand, there isn’t a court in the world that can get a description of you naked from your dog.
The primary difference between a computer and a dog is that the computer communicates with other people and the dog does not—at least, not well enough to matter. Computer algorithms are written by people, and their output is used by people. And when we think of computer algorithms surveilling us or analyzing our personal data, we need to think about the people behind those algorithms. Whether or not anyone actually looks at our data, the very facts that (1) they could, and (2) they guide the algorithms that do, make it surveillance.
You know this is true. If you believed what Clapper said, then you wouldn’t object to a camera in your bedroom—as long as there were rules governing when the police could look at the footage. You wouldn’t object to being compelled to wear a government-issued listening device 24/7, as long as your bureaucratic monitors followed those same rules. If you do object, it’s because you realize that the privacy harm comes from the automatic collection and algorithmic analysis, regardless of whether or not a person is directly involved in the process.
Excerpted from Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World by Bruce Schneier. Copyright © 2015 by Bruce Schneier. With permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.
Author photo © Geoff Stone
Every day, reporters and producers at The World are hard at work bringing you human-centered news from across the globe. But we can’t do it without you. We need your support to ensure we can continue this work for another year.
Make a gift today, and you’ll help us unlock a matching gift of $67,000!