Read Data and Goliath Online

Authors: Bruce Schneier

Data and Goliath (35 page)

It’s in the US Constitution—not explicitly, but it’s implied in the Fourth, Fifth,
and Ninth Amendments. It’s part of the 2000 Charter of Fundamental Rights of the European
Union. In 2013, the UN General Assembly approved a resolution titled “The right to
privacy in the digital age,” affirming that our fundamental right to privacy applies
online as well as offline, and the risk of mass surveillance undermines this right.

The principles are enshrined in both national and international law. We need to start
following them. Privacy is not a luxury that we can only afford in
times of safety. Instead, it’s a value to be preserved. It’s essential for liberty,
autonomy, and human dignity. We must understand that privacy is not something to be
traded away in some fearful attempt to guarantee security, but something to maintain
and protect in order to have real security.

Charter of Fundamental Rights of the European Union (2000)

ARTICLE 7:
Respect for private and family life. Everyone has the right to respect for his or
her private and family life, home and communications.

ARTICLE 8:
Protection of personal data. 1. Everyone has the right to the protection of personal
data concerning him or her. 2. Such data must be processed fairly for specified purposes
and on the basis of the consent of the person concerned or some other legitimate basis
laid down by law. Everyone has the right of access to data which has been collected
concerning him or her, and the right to have it rectified. 3. Compliance with these
rules shall be subject to control by an independent authority.

None of this will happen without a change of attitude. In the end, we’ll get the privacy
we as a society demand and not a bit more.

DON’T WAIT

The longer we wait to make changes, the harder it will become. On the corporate side,
ubiquitous tracking and personalized advertising are already the norm, and companies
have strong lobbying presences to stymie any attempt to change that. California’s
Do Not Track law is an example of that: it started out as a good idea, but was completely
defanged by the time it passed. The natural trajectory of surveillance technology
(remember the declining cost from Chapter 2?) and the establishment of a new status
quo will make changes much harder to achieve in the future, especially in the US.

It’s not just corporations that oppose change. Politicians like the same data that
commercial advertisers do: demographic data, information about individual consumer
preferences, political and religious beliefs. They use it in their campaigns, and
won’t want to give it up. Convincing them to forgo access to data that helps them
with targeted messaging and get-out-the-vote campaigns will be tough. And once someone
figures out how to use this data to make elections permanently unfair, as gerrymandering
does, change will be even harder.

At a conference last year, I overheard someone saying, “Google Analytics is the crack
cocaine of Internet surveillance. Once you see it, you never want to give it up.”
More generally, data becomes its own justification. The longer we wait, the more people
and organizations become used to having broad access to our data and the more they
will fight to maintain that access.

We’re at a unique time to make the sorts of changes I recommend in this book.

The industries that collect and resell our data are powerful, but they’re still relatively
new. As they mature and become more entrenched, it will be much harder to make major
changes in how they do business.

Snowden has forced government surveillance into the spotlight by revealing the actions
of the NSA and GCHQ. This has produced some singular tensions amongst the world’s
countries, between Germany and the US in particular. And in the US, this political
tension cuts across the traditional partisan divide. There’s an opportunity for real
change here. As Chicago mayor and former Obama chief of staff Rahm Emanuel said, “You
never want a serious crisis to go to waste.” That’s true here, even if most people
don’t realize that this is a crisis.

We’re at a unique moment in the relationship between the US and the EU. Both sides
want to harmonize rules about this sort of thing, to make cross-border commerce easier.
Things could go one of two ways: Europe could become more permissive to match the
US, or the US could become more restrictive to match Europe. Once those rules get
equalized, they’ll be harder to change.

We’re also in a unique time in the development of the information age. The Internet
is being integrated into everything. We have temporary visibility of these systems.
We can make changes now that will stand for decades.

Even so, we need to be prepared to do this all over again.

Technology constantly changes, making new things possible and old laws obsolete. There’s
no reason to think that the security, privacy, and anonymity technologies that protect
us against today’s threats will work against tomorrow’s.

Political realities also change, and laws change with them. It’s folly to believe
that any set of solutions we establish today will hold forever, or even for half a
century. A look back at recent US history suggests that gross abuses of executive
power—and the resulting big public scandals and mitigating reforms—have occurred every
25 to 30 years. Consider the Palmer Raids in the 1920s, McCarthyism in the 1950s,
abuses by the NSA and the FBI in the 1970s, and the ongoing post-9/11 abuses in the
2000s.

If you think about it, 30 years is about the career length of a professional civil
servant. My guess is that’s not a coincidence. What seems to be going on is that we
reform, we remember for a while why we implemented those
reforms, and then eventually enough people change over in government that we begin
to forget. And then we roll back the reforms and have to do the whole thing over again
in a new technological context.

The next set of abuses is for another generation to reckon with, though. This one
is ours.

I worry that we’re not up to it: that this is generational, and that it will be up
to the next generation to make the social shifts necessary to enact policy changes
that protect our fundamental privacy. I worry that the generation currently in charge
is too scared of the terrorists, and too quick to give corporations anything they
want. I hope I’m wrong.

THE BIG DATA TRADE-OFF

Most of this book is about the misuse and abuse of our personal data, but the truth
is, this data also offers incredible value for society.

Our data has enormous value when we put it all together. Our movement records help
with urban planning. Our financial records enable the police to detect and prevent
fraud and money laundering. Our posts and tweets help researchers understand how we
tick as a society. There are all sorts of creative and interesting uses for personal
data, uses that give birth to new knowledge and make all of our lives better.

Our data is also valuable to each of us individually, to conceal or disclose as we
want. And there’s the rub. Using data pits group interest against self-interest, the
core tension humanity has been struggling with since we came into existence.

Remember the bargains I talked about in the Introduction. The government offers us
this deal: if you let us have all of your data, we can protect you from crime and
terrorism. It’s a rip-off. It doesn’t work. And it overemphasizes group security at
the expense of individual security.

The bargain Google offers us is similar, and it’s similarly out of balance: if you
let us have all of your data and give up your privacy, we will show you advertisements
you want to see—and we’ll throw in free web search, e-mail, and all sorts of other
services. Companies like Google and Facebook can only make that bargain when enough
of us give up our privacy. The group can only benefit if enough individuals acquiesce.

Not all bargains pitting group interest against individual interest are such
raw deals. The medical community is about to make a similar bargain with us: let us
have all your health data, and we will use it to revolutionize healthcare and improve
the lives of everyone. In this case, I think they have it right. I don’t think anyone
can comprehend how much humanity will benefit from putting all of our health data
in a single database and letting researchers access it. Certainly this data is incredibly
personal, and is bound to find its way into unintended hands and be used for unintended
purposes. But in this particular example, it seems obvious to me that the communal
use of the data should take precedence. Others disagree.

Here’s another case that got the balance between group and individual interests right.
Social media researcher Reynol Junco analyzes the study habits of his students. Many
textbooks are online, and the textbook websites collect an enormous amount of data
about how—and how often—students interact with the course material. Junco augments
that information with surveillance of his students’ other computer activities. This
is incredibly invasive research, but its duration is limited and he is gaining new
understanding about how both good and bad students study—and has developed interventions
aimed at improving how students learn. Did the group benefit of this study outweigh
the individual privacy interest of the subjects who took part in it?

Junco’s subjects consented to being monitored, and his research was approved by a
university ethics board—but what about experiments performed by corporations? The
dating site OKCupid has been experimenting on its users for years, selectively showing
or hiding photos, inflating or deflating compatibility measures, to see how such changes
affect people’s behavior on the site. You can argue that we’ve learned from this experimentation,
but it’s hard to justify manipulating people in this way without their knowledge or
permission.

Again and again, it’s the same tension: group value versus individual value. There’s
value in our collective data for evaluating the efficacy of social programs. There’s
value in our collective data for market research. There’s value in it for improving
government services. There’s value in studying social trends, and predicting future
ones. We have to weigh each of these benefits against the risks of the surveillance
that enables them.

The big question is this: how do we design systems that make use of our
data collectively to benefit society as a whole, while at the same time protecting
people individually? Or, to use a term from game theory, how do we find a “Nash equilibrium”
for data collection: a balance that creates an optimal overall outcome, even while
forgoing optimization of any single facet?

This is it: this is the fundamental issue of the information age. We can solve it,
but it will require careful thinking about the specific issues and moral analysis
of how the different solutions affect our core values.

I’ve met hardened privacy advocates who nonetheless think it should be a crime not
to put your medical data into a society-wide database. I’ve met people who are perfectly
fine with permitting the most intimate surveillance by corporations, but want governments
never to be able to touch that data. I’ve met people who are fine with government
surveillance, but are against anything that has a profit motive attached to it. And
I’ve met lots of people who are fine with any of the above.

As individuals and as a society, we are constantly trying to balance our different
values. We never get it completely right. What’s important is that we deliberately
engage in the process. Too often the balancing is done for us by governments and corporations
with their own agendas.

Whatever our politics, we need to get involved. We don’t want the FBI and NSA to secretly
decide what levels of government surveillance are the default on our cell phones;
we want Congress to decide matters like these in an open and public debate. We don’t
want the governments of China and Russia to decide what censorship capabilities are
built into the Internet; we want an international standards body to make those decisions.
We don’t want Facebook to decide the extent of privacy we enjoy amongst our friends;
we want to decide for ourselves. All of these decisions are bigger and more important
than any one organization. They need to be made by a greater and more representative
and inclusive institution. We want the public to be able to have open debates about
these things, and “we the people” to be able to hold decision makers accountable.

I often turn to a statement by Rev. Martin Luther King Jr: “The arc of history is
long, but it bends toward justice.” I am long-term optimistic, even if I remain short-term
pessimistic. I think we will overcome our fears, learn how to value our privacy, and
put rules in place to reap the benefits of big data while securing ourselves from
some of the risks. Right now, we’re seeing
the beginnings of a very powerful worldwide movement to recognize privacy as a fundamental
human right, not just in the abstract sense we see in so many public pronouncements,
but in a meaningful and enforceable way. The EU is leading the charge, but others
will follow. The process will take years, possibly decades, but I believe that in
half a century people will look at the data practices of today the same way we now
view archaic business practices like tenant farming, child labor, and company stores.
They’ll look immoral. The start of this movement, more than anything else, will be
Edward Snowden’s legacy.

I started this book by talking about data as exhaust: something we all produce as
we go about our information-age business. I think I can take that analogy one step
further. Data is the pollution problem of the information age, and protecting privacy
is the environmental challenge. Almost all computers produce personal information.
It stays around, festering. How we deal with it—how we contain it and how we dispose
of it—is central to the health of our information economy. Just as we look back today
at the early decades of the industrial age and wonder how our ancestors could have
ignored pollution in their rush to build an industrial world, our grandchildren will
look back at us during these early decades of the information age and judge us on
how we addressed the challenge of data collection and misuse.

Other books

Like a Cat in Heat by Lilith T. Bell
Nueva York by Edward Rutherfurd
Nine White Horses by Judith Tarr
The Lightning Keeper by Starling Lawrence
Dead Meat by William G. Tapply