Hello, friends,
Since the Jan. 6 Capitol riot, Facebook has been announcing a flurry of changes to appease lawmakers who accuse the company of allowing the spread of lies that fueled the attack, as well as generally being a haven for misinformation.
Last week, Facebook said it would remove false claims about vaccines. This week it said it would test out a method of removing political content from users’ news feeds. And this week the company responded to our revelation that it had not removed political group recommendations—despite promising Congress that it would do so—with a mea culpa alleging “technical issues.”
And yet, many argue that none of Facebook’s changes deal with the platform’s fundamental problem: Its business is built on extracting our data and giving advertisers the opportunity to use that data to manipulate us.
To understand the arguments about the dangers of the tech giant’s business model, I spoke and corresponded with Shoshana Zuboff, the author of the landmark book “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.” She is the Charles Edward Wilson Professor Emerita at Harvard Business School and a former faculty associate at the Berkman Klein Center for Internet and Society at Harvard Law School.
Her recent op-ed in The New York Times, “The Coup We Are Not Talking About,” argues that the business of big surveillance threatens democracy.
Her written responses to my questions are below.
Angwin: Let’s start with the basics. You developed the term “surveillance capitalism,” which you define as turning human experience into “a free raw material for translation into behavioral data.” I was hoping you could explain that in layperson’s terms. Most people would probably have experienced this in terms of ads following them around the internet. But of course, it’s much more than that.
Zuboff: I think it’s useful for people to understand that there is a long-term pattern of how capitalism has evolved by taking things that exist outside of the market dynamic and turning them into something that can be sold: a commodity.
Industrial capitalism largely claimed nature for the market dynamic. So, trees became sources of raw material and mountainsides became sources of raw material and rivers became sources of energy and so forth. So nature was transformed into land and real estate—things that could be sold.
Fast-forward to the early 21st century and now where we’re moving through this new structural transformation from an industrial civilization to the early stages of an information civilization. The young internet companies had great ideas and many good services, but they hadn’t figured out how to make money. They needed that new commodity breakthrough, which occurred at Google around 2000/2001.
That’s when they discovered that the data leftover from people’s search and browsing activities contained highly predictive signals from which they could compute the probabilities of what ads a person was likely to click. These data had been considered waste material called “data exhaust.” It was surplus data—more than what was needed for product improvements. They soon discovered that they could also hunt and capture predictive personal information from all over the web without anyone even knowing and use it to infer things that people had not intended to disclose.
With these new caches of personal information they invented the click-through rate, which is a computational product that predicts human behavior—admittedly, in a narrow realm—but nonetheless, it’s a prediction of your behavior. They discovered that this could revolutionize advertising and it did.
So now, if we just zoom out from this a little bit, what have we learned? We’ve learned that as we live our lives, having our experiences, we’re leaving far more data behind that we have any idea about and those data are being systematically extracted at scale.
Surveillance capitalism rests on the discovery that behavioral data could be lifted from our lives and claimed as a company’s private property for manufacturing and sales. The data flow through supply pipes to new computational factories called AI, where they are fabricated into predictions and sold to a new kind of marketplace that trades in these predictions of what people will do. I call these “human futures markets,” just like we have markets that trade pork belly futures or oil futures, or wheat futures.
So this is how surveillance capitalism works. It illegitimately dips into our private experience, lifts out behavioral data, claims that behavioral data as its private property, and then takes it to their factory to create products that it can sell.
Angwin: You have described the results of surveillance capitalism as an “epistemic coup” that produces “epistemic inequality.” Can you define what you mean by that?
Zuboff: Epistemic means having to do with knowledge and knowing. It is how we know, it’s the work of knowing, the action of knowing. The epistemic coup is a revolutionary takeover of what is known, how it is known, and who can know it. Epistemic inequality is the growing abyss between what I can know and what can be known about me.
Surveillance capitalism has declared its right to know us in ways that we never agreed to, unilaterally changing the conditions of our lives without our participation or consent.
There’s no gun involved. There’s no violence. There’s no threat of murder. There’s no dragging us to the gulag or the camp.
This is a new form of power that I call ‘instrumentarian” power. It work its will through the digital architectures to create huge concentrations of knowledge about us. And then that knowledge morphs into power as it is used to feed an array of targeting mechanisms that tune, manipulate, shape and ultimately modify behavior in ways that align with others’ commercial objectives.
And, as we have learned since Cambridge Analytica, politicians, oligarchs, or really any actor with enough knowledge, capability, and money can simply lease this entire operation and pivot it a few degrees from commercial to political behavior.
Angwin: This is what I believe you call “epistemic chaos,” where if I understand it correctly, ”our shared reality is splintered by algorithmic amplification and microtargeting of corrupt information.” Is this where we are right now?
Zuboff: I’ll just briefly define the stages of the coup. So first there is this usurping of epistemic rights.
You know, we talk a lot about the right to free speech. For most of human history, there was not a right to free speech because it wasn’t necessary. If your vocal cords worked and your physiology was intact, you could open your mouth and you could speak. You didn’t need a right to speak.
But with the development of civilization and the development of political complexity, there comes a moment where the conveying of ideas in speech can be politically contested and people have to fight to be able to say what they think. So, in the 18th century there’s finally an understanding that a formal right is required.
That’s how legal rights are established. At some point in the evolution of society, things that we consider to be elemental and natural come under attack, and they can no longer be taken for granted. That’s why the first stage of the coup is marked by the fact that they are taking rights from us that we thought were elemental and inalienable.
Just a few weeks ago, Amazon put out a press release about its Rekognition facial recognition system announcing that it can now recognize fear. Fear now joins the eight other emotions that Rekognition can already recognize.
So, why is fear a big deal? Well, they’re beyond just wanting our faces so that they can ID us. They want the micro expressions that are formed by hundreds of little facial muscles which predict emotions and emotions predict behavior. Emotions are highly sought-after predictive data.
Now my response to this is, “You have no right to my face. I didn’t give you my face and you certainly have no right to my fear. Only I get to know my fear. Maybe I share it with my husband or maybe I share it with my best friend, but I’m certainly not sharing it with you, Amazon.”
But now my fear is Amazon’s private property. They have no right to this material. It’s illegitimate. It should be illegal. This is the first stage of the coup, usurping our epistemic rights.
So now in Stage Two, this becomes an extreme new form of social inequality that is measured by the difference between what I can know and what can be known about me. That’s what I call epistemic inequality.
Stage Three is episodic chaos, as a range of targeting mechanisms are built from all that knowledge. The system moves beyond monitoring and tracking and knowing us. Targeting is designed to get us to do something or think something or buy something or believe something or join something that we probably otherwise would not have done. This is how knowledge translates into power.
These machine operations are engineered to maximize engagement because that leads to more opportunities for data extraction. These systems are indifferent to content. They don’t care if they are amplifying material that asks you to burn down the Capitol or material that shows you how to build a doll’s house. As long as it engages you and delivers more data, it’s considered a success. Epistemic chaos is the inevitable result, because corrupt information rises to the top of these targeting machines. The crazy inflammatory stuff draws people to it and produces more revenue.
Angwin: You have argued that this chaos could lead to the end of democracy. Explain how that happens?
Zuboff: That is correct. I call this epistemic dominance.
If you think about what happened in the 2016 Trump campaign, they were able to depress Black voter turnout using a strategy of online manipulation. We know the deterrence strategy was effective.
Citizens of a democratic society, citizens of the oldest democracy on earth, were persuaded to give up their most fundamentally democratic right, which is the right to vote. No one came to their house with a gun and said, you’re not allowed to go to the polls. No one threatened them with the gulag or the camp. No one threatened murder or violence. It was all done in ways that were designed to evade their awareness.
All of this works subliminally as it’s coming through the digital instrumentation. You have no idea it’s happening. Therefore, you can’t agree to it because you don’t know what is happening and you can’t fight it because you don’t know what’s happening. And yet look at how it inscribes its effect.
This is not science fiction. This is the weakening of democratic institutions with a new kind of power. Democratic governance is replaced by computational governance.
Angwin: So what needs to be done to prevent this dystopian scenario from playing out?
Zuboff: The first thing is that the digital has to live in democracy’s house.
This whole digital domain, which is now essentially owned and operated by surveillance capitalism, must come under the democratic rule of law and the governance of democratic institutions. Democracy has been literally asleep for two decades and that has to change—and it is changing.
I’ve been doing this work for a long time. Julia, most of the time I’ve been feeling like, you know, I think there’s a standard nightmare where you’re screaming and no sound comes out. I’ve had that feeling for many years of my career.
But now there’s engagement in a way that there hasn’t been before. Things are changing among lawmakers and citizens, even in America.
So, what does it mean to come under the rule of law? I compare where we are to very early in the 20th century when we had no workers’ rights, we had no consumer rights.
We didn’t have institutions and systems like unemployment insurance or social security, or the institutions that oversaw the stock market or banking or making sure that food and drugs were safe to ingest or that workers were employed under safe conditions and had decent wages.
Hundreds of administrative institutions were invented in the 1920s and 1930s to give us the rights and the laws and the institutions that we needed to create an industrial century in which not only could democracy survive, but it could thrive.
So when we look at surveillance capitalism, we need to look at its data extraction, its operations, and its financial incentives. Data extraction has spread to every corner of daily life: your car, your kitchen, your Spotify. They’re coming from your smoke detector. And if your child’s school is using Google classroom, they’re coming from your child’s personal information. Each of these situations are supply chain interfaces now.
As democratic societies we need to assert that this unilateral extraction of sovereign human experience is illegitimate. Data collection should be tied to fundamental rights.
Once we make a significant dent in extraction, we’re automatically tackling these pernicious targeting capabilities. It must be the individual who decides what data are shared, how, and for what purpose. I might want to share data for research on a disease, or to improve living conditions in my community. These are things that I, as a sovereign individual, now can decide.
Then we can intervene in the human futures markets and eliminate the financial incentives that drive the whole system. Let’s say that human futures markets are illegal because they have a predictably disruptive consequences for people, society, and for democracy.
That’s not a radical idea. We made many other markets illegal. We made slave markets illegal, even when whole economies were based on those markets. We say no markets in humans, no markets in babies, no markets in human organs. And we can say that here too.
Finally, our citizens and our lawmakers have to come together and form new alliances. We need new forms of collective action. We need alliances between citizens and lawmakers. Ultimately all the liberal democracies must work together on the prospect of a democratic digital future. That means new alliances with the EU and other democratic nations. The European political leadership is at the head of the queue now in reasserting democratic control over the digital future. They have been working toward this and it is key to their vision and policy proposals.
These are not problems that can be solved on a country-by-country basis. The liberal democracies have to move together to assert a vision of a democratic digital future, just as China has asserted its own vision of a democratic century that advances its form of authoritarian government. This will be a critical contribution to rebuilding our broken democracy.
Thanks, as always, for reading. I’m taking a vacation next week but will return to your inboxes on Feb. 27.
Best,
Julia Angwin
Editor-in-Chief
The Markup