Hi,
I’m Aaron Sankin, a reporter here at The Markup. A few months ago, I published an interview with a pair of academics who had written a book meant to help teenage internet users navigate an online ecosystem designed to transform them into anxious, unhappy posting machines.
Their advice was simple, yet revelatory: When something online gets you hot and bothered, instead of immediately raging right back, stop to take a breath and really think about the best way to respond, or if a response is even worth it. A healthy relationship with the internet requires a mindfulness practice anchoring you against social media’s endless red tide of rage-inducing slop.
When I read Sasha Issenberg’s new book The Lie Detectives, a fascinating portrait of the people helping political campaigns navigate the waters of online disinformation, I couldn’t help notice the similarities. It isn’t that political candidates aren’t, at times, too far off from impulsive, hormonal teenagers (although, if the flag pin fits). It’s that the same set of incentives pushing teens into digital flame wars they’ll later regret also operate on politicians hunting for votes.
The key, in both cases, is to stop, take a breath, and survey the landscape before accidentally setting off an even bigger firestorm.
What makes The Lie Detectives so interesting is that as Issenberg’s (also very good) 2012 book The Victory Lab detailed, at the beginning of the last decade it seemed like campaigns were using big data and reams of academic research to winnow politics down to an increasingly precise science. But that’s all been turned upside down by the wave of conspiracy theories and “fake news” that’s defined American politics since 2016. Issenberg’s narrative centers around the people who weren’t surprised by 2016 because they’d seen similar dynamics play out in other elections across the globe.
I spoke with Issenberg about the globalization of online disinfo, playing politics in the context of social media’s landscape of no context, and how to recognize when the best decision is just to ignore the crazy guys screaming about you online.
The following conversation has been edited for length and clarity.
Sankin: The Lie Detectives is narratively anchored around a woman named Jiore Craig, who’s become a go-to consultant for political campaigns dealing with disinformation. I’m curious what about her story you found interesting?
Issenberg: There are two parts of her story. One is how she ends up in this unique, novel role in American politics. And then two, what she does when she’s there.
She ends up at this moment, right after the 2016 election, where everybody in the American left is thrashing about to make sense of Trump’s victory. One line of inquiry focuses on all the things on the internet in 2016 that, over the course of that election year, had never really been properly understood at the time.
These leading figures on the American left, in the Democratic Party apparatus, in labor unions, started trying to figure out how to develop how campaigns should navigate a world of bots and trolls. The term “fake news” was just emerging, in its original pure meaning. There was nobody in American politics who had much experience with any of this.
Eventually, all these figures find their way to Jiore Craig, who was 25 years old, grew up in Illinois, and had never worked on a domestic political campaign before. She had worked for a consulting firm in Washington that specialized in, among other things, overseas consulting. She’d gone to Moldova to help a parliamentary official get on Facebook and then worked in elections in Gabon and then was in the Philippines and Panama.
Coronavirus
Want to Find a Misinformed Public? Facebook's Already Done It
While vowing to police COVID-19 misinformation on its platform, Facebook let advertisers target users interested in “pseudoscience”
She’d watched the American 2016 election from abroad. But all these things that seemed like new dynamics to people running political campaigns here, were not really new to politics. Foreign interference, the persistence of rumor, conspiracy theory, the rise of partisan and ideological media.
All these are things often taken for granted as fundamental aspects of how politics operates in large parts of the world. She was really the only person in Washington who had the proper context from which to develop a playbook for this era of viral disinformation.
What she ends up being able to do is advise clients—up to the Biden presidential campaign—how to bring a proper sense of proportion to basically the problem of somebody lying about you on the internet. Without her, few people would have had the ability to see that proportionally.
We think of them as internet problems in the U.S. because we first encountered them in an election in which the novel context for encountering them was social media—Facebook in particular. But the fact is, you don’t have to read them that way. What Jiore was able to train American political communicators to realize is that you can get distracted by the novelty of the internet and lose sight of the underlying dynamic. To try to interrogate why are people being receptive to this and how do you address that without making the problem worse.
Sankin: Reading The Victory Lab when it was released in 2012, there was a sense that the next decade or so of politics would be defined by control. Campaigns were incorporating all of this data, going into political science literature, and coming out with an idea how to do this pretty scientifically.
Maybe one of the disconnects that happened in 2016 is the realization that there’s really no control. The thing they were dealing with was this wildfire of informational chaos everywhere. It shattered that whole paradigm.
Issenberg: The central dynamic that drew me to want to come back to this world, and turned into The Lie Detectives, was about how those people were reckoning with the newly decentralized communications landscape as it exists now. The asymmetry that comes along with having your opposition not necessarily being your opponent. Not being somebody under the same legal and normative constraints that you are as a campaign, who doesn’t have to raise money under very restrictive conditions, who doesn’t have to disclose basically everything they spend, who doesn’t have to afford the world a fair bit of visibility into how and where they communicate.
And now, whether it’s an anonymous person in their basement, or it’s a foreign intelligence service, or somebody who’s just trying to make money off of clicks online, the campaign or the party operating under all those constraints is having to contend with an opposition that isn’t. And thus can’t be held accountable for its speech within the political system.
If your opponent lies about you or if your opponent says something mean, but true, about you, the typical playbook was you try to hold them accountable before voters. You tell voters that they’re a liar. You tell voters they play dirty. You try to get newspaper editorial pages to just scold them about that. There’s at least the expectation that there was an accountability mechanism available to you as a candidate. That’s gone now.
The BreakdownAutomated Censorship
How Automated Content Moderation Works (Even When It Doesn’t)
Instagram, YouTube, and other platforms moderate billions of posts—here are some of their common techniques
To some extent, that’s the meta-narrative of my book. It’s about the people who really believed they have the ability to control and shape public opinion realizing that that’s no longer the case.
Political consultants, by their very nature, can put themselves in the shoes of their opponent and guess what a given move is intended to do and what it portends. It’s like playing chess. You can put yourself in the other person’s shoes if you know that they’re also trying to get to 50 plus one on the same election day you are. If they’re trying to game programmatic ad marketplaces or create domestic disorder for other geopolitical aims, it’s very hard to anticipate why they’re doing what they’re doing.
Early on post-2016, there was a lot of attention paid to attribution. Who was behind this? That was, at times, useful for going to the platforms to get stuff taken down—especially if there was inauthentic coordinated activity, or whatever the term they use for bots is, indicating who was behind it could be useful to get it taken down.
Fundamentally, what people in campaigns have learned is that how to make the determination about when to respond shouldn’t have much to do with who the speaker is, it should have to do with your sense of whether it hits your vulnerabilities. I think too much attention early on was focused on if it’s the Russians or supporters of my opponent. Ultimately, I think that’s a distraction from the kind of short-term tactical calculations that the campaigners have to be making about the merits of a response.
Sankin: What’s the framework for responding that Jiore, and folks like her, have developed to help campaigns through this?
Issenberg: There’s a certain amount of good data that you need to make these decisions, but fundamentally it’s a pretty straightforward cost-benefit analysis. The costs of responding to a negative claim about you online, whether or not it’s true or false, are significant. That includes the Streisand Effect problem, a uniquely digital thing, in which you can draw attention to something that isn’t getting a lot of attention just by speaking out on it.
There’s cognitive science research that points to the fact that if you don’t go about debunking a claim in the right way, you can end up reinforcing in the minds of people who hear it. The nature of social media platforms is that they reward engagement. Trying to fact-check, or respond, or dunk on a lie can often end up just helping to spread it.
Campaigns have only a limited capacity to talk about a certain number of things every day. If you spend all day responding to what other people are talking about, you never get the opportunity to communicate proactively on the issues and topics that you care about.
If you are a public figure, or public institution of any kind, a celebrity, a company, a university, a sports team, people are gonna be lying about you and the issues you care about on the internet every day. Ninety-nine percent of the time responding to them will only make your problem worse. The challenge is figuring out what the one percent is and formulating a proportional response.
Hello World
Trump, Kate, and a Misplaced Shark
Let’s talk about fake news images
There was this natural bias towards action that Jiore was up against. It’s partly ingrained in political consultants who are paid to tell candidates what to do. I think communications consultants do not feel like their value comes from telling people not to talk.
But then I think it was also reinforced by a particular mentality of the Trump-era American left, which was this real feeling of “we need to do something.” As this got caught up in conversations about Russia, about the future of our democracy, about the threat of the pandemic, the weightier that the subject matter, the more the natural instinct of people on the front lines was to say, “well, we need to respond.”
Jiore helped formulate these decision-making matrices that were very simple, which are basically—is this reaching a lot of people? And is it likely to change their views about you or the issue you care about?
You need good data to be able to intelligently answer those questions. But once you did, the logic of it was clear, which is, only if stuff is scoring high on both those measures do you even need to think about whether to respond. Conditioning politicians and campaign decision-makers to confidently sit on their hands was really difficult. It was difficult as a matter of culture; it was not a challenging technology problem.
Sankin: You framed this issue around deciding which is the one percent of internet flame wars that is worth getting involved in. That’s a universal problem, not just for political campaigns. Any person or institution that exists online is going to be hit by these factors. The overwhelming majority of folks who are gonna be hit with this, and are going to be thinking about making this decision, are not institutional actors. They’re just people.
At the same time, a campaign may be an institution, but it’s also the representation of a person. There was a section in your book that detailed a gathering of officials talking about the disinformation they’ve experienced and they got really emotional. It was personal to them. These were lies that were spread about them, about their families, about their loved ones, about the thing they care about the most in the world.
The hard part, on a personal level, for someone trying to be rational about this is the same regardless of whether they’re a candidate for national office or just a random person who sees someone in a local community Facebook group lying about what they’re doing with their front lawn. It still feels personal and, because of that, it feels like you have to respond.
Issenberg: I think it’s all a question of proportion.
We had a reasonable sense of proportion for the pre-digital versions of this, which is one crazy guy at the end of the bar spouting off. That maybe merits an eye roll, but not going to engage. If one crazy person at the end of the bar is criticizing you or your spouse, you certainly wouldn’t go onto the local radio station to respond. That would seem like a wildly disproportionate and probably counterproductive thing to do.
If 500 people are at the local park holding signs saying that about you or your spouse, then you start to think, “okay, well, maybe it’s time to go on the radio station or buy a newspaper ad.”
The first thing Jiore does when she gets hired by one of these campaigns is what she calls a “landscape analysis,” mapping the networks around a given topic or subject area. Which accounts are linked? How many followers do they have? What subjects do they traffic in? What is the standard number of follows or retweets or shares their content gets? To create a baseline set of expectations for understanding what things look like on a normal day. Is this one crazy guy at the end of the bar or are these 20 crazy guys at the end of the bar having a crazy guys at the bar meeting?
She gets paid good money to develop that analysis and provide it to clients. That’s the basis for being able to say, “No, this is not abnormal. These people are always talking about chemtrails. This is just the way the internet works.”
When you drop into a conversation because it’s newly of interest to you, you don’t have those cues. So, is ten shares on this a lot? Is a hundred shares on this a lot? Proportion is really the thing campaigns struggle with.
Sankin: That makes me think about how, for like half a century, we had a bunch of big media outlets handling most communication, one-way. Campaigns needed to know what the big outlets were and then what the local outlets were. Those were the communication vectors.
That landscape was a lot more stable. It makes sense that when you have information pathways that are not obvious and change fluidly, you need to have a different level of expertise. When a campaign sees someone screaming about them online, it’s harder to immediately identify if that guy matters or not. It’s shifted away from what was a fairly stable, and historically speaking, pretty unique set of circumstances that created that media stability.
Issenberg: The United States having a centralized media that aspired to a certain type of neutrality was a really distinctive phenomenon, globally. All of our political professionals were raised in this environment and took it for granted.
If you worked in American politics, you didn’t have to think about word of mouth as a meaningful vector for communications. Now, you do. The difficulty is, if you think of it as an internet problem and not a word of mouth problem, you get distracted by the novelty of it.
The thing about Jiore is she’s looking in these countries where they had encountered many of these dynamics before the internet. So when they went online, they were just a natural extension of the way politics work, just with some slightly new communications tools. Not a paradigm shift in what constitutes politics.
Don’t forget to vote,
Aaron Sankin
Investigative Reporter
The Markup