Hello, friends,
Over the past two years, Facebook has been quietly building its own version of a Supreme Court—the Oversight Board—and this week the board finally announced its first rulings.
Facebook’s creation of an oversight board is an unprecedented legal experiment on a global scale. The board is essentially an appeals court that users can go to if they feel that their content has been removed unfairly from Facebook or Instagram. Although board members are chosen and paid by a trust operated by Facebook, they will in theory maintain independence in their decision-making.
In its first rulings this week, the board made a statement—overturning Facebook’s decisions in four out of five rulings, including one that required Facebook to restore a post that the user said was meant as a critique of the infamous Nazi Joseph Geobbels but that Facebook had deleted under its policy of removing content that could be seen as lionizing “dangerous individuals.”
In the next few weeks, the board will begin to deliberate on a high-stakes case: whether to keep former president Donald Trump banned from Facebook’s platforms. Facebook suspended Trump’s account after his supporters staged an attack on lawmakers on Capitol Hill on Jan. 6. Last week, Facebook asked the board to examine its decision to indefinitely suspend Trump.
To understand the unprecedented role of this new world court, I spoke to Kate Klonick, an assistant professor of law at St. John’s University School of Law, who spent 18 months following the internal team at Facebook tasked with setting up the oversight board so she could document that process. In July, Klonick published a paper in the Yale Law Journal examining the legal and societal implications of what she calls a novel “public-private partnership to govern online speech.”
The interview is below, edited for brevity.
Angwin: Let’s start with this week’s rulings from the oversight board. All but one of them restored posts that Facebook had deleted. So in some sense this board feels like the opposite of the Right to Be Forgotten, which allows people to remove search results from Google. After all, who is going to appeal if content stays up on Facebook?
Klonick: It absolutely does have a bias toward restoring content. Content that stays up is not even appealable to the board. The only way that content staying up could be an issue is if Facebook asks for a ruling on it.
There has been a trend in the last few years to remove more content, and the removal—especially during the pandemic—is increasingly automated. So there are a lot of false positives.
People are so frustrated. I get messages every day. People are saying, “I got banned and I don’t know why.” It’s not even about speech. It’s more like the right to assembly when, during this time of COVID, these platforms are literally our only way of socializing.
The case they decided about Goebbels is a great example. Facebook used an automated tool to take down anyone to who quotes a Nazi leader. Facebook just doesn’t care at this point about the 99 dolphins in the “incitement to violence” net; they care about the one shark. That’s a little bit what’s happening here—and I think here the oversight board is telling us that this Goebbels post was one of those 99 dolphins.
Angwin: Critics say that Facebook shouldn’t police itself—and that the board is not independent enough. As you have written, it is clearly a bid to stave off regulation*, at a time when regulators are seeking to break up Facebook. How independent do you think the board can be? In your paper, you say that users might be expecting “democratic accountability,” but “a more realistic outcome is participatory empowerment.”
Klonick: I think that they have gotten a lot of very fancy and very impressive people to stake their reputation on the development of this institution, and they are not going to turn into patsies for Facebook. I think that is huge because you can write all the documents in the world, but at the end of the day it’s humans all the way down. You have to have people who will insist on their independence.
The bigger question is even if the oversight board is exactly what it says it is, what happens if it still doesn’t create a better set of rules than Facebook? The really simplistic takeaway is that these are really impossible decisions and what will maybe happen over time is this will create a more public dialogue. At least it’s not a secret thing happening inside these companies. There will be a period of civics and public education that comes out of this and maybe eventually people will have an understanding of what the tradeoffs are, but I think that’s just going to take time.
Angwin: As you know, there is a group that has set themselves up as the Real Facebook Oversight Board that says the board’s powers are too limited and that it doesn’t scratch the surface of the larger problems of Facebook. What do you think about the scope of Facebook’s Oversight Board. Is it too narrow?
Klonick: Let’s go back to the Goebbels ruling. In the U.S., it’s very clear if you cite a Nazi that, except if you are in one small crazy circle of people, it will probably be seen as criticism of Nazis. But often it’s impossible for the machines to discern that.
There’s no one global rule that will work perfectly to take down Nazi quotes. Just like there’s not one global rule about taking a picture down of a naked girl running down the street after being sprayed by napalm [referring to Facebook’s removal of Nick Ut’s iconic “The Terror of War” photo, which depicts a prepubescent girl running naked down the road following a napalm attack on her village during the Vietnam War. Facebook’s removal caused international furor and resulted in the photo being reinstated on the site.] For one group, that picture brought them into the horror and harm of a foreign war; for another group of people it might be a horrific triggering of a harm. One person’s insights and knowledge of atrocity is another person reliving an atrocity.
The thing that I think is productive is that their rulings will rely more on international human rights law and the Santa Clara Principles rather than just Facebook’s own community standards, which were just a bunch of things people have made ad hoc.
You’ve written about this, Julia; you know this. The whole system has been about just putting out fires. We’re honestly lucky the rules ended up as well as they did. But now we have a moment where people deservedly want an explanation of what the rules are and how they broke them. That seems to me not easy but ultimately a scalable concept. Tell people what rule specifically they broke and apply the facts of their post.
Angwin: One of the big inequities in content moderation that you have discussed a lot is the fact that celebrities and public figures are often able to go directly to people they know at Facebook to dispute Facebook’s decisions about their content. Does this board help level the playing field for the average user?
Klonick: Not as much as I think it should. I don’t understand why people don’t talk about this more.
I do this show every day at five with Ben Wittes, and the day after Jan. 6, he promoted our show in the description about the insurrection with a QAnon hashtag. And Facebook took down our page and his page. But because he and I have contacts at Facebook and he has 400,000-plus followers on Twitter, we were back up within half an hour. This is such an inexact and inequitable system of justice.
I’d like to eventually see that kind of inequity in the process addressed. Right now, we simply don’t know where that will go. The board has only taken six cases plus the Trump suspension. Those are on really hard issues and created a record of interest groups weighing in. But there were only four cases from those first seven that came from users. The rest were referred by Facebook. I think this board might not end up being able to address that equity issue. And maybe it shouldn’t, because really in part those are some massive issues of scale.
Angwin: What happens if a board ruling flouts the law in a particular country?
Klonick: The board is not allowed to take a case that would contradict local law.
Angwin: You have envisioned scenarios where other companies set up similar boards or even take their cases to this board, making it a global courtroom unlike any other. What would it mean for global freedom of expression to have a privately run and funded world court system?
Klonick: I think that’s the most mind-blowing thing about this.
Imagine a world in which a bunch of other boards exist and this is how we adjudicate speech and it takes hold and people talk about the oversight board—“Did you hear that Oversight Board decision?”—in the same way that they talk about the Supreme Court or their governments.
It is a totally crazy thing to have an essentially private world court governing public rights for an independent private corporate platform. I really truly think this might be a new beginning where we are getting into a tipping point into a new permanent structure of power with these tech companies. A power we’ve never exactly seen before, in which these transnational companies control the communication infrastructure around the globe. They are as powerful—or more powerful—than countries. I think that’s the reality that most people haven’t yet truly really wrapped their minds around.
*”Regulation” originally said “self-regulation” and has been corrected here.
As always, thanks for reading.
Best,
Julia Angwin
Editor-in-Chief
The Markup