Hello, friends,
This week, dozens of media outlets have published investigations providing fresh details about how Facebook continuously overlooked internal research and concerns about harms generated by its platform. This new effort, dubbed The Facebook Papers, enhances and expands upon The Wall Street Journal’s Facebook Files series using documents provided by whistleblower Frances Haugen.
(If you’re having a hard time keeping up with The Facebook Papers articles, I stumbled on this extremely helpful Google doc where someone is compiling all of them.)
As the tsunami of stories flooded screens this week—culminating in an end-of-week announcement by Facebook CEO Mark Zuckerberg that the company was changing its name to Meta—I began thinking about how The Markup brings a unique perspective to the Facebook coverage.
Consider, for instance, just one document in the massive leak and how Facebook handled it: the internal Facebook report on “Carol’s Journey” (well reported on by Brandy Zarodzny at NBC News) that showed how a fictitious new user named Carol was recommended conspiracy theories by Facebook’s recommendation algorithms within two days of joining Facebook.
It wasn’t news that Facebook’s recommendation algorithms, particularly the ones that suggest groups to join, were a pathway to radicalization. Critics warned back in 2018, when Facebook changed its algorithm to boost group recommendations, that polarizing content would flourish.
Frederic Filloux, editor of Monday Note, wrote at the time, “This vision could backfire terribly: an increase in the weight of ‘groups’ means reinforcement of Facebook’s worst features —cognitive bubbles —where users are kept in silos fueled by a torrent of fake news and extremism.”
Around the same time, security researcher Renee DiResta told BuzzFeed, “The groups recommendation engine is a conspiracy correlation index.”
But what Haugen’s documents—with their internal studies like Carol’s Journey—show is that researchers inside the company were aware of and worried about the radicalization potential of the group recommendation engine. Yet Facebook’s official response to NBC’s reporting was a bit of a shrug: It was “a study of a hypothetical user” the company told NBC, while adding that the company had made many changes to curtail group recommendations since then, including stopping recommending political groups.
That’s where The Markup comes in. As a newsroom that uses the scientific method as our guiding light, we don’t take Facebook officials or critics at their word about whether anything on the social networking platform is fixed or broken. We persistently monitor Facebook’s platform to see what is actually happening in real users’ news feeds.
And what we have found is that during the past year, Facebook’s group recommendation system was doing exactly what internal and external researchers had warned about. Since December, our data shows that Facebook repeatedly pushed partisan political groups to its users—even after promising Congress it would stop the practice in the run-up to the tumultuous presidential election.
In January, The Markup’s Leon Yin and Alfred Ng reported that Facebook was recommending political groups—and that partisan political groups were being most heavily pushed to the Trump voters on our panel. In February, Facebook told Congress that “technical issues” were to blame for its failure to curb political group recommendations.
According to documents provided by Haugen to Congress and shared with The Markup, the day that our article was published, a Facebook team started investigating the “leakage,” and the problem was sent to the highest level to be “reviewed by Mark.”
The documents reveal that the problem was caused in part by a technical hiccup: The systems that recommend groups to users were cached on servers and users’ devices and only updated every 24 to 48 hours in some cases. The lag resulted in users receiving recommendations for groups that had recently been designated political, according to the logs.
At the same time, the documents showed that Facebook was defining groups as political by looking at the last seven days’ worth of content in a given group. That meant that if a group went seven days without posting political content, it would be taken off the blacklist—causing the list to be constantly “churning” as groups were taken on and off.
Despite these internal conclusions, six months after our original reporting, Facebook’s algorithm was still pushing political groups.
Facebook responded to our June report saying that it would investigate why its automated system failed to identify the political groups we found but argued that they were shown to only a tiny portion of users. “Even if every group they flagged should not have been recommended, it would represent just 0.2% of the total groups recommended to ‘Citizen Browser’ panelists,” Facebook spokesperson Kevin McAlister told us.
And that is really at the crux of the question posed by these leaks: How many people are harmed when Facebook amplifies toxic content or fails to follow through on its promises? And are we as a society willing to bear the consequences of that error rate?
The Facebook Papers provide evidence that executives inside Facebook knew they were causing harm. The Markup’s Citizen Browser panel provides a window into how widespread that harm was.
I think of our approach as similar to public health reporting. We are out on the virtual streets trying to determine the scope of the misinformation pandemic surging across Facebook’s platform. Meanwhile, The Facebook Papers are reporting from inside the control room where Facebook’s overwhelmed staff tries to spot and put out the fires raging across its platforms.
Both views are valuable. And we are proud to be able to use our technical expertise to bring accountability to Facebook’s platform in a way that has never been done before.
As always, thanks for reading.
Best,
Julia Angwin
Editor-in-Chief
The Markup