Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

2020 in Review

Facebook Had an Anxiety-Producing 2020

Despite financial success, the company was plagued by content moderation struggles, internal dissent, and regulatory threats

Niall Carson/PA Images via Getty Images and Getty Images

In a July hearing before the House Judiciary antitrust subcommittee, Facebook CEO Mark Zuckerberg defended some of his company’s more controversial practices. 

“Congressman, I think we have a responsibility to, uh, limit the spread of content that’s going to be harmful for people. Also, I’d like to add that I do not believe that we have any incentive to have this content on our service. People don’t like it, business—” Zuckerberg began.

But subcommittee chairperson Rep. David Cicilline (D-RI), who would later spearhead the committee’s damning antitrust report on Facebook and other tech companies, quickly cut him off. 

“Mr. Zuckerberg, with all due respect, it is often the most engaging,” Cicilline said of the unwanted speech. “It brings the most likes, or it brings the most activity, which of course, produces great profit. So, you do have an incentive: The more engagement there is, the more money you make on advertising.”

Advertising accounted for 90 percent of Facebook’s 2019 profits. As of just Sept. 30, 2020, Facebook had made nearly $58 billion in revenue, nearly $57 billion of it in advertising revenue alone, leaving it on track to surpass 2019’s more than $70 billion in total revenue. For the third fiscal quarter alone, the company had a 22 percent rise in total revenue from last year, which many attribute to users engaging more with social media during the coronavirus pandemic.

And so went 2020 for the social media giant: big profits, along with big questions about how it runs its business. Whether it was concerns about how it regulates hate speech and misinformation or federal lawsuits over how it approaches competition, Facebook faced challenges from nearly every direction: competitors, elected officials from both sides of the aisle, hate speech watchdogs, and state and federal prosecutors. 

↩︎ link

A Colossus of Misinformation

From QAnon to coronavirus hoaxes to Holocaust denial, the largest social media platform in the world took unprecedented heat over its struggle to properly regulate toxic discourse and misinformation on its platform. 

As recently as April, according to its own ad portal, Facebook was allowing advertisers to profit from ads targeting more than 78 million people interested in “pseudoscience,” which included conspiracy theories about 5G, COVID-19, and chemtrails. Facebook eliminated the pseudoscience category after The Markup reached out for comment.

In October, Facebook announced it would ban QAnon theories from the site, but just days afterward, The Markup found an advertisement on the platform (which was subsequently removed) linking to a Facebook page featuring popular QAnon videos. 

Also in October, Facebook announced it would no longer permit any content that “denies and distorts the Holocaust,” but The Markup found numerous, active Facebook pages of well-known Holocaust denial groups. Some of those pages were also subsequently taken down

But worries about the 2020 elections loomed largest. 

In anticipation of vitriolic campaigning and a contested presidential result, Facebook cut off political advertising weeks before the election and quietly stopped “recommending” political groups. Instagram also banned the “recent” tab from the hashtag page for U.S. users in the final days before Nov. 3 to reduce the real-time spread of potential election misinformation. In early December, Instagram resumed the recent tab. Recently, Facebook lifted the political ad ban, but only for Georgia users about that state’s Senate runoff races. Facebook did not immediately respond to a question as to whether it has resumed suggesting political groups.

In the end, the platform avoided major November disaster—though lawmakers told Zuckerberg that the company had not done enough to squelch misinformation throughout the long campaign season. An October company report noted that Facebook had removed three networks for “coordinated inauthentic behavior,” two of which targeted U.S. politics and the U.S. election.  

Zuckerberg and his wife, Priscilla Chan, also personally donated $400 million to nonprofits that provided local election offices with staff, ballot-scanning machines, protective gear, and rental space in the wake of severe funding shortages.

↩︎ link

Employee Dissent Grew

Facebook has faced internal unrest before, but nothing like what it saw in 2020. 

Facebook execs have been stuck in the middle of a tug of war between users who demand unfettered speech and others who demand that the company clamp down on hate speech and misinformation—and that latter group includes a lot of employees and some advertisers.

In June, hundreds of Facebook employees staged a walkout to draw attention to the platform’s light handling of President Donald Trump’s posts, leading to decreased employee satisfaction and the firing of one Seattle-based engineer.

They put political considerations over enforcing their policies to the letter of the law.

Yaël Eisenstat, former Facebook elections ads integrity lead

In September, another leaked memo, written by a fired Facebook employee, Sophie Zhang, alleged widespread misuse of the platform by politicians and political groups around the world looking to make political gains. Another internal memo called upon Facebook’s leadership to address an alleged pattern of favoritism toward hard-line Hindu politicians in India. And despite Facebook’s election preparations (however belated), some Facebook employees found its attempts to stop hateful speech insufficient and have quit

A former Facebook elections ads integrity lead, Yaël Eisenstat, told BuzzFeed that the company’s content moderation policies were often inconsistently adjusted to respond to political or public relations factors. “They put political considerations over enforcing their policies to the letter of the law,” she said. 

Moderators on the platform, meanwhile, who are often contractors, have called on the company to require safer work conditions during the pandemic. 

Employee discontent has also simmered in company meetings. In August, for instance, following a shooting in Kenosha, Wis., employees questioned Zuckerberg over the company’s handling of militia and conspiracy pages on its platform.  

In leaked internal comments and audio, Facebook employees also recently criticized the company’s rising tension with Apple over Apple’s plan to have iPhone users explicitly opt in to letting themselves be tracked across apps. Some employees saw the Facebook publicity campaign against Apple’s policy change, which Facebook claims will hurt small businesses who use trackers to target potential customers, as self-serving. “It feels like we are trying to justify doing a bad thing by hiding behind people with a sympathetic message,” one engineer wrote in an online company forum.

↩︎ link

Regulators Woke Up in 2020

In the waning weeks of 2020, Facebook was hit with two separate antitrust lawsuits. 

The first, filed by the Federal Trade Commission in the U.S. District Court for the District of Columbia, seeks, among other remedies, to undo the allegedly anticompetitive acquisitions of Instagram and WhatsApp.

“Facebook fell back on the philosophy that ‘it is better to buy than compete,’ ” the FTC complaint alleged. 

Another suit, filed in the same court by the attorneys general of 46 states, plus Guam and Washington, D.C., similarly seeks to stop alleged anticompetitive behavior and break the company up.

“Facebook’s monopoly gives it significant control over how users engage with their closest connections,” the states’ complaint read, “and what content users see when they do…. Users of personal social networking services have suffered and continue to suffer a variety of harms as a consequence of Facebook’s illegal conduct, including degraded quality of users’ experiences, less choice in personal social networks, suppressed innovation, and reduced investment in potentially competing services.”

Facebook has called the suits “revisionist history” and “not how the antitrust laws are supposed to work.”

An antitrust suit filed against Google by 10 state attorneys general in the U.S. District Court for the Eastern District of Texas alleges that Facebook illegally colluded with Google on advertising and that they also agreed to help each other in the event of antitrust scrutiny.

Meanwhile, European regulators sent a letter to Facebook ordering it to stop sending European user data to the United States out of concern about how the U.S. government might use that data. Although Facebook threatened to leave the Continent altogether over the order, it has not yet. In the meantime, it has also been forced to disable some features in European Messenger and Instagram in order to comply with other EU privacy regulations. 

↩︎ link

Existential Threat

A potential legislative threat looms for the company, as well, after calls to eliminate Section 230 of the Communications Decency Act—which tech companies argue gives them broad liability protection for harmful third-party content that might appear on their platforms—escalated. 

President-elect Joe Biden and other Democrats have called for the repeal of Section 230 protections, and despite data showing that conservative news articles are favored on Facebook, President Trump, Sen. Ted Cruz (R-TX), Sen. Josh Hawley (R-Mo.), and other Republicans have also opposed this legal protection through tweets, hearings, an executive orders, and proposed legislation. Sen. Mitch McConnell (R-KY) is now attempting to tie giving out $2,000 stimulus checks to repealing Section 230.

Facebook, and other platforms, have been resistant to such proposals.

“[T]his would penalize companies that choose to allow controversial speech and encourage platforms to censor anything that might offend anyone,” Liz Bourgeois, a Facebook spokesperson, said at the time of Trump’s executive order, which ended up having little legal authority.

If Section 230 were somehow repealed, experts say that it could require Facebook and other internet platforms to screen any and all content uploaded in order to avoid potential joint liability for harmful speech, in the same way that a newspaper does. That would massively change their current business models and potentially impact their popularity and profits.

Most don’t anticipate its repeal to move forward anytime soon, but Zuckerberg, unlike other tech titans, has expressed support for the law’s reevaluation. Another bipartisan proposal, which Zuckerberg himself seemed amenable to in a recent earnings call, would require transparency and standard processes about how and what content the platform censors. “I think a system like that, that basically requires companies to meet certain thresholds or show improvement, basically aligns incentives in the right way,” he said.

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now