Skip navigation

The Breakdown

What Does Facebook Mean When It Says It Supports “Internet Regulations”?

Since companies typically try to avoid regulation, Facebook’s support for it is weird. Or is it? There’s more to the company’s proposal than meets the eye

A winking cartoon bill sits on steps. He's wearing a sash that has the Facebook logo on it.
Gabriel Hongsdusit

Erin Simpson, associate director for technology policy at Washington, D.C.–based think tank The Center for American Progress, had a message for Facebook. Last year, she co-wrote a pair of reports advising major social media platforms, like Facebook, on how to better address the twin crises of election delegitimization and health misinformation

But Facebook has also been pushing a message at her. 

For more than a year, Simpson had been noticing ads pop up with the same phrasing: Facebook “supports updated internet regulations,” the ads always said. 

The ads appeared in podcasts and on TV. A few months ago, when she went to the grocery store to make a copy of a key, Simpson was startled to see that even the key copying machine was flashing the message. 

“These ads follow me everywhere,” Simpson said.

“How is it here? I’m at Safeway and it’s on the key copying machine. It’s taunting me.”

For years, Congress has debated how to best regulate the tech giants that hold an increasing amount of control over human communication. But no comprehensive legislation dictating their behavior has been passed in a generation. According to a slew of ads like the one Simpson saw, along with public statements, blog posts, op-eds, and congressional testimony from CEO Mark Zuckerberg, Facebook would very much like to change that. The company, which spent $2.26 billion on advertising last year, has maintained a consistent public message: Facebook is not the one slowing down internet regulation in this country, and the company would certainly welcome reasonable curbs on its power.

Key copying enthusiasts and members of Congress aren’t the only targets for this message. Facebook has peppered its own platform with a PR campaign aimed at users. 

The Markup has seen the ads turn up repeatedly on the Facebook feeds of our Citizen Browser panel, a group of some 1,800 users that automatically share their news feed data with us. Of the 1,000 most common ads appearing on our panelists’ feeds between Nov. 1, 2020, and Aug. 27 of this year, just one ad was from Facebook itself: “It’s been 25 years since comprehensive internet regulations were passed,” the text of the ad says. “It’s time for an update.”

Targeting information collected by Citizen Browser showed the ads were aimed at users interested in “politics” and, where additional ideological targeting was available, largely to “people in the USA who are likely to engage with liberal political content.” 

But why is Facebook pushing for “regulations” that would curb its own power, and what exactly are the regulations it supports?

The ads are not explicit, but proposals and pieces of proposals laid out on Facebook’s regulations hub, which is linked in the Facebook ads, as well as in op-eds and congressional testimony from Facebook CEO Mark Zuckerberg, offer some insight into what Facebook wants. Facebook also provided additional materials directly when The Markup reached out for information. 

As long as everyone is focused on user content … we are not talking about advertising. We are not talking about the money.

Nathalie Maréchal, New America Foundation

In short, Facebook’s proposals seem to consist largely of implementing requirements for content moderation systems that Facebook has previously put in place, potentially forcing competitors to do work Facebook has already done. While there are instances, in some cases, of Facebook asking for guidance on how certain thorny questions should be handled, most of Facebook’s proposals seem unlikely to fundamentally change how the company itself does business. 

For skeptics about Facebook’s efforts, like Nathalie Maréchal, who does work on platform accountability at the New America Foundation’s Ranking Digital Rights project, the biggest problem with Facebook’s proposals are the things it doesn’t mention—the nonpolitical advertising that is the core of Facebook’s business. 

“The bigger picture here is that as long as everyone is focused on user content and all of its discontents, we are not talking about advertising. We are not talking about the money,” Maréchal explained. “It’s a PR blitz to get lawmakers and other policy-adjacent people to believe that Facebook is not standing in the way of change.”

↩︎ link

Why Is Facebook Saying It Wants Regulation?

Surveys show Americans are broadly supportive of imposing new regulations on social media platforms. For example, a recent Morning Consult poll found that more than three-quarters of Democrats and more than half of Republicans would back legislation holding platforms accountable for spreading COVID-19 misinformation. 

Lawmakers around the country have taken notice of that sentiment. There were 31 state-level bills introduced this year regarding digital privacy. Lawmakers in 18 states have introduced bills this year that would penalize social media platforms for taking down political content, most of which were introduced after former president Donald Trump was suspended from the major social media platforms for provoking violence in the wake of the 2020 election. 

There’s also been action at the federal level, with legislation like Sens. Amy Klobuchar and Ben Luján’s Health Misinformation Act or Sens. Brian Schatz and John Thune’s Platform Accountability and Consumer Transparency (PACT) Act. Last year, the House Antitrust subcommittee issued a blistering report calling out Facebook for anticompetitive practices, and the Biden administration has pursued ongoing investigations into the company’s practices.  

Regulation, it seems, is in the air. Facebook, experts say, has a strong incentive to get out in front of it while also steering the conversation about new rules to the areas where it’s comfortable seeing tighter government controls and away from areas where it isn’t.

“Facebook is just reading the tea leaves,” Simpson said. “And they’re trying to put themselves in a position where they think they would have some credibility in the legislative conversation by saying that they’re interested in policy—not that they’re reflexively against it.” 

Facebook is just reading the tea leaves.

Erin Simpson, The Center for American Progress

When The Markup reached out to Facebook asking for details on its regulatory proposals, spokesperson Rachel Holland replied with a statement highlighting how Facebook developed its moderation processes. 

“Facebook regularly works with external stakeholders when developing our Community Standards to balance competing interests, such as freedom of expression, safety, and privacy,” Holland said. “Depending on the topic, we will continue to work with subject matter experts, civil society organizations, and law enforcement, among others, to develop robust content policies that keep communities safe and apply to a global user base.” 

Facebook isn’t the first technology company to publicly push for greater regulation. Microsoft, for example, has been a major supporter of tighter regulations on the use of facial recognition technology. A law passed in the company’s home state of Washington last year regulating facial recognition technology was sponsored by a lawmaker who literally worked for Microsoft.

Facebook spent nearly $20 million lobbying the U.S. government last year, making it the sixth-highest spender of any organization. However, Sen. Mark Warner, who has introduced several bills about regulating big tech platforms, said Facebook hasn’t been doing a lot of the things savvy corporations do when they want their favored legislation passed. 

“There’s no doubt that Facebook, like other industry giants, knows exactly what it takes to get a bill passed and signed into law. Unfortunately, despite a pervasive ad campaign designed to tout its support for updated internet regulations, Facebook’s proactive engagement has been limited and it has continued to sponsor a number of groups focused on undermining regulatory proposals,” Warner told The Markup. “As we continue to try to bring internet law into the 21st century, I invite Facebook to engage meaningfully with my office on this complex issue.”

Facebook’s  campaign is more than political; it’s public relations, said Matt Navarra, a U.K.-based social media consultant who produces a newsletter and podcast about social platforms. Facebook’s decision to go directly to the public with its legislative message, in addition to its lobbying, he said, is part of a larger shift in brand marketing, largely enabled by social platforms like Facebook. 

“It wasn’t so many decades ago that if a company had a big issue it was trying to influence, it would just stick to the usual routes of lobbying ministers and making their products seem super amazing,” Navarra explained. “Now it’s easy to build a community around a cause and activate people. We’ve seen a shift in how people communicate and engage with each other because of those platforms, and that’s led to a shift in why companies want to reach consumers with these messages.”

↩︎ link

What Exactly Is Facebook Proposing?

Facebook’s ad campaign centers on the Communications Decency Act. The 1996 law has a provision, Section 230, which makes websites generally not liable for the content posted by users and provides the legal basis platforms use to justify their content moderation policies. Section 230 is widely considered the guiding legal principle for the modern internet; since many people across the political spectrum have qualms about the modern internet, Section 230 has become something of a flashpoint.

Facebook’s ads, which often at least obliquely reference Section 230, wisely don’t mention many of the specific reasons people might be upset enough with Facebook to desire new rules governing its conduct. Instead, the ads focus on the law’s age. The quarter-century since its passage is an eternity in internet years. 

In testimony before Congress in March, Zuckerberg proposed that the law should be updated to require tech platforms to have certain content moderation policies and systems in place to catch “unlawful content and activity on their platforms.” If those systems exist, platforms can continue to enjoy all the protections—meaning essentially zero liability for anything that’s posted on their platforms—of Section 230. 

He went on to ask Congress to pass legislation to “bring more transparency, accountability, and oversight to the processes by which companies enforce their rules about content that is harmful but legal.” 

While Zuckerberg’s testimony doesn’t go into detail, proposals in Facebook’s public-facing policy documents, conveniently, mostly happen to be things that Facebook already put in place following public pressure. Some are things already required of the company by laws like the European Union’s General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA). 

These processes, outlined here, include:

  • Allowing users to report problem content and having a content moderation program in place—both of which Facebook already does
  • Having some form of external body evaluate content moderation standards and also provide oversight on enforcement decisions. Facebook already has an oversight board.
  • Publishing content moderation standards, which Facebook began doing after those standards were leaked to The Guardian.
  • Providing transparency into how content moderation standards are set, which Facebook has done since 2018 by publishing the minutes of the internal meetings where its content policy is set (although those minutes were eventually replaced by glossy, less informative slide presentations).
  • Giving users input on content rules, which Facebook at least ostensibly does through its “Stakeholder Engagement” process.
  • Allowing affected users to appeal content moderation decisions, a process Facebook implemented in 2018.
  • Requiring responses to all user-initiated content reports, which Facebook users can track the status of on the platform.
  • Notifying users when their content is removed, which Facebook does.
  • Disclosing publicly aggregate content enforcement numbers, which Facebook has been doing quarterly for years, with companies’ potentially getting hit with more oversight or even sanctions if the prevalence of offending content on their platform hits a certain threshold.  

The documents in Facebook’s regulation hub also spell out a handful of things Facebook doesn’t want to see in new rules. 

The company says it is against requiring platform moderators to pre-approve user content before it can be posted, short time frames for taking down content, the criminalization of misinformation, and any kind of specific product design mandates. Facebook also says that, ideally, any content moderation requirements should scale up with size, giving smaller platforms more leeway than larger ones. 

Facebook singled out existing proposals for regulating political ads to combat foreign interference—the DETER Act and the Honest Ads Act—as ones it supports, although, as Quartz reported, Facebook simultaneously claimed to back the Honest Ads Act in public while lobbying against its passage in private.

The company favors passing federal privacy legislation modeled very closely on privacy rules already on the books in Europe. “I believe it would be good for the internet if more countries adopted regulation such as GDPR as a common framework,” Zuckerberg wrote in a 2019 Washington Post op-ed.

In that same op-ed, he supported requiring platforms to have data portability, which is the idea that, as Zuckerberg wrote, “if you share data with one service, you should be able to move it to another.”

An element of data portability is mandated as part of the GDPR and the CCPA, and in the op-ed, Zuckerberg highlighted Facebook’s support for the Data Transfer Project, an open-source initiative creating a common framework for sharing user data among online service providers. Facebook has a tool for users to port things like their photos and posts but doesn’t currently offer the ability to move users’ friends lists to other services, as some have called for as a way to more meaningfully increase competition.  

In addition to requiring some measure of data portability, Facebook said it hopes the government provides guidance on some of the privacy issues surrounding moving data across platforms, particularly when moving one person’s data also involves transferring information about other users. For example, what happens if someone ports their friends list from a social network where people can go by pseudonyms to one where everyone uses their real name? That could expose people’s real identities without their consent.

Daphne Keller, director of the Program on Platform Regulation at Stanford’s Center for Internet and Society, said this desire for clarity is likely born out of regulatory inconsistencies surrounding data portability under E.U. law that the company wouldn’t want to see replicated under U.S. law. “They wanted people to stop asking them for incompatible things, for someone with authority to tell them what the real answer is,” Keller said. “They don’t care what it is. They just want someone to decide so they’re not in an impossible position.”

↩︎ link

Could Passing Facebook’s Proposals Actually Be Harmful? 

Offering already implemented policy changes, said Maréchal of Ranking Digital Rights, lets Facebook crowd out tougher legislation, like a proposed bill that would require transparency in targeted digital advertising. The bill, which has not yet come up for a vote, popped up in May in the wake of Facebook’s cutting off access to NYU researchers studying ad targeting on the platform. Many of Facebook’s now favored policies, Maréchal noted, were pushed by public interest groups over the course of a decade before Facebook finally implemented them. “They’re conceding on a point that they’ve already lost,” she said. 

And new regulations might very well hurt competitors more than they hurt Facebook. 

Even just dragging GDPR across the pond from Europe, as Facebook suggests, would trigger significant compliance costs from U.S. companies. While major players with big European footprints, like Facebook and Google, have GDPR compliance all squared away, Keller notes that many other U.S. firms would almost certainly have to invest significant resources to survive an investigation by European regulators.  

Scaling these requirements up as companies grow, as Facebook suggests, is a common feature in this type of legislation. 

Not every large social media service desires or aspires to be like Facebook….

Aaron Mackey, Electronic Frontier Foundation

However, Aaron Mackey, a senior staff attorney at the Electronic Frontier Foundation who has written critically about Facebook’s proposals, worries that setting up a regulatory framework where the end goal is to be like Facebook directly encourages an internet where everything works like Facebook.

“Not every large social media service desires or aspires to be like Facebook or even aspires to monetize their users in a certain way. That’s a problem too, this idea that the law should be modeled off of one individual company and the way they have developed their business model,” Mackey said. “There are a lot of for-profit or community-led or individual-led user generated services and social networks that don’t have that model and they don’t aspire to be like Facebook.”

Facebook’s proposals haven’t seemed to gain much traction on Capitol Hill. After Zuckerberg put forward these ideas in congressional testimony earlier this year, Politico talked to lawmakers on both sides of the aisle who were deeply skeptical of his intentions. 

“Mark Zuckerberg knows that rolling back section 230 will cement Facebook’s position as the dominant social media company and make it vastly harder for new startups to challenge his cash cow,” Sen. Ron Wyden, a Democrat who was an original author of Section 230, told Politico. 

“Section 230 reform will hit Facebook regardless of what these self-interested Silicon Valley CEOs want,” concurred Republican senator Marsha Blackburn, also to Politico. “Big Tech only wants reform when it bolsters their power at the expense of competitors.”

Additional reporting by Corin Faife

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now