Subscribe to Hello World
Dispatches from founder Julia Angwin. Hello World is a weekly newsletter—delivered every Saturday morning—that goes deep into our original reporting and the questions we put to big thinkers in the field. Browse the archive here.
Hello, friends,
Last week, European regulators fined Meta, the parent company of Facebook and Instagram, 390 million euros for illegally forcing users to accept personalized advertising.
The decision was hailed as a watershed moment for Europe’s comprehensive privacy law, the General Data Protection Regulation, known as GDPR. But in many ways, it was actually a reminder of how poorly the law has performed.
Enacted in 2018, GDPR is essentially a privacy bill of rights. Amid rising commercial surveillance, it offered EU residents some essential rights over their data—including the right to see their data, to correct it, to delete it, to restrict or object to specific uses, and to not have data used about their sex, health, religion, political identity, and other sensitive topics.
The law did lead to more transparency about how companies were using data and led to a steady stream of fines against violators, totaling nearly 2.8 billion euros so far. But many of the rights have been nearly impossible to exercise because legal challenges based on the law have often languished at the Irish Data Protection Commission, which handles most GDPR complaints against Big Tech.
According to the agency’s own statistics, the Irish Data Protection Commission had a backlog of more than 300 outstanding GDPR complaints as of the end of 2021, many dating back to 2018. The case against Meta was filed to the Austrian data protection agency the day that GDPR went into effect by the European nonprofit noyb (which stands for none of your business), founded by strategic privacy litigator Max Schrems. Like most GDPR cases, it was referred to Ireland, where Meta’s European headquarters are located.
Noyb’s lawsuit alleged that Facebook did not give users an opportunity to “freely” give consent to having their data used for advertising purposes as required by the law. Meta argued that providing personalized advertising was a “contractual necessity” required by its terms of service. Four years later, the European Data Protection Board disagreed with Meta’s reasoning and overruled an earlier decision by the Irish Data Protection Commission. Meta now has three months to work with Irish regulators to find a more lawful basis for its advertising business. Meta has said it will appeal the decision and that it will not change its approach to personalized advertising.
While all these opaque legal issues make their way through the European regulators, however, nothing has changed for Meta users. So British human rights activist Tanya O’Carroll has taken a different approach. Late last year she filed a lawsuit in a U.K. court against Meta for violating the GDPR, in the hopes that a different jurisdiction would yield different results. (Meta spokesperson Al Toler declined to address O’Carroll’s lawsuit when I asked about it and instead offered boilerplate language about Meta’s commitment to privacy and security.)
O’Carroll is a co-founder and former director of Amnesty Tech, the unit of Amnesty International that aims to disrupt surveillance around the world. She also coordinates a network of organizations called People v. Big Tech, and she is on the board of directors of SumofUs.
Our conversation, edited for brevity and clarity, is below.
Angwin: How has the GDPR performed since it was enacted in 2018?
O’Carroll: There was so much expectation and excitement when we got GDPR passed in Europe. Now the law has been applied for almost five years, and nothing is substantially different apart from the fact that everyone in Europe has to constantly click on these consent banners and go through this almost performance of privacy, which they mostly find incredibly irritating and gives privacy a bad name.
It’s a very ambitious and brilliant piece of legislation, but it lacks a robust enforcement structure. The way that it works is that the regulator is in the country where the business is headquartered. Ireland is this tiny country, but it is the regulator for some of the biggest companies on Earth because their European headquarters are there—companies like Google, Facebook, and Apple.
You’ve got a situation where the entire Irish economy depends on foreign direct investment, with a large proportion coming from the tech industry. There is just never going to be political will to properly enforce GDPR in Ireland. It’s leading to a lot of frustration, including from data protection authorities in other European countries, where complaints from citizens in their countries get kicked to Ireland and then die in Ireland.
Angwin: You are trying to unlock the enforcement problem. Can you explain your case and how it’s going to avoid this same issue?
O’Carroll: Put very simply, we’re going around the Irish regulator. We are going to court in the U.K. My case is based on some of the learning we’ve had over the last few years. Specifically, Facebook has consistently argued that it does not need to gather people’s consent for tracking and targeted advertising—despite what GDPR says. It essentially argues, “We don’t need your consent because you signed a contract, and the contract you signed means we are actually beholden to you—we’re contractually obliged to provide you with a personalized advertising experience.”
What I’m doing with my case is basically taking the ball into a different playing field. I am drawing on a different right within GDPR, Article 21.2, which gives us all an absolute right to object to the use of our data for direct marketing. Under this provision, it doesn’t matter what contract I signed—or any other legal basis Facebook can argue—because the right is absolute. It simply allows me to say stop at any time to the profiling and ad targeting.
At the moment, Facebook is not arguing that the right doesn’t exist but that the right shouldn’t apply in this situation. It’s basically saying GDPR doesn’t apply to its whole ad targeting machinery; rather, that an entirely different law applies. Which is an interesting argument because if the European gold standard data protection legislation doesn’t apply to data profiling and ad targeting by one of the biggest tech companies on Earth, then what does it apply to?
It boils down to the question of whether a judge in this case (not a regulator) will rule that data protection rights mean something or not. If it works, it could be the Achilles’ heel to bring the whole thing down. This right to object applies to any company and any practices that are based on the processing of your personal data for direct marketing. That doesn’t just include targeted ads but also the promotion of things like Facebook pages and groups. It applies across the board, and it’s a very powerful way that we, as individuals and citizens, have the ability to take back control over our data online. It is untested, but it’s potentially very exciting.
Angwin: What do you object to with Facebook’s processing of your data?
O’Carroll: This began with me wanting to understand how I’m being profiled. Facebook has made that data available. Anyone can open their settings and see the long list of ways they are being profiled. On my profile, the “interests” included the name of a political party, “homosexuality,” “feminism,” and other things that clearly count as sensitive data under GDPR.
Meta provides a veneer of control over these “interest” categories by letting users untag themselves from individual categories. But I didn’t want to just untag myself from an individual category. I wanted Facebook to delete the most invasive categories altogether, and I wanted to turn off the whole thing. Part of the reason I wanted this is because I had already tried turning off individual categories when I had a child in 2017. I didn’t like being bombarded by all of the new baby stuff, so I untagged “motherhood” as a category. But I found that I was still being bombarded by baby stuff because they had just repopulated it with “parenthood” or “children” or “toddler.”
When I complained, Facebook wrote back basically saying no, and it used the contract argument I described earlier, that untagging me from everything would mean failing to fulfill its contract to me. We went through a couple of rounds of this in letters before they raised the convoluted argument about a different law than GDPR governing their data processing for targeted ads.
What they did do was turn off the categories I’ve mentioned, so you can no longer tag people with homosexuality, feminism, or political parties. They made a big announcement that said Facebook is turning off sensitive data categories. But that doesn’t stop the problem because it’s not just the sensitive categories; it’s all of the proxy categories that reveal the exact same thing.
Angwin: What does the European Data Protection board decision last week mean for your case?
O’Carroll: For too long Meta has gotten away by arguing its contracted service is targeted ads, and in early 2022, the Irish regulator even ruled in its favor. But last week, the European Data Protection Board came to a decision overturning Ireland. This is absolutely huge and testament to years of smart campaigning and litigation by Max Schrems.
The move by the EDPB is very encouraging and may signal a bigger shift in the way that other European member states are going to start pushing Ireland. However, we’ve still got a long way to go. The first problem is that Ireland is still in the middle. They’ve had their preliminary decision overturned, but Ireland still gets to determine the scope of the changes that will be required. Second, Facebook has appealed. That will kick the can down the road for another few years. Third, Facebook could try to just switch its lawful basis. There’s nothing to stop them from saying, “O.K., we won’t use the contract argument, we’ll use a legitimate interest argument”—which is another route available to them under GDPR. They ultimately have deep pockets and can instruct a heap of lawyers to help them “comply.” While such “compliance” may be specious, there may be enough plausibility to necessitate another round of complaints and regulatory action. This could take years to finally settle.
GDPR has also taught us there is always a way to game these things. Even if Facebook were forced to introduce a consent option, we could end up with something similar to a cookie banner, meaning Facebook could break down every single thing it does and make it lengthy and irritating for a user to consent to each item each time they log in. Giants like Facebook are not just going to roll over. It will take relentless action on multiple grounds.
Angwin: What is the best case outcome from your lawsuit?
O’Carroll: For me it’s the ability to say no to your data being used to target you with ads and other kinds of “direct marketing,” such as the promotion of groups and pages, without losing access to the service.
It’s what Apple did last year, in terms of giving people a simple yes or no option on third-party apps tracking them. When they did that, 96% of U.S. citizens said, “No thanks, I’d rather not be tracked by loads of weird apps that I have no idea what they’re doing with my data.”
It seems like a very simple thing, but true control across the board is a step toward a very different internet. Companies will have to look for alternatives. They’ll have to look at contextual advertising and other models that will make the whole system much healthier. If Meta is really as innovative and future-oriented as it likes to tell us it is, it should know there is a ticking clock on its broken business model. I won’t hold my breath though.
As always, thanks for reading.
Best,
Julia Angwin
The Markup
(Additional Hello World research by Eve Zelickson.)