Hello, friends,
As we head into U.S. elections, technology issues are appearing on the ballot all over the country. Michigan is considering a constitutional amendment that would prohibit warrantless searches of electronic data by law enforcement. Chicago is considering a referendum that would pave the way for public funding of broadband infrastructure.
Portland, Maine, is considering a referendum to ban the use of facial recognition software by law enforcement. And Akron, Ohio, is taking up an amendment to the city’s charter that would require police to release all bodycam and dashcam footage in situations where officers used force.
California is the battleground for the most controversial issues. Tech companies have spent nearly $200 million lobbying for Proposition 22, which would allow companies such as Uber, Lyft, Postmates, and Instacart to classify their workers as contractors, rather than employees—meaning they would not have to provide them benefits, vacation, etc.
And one of the most contentious issues on the California ballot is Proposition 24, the California Privacy Rights Act. The act has created strange bedfellows, with both the ACLU and a coalition of advertisers opposing it, while consumer groups such as Consumer Watchdog and Common Sense Media support it. And some privacy rights groups are just sitting out the fight altogether: Both the Electronic Privacy Information Center and the Electronic Frontier Foundation take no position on the measure.
To understand what is going on, I interviewed Ashkan Soltani, a researcher with a focus on privacy, security, and technology policy presently at Georgetown Law as a distinguished fellow. Soltani is also an adviser to Alastair Mactaggart, a real estate developer who spearheaded the California Privacy Rights Act (CPRA) and its predecessor law, the California Consumer Privacy Act of 2018 (CCPA).
Full disclosure: Soltani and I go way back. A decade ago, when I was working at The Wall Street Journal, I stumbled on Soltani’s master’s thesis, which found that 88 percent of websites at the time contained tracking code from Google. I called him up and said, “Can you do the same survey for me?” He agreed, and we launched the “What They Know” series of investigations into privacy, which ran from 2010 to 2013.
After working with me at The Wall Street Journal, Soltani moved on to The Washington Post, where he helped with technical analysis of the Snowden documents and was part of the team that won the Pulitzer Prize. From 2014 to 2015, he served as the chief technologist of the Federal Trade Commission.
Soltani has a distinct point of view that does not represent the views of The Markup, which does not take positions on public policy. The interview with Soltani is below, edited for brevity.
Angwin: Let’s start with the basics. How do you define the problem that we call privacy?
Soltani: How much time do you have? [Laughs.] O.K., here is the condensed version. There is a market failure similar to what occurred with the environmental issues of 20 or 30 years ago, where there is a collective resource that is being mined and used, but individuals have very little knowledge and control of its use. That’s the fundamental market failure of privacy: It’s the failure of the market to properly price and address personal data.
Angwin: So to protect personal data, do you need some baseline privacy rights? I think of the California Consumer Privacy Act of 2018 as the first comprehensive privacy law in the United States. (Many countries already have a comprehensive privacy law.) Tell me about your involvement in CCPA?
Soltani:, Alastair [Mactaggart] had the idea that he wanted to do a privacy law and had done two years of research before we met. He had already submitted a first draft of the initiative to the attorney general, but a number of people flagged technical issues regarding how it would work in practice, so he brought me in to resolve some of the operational issues.
In the initiative process, after you submit the initial draft, you have three weeks to make revisions. I was brought in to make changes within those three weeks and was pretty limited in what I could change. And I think I was traveling in my van at the time. I remember making edits in a national park somewhere.
Then the following year, come summer, I hadn’t heard from him, and then suddenly he calls me up to tell me we got the votes and it’s going on the ballot. Then again, a few weeks later, he says we have a deal to pass it through the legislature with some minor changes, for example, if we get rid of the private right of action. Then that became the CCPA—and a lot of people celebrated its passage including some privacy orgs that now oppose it.
Then, come 2019, he pulled me back in because everyone and their mother were trying to push amendments to gut the law or tweak it in their favor. Some of them made it through the legislature (insurance exemption and a carve-out for the auto industry, for example).
The most fraught was a tiny amendment that one very large tech company nearly snuck in without any of the advocates being aware. At first glance, it looked like a reasonable change. However, after closer analysis, it became clear that this tweak would actually exempt certain players: those that are simultaneously first parties and third parties.
By the end of 2019, Alastair started realizing that this was going to be his life—full-time in hand-to-hand combat with the biggest tech companies in the world. So he was like, O.K., you know what, I’m going to go back to the ballot and memorialize the law.
Angwin: So the goal of the CPRA is essentially to protect CCPA from being eviscerated by tech lobbyists in Sacramento?
Soltani: Yes. The motivation was to at least set the floor and protect it from being gutted.
Also, there are a bunch of additional rights that are added by the law, if it passes. There are new rights around algorithmic transparency, data minimization, and sensitive categories of information (location, race, sex, orientation, etc.). There is the creation of an enforcement agency—essentially a data protection authority—that would be tasked with enforcing the law but also issuing guidance and amending the law. The California attorney general has said on multiple occasions they don’t have the resources to enforce the CCPA—so this new DPA would address that gap.
The other goal was to harmonize it with Europe’s General Data Protection Regulation (GDPR) [which went into effect in 2018].
First off, that makes compliance easier for businesses. More important, after the Schrems II decision [a July EU ruling about U.S.-U.K. data transfers], companies are prohibited from transferring data from the EU to the U.S. But with CPRA in place, Europe could decide that California has an “adequate” level of data protection under GDPR. That would allow Californian companies to operate in the global marketplace and have huge implications for the national privacy conversation.
Angwin: What happens if CPRA doesn’t pass?
Soltani: The reality is that, if passed, CPRA wouldn’t go into effect until 2023. Setting up a data protection agency and issuing regulation won’t be instantaneous.
We also know that both presidential candidates, Congress, and industry are heavily pushing for federal privacy legislation. After the Schrems II decision, industry is calling for a federal privacy framework that will enable U.S.-EU data transfers and permit us to participate in the global marketplace.
So it’s my belief that we will likely see a federal privacy law pass within the next two years regardless of who is in the White House.
The key question will be the federal baseline and whether it will preempt California’s. Whatever happens, the battle will move to Washington, D.C. If CPRA passes, the baseline for privacy protections will have been raised significantly. If CPRA fails, it will not only expose the CCPA to subsequent amendments that will weaken it but will also signal to D.C. (and other states) that consumers (at least 40 million Californians) don’t really care about privacy—and that the advocates can’t get it together and will just fight among themselves when given the chance.
Angwin: Opponents and critics say that CPRA is not strong enough, that it is opt-out rather than opt-in, that it doesn’t allow for private right of action, and that it encourages pay-for-privacy schemes. Can you address those criticisms?
Soltani: Let’s start with the opt-in argument. Alastair actually intended the CCPA to be opt-in when he first started. After talking to a ton of experts, however, he ended up structuring it as an opt-out to address First Amendment concerns.
The current opt-out standard requires you, when you visit a website, to click a button saying, “Do not sell my personal information.” And so separately, I helped organize a group of privacy-forward organizations to propose a Global Privacy Control that allows consumers to configure a setting in their browser and then automatically be opted out from every website they visit.
The other criticism is that the CPRA doesn’t have a private right of action for privacy violations. The thinking there was any initiative that would have a private right of action is essentially nuclear for the companies. I don’t think we will ever see federal or state legislation that will have it.
The third piece is the pay-for-privacy piece—and to be clear, the CPRA isn’t any different from CCPA on this point. The idea is that if you go to a website, the advocates would want the site to still provide you service even though it can’t monetize your information. That would mean that the only news sites and publishers that could stay in business are the ones that have alternative funding—such as from the Koch brothers or Peter Thiel.
In the long run, I think incentivizing businesses to employ alternative “privacy friendly” ways to monetize content is a good idea, but we’re not there yet. Doing it today would mean that most of the publishers we rely on today would have to give their content away for free or shut down—and with the death spiral the news media is already in, I think that would have profoundly negative effects on society.
The CCPA and CPRA say that you can only charge the user the amount that you make from monetization of their data, and that fee can’t be “unjust, unreasonable, coercive, or usurious in nature.” Companies are actually required to calculate the “value of the user’s data” and make those calculations available to the attorney general. Going back to the environmental model, this forces companies to explicitly declare the price of the trees they are cutting down—meaning we can finally begin to value this precious resource.
As always, thanks for reading.
Best,
Julia Angwin
Editor-in-Chief
The Markup