Last month, the District of Columbia sentenced a man to five years in prison for, among other charges, publishing and texting sexually explicit images of his former girlfriend. He spent years spamming her picture into group text chats, sending those images to her employer, and texting her directly from fake accounts pretending to be people online who had seen the photos.
His sentencing was a rare moment of accountability for the unfortunately very common crime known as nonconsensual pornography. The rise of cellphone cameras and social media has made it easy for voyeurs to create, trade, and monetize nude and sexual images of people who were either unwittingly filmed or who shared intimate photos with someone who breached their trust.
In the U.S., a 2017 study by the Cyber Civil Rights Initiative found that one in eight social media users who participated in the study had been victimized or threatened with victimization by nonconsensual porn. Around the world, it can be even more prevalent. In South Korea, women took to the streets in 2018 to protest the lack of consequences for a pervasive culture of “digital sex crimes” using spy cameras hidden in public bathrooms and changing rooms.
There has been a growing recognition among lawmakers that more is needed to address these crimes, which are intended to humiliate and harm the personal and professional prospects of victims. Just last week, the British government announced an array of new measures to crack down on intimate image abuse, such as a practice known as “downblousing”—photos taken down a woman’s top without consent.
In her new book, “The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age,” Danielle Citron calls for a new civil right to be established protecting intimate privacy. This is my second newsletter interviewing Danielle, who is the leading legal scholar in the emerging field of cyber civil rights. Two years ago, I interviewed her about efforts to reform Section 230 of the Communications Decency Act—a law sometimes referred to as the Magna Carta of the internet.
Citron is the Jefferson Scholars Foundation Schenck Distinguished Professor in Law and Caddell and Chapman Professor of Law at the University of Virginia, where she is the director of the school’s LawTech Center. In 2019, Citron was named a MacArthur Fellow based on her work on cyberstalking and intimate privacy.
Our conversation, edited for brevity and clarity, is below.
Angwin: What is intimate privacy?
Citron: Intimate privacy concerns the boundaries set around our intimate lives. It’s information about and access to our bodies, our health, our innermost thoughts (which we document all day long as we browse, read, search, share, text, and email). It concerns information about our sexual orientation, gender, and our sexual activities. It concerns our closest relationships. Intimate privacy is both descriptive and normative. It’s the privacy that we want and expect, and it’s the privacy—this is the normative part—that we deserve to flourish, to engage as citizens, to love, to figure out who we are, and to enjoy self-respect as well as social respect.
I’m talking about exploitations by individual privacy invaders like the nonconsensual taking, exploitation, sharing, or manufacture of intimate imagery. I’m also talking about the ways in which companies amass, share, and exploit our intimate data. Our apps, phones, tools, and sites that we visit share our intimate data with advertisers, marketers, and data brokers that in turn sell dossiers about us to law enforcement. Individual privacy invaders and corporate spies are handmaidens to governments.
Angwin: Why do we not currently have intimate privacy, and where is the law failing us?
Citron: When it comes to individual privacy invaders, the law doesn’t deal with it in a comprehensive manner. Instead, we tackle specific types of intimate privacy violations and miss others. Thanks to the courageous work of Mary Anne Franks and our team at the Cyber Civil Rights Initiative, 48 states, D.C., and two territories criminalize the disclosure of nonconsensual intimate imagery. Yet upskirt photos and deepfake sex videos often fall outside criminal law’s reach.
When the criminal law does tackle intimate privacy violations, they are often treated as misdemeanors, which are woefully underenforced because we don’t protect against gendered harm and because it is difficult to get officers and prosecutors to spend scarce resources on low-level offenses. It’s really difficult in the United States for individuals to sue perpetrators, because most lawyers can’t offer their services for free and most perpetrators have little resources to make it worth suing them.
Then there is the problem of Section 230. In 1996, Congress passed Section 230 of the Communications Decency Act, which provides a legal shield for leaving up or taking down offensive material. The goal was to incentivize moderation and to immunize Good Samaritans for trying to tackle abuse online. Under the law’s broad interpretation, sites that encourage, solicit, or keep up intimate privacy violations face no repercussions. There are 9,500 sites whose raison d’être is intimate image abuse. Section 230, as it’s broadly interpreted by the courts, means that those sites that encourage and solicit intimate privacy violations enjoy immunity; they have a legal shield from responsibility.
Angwin: You call for a civil right to intimate privacy. What does that mean?
Citron: Modern civil rights laws protect against invidious discrimination and rightly so. I want us also to conceive of civil rights as both a commitment for all to enjoy and something that provides special protection against discrimination. Because who is most affected and harmed by the sharing of intimate information? Women, non-White people, and LGBTQ+ individuals, many of whom often have more than one vulnerable identity.
Currently, the law woefully underprotects intimate privacy. Treating intimate privacy as a civil right is crucial for expressive and practical reasons. First, as an expressive message, when you say something is a civil right, it means you can’t give it away without a good reason. It means that intimate privacy cannot be traded away for the sake of profits, efficiency, or fun. Once we understand intimate privacy as a civil right, it would mean that any entity or person amassing intimate data or accessing our intimate lives would become a steward or guardian of our intimate data or lives. For instance, when companies gather intimate data about us, rather than being able to exploit it, a civil rights model would say you’re the guardian of that data and you cannot exploit it.
Angwin: What does it mean to be the guardian of someone’s intimate data?
Citron: If you’re the guardian of people’s intimate data, then you can’t collect it unless it is strictly necessary to provide your product or service, or for legitimate government processes. Second, you’ve got to get rid of it as soon as your strictly necessary reason is over. The guardians of our intimate data would have duties of loyalty and care (here, I’m relying on Woody Hartzog and Neil Richards’s scholarship), duties of nonexploitation, duties of nondiscrimination, and duties of confidentiality. It means if you have my intimate data, you can’t sell it, not ever, end of story. Those substantive protections would provide stronger protection than under the EU’s General Data Protection Regulation (GDPR).
Congress needs to step in to secure a civil right to intimate privacy. Aspects of the American Data Protection and Privacy Act (ADPPA) embody this notion. The ADPPA includes duties of minimization and collection and limits on the sale of intimate data, and it defines intimate data to include sexual orientation, intimate images, and health. The ADPPA also includes a civil rights commitment to antidiscrimination.
Angwin: In your 2014 book, “Hate Crimes in Cyberspace,” you called for an exception to Section 230 for cyber cesspools. In this book, you’re calling for a Section 230 reform that would limit immunity for sites unless they’ve taken reasonable steps to address the unlawful use of services that create serious harm to others. Can you talk about your evolution of thinking on Section 230?
Citron: Post-Dobbs, my thinking has once again changed in light of efforts to criminalize speech about reproductive rights and efforts to aid and abet reproductive autonomy, including providing information about reproductive rights. I’ve written about my revised, more modest position in a recent article. Under my current proposal, sites that deliberately solicit, encourage, or keep up intimate privacy violations, cyberstalking, or cyber harassment would not enjoy immunity from liability under Section 230(c)(1). In other words, those Bad Samaritans would be exempted from the legal shield.
Second, and here I am narrowing my call for reasonable steps, platforms should have a duty of care to take reasonable steps to address intimate privacy violations and cyberstalking. Rather than an unguided duty of care, lawmakers should specify the obligations involved, drawing on key lessons from the trust and safety field. I focus on intimate privacy violations, cyberstalking, and cyber harassment because we know that those abuses drive people offline and destroy their ability to work and engage with the world around them. Platforms need to tackle those abuses to earn the immunity.
Driving my decision to revise my proposal was that if every platform had a duty of care to address all illegal content, now that abortion is illegal in some jurisdictions and aiding and abetting abortion is illegal, then platforms would have duties of care to take down information about reproductive health. I want to make sure that intimate privacy doesn’t come with unintended costs.
Angwin: You talk about how the law is not particularly well set up to give victims what they really want, which is injunctive relief—they want the content removed. How difficult is it to get that now, and what do you propose to fix this?
Citron: We can’t undo the harm victims have experienced, but at the very least we can stop the harm from continuing. One victim said this was like an incurable disease because she would go online and find her intimate photos migrated to other sites. It’s like whack-a-mole. Even if she could figure out where all these photos are, she can’t sue the site, she can’t force them to take it down. This is how these sites make money, so they’re not taking it down.
So what is injunctive relief? If we don’t get my broader amendment to Section 230, a more narrow amendment to Section 230 could say that you can sue platforms for injunctive relief in cases involving intimate privacy violations and obtain attorneys’ fees from platforms. Then, lawyers would be motivated to take these cases. Injunctive relief helps actually address the disease. It would take time, because you have to get a lawyer and a lawsuit, and it could pop up on another site. It’s not a foolproof plan, but I want to take any step to help victims stem the damage
As always, thanks for reading.
(Additional Hello World research by Eve Zelickson.)