Hello, friends,
In May, The Markup reporters Annie Gilbertson and Jon Keegan presented Amazon with a list of nearly 100 active product listings on its site for items that the company bans under its categories of drugs, theft, spying, weapons and other dangerous items.
Our investigation, which we published in June, revealed just how easily third-party sellers—and in some cases, Amazon itself—were able to peddle prohibited goods on Amazon’s site. Amazon removed most of the listings we flagged once we brought them to the company’s attention.
One category of items we found was synthetic peptides—short chains of amino acids that some people use in an attempt to help build muscle and repair injury. Amazon.com bans the sale of injectable drugs and any drugs not approved by the Food and Drug Administration.
Amazon said that the peptides we found were not really drugs.
“We did sell lab chemicals that were clearly marketed as being for research use only and not for human consumption,” Amazon spokesperson Patrick Graham wrote in an email. “Out of an abundance of caution we are restricting them going forward.”
But over the course of the following few months, Annie and Jon found 66 active listings for synthetic peptides on Amazon.com—many with reviews indicating people were using them on themselves. The peptides we found were not on the FDA’s approved drug list, and it is illegal to sell misbranded or unapproved new drugs. Several are classified as doping drugs by the World Anti-Doping Agency. More than a quarter of the listings had been around for at least a year, showing the extent to which Amazon has failed over time to enforce its own rules.
When we brought the listings to Amazon’s attention, the company yet again removed them and said it would provide better enforcement in the future. “We do not sanction customer misuse or abuse of products,” company spokesperson Mary Kate McCarthy said. “However, out of an abundance of caution, we decided to no longer allow these products and have been removing them since, as we have in this case.”
Sound familiar? This game of whack-a-mole is familiar to anyone who has been paying attention to how the big tech companies monitor their platforms. The way it works is, a journalist or academic finds something sketchy on a tech platform; the platform says sorry and pledges to remove it; weeks or months later, the sketchy item reappears; the cycle repeats.
To understand how we might exit this cycle, I interviewed UCLA professor Safiya Umoja Noble, who has played her own game of whack-a-mole with Google regarding its racist and sexist associations in search results. Noble is an associate professor in the departments of Information Studies and African American Studies and the author of the best-selling book Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press, 2018).
Eight years ago, Noble wrote an article for Bitch magazine describing how searches for “Black girls” regularly brought up porn sites in top results. Google quietly fixed the problem, though the company didn’t make any official statements about it. Now, a search for “Black girls” returns links to nonprofit groups like Black Girls Code and Black Girls Rock.
But earlier this year, The Markup found that Google was still showing pornographic results for searches for “Black girls” in its ad buying portal. After we contacted Google, the company removed the pornographic results and said its filters hadn’t worked “as intended.” Google spokesperson Suzanne Blackburn wrote in a statement emailed to The Markup, “We’ve removed these terms from the tool and are looking into how we stop this from happening again.”
The interview with Noble is below, edited for brevity.
Angwin: Sometimes it feels like journalists like myself and my colleagues have just become content moderators for the big tech platforms—alerting them to content that breaks their own rules. How do you view this outsourcing of content moderation?
Noble: Journalists and scholars are doing the moral and ethical work for these companies for zero compensation and a complete lack of acknowledgement. In fact, in some cases there is a total denial that the problems that we are articulating and pointing to are real. But we know they are responding to our critiques. In fact, Google opening up an AI ethics consulting division two weeks ago is evidence of their responding to their critics.
Angwin: And it’s not easy work. Search results, Facebook posts, all seem to just exist for a moment in time. It’s ephemeral. How did you start looking into this issue?
Noble: When I first looked at “Black girls” in Google search, it was late 2009. At that time, the top result was hotblackpussy.com, and I remember thinking, “This is wrong. This is terrible.” And then I kept an eye on it over the course of a couple years. By 2011 hotblackpussy.com had gone out of business and sugaryblackpussy.com had replaced it. That’s when I started getting systematic about how I was going to research it.
The challenge then was not only that the results were changing but I also needed to figure out how to have multiple ways of recognizing whether this was a steady and persistent representation or whether it was specific to my locale. At that time, the majority of my professors were telling me, “That’s not a thing.” They were saying that it was “user error”—which is of course what most people think when they come across something wrong. That’s also by design.
I knew that as my work started getting out there, the results would change. It’s a constantly moving landscape—the earth is shifting under our feet, but that doesn’t mean we are not on the earth.
It was a decade ago. At that time, people’s ears couldn’t hear and process what I was saying when I said these algorithms are discriminatory. People were adamant that search only represented what was out there.The primary response from professors was there was no way this is happening at the level of code because code is just math and math can’t discriminate. So now fast-forward 10 years, I find that we do have a totally different ear—we understand that programming is a language and languages are subjective, and now we are able to talk about these things with more ease.
Angwin: You have written powerfully about Google’s ability to shape our idea of the truth. But many people still think of Facebook when they think about misinformation.
Noble: People use social media for their news, but they use Google for their facts. It’s interesting to watch the public become more aware of social media manipulation, but many forget that as people are trying to test the veracity of things they find in social media, they are going to Google like it’s the objective truth fact-checker, which couldn’t be further from the truth.
Angwin: You have advocated for a public interest non-commercial search engine that could be administered by librarians. Assuming the funding was there for that, can you describe what you are envisioning?
Noble: Librarianship has a long and storied history of narrating, collecting, and curating history’s winners, the colonizers, the imperial, the people who are invested in framing the world through their own eyes. Having said that, we understand that libraries hold a higher standard to the kinds of things they collect, if not because, just practically speaking, the brick and mortar building can only hold so much, which is different than the vastly expansive internet.
I’ve often tried to convince large libraries to think about what their role could be in curating the open web and differentiating knowledge from advertising from propaganda. The challenge is that Silicon Valley has given us the vocabulary words for how to think about information—and the prevailing word is “content.” But content flattens the distinctions between propaganda and evidence-based research and many other kinds of knowledge or information.
What we need are counterweights—resistance to that kind of instant-gratification model. I liken it to the slow food versus industrialized fast food models. There is something healthier, better for your community, better for your body and mind when you take a slower approach to learning, knowledge, and information gathering compared with mass-produced content that is cutting every kind of corner.
Angwin: Would breaking up Big Tech improve things?
Noble: Where my work is going now is, I’m really trying to write and research how to shift the paradigm around Big Tech. I’m using two other eras of history—the breakup of Big Cotton, which was predicated on the slave trade, and the era of Big Tobacco. I’m looking at Big Tech through a similar lens.
We have a narrative that every dimension of our economy is propped up by technology so we could never roll it back. I think of myself as a tech abolitionist working in the tradition of previous generations of abolitionists. I am trying to do culture-making work, and narrative that will make these connections so people don’t have to feel totally dominated by the idea that there could be no other way.
Angwin: What does tech abolition mean?
Noble: I think that the tech sector owes trillions of dollars to publics all around the world for its extractive and harmful business practices. There would certainly have to be a strong element of not only breaking up these large monopolies but also transferring the wealth that has been extracted back to the public.
Angwin: Recently the Duchess of Sussex name-checked your book in an interview with Gloria Steinem. What was that like for you?
Noble: Seeing the duchess and Gloria Steinem talk about what’s happening in society and linking my work to their concerns was deeply touching because I respect the work of both of them. Scholars want nothing more than having our work reach broader publics and not just having our work stay sequestered in the academy.
In many ways I relate so much to Meghan Markle’s personal story of being a working class woman of color from California. I also had a black parent and a white parent. I also married a prince [laughs heartily] who adores and supports me in my work on gender and racial equity.
In the interview she said it hadn’t dawned on her that this level of misinformation about Black girls was happening, but then she realized that of course it was happening. That is what happens when people come across the feminist research—it validates our own lived experience. There’s really nothing more exciting than when we all link up our advocacy work—including how I feel about your work, Julia—to effect change. It’s an honor to be in the company of other women trying to do their part to make the world better.
As always, thanks for reading.
Best,
Julia Angwin
Editor-in-Chief
The Markup