Hello, friends,
Ten years ago, Eli Pariser coined the term “filter bubble” to describe the “personal ecosystem of information” created by the algorithms of Big Tech. He worried that the rise of algorithmically reinforced world views would fracture our sense of a shared reality.
A decade later, his warnings have more salience than ever. Having just survived an election season in which a significant portion of the U.S. population believed that there was a different outcome from what the vote counts revealed, our sense of shared reality is more fractured than ever.
There are many reasons for polarization, but the rise of social media and its engagement-focused algorithms is among the top culprits. That’s why we at The Markup released a tool this week, Split Screen, that allows readers to see for themselves how different Facebook news feeds are personalized.
Built by a team led by Sam Morris, our graphics editor, and investigative data journalist Surya Mattu, Split Screen allows readers to examine what Facebook news feeds look like for Trump voters, Biden voters, men, women, millennials, and boomers in our Citizen Browser national panel of users. Of course, Split Screen is not necessarily representative of what any particular Facebook user would see in his or her own feed. It is based on data collected from more than 2,500 participating panelists. As always, we provide an extensive methodology describing how we put it together.
The disparities in political news were stark. Trump voters’ feeds at the time of publication were highlighting a speech by former president Donald Trump and a post by Sean Hannity about “cancel culture.” Meanwhile, Biden voters were shown articles about President Joe Biden and his efforts to distribute the COVID-19 vaccine.
Some of the disparities were also funny. Millennials were most often recommended groups about ’90s nostalgia and ramen noodles, while boomers were offered groups about the Far Side cartoon.
To better understand the filter bubble and how it is affecting society, I spoke with Pariser, the author of the 2011 book “The Filter Bubble.” Pariser is an author, activist, and entrepreneur focused on how to make technology and media serve democracy. In 2004, at 23, he became executive director of MoveOn.org, where he helped pioneer the practice of online citizen engagement. In 2006, he co-founded Avaaz, now the world’s largest citizen’s organization. In 2012, he co-founded Upworthy, a media startup that reached hundreds of millions of visitors with civically important content. He currently co-directs New_ Public, a project of the National Conference on Citizenship and of the Center for Media Engagement at Moody College of Communication, the University of Texas at Austin.
The interview is below, edited for brevity.
Angwin: You created the term “filter bubble” 10 years ago. Can you talk about what your insight was at the time?
Pariser: My interest really started with trying to understand how communication was changing, and how the way the information flowed was changing with the rise of platforms like Facebook and Google. I had this moment where I realized, “Oh, they’re all going to be powered by personal data and trying to reflect back what they think we’re most likely to click on or engage with.”
When all of a sudden those are a lot of the primary places that everybody’s getting information, you start to imagine this personal universe of information that’s generated by all these different algorithms, just for you or for who they think you are. And that was the filter bubble.
And so what does that mean for democracy? I was worried that it would be harder and harder to kind of live in a shared information universe or even really have a shared reference point of how far out you were.
It’s like everyone’s always lived in their own information universe in a way, but you can kind of see like, O.K., am I one standard deviation from the mean or five or 10, you know? And I guess part of my concern with personalization is like, you don’t even really have that reference point.
You don’t know how weird your bubble is in reference to anyone else’s because you can’t actually see it.
Angwin: Your work was an inspiration for our Split Screen tool. I’m curious what you think of looking at the filter bubble in that way?
Pariser: It’s really interesting to just be able to get a little bit more experientially into a different kind of media ecosystem. And it does not do a lot to reduce my worries about what this all means for our ability to have a shared conversation.
You’re filtering for things that are the most different. So there’s some content—I was looking at the top, most engaged Facebook photos today, and some of them are cute animals, and everyone loves cute animals—where there is probably cross partisan agreement. That’s not going to show up here necessarily.
But then there is stuff that as a liberal I just never, ever, ever see. I didn’t know a lot about some of the things in the Trump feed. Literally. I never heard about that.
Angwin: There has been some research on the filter bubble, and some of it has said there’s not much of a filter bubble. I’m curious how your thinking has evolved over the past 10 years.
Pariser: I have come to believe that it is super complicated, and general statements about how information moves when you’re talking about billions of people and personalized algorithms is really hard for anyone—even the companies—to do right.
The place where I’ve probably changed my mind the most is that the filter bubble was a content-centric way of thinking about what people need to know in order to understand the world. And I think the more I thought about it, the more I’ve wondered if really it’s about relationships, and the content is kind of a layer that sits on top of the structure of your relationships.
And that means I could probably work hard to have a truly kind of bipartisan group of close friends and connections that would counter the effect of these algorithms on what I ultimately came to believe.
I don’t believe that there are many people who see one ISIS video and are like, All right, off to the front with me. I don’t think that’s how content works. But I do believe that you can fall into a community and find a bunch of meaning in it and start to feel a strong sense of affiliation, and that what community you fall into is in part a function of what’s recommended to you.
It’s the relationships around that content that make it compelling. And that doesn’t let the platforms off the hook, because they are structuring those relationships.
The other place where I’ve shifted what I think a little bit is the premise in the “Filter Bubble” that suggested that if I, as a liberal, watched Fox News more or saw more articles from Fox News, that I would be more sympathetic to conservative points of view.
I think that’s actually a bit of a naive or a simplistic view of how this works. In fact, what often happens in practice is that I as a liberal see the most inflammatory thing from Fox News, and it reinforces my low opinion of Fox News and anyone who watches it.
Rather than me coming to understand differently where someone’s coming from, all of my prejudices are reinforced, which happens a lot less if I’m actually sitting down for coffee with someone who’s watching Fox News.
So there is the content and the relationships, and I think there’s a way in which you can put too much focus on the content versus the relationships around them.
Angwin: Does that mean that you think Facebook shouldn’t recommend groups?
Pariser: Well, I’m spending a lot of time thinking about that.
The way that our relationships are structured has a lot to do with the kind of society that we end up being. And we have places where relationships get structured. I think about parks a lot because parks actually do a bunch of work for communities, helping build cohesion and familiarity and a sense of identity in this relatively gentle way.
How do we build digital spaces that are weaving social fabric rather than tearing it apart? How do we build a culture for a pluralistic democracy?
That is what we’re trying to do at New_ Public. We like to think about this in terms of kind of an urban planning metaphor because these problems show up in urban planning as well.
It’s a “How do you get strangers to behave together?” problem. Public life happens on our sidewalks and in parks and libraries and these other public fora, and we just don’t have enough of those in the digital world. In any physical community, you’d never say we’re going to just cram all these functions into the for-profit businesses that are around.
Angwin: So what does a digital public park look like?
Pariser: A good example is Front Porch Forum in Vermont. It’s a business, but it’s not venture-capital backed, and it’s not a growth-oriented business.
It’s a very well and seriously moderated community discussion forum for every town in Vermont. And apparently about two-thirds of Vermont households are on it. Because they’re not trying to maximize for engagement, every post gets read before it’s sent out, and there’s only one email blast a day with all of these posts.
That has two effects. One is that if you’re being too bratty, your posts get sent back to you. And on the other hand, if you’re getting into a big argument, to sustain it, you have to be willing to go for many days on end, which most people are not actually willing to do.
So the dynamics are really different. It’s not perfect, but it’s a pretty solid way of having community conversations that is really different from a Facebook group.
Angwin: So now, 10 years later, when you reflect on the filter bubble, is it worse than you thought it would be?
Pariser: It’s about as bad as I thought.
The challenge is, we’ve got a whole bunch of different forces that are reinforcing each other of which social media is one. But not the only one. There’s also demographics that show we live closer to people who are like us than we have ever before and we belong to organizations that are more politically polarized.
So the bad news is that to invent our way out of this is going to require whole new kinds of institutions that specifically address these problems.
The good news is that we’ve done that before at times of social fracture and stress in American history, such as when libraries came in as people were starting to get literate at a mass level.
As an eternal optimist, I feel like, O.K., we’ve got this big invention project in front of us and it’s not going to be driven by Elon Musk. It’s going to be driven by a much more public-spirited movement.
As always, thanks for reading.
Best,
Julia Angwin
Editor-in-Chief
The Markup