Home » Blog » ideas » Eli Pariser on Filter Bubbles

Eli Pariser on Filter Bubbles

Eli Pariser offers the idea of a personalized conference. What if we came to an event like Personal Democracy Forum, and sorted ourselves by gender, age, political ideology, hometown. Pretty soon, we’d all be sitting in small rooms, all by ourselves. What if speakers then offered personalized talks, adding explosions for the young male listeners, for instance. “Yu’d probably like your personal version better… but it would be a bad thing for me to do.” It renders moot the point of a conference – we no longer have a common ground of speeches that we can discuss in the hallways.

Google uses 57 signals available to personalize the web for you, even if you’re not logged in. As a result, the results you get on a Google search can end up being very different, even if quite similar people are searching. Eli shows us screenshots of a search for “BP” conducted by two young women, both living in the Northeastern US. They get very different results… one set focuses on business issues and doesn’t feature a link on the oil spill in the top three, while the other does. And one user got 141 million results, while the other got 180 million. Just imagine how different those results could be for very different users.

Facebook also engages in customization, using information on the links you click to customize the news that appears in your personal feed. Eli tells us that he’s worked hard to add conservatives to his circle of friends and follow them on Facebook – why wasn’t he getting news and links from them. Well, Facebook saw he was clicking more links about Lady Gaga and progressive politics and customized his experience to filter out conservative links.

Eli terms this phenomenon a “filter bubble” – a special sort of echo chamber. The better our filters get, the less likely we are to be exposed to something novel, unexpected, or uncomfortable. This has always happened – media always lets us choose more familiar and comfortable perspectives and filter out others. But filter bubbles differ from the past in three ways:

– The degree of personalization is higher. You’re no longer just hanging out with the other thousands of readers of The Nation – you’re alone in your bubble.

– They’re invisible. Google doesn’t tell you it’s personalizing your bubble, which means there are big unknown unknowns.

– You don’t choose the filter – it chooses you. You know you’re choosing partisan news when you look at Fox News or Democracy Now, but it’s increasingly impossible to escape the filter bubble.

We thought the battle on the internet was to defeat the censors, to get the story out of Iran around filters and the police. We thought we needed to circumvent the biases of traditional media gatekeepers. But now we’re facing a re-intermediation, this time by algorithms, not by individuals.

We need filters – there’s more information created in a single year now than was created from the beginning of human history through 2008. But we need to think of the values embedded in these filters. Facebook’s filters have something to do with the statement Eli attributes to Marc Zuckerberg that a squirrel dying in front of your house might be more important to you than people dying in Africa. (I haven’t been able to source this quote – I’ll see Eli later today and ask for a footnote.)

Personalization is a great corporate strategy, but it’s bad for citizens. These filters could lead to the end of public conversation, Cass Sunstein worries, or the end of the public. But humans created these tools, and we can change them. We might add a slider to Facebook that lets us see news that’s more or less homogenous.

First, we need to get over the idea that code is neutral – it’s inherently political.

Eli invites us to continue the conversation, using the #filterbubble tag. I look forward to connecting him with some of the writing I’ve been doing the last two years on homophily.

14 thoughts on “Eli Pariser on Filter Bubbles”

  1. Excellent summary, Ethan. Eli’s talk was the first convincing presentation I’ve seen about the downsides of the personalization bubble. Glad to hear this will be dissected at greater length during Saturday’s PDF unconference.

  2. Pingback: Google Fellow at the Personal Democracy Forum « Can? We? Save? Africa?

  3. Pingback: New Digital Divides: The Personalized “Filter Bubbles” Menacing Democracy | Information Personnes / Persons Information

  4. Pingback: Highlights of Personal Democracy Forum 2010

  5. Pingback: Personalised to death: Filter bubbles « The New Media World

  6. Pingback: Reboot

  7. Pingback: The Big Bad (Filter) Bubble. Really? « Blogging from Binney St

  8. Pingback: Making news content more transparent « Sameer Padania

  9. Pingback: …My heart’s in Accra » In Soviet Russia, Google Researches You!

  10. Ethan makes the point that, “The better our filters get, the less likely we are to be exposed to something novel, unexpected, or uncomfortable.” The truth is the exact opposite. :(

    Now, as we stand today, filters do work with our impulses. However, this is because of a lack of profile data on who we are and what we intend or need to do. For filters to work correctly we need to have data being sent and received in real-time for both the providers and receivers of internet content.

    So, your point is valid, but only for now. Filters will increasingly give what we need and be able to challenge us through intelligent prediction.

  11. The Zuckerberg quote reminds me of a another qoute from the penny press.

    “A dog fight in New York is more important than revolution in China.” Not sure, but I believe it was Alexander Hamilton who said it.

    It makes sense if Facebook holds the same values as the penny press did — in a social media version: Everyday, local, human interest, privat etc.

  12. Pingback: …My heart’s in Accra » How diverse is your social network? How diverse should it be?

  13. Pingback: Phrase of the Day: filter bubble | The Big Picture

Comments are closed.