Roughly ten years ago, as phones became smartphones and Facebook and Twitter began their rise towards ubiquity, a fundamental social shift took place: the majority of people in the developed world became content creators. The bloggers of the early 2000s were joined by hundreds of millions of people posting videos to YouTube channels, pictures to Instagram, essays to Medium and countless status updates from 140 characters to Facebook wall posts. Before the internet, publishing had been a distinction, with a limited number of people lucky, talented or wealthy enough to share ideas or images with a wide audience. After the rise of social media, publishing became a default, with non-participation the exception.
There’s a problem with this rise in shared self-expression: we’ve all still got a constant and limited amount of attention available. For those creating content, this means the challenge now is not publishing your work, but finding an audience. The problem for those of us in the audience – i.e., all of us – is filtering through the information constantly coming at us.
Before the internet, we relied on newspapers and broadcasters to filter much of our information, choosing curators based on their styles, reputations and biases – did you want a Wall Street Journal or New York Times view of the world? Fox News or NPR? The rise of powerful search engines made it possible to filter information based on our own interests – if you’re interested in sumo wrestling, you can learn whatever Google will show you, even if professional curators don’t see the sport as a priority.
Social media has presented a new problem for filters. The theory behind social media is that we want to pay attention to what our friends and family think is important. In practice, paying attention to everything 500 or 1500 friends are interested in is overwhelming – Robin Dunbar theorizes that people have a hard limit to how many relationships we can cognitively maintain. Twitter solves this problem with a social hack: it’s okay to miss posts on your feed because so many are flowing by… though Twitter now tries to catch you up on important posts if you had the temerity to step away from the service for a few hours.
Facebook and other social media platforms solve the problem a different way: the algorithm. Facebook’s news feed usually differs sharply from a list of the most recent items posted by your friends and pages you follow – instead, it’s been personalized using thousands of factors, meaning you’ll see posts Facebook thinks you’ll want to see from hours or days ago, while you’ll miss some recent posts the algorithm thinks won’t interest you. Research from the labs of Christian Sandvig and Karrie Karahalios suggests that even heavy Facebook users aren’t aware that algorithms shape their use of the service, and that many have experienced anxiety about not receiving responses to posts the algorithm suppressed.
Many of the anxieties about Facebook and other social platforms are really anxieties about filtering. The filter bubble, posited by Eli Pariser, is the idea that our natural tendencies towards homophily get amplified by filters designed to give us what we want, not ideas that challenge us, leading to ideological isolation and polarization. Fake news designed to mislead audiences and garner ad views relies on the fact that Facebook’s algorithms have a difficult time determining whether information is true or not, but can easily see whether information is new and popular, sharing information that’s received strong reactions from previous audiences. When Congress demands action on fake news and Kremlin propaganda, they’re requesting another form of filtering, based on who’s creating content and on whether it’s factually accurate.
USERS: Hey, can you get rid of the Nazis?
TWITTER: circle profile pics?
USERS: No. The Nazis.
TWITTER: 280 characters?
USERS: Nazis. Naaaaziiiiis.
— Parker Molloy (@ParkerMolloy) November 10, 2017
Twitter’s problems with trolls, bots, extremists and harassment are filtering problems as well. Prominent users like Lindy West have left the system complaining that Twitter is unwilling to remove serial abusers from the platform, or to give people abused on the service stronger tools to filter out and report abuse. As questions arise about Russian influence on the platform, Twitter may need to aggressively identify and filter out automated accounts which are used to promote pro-Trump or pro-Kremlin hashtags – the Hamilton68 Project focuses on tracking these accounts and understanding their influence as Twitter since the service has not yet filtered them out, either banning them or allowing audiences to block them from their feed.
Why don’t social media platforms like Facebook and Twitter give users powerful tools to filter their own feeds? Right now, the algorithms control what we see, but we can’t control them. As the internet maxim goes, “If you’re not paying for something, you’re not the customer; you’re the product being sold”. Both Twitter and Facebook offer powerful filtering tools that allow advertisers to target exactly who they want their ads to reach. You can pay money and advertise to women of color between 40-60 in Seattle, but you can’t choose to read perspectives from those women. While we’ve seen great innovation from projects like BlockTogether, which lets users who experience harassment share Twitter blocklists, we’ve seen surprisingly little innovation on user-controllable filters from the platforms themselves. And unless we see something like public-service social media platforms, it’s unlikely that we will see platforms give users much more control over what they see.
Algorithmic filters optimize platforms for user retention and engagement, keeping our eyes firmly on the site so that our attention can be sold to advertisers. We thought it was time that we all had a tool that let us filter social media the ways we choose. What if we could choose to challenge ourselves one day, encountering perspectives from outside our normal orbits, and relax another day, filtering for what’s funniest and most viral. So we built Gobo.
Gobo is a social media aggregator with filters you control. You can use Gobo to control what’s edited out of your feed, or configure it to include news and points of view from outside your usual orbit. Gobo aims to be completely transparent, showing you why each post was included in your feed and inviting you to explore what was filtered out by your current filter settings.
To use Gobo, you link your Twitter and Facebook accounts to Gobo and choose a set of news publications that most closely resembles the news you follow online. Gobo retrieves recent posts from these social networks and lets you decide which ones you want to see. Want more posts from women? Adjust a slider to set the gender balance of your feed… or just click on the “mute all men” button and listen to the folks who often get shouted down in online dialogs. Want to broaden the perspectives in your feed? Move the politics slider from “my perspective” to “lots of perspectives” and Gobo introduces news stories from sources you might not otherwise find.
How does it work?
Gobo retrieves posts from people you follow on Twitter and Facebook and analyzes them using simple machine learning-based filters. You can set those filters – seriousness, rudeness, virality, gender and brands – to eliminate some posts from your feed. The “politics” slider works differently, “filtering in”, instead of “filtering out” – if you set the slider towards “lots of perspectives”, our “news echo” algorithm will start adding in posts from media outlets that you likely don’t read every day.
That sounds great! Why isn’t everyone using it?
There are some serious limitations to Gobo at present. It’s slow – we’re generally showing you posts that appeared on Twitter three hours ago. As we refine and scale the tool, we’ll get faster, but right now, Gobo’s a good way to see how algorithms shape your newsfeed, but not a great way to keep up with breaking news.
You’ll also notice that there’s probably a lot less content from Facebook than from Twitter. Facebook allows us to show you posts from public pages, but not from your friends’ individual pages. We’re exploring ways you might be able to feed your whole, unedited Facebook news feed through Gobo, but we’re not there yet.
You may also notice that filters don’t always work the way you’d expect. We’re using off-the-shelf open source machine learning filters – we may end up fine-tuning these over time, but we don’t have the advantage of billions of user sessions to learn from the way Facebook does. It’s also a good reminder that these filters are always probabilistic and inexact – you get to see where our system screws up, unlike with Facebook!
Who built it?
Gobo is a project of the Center for Civic Media at the MIT Media Lab and Comparative Media Studies at MIT. The idea for the project came from conversations between Chelsea Barabas, Neha Narula and Ethan Zuckerman around decentralized web publishing, leading to the report “Back to the Future: The Decentralized Web”. Rahul Bhargava, Jasmin Rubinovitz and Alexis Hope built the tool itself, with Jasmin focusing on the AI filters, Alexis on the product design and Rahul on integration and deployment.
Our work on Gobo and on decentralized publishing, was made possible by the Knight Foundation, the founding donors behind our Center and supporters of some of our wackiest and most speculative work. We thank them for their trust and support.
Where’s Gobo going in the future?
We want Gobo to be more inclusive, incorporating content from new, decentralized social networks like Mastodon and Steemit, as well as existing networks like Instagram, YouTube and Tumblr. We really want to find a way to let users filter their Facebook feeds, as bringing transparency to that process was an inspiration for the process. We’d like to integrate RSS feed reading, possibly turning Gobo into a replacement for the late great Google Reader. And we’d like it to be lots faster. In the long run, we’d love to see Gobo run entirely in the browser so we don’t have central control over what content you’re seeing – an intermediate step may include allowing people to run local Gobo servers ala Mastodon or Diaspora.
That said, the real goal behind Gobo is to open a conversation about who gets to filter what you see on the web. If we prompt a conversation about why platforms don’t give you more control over what you see, we’d be really happy. If Facebook or another platform incorporated ideas from Gobo in their own design, we’d throw a party. We’d even invite you.
Can I help make Gobo better?
Heck yeah. There are bound to be lots of bugs in this prototype. Beyond that, Gobo is an open source project and weâ€™ll be sharing source code on the MIT Media Lab github repository. Weâ€™ve designed the prototype to treat ML filters as modules that can be dropped into our processing queueâ€Šâ€”â€Šweâ€™d love ideas of other text or image analysis modules we can introduce as filters for Gobo.
Why the name?
Ever seen a stage production where the lights look like they’re coming through a window, or the leaves of a forest? Those effects are created with gobos, filters cut from sheets of metal and placed in front of a light to shine a particular pattern on a curtain or other surface. We’re theater geeks, and it seemed like the perfect name for a product that lets you experiment with the effects of filters.