Who rules the Internet? These days it’s Upworthy, Eli Pariser’s socially-bent aggregator, which fills a gap in viral content where puppies used to sleep. The site’s sheer power on the Interwebs came quickly: In just two years, the site has come to fill the Facebook feeds of 5.4 million people. (So it’s no surprise that haters gonna headline-hate.)
Eli Pariser: Beware online "filter bubbles"Upworthy is Pariser’s first major project since he published his 2011 book, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, which is also the subject of his TED Talk, given the same year. The filter bubble is the phenomenon whereby an algorithm tracks your clicks in your feeds and searches and, over time, predicts what it thinks you want. In theory, this could be a good thing. But it actively limits the variety of opinions you’re exposed to.
There is a seeming contradiction between the two projects: the progressive views of Upworthy might actually exacerbate, not pop, the filter bubble. Curious about the relationship between the two projects, I caught up with Pariser over the phone to ask him about what he’s been up to since he gave his talk, and about what Upworthy is doing to make the Internet a better place to hang out. Below is an edited transcript of our conversation.
How does Upworthy address the filter bubble? Isn’t it possible it makes it worse?
Well, in the talk, and even more in the book, there’s two pieces: One is the partisan echo chamber challenge, and the other is, do people get exposed to content about topics that are in the public sphere at all — or is it Miley Cyrus and cats all the way down? And it’s that latter piece that I think Upworthy is really aimed at solving. My view is, if people were exposed to many points of view on global poverty, that would be great — but I would rather them be exposed to one than none.
Why do you think people were so upset when you pointed out that their feeds and search results were being filtered?
People still thought that everybody sees the same things through Google and everyone sees all of the posts on Facebook and the Facebook news feed. When you can demonstrate how inaccurate that is, it’s really surprising. It’s sort of like being told that your glasses edit out certain people as you’re walking down the street. Which maybe someday Google Glass will do.
So I think that was a piece of it. Also I think we’re all really wrestling with these questions of control that are raised by the Internet era. On the one hand, Google and Facebook are great tools. On the other hand, they’re serving us up to advertisers, and who really gets to decide where the allegiances ultimately are is kind of one of the big questions of the era. The talk tapped into some very visceral aspects of that question.
Since you gave your talk, have you noticed an increase in social media sites being transparent about their algorithms?
I haven’t. With the exception of DuckDuckGo, a search engine that’s really, really focused on doing non-filter-bubble-esque search — they also run the site dontbubble.us — I don’t think that there’s a whole lot more transparency. I’m disappointed on that front. I am pleased to see how much all sorts of people are grappling with some of these questions. To see people creating browser extensions that show you how your data is being used; people doing academic research on the effects of these kinds of things; Facebook itself doing research into the phenomenon, and I think making some efforts to highlight higher quality information sources in its news feed.
Is it more important for Facebook to show you all posts, or for Facebook to be more transparent about what it’s hiding? If you were Mark Zuckerberg would you change the terms and conditions, or the algorithm?
I feel like the right answer is to change the terms and conditions to make it more transparent, but I’d be very tempted to mess with the algorithms, because it’s such a fascinating, huge thing. Facebook gets to decide how hundreds of millions or billions of people spend hours and hours every month, just through that algorithm, and media companies rise and fall by the sword of the Facebook algorithm. They’ve got this really incredible power to affect how the future of media looks, because when you control distribution, you can influence creation. You start to create suction for different kinds of content. I might not wish that that power was so centralized in so few hands as it is, but it’s really interesting to think about how you would use it to create a more democracy-friendly media structure.
Lots of people would say that you’ve unlocked that algorithm.
I’ve been thinking about the Facebook algorithm for a while, but honestly I wouldn’t expect that I know a hundredth of what’s actually going on in there. A fun exercise that I did was I invited a bunch of people to think about: If you were Facebook, how would you think about quality of media?
What’s next for you after Upworthy?
It’s all Upworthy all the time right now.
So, Upworthy is meant to address a dearth in good content on the web. But what if the reality is that most people just like junk?
Well, hopefully what we’ve demonstrated is that that’s just not true. There is a huge audience for content about economic inequality and environmental challenges, and I think Upworthy’s demonstrated that – and actually TED has also demonstrated that. The problem, empirically, isn’t that there aren’t enough people that really care about this stuff; it’s that most people either don’t believe that they care or aren’t making the media focus on those topics.