Algorithmic catastrophe: How news feeds reprogram your mind and habits
The most powerful editors in the world? Algorithms.
ELI PARISER: A filter bubble is your own personal universe of information that's been generated by algorithms that are trying to guess what you're interested in. And increasingly online we live in these bubbles. They follow us around. They form part of the fabric of most websites that we visit and I think we're starting to see how they're creating some challenges for democracy.
We've always chosen media that conforms to our address and read newspapers or magazines that in some way reflect what we're interested in and who we want to be. But the age of kind of the algorithmically mediated media is really different in a couple of ways. One way is it's not something that we know that we're choosing. So we don't know on what basis, who an algorithm thinks we are and therefore we don't know how it's deciding what to show us or not show us. And it's often that not showing us part that's the most important – we don't know what piece of the picture we're missing because by definition it's out of view. And so that's increasingly I think part of what we're seeing online is that it's getting harder and harder even to imagine how someone else might come to the views that they have might see the world the way they do. Because that information is literally not part of what we're seeing or consuming. Another feature of kind of the filter bubble landscape is that it's automatic and it's not something that we're choosing. When you pick up a left wing magazine or a right wing magazine we know what the bias is, what to expect.
A deeper problem with algorithms choosing what we see and what we don't see is that the data that they have to base those decisions on is really not representative of the whole of who we are as human beings. So Facebook is basically trying to take a handful of sort of decisions about what to click on and what not to click on, maybe how much time we spend with different things and trying to extract from that some general truth about what we're interested in or what we care about. And that clicking self who in fractions of a second is trying to decide am I interested in this article or am I not it just isn't a very full representation of the whole of our human self. You can do this experiment where you can look back at your web history for the last month and obviously there are going to be some things there that really gave you a lot of value, that represent your true self or your innermost self. But there's a lot of stuff, you know, I click on cell phone reviews even though I'll always have an iPhone. I never am not going to have an iPhone. But it's just some kind of compulsion that I have. And I don't particularly need or want algorithms amping up my desire to read useless technology reviews.
The people who create these algorithms like to say like they're neutral. We don't want to create a kind of take an editorial point of view. And I think there's something to that that's important, you know. We don't want Mark Zuckerberg to impose his political views on all of us and I don't think he is. But it's also kind of a weird dodge because every time that you create a list and that's essentially all that Facebook or Twitter is is a list that ranks information. Every time you create a list you're making some value judgments about what goes at the top of the list and the bottom of the list. Here's no such thing as a neutral algorithm and you have to decide on some basis that some things are going to be more valuable and more worthy of attention than others. I always find it dangerous when people say there's no editorial viewpoint here or we're not taking an editorial stand because every list has some kind of viewpoint about what matters and what doesn't matter. there's no such thing as a neutral list because if I'm neutral on one criterion often I'll be biased on some other criterion. So if I rank people by alphabetically there's no guarantee that that's going to have an equal impact on people of different ethnicities or different races or different genders. what we have to grapple with is we do have these kind of much more powerful than ever before editors shaping what we see and don't see. But they themselves haven't really taken on the responsibility of that editorial judgment.
- According to a Pew Research poll, 45% of U.S. adults get at least some of their news from Facebook, with half of that amount using Facebook as their only news outlet.
- Algorithms on social media pick what people read. There's worry that social media algorithms are creating filter bubbles, so that they never have to read something they don't agree with and thus cause tribal thinking and confirmation bias.
- The Charles Koch Foundation is committed to understanding what drives intolerance and the best ways to cure it. The foundation supports interdisciplinary research to overcome intolerance, new models for peaceful interactions, and experiments that can heal fractured communities. For more information, visit charleskochfoundation.org/courageous-collaborations.
- The opinions expressed in this video do not necessarily reflect the views of the Charles Koch Foundation, which encourages the expression of diverse viewpoints within a culture of civil discourse and mutual respect.
The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We ThinkList Price: $18.00New From: $6.00 in StockUsed From: $3.00 in Stock
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.