Filter Bubbles

by ninaxu

In Eli Pariser’s TedTalk “Beware Filter Bubbles”, Pariser talks about his hopes, as a youth, for the internet and its ability to push democracy and society’s ability to connect. Ted.com’s description for the video emphasises Pariser’s theory of the “filter bubble”, his metaphor for the internet world that is tailored to each user’s presumed preferences based on algorithms designed to scan a user’s likes, search history, clicked links, and other online activity. Pariser believes that this is ultimately harmful for democracy, as we are insulated from viewpoints that do not challenge our own and meaningful dialogue is prevented.

While I recognise the danger in the filter bubble, I believe that the creation of the online filter bubble is symptomatic of a different problem. Pariser himself supports this: while he claims that, despite being “progressive politically”, he has “always gone out of [his] way to meet conservatives… [and] likes hearing what they’re thinking about [and] seeing what they’re linking to”, he also notes that Facebook’s algorithms observed he ”was clicking more on [his] liberal friends’ links than [his] conservative friends’ links”. Pariser, who considers himself open-minded and interested in exploring opposing viewpoints, is not, in actuality, doing so; like many people, it is his habit to only click and read links that are of interest to him while skimming links that he is not interested in. Online algorithms do not create the filter bubbles we live in; they assist in making pre-existing, self-made filter bubbles more impenetrable. We as individuals are perfectly capable of puncturing the filter bubble by actively seeking out opposing viewpoints, clicking links that we disagree with, and engaging with those whose politics we do not support. Should this occur, online algorithms would note this activity and begin to include people, links, and viewpoints that challenge our worldview.

Controlling our filter bubbles is made more difficult by another criticism Pariser levels against online algorithms: their invisibility. Few people are aware of the existence of online algorithms that curate our online lives; fewer yet are aware of how they work and how to circumvent them. While I believe that individuals’ tendencies to seek out viewpoints that support our own and dismiss viewpoints that oppose us is the reason filter bubbles are so myopic, the opacity of online algorithms and our inability to actively participate in choosing what we are shown is troublesome. Being clear about the existence of tools that scan our online lives and the methodology they use to choose what information is presented to us would go a long way to giving us the choice of whether to puncture our filter bubbles. The rest is up to us.