Facebook and the Dangers of Binary Thinking

The other week, I read an interesting article by Matt Honan, a professional blogger, on the website ‘Wired.’ It was directly relevant to our in-class discussion today of Pariser and how Facebook whittles down our world view through a feedback system that generates power through our likes, comments, and demographical information. In summary, Honan liked everything he saw on Facebook for two weeks, then observed the effects on his timeline in comparison to pre-liking-spree. With each like, his ‘filter bubble’ narrowed in one direction or another, onto different political spectrums or, as he put it, “down rabbit holes of special interests until we’re lost in the queen’s garden, cursing everyone above ground.” (Honan) This caused me to wonder what it does to people who just join Facebook, and how it affects their mental development and growth in terms of the supposed website ‘goal’.

Facebook, as we discussed in class, is theoretically designed to expand the horizons of our knowledge, and create social connections to people in our own community, and, as the logo depicts, across the world too. However, if one joins Facebook, enters in the mandatory information, adds their friends, and begins to ‘like’ things that tailor future information that is propagated on our News Feeds, how does this affect one down the line? I’m curious to know how adversely this reinforcement system affects one’s ideological, cultural, and personal beliefs in comparison to someone who doesn’t use Facebook. I think this system creates dangerous binaries within the Facebook society, as in you are either a or b, and are visually limited to only the information of your self- imposed group, disarming you from engaging in a legitimate discourse or having a fair argument with the information presented on both sides of a topic. This, in turn I believe, affects people in the sense that they are unable to discuss issues in real life out of the confines of a laptop screen, as they are uninformed, biased, and illiterate in anything but the beliefs they already held from the start. Facebook is instead creating binaries, lines, and borders, separating individuals instead of connecting.

          Honan, Matt. “I Liked Everything I Saw on Facebook for Two Days. Here’s What It Did to Me | WIRED.” Wired.com. Conde Nast Digital, 14 Aug. 2009. Web. 11 Sept. 2014. <http://www.wired.com/2014/08/i-liked-everything-i-saw-on-facebook-for-two-days-heres-what-it-did-to-me/>.

Lareau, onathan . “Non-Dual Thinking: There Are Things We Don’t Know.” Tiny Buddha. N.p., n.d. Web. 11 Sept. 2014. <http://tinybuddha.com/blog/non-dual-thinking-there-are-things-we-dont-know/>.

 

3 Comments

Filed under Uncategorized

3 Responses to Facebook and the Dangers of Binary Thinking

  1. sdcook

    Horner’s article is interesting as a thought experiment when we start talking about the totality of one’s exposure being led down a very specific, narrow path. He says that by liking something pro-Israel, it started a slide into a very right-wing sphere of News Feed. Horner continues to like everything, as the experiment calls for, but eventually comes to a point where his newsfeed “[drifts] further and further left.” I believe this might be because when we report on events, there are so many approaches to the subject matter that one cannot find a wholly congruent approach and is almost forced to make ‘choices,’ so to speak, about their beliefs.

    So looking at, say, the recent story about NATO delivering arms to Ukraine differ from this BBC article and this RT News (essentially a Russian propaganda wagon, read the comments for a bit of fun). An event occurs, the reportage and vocabulary may be different (check out the use of Kiev versus Ukraine in the headline) but nonetheless one is left to seek answers to questions like: Why is NATO delivering weapons? Who else is involved? What does this mean for me?

    And lucky for us, the internet isn’t a vacuum; we are not beholden to a single source (this holds true for our News Feed as well). We are able to search out truth despite whatever google algorithms may try to get in our way.

    Of course, I’ve only really approached the question from the relatively simple and demonstrable political/ideological standpoint. The issue may get a tad bit more complicated when we’re considering entire ways of thinking/conceptualizing, but that’s a question for another comment.

    • sdcook

      And I pretty much wrote a blog entry. BUT essentially I was trying to respond to your comment about people becoming “uninformed, biased, and illiterate in anything but the beliefs they already held from the start” and very politely refute its severity.

      • Hi, Thanks for the reply! I see your point, we can seek out our own info. I am generalizing a bit, but I think often when people are just browsing facebook, they click the links their given, and sometimes don’t search up information on their own to choose their own opinion. And in this sense, they restrict their mindset and opinions on topics and ideologies. I agree, I may have been a bit too severe and general with my viewpoint.

Leave a Reply

Your email address will not be published. Required fields are marked *