Algorithms, Identity and You/Me/(Us?)
In his TED talk, Eli Pariser is concerned with how content filter algorithms, used by sites like Facebook and Google, insulate the individual from ideas outside their usual sphere of experience, creating so-called “filter bubbles.” This problem is of particular interest if we apply it to the process of identity construction.
Katja de Vries, in her paper “Identity, profiling algorithms and a world of ambient intelligence,” refers to these algorithms as a part of emerging ambient technologies, technologies that envision a world “in which computing is brought into the world by integrating sensors and microprocessors with everyday objects” incorporating “a certain additional social ability” (78). These can operate as Foucaultian identity apparatuses “that constitute a mediating reality that can give rise to identifications and identities” (79).
De Vries presents three possibilities for this brave new world: one, that identity has “always been constituted by technological memory” (79) and will therefore be little affected; two, that the experiential identity of the self (ipse-identity) will be destroyed; and three, that one would be faced with the uncanny reality of the “uncontrollability of one’s identity in general” (83).
Although we do not yet live in a world where your car can predict where you want to go (or do we?), the relation between current-day algorithms, social networks and identity is a growing concern that will only be intensified by these emerging technologies. Facebook, for instance, asks us to self-identify things like religion or professional skills but does so through an apparatus which dictates the language being used while algorithmically filtering the content we receive (which may form a key component of our identity construction).
In a more overt way, a dating website quiz tells me that I am a hopeless romantic while Buzzfeed says I am the Disney princess Cinderella (although I’ve always felt I was a Jasmine); what does it mean to encode these identities? How are these different than traditional cultural scripts?
And while this algorithmic-identity dynamic could be used in the Orwellian sense to create the perfect citizen (certainly, China has tried to create national filter bubbles in the past), I think it has the potential to do quite the opposite. Vincent Miller traces a shift in new media “where the point of the network was to facilitate an exchange of substantive content… to a situation where the maintenance of a network itself has become the primary focus” (399). This is in the same vein as Pariser’s concern of a filter bubble that only facilitates the delivery of “junk” wherein the informational content of an idealized self is eliminated.
The biggest identity crisis might then become whether I am a cat video person or a dog video person (the former, actually) rather than one of any real substantive quality. The implications of algorithmic control on identity apparatuses like social networks are far-reaching and deserve our attention.
de Vries, K. (2010). Identity, profiling algorithms and a world of ambient intelligence. Ethics and Information Technology,12(1), 71-85. doi: http://dx.doi.org/10.1007/s10676-009-9215-9
Miller, Kevin. (2008). New Media, Networking and Phatic Culture. Sage Publications, 14(4), 387-400. doi: http://dx.doi.org/10.1007/1354856508094659