Categories
Rip.Mix.Feed.

Tragedy of the Creative Commons

e8be65ed8d9705e4_o

The title is a play on words based on a well-known puzzle.

Categories
Major Project

Hypermedia and Cybernetics: A Phenomenological Study

As with all other technologies, hypermedia technologies are inseparable from what is referred to in phenomenology as “lifeworlds”. The concept of a lifeworld is in part a development of an analysis of existence put forth by Martin Heidegger. Heidegger explains that our everyday experience is one in which we are concerned with the future and in which we encounter objects as parts of an interconnected complex of equipment related to our projects (Heidegger, 1962, p. 91-122). As such, we invariably encounter specific technologies only within a complex of equipment. Giving the example of a bridge, Heidegger notes that, “It does not just connect banks that are already there. The banks emerge as banks only as the bridge crosses the stream.” (Heidegger, 1993, p. 354). As a consequence of this connection between technologies and lifeworlds, new technologies bring about ecological changes to the lifeworlds, language, and cultural practices with which they are connected (Postman, 1993, p. 18). Hypermedia technologies are no exception.

To examine the kinds of changes brought about by hypermedia technologies it is important to examine the history not only of those technologies themselves but also of the lifeworlds in which they developed. Such a study will reveal that the development of hypermedia technologies involved an unlikely confluence of two subcultures. One of these subcultures belonged to the United States military-industrial-academic complex during World War II and the Cold War, and the other was part of the American counterculture movement of the 1960s.

Many developments in hypermedia can trace their origins back to the work of Norbert Wiener. During World War II, Wiener conducted research for the US military concerning how to aim anti-aircraft guns. The problem was that modern planes moved so fast that it was necessary for anti-aircraft gunners to aim their guns not at where the plane was when they fired the gun but where it would be some time after they fired. Where they needed to aim depended on the speed and course of the plane. In the course of his research into this problem, Wiener decided to treat the gunners and the gun as a single system. This led to his development of a multidisciplinary approach that he called “cybernetics”, which studied self-regulating systems and used the operations of computers as a model for these systems (Turner, 2006, p. 20-21).

This approach was first applied to the development of hypermedia in an article written by one of Norbert Wiener’s former colleges, Vannevar Bush.  Bush had been responsible for instigating and running the National Defence Research Committee (which later became the Office of Scientific Research and Development), an organization responsible for government funding of military research by private contractors. Following his experiences in military research, Bush wrote an article in the Atlantic Monthly addressing the question of how scientists would be able to cope with growing specialization and how they would collate an overwhelming amount of research (Bush, 1945). Bush imagined a device, which he later called the “Memex”, in which information such as books, records, and communications would be stored on microfilm. This information would be capable of being projected on screens, and the person who used the Memex would be able to create a complex system of “trails” connecting different parts of the stored information. By connecting documents into a non-hierarchical system of information, the Memex would to some extent embody the principles of cybernetics first imagined by Wiener.

Inspired by Bush’s idea of the Memex, researcher Douglas Engelbart believed that such a device could be used to augment the use of “symbolic structures” and thereby accurately represent and manipulate “conceptual structures” (Engelbart, 1962).This led him and his team at the Augmentation Research Center (ARC) to develop the “On-line system” (NLS), an ancestor of the personal computer which included a screen, QWERTY keyboard, and a mouse.  With this system, users could manipulate text and connect elements of text with hyperlinks. While Engelbart envisioned this system as augmenting the intellect of the individual, he conceived the individual was part of a system, which he referred to as an H-LAM/T system (a  trained human with language, artefacts, and methodology) (ibid., p. 11). Drawing upon the ideas of cybernetics, Engelbart saw the NLS itself as a self-regulatory system in which engineers collaborated and, as a consequence, improved the system, a process he called “bootstrapping” (Turner, 2006, p. 108).

The military-industrial-academic complex’s cybernetic research culture also led to the idea of an interconnected network of computers, a move that would be key in the development of the internet and hypermedia. First formulated by  J.C.R. Licklider, this idea was later executed by Bob Taylor with the creation of ARPANET (named after the defence department’s Advanced Research Projects Agency). As a extension of systems such as the NLS, such a system was a self-regulating network for collaboration also inspired by the study of cybernetics.

The late 1960s to the early 1980s saw hypermedia’s development transformed from a project within the US military-industrial-academic complex to a vision animating the American counterculture movement. This may seem remarkable for several reasons. Movements related to the budding counterculture in the early 1960s generally adhered to a view that developments in technology, particularly in computer technology, had a dehumanizing effect and threatened the authentic life of the individual. Such movements were also hostile to the US military-industrial-academic complex that had developed computer technologies, generally opposing American foreign policy and especially American military involvement in Vietnam. Computer technologies were seen as part of the power structure of this complex and were again seen as part of an oppressive dehumanizing force (Turner, 2006, p. 28-29).

This negative view of computer technologies more or less continued to hold in the New Left movements largely centred on the East Coast of the United States. However, a contrasting view began to grow in the counterculture movement developing primarily in the West Coast. Unlike the New Left movement, the counterculture became disaffected with traditional methods of social change, such as staging protests and organizing unions. It was thought that these methods still belonged to the traditional systems of power and, if anything, compounded the problems caused by those systems. To effect real change, it was believed, a shift in consciousness was necessary (Turner, 2006, p. 35-36).

Rather than seeing technologies as necessarily dehumanizing, some in the counterculture took the view that technology would be part of the means by which people liberated themselves from stultifying traditions. One major influences on this view was Marshall McLuhan, who argued that electronic media would become an extension of the human nervous system and would result in a new form of tribal social organization that he called the “global village” (McLuhan, 1962). Another influence, perhaps even stronger, was Buckminster Fuller, who took the cybernetic view of the world as an information system and coupled it with the belief that technology could be used by designers to live a life of authentic self-efficiency (Turner, 2006, p. 55-58).

In the late 1960s, many in the counterculture movement sought to effect the change in consciousness and social organization that they wished to see by forming communes (Turner, 2006, p. 32). These communes would embody the view that it was not through political protest but through the expansion of consciousness and the use of technologies (such as Buckminster Fuller’s geodesic domes) that a true revolution would be brought about. To supply members of these communes and other wayfarers in the counterculture with the tools they needed to make these changes, Stewart Brand developed the Whole Earth Catalogue (WEC). The WEC provided lists of books, mechanical devices, and outdoor gear that were available through mail order for low prices. Subscribers were also encouraged to provide information on other items that would be listed in subsequent editions. The WEC was not a commercial catalogue in that it wasn’t possible to order items from the catalogue itself. It was rather a publication that listed various sources of information and technology from a variety of contributors. As Fred Turner argues (2006, p. 72-73), it was seen as a forum by means of which people from various different communities could collaborate.

Like many others in the counterculture movement, Stewart Brand immersed himself in cybernetics literature. Inspired by the connection he saw between cybernetics and the philosophy of Buckminster Fuller, Brand used the WEC to broker connections between ARC and the then flourishing counterculture (Turner, 2006,  p. 109-10). In 1985, Stewart Brand and former commune member Larry Brilliant took the further step of uniting the two cultures and placed the WEC online in one of the first virtual communities, the Whole Earth ‘Lectronic Link or “WELL”. The WELL included bulletin board forums, email, and web pages and grew from a source of tools for counterculture communes into a forum for discussion and collaboration of any kind. The design of the WELL was based on communal principles and cybernetic theory. It was intended to be a self-regulating, non-hierarchical system for collaboration.  As Turner notes (2005), “Like the Catalog, the WELL became a forum within which geographically dispersed individuals could build a sense of nonhierarchical, collaborative community around their interactions” (p. 491).

This confluence of military-industrial-academic complex technologies and the countercultural communities who put those technologies to use would form the roots of other hypermedia technologies. The ferment of the two cultures in Silicon Valley would result in the further development of the internet—the early dependence on text being supplanted by the use of text, image, and sound, transforming hypertext into full hypermedia. The idea of a self-regulating, non-hierarchical network would moreover result in the creation of the collaborative, social-networking technologies commonly denoted as “Web 2.0”.

This brief survey of the history of hypermedia technologies has shown that the lifeworlds in which these technologies developed was one first imagined in the field of cybernetics. It is a lifeworld characterised by non-hierarchical, self-regulating systems and by the project of collaborating and sharing information. First of all, it is characterized by non-hierarchical organizations of individuals. Even though these technologies first developed in the hierarchical system of the military-industrial-academic complex, it grew within a subculture of collaboration among scientists and engineers (Turner, 2006, p. 18). Rather than  being strictly regimented, prominent figures in this subculture – including Wiener, Bush, and Engelbart -voiced concern over the possible authoritarian abuse of these technologies (ibid., p. 23-24).

The lifeworld associated with hypermedia is also characterized by the non-hierarchical dissemination of information. Rather than belonging to traditional institutions consisting of authorities who distribute information to others directly, these technologies involve the spread of information across networks. Such information is modified by individuals within the networks through the use of hyperlinks and collaborative software such as wikis.

The structure of hypermedia itself is also arguably non-hierarchical (Bolter, 2001, p. 27-46). Hypertext, and by extension hypermedia, facilitates an organization of information that admits of many different readings. That is, it is possible for the reader to navigate links and follow what Bush called different “trails” of connected information. Printed text generally restricts reading to one trail or at least very few trails, and lends itself to the organization of information in a hierarchical pattern (volumes divided into books, which are divided into chapters, which are divided into paragraphs, et cetera).

It is clear that the advent of hypermedia has been accompanied by changes in hierarchical organizations in lifeworlds and practices. One obvious example would be the damage that has been sustained by newspapers and the music industry. The phenomenological view of technologies as connected to lifeworlds and practices would provide a more sophisticated view of this change than the technological determinist view that hypermedia itself has brought about changes in society and the instrumentalist view that the technologies are value neutral and that these changes have been brought about by choice alone (Chandler, 2002). It would rather suggest that hypermedia is connected to practices that largely preclude both the hierarchical dissemination of information and the institutions that are involved in such dissemination. As such, they cannot but threaten institutions such as the music industry and newspapers. As Postman (1993) observes, “When an old technology is assaulted by a new one, institutions are threatened” (p. 18).

Critics of hypermedia technologies, such as Andrew Keen (2007), have generally focussed on this threat to institutions, arguing that such a threat undermines traditions of rational inquiry and the production of quality media. To some degree such criticisms are an extension of a traditional critique of modernity made by authors such as Alan Bloom (1987) and Christopher Lasch (1979). This would suggest that such criticisms are rooted in more perennial issues concerning the place of tradition, culture, and authority in society, and is not likely that these issues will subside. However, it is also unlikely that there will be a return to a state of affairs before the inception of hypermedia. Even the most strident critics of “Web 2.0” technologies embrace certain aspects of it.

The lifeworld of hypermedia does not necessarily oppose traditional sources of expertise to the extent that the descendants of the fiercely anti-authoritarian counterculture may suggest, though. Advocates of Web 2.0 technologies often appeal to the “wisdom of crowds”, alluding the work of James Surowiecki  (2005). Surowiecki offers the view that, under certain conditions, the aggregation of the choices of independent individuals results in a better decision than one made by a single expert. He is mainly concerned with economic decisions,  offering his theory as a defence of free markets. Yet this theory also suggests a general epistemology, one which would contend  that the aggregation of the beliefs of many independent individuals will generally be closer to the truth than the view of a single expert. In this sense, it is an epistemology modelled on the cybernetic view of self-regulating systems. If it is correct, knowledge would be  the result of a cybernetic network of individuals rather than a hierarchical system in which knowledge is created by experts and filtered down to others.

The main problem with the “wisdom of crowds” epistemology as it stands is that it does not explain the development of knowledge in the sciences and the humanities. Knowledge of this kind doubtless requires collaboration, but in any domain of inquiry this collaboration still requires the individual mastery of methodologies and bodies of knowledge. It is not the result of mere negotiation among people with radically disparate perspectives. These methodologies and bodies of knowledge may change, of course, but a study of the history of sciences and humanities shows that this generally does not occur through the efforts of those who are generally ignorant of those methodologies and bodies of knowledge sharing their opinions and arriving at a consensus.

As a rule, individuals do not take the position of global skeptics, doubting everything that is not self-evident or that does not follow necessarily from what is self-evident. Even if people would like to think that they are skeptics of this sort, to offer reasons for being skeptical about any belief they will need to draw upon a host of other beliefs that they accept as true, and to do so they will tend to rely on sources of information that they consider authoritative (Wittgenstein, 1969). Examples of the “wisdom of crowds” will also be ones in which individuals each draw upon what they consider to be established knowledge, or at least established methods for obtaining knowledge. Consequently, the wisdom of crowds is parasitic upon other forms of wisdom.

Hypermedia technologies and the practices and lifeworld to which they belong do not necessarily commit us to the crude epistemology based on the “wisdom of crowds”. The culture of collaboration among scientists that first characterized the development of these technologies did not preclude the importance of individual expertise. Nor did it oppose all notions of hierarchy. For example, Engelbart (1962) imagined the H-LAM/T system as one in which there are hierarchies of processes, with higher executive processes governing lower ones.

The lifeworlds and practices associated with hypermedia will evidently continue to pose a challenge to traditional sources of knowledge. Educational institutions have remained somewhat unaffected by the hardships faced by the music industry and newspapers due to their connection with other institutions and practices such as accreditation. If this phenomenological study is correct, however, it is difficult to believe that they will remain unaffected as these technologies take deeper roots in our lifeworld and our cultural practices. There will continue to be a need for expertise, though, and the challenge will be to develop methods for recognizing expertise, both in the sense of providing standards for accrediting experts and in the sense of providing remuneration for expertise. As this concerns the structure of lifeworlds and practices themselves, it will require a further examination of those lifeworlds and practises and an investigation of ideas and values surrounding the nature of authority and of expertise.

References

Bloom, A. (1987). The closing of the American mind. New York: Simon & Schuster.

Bolter, J. D. (2001) Writing space: Computers, hypertext, and the remediation of print (2nd ed.). New Jersey: Lawrence Erlbaum Associates.

Bush, V. (1945). As we may think. Atlantic Monthly. Retrieved from http://www.theatlantic.com/doc/194507/bush

Chandler, D. (2002). Technological or media determinism. Retrieved from http://www.aber.ac.uk/media/Documents/tecdet/tecdet.html

Engelbart, D. (1962) Augmenting human intellect: A conceptual framework. Menlo Park: Stanford Research Institute.

Heidegger, M. (1993). Basic writings. (D.F. Krell, Ed.). San Francisco: Harper Collins.

—–. (1962). Being and time. (J. Macquarrie & E. Robinson, Trans.). San Francisco: Harper Collins.

Keen, A. (2007). The cult of the amateur: How today’s internet is killing our culture. New York: Doubleday.

Lasch, C. (1979). The culture of narcissism: American life in an age of diminishing expectations. New York: W.W. Norton & Company.

McLuhan, M. (1962). The Gutenberg galaxy. Toronto: University of Toronto Press.

Postman, N. (1993). Technopoly: The surrender of culture to technology. New York: Vintage.

Surowiecki, J. (2005). The wisdom of crowds. Toronto: Anchor.

Turner, F. (2006). From counterculture to cyberculture: Stewart Brand, the Whole Earth Network, and the rise of digital utopianism. Chicago: University of Chicago Press.

—–. (2005). Where the counterculture met the new economy: The WELL and the origins of virtual community. Technology and Culture, 46(3), 485–512.

Wittgenstein, L. (1969). On certainty. New York: Harper.

Categories
Research Paper

William Blake and the Remediation of Print

One might be inclined to view William Blake’s illuminated books as throwbacks to mediaeval illuminated manuscripts. Yet they should rather be understood as “remediating” older media. According to Bolter (2001, p. 23), remediation occurs when a new medium pays homage to an older medium, borrowing and imitating features of it, and yet also stands in opposition to it, attempting to improve on it. In the case of Blake’s illuminated books, one of the older media being remediated was the mediaeval illuminated manuscript, but another medium being remediated was the printed book, which in Blake’s time had already been in use for three centuries.

Blake adopted the way in which the richly illustrated texts of mediaeval illuminated manuscripts combined the iconic and the symbolic so that the former illumined meaning of the latter, the images revealing the spiritual significance of the scripture. Blake also seized upon an aspect of illuminated manuscripts which would later impress John Ruskin as well (Keep, McLaughlin, & Parmar, 1993-2000)—the way in which they served as vehicles for self-expression. The designs of manuscripts such as the Book of Kells and the Book of Lindisfarne, for instance, reflected the native artistic styles of Ireland and Northumbria and often depicted the native flora and fauna of those lands as well. Blake also adopted some of the styles and idioms of illustration found in mediaeval illuminated manuscripts, producing images in some cases quite similar to ones found in mediaeval scriptures and bestiaries (Blunt, 1943, p. 199). It seems that he also embraced the idea, embodied in the creation of illuminated manuscripts, that the written word can be something sacred and powerful and that it is therefore something to be adorned with gold and lively colours.

Blake’s illuminated books broke with the medium of mediaeval manuscripts mainly by virtue of that which they adopted from the medium of the printed book. Blake produced his illuminated books first by making copper plates engraved with images and text, deepening these engravings with the help of corrosive chemicals. He then used inks to form impressions of the plates on sheets of paper, often colouring the impressed images further with watercolour paints (Blake, 1967, p. 11-2). His use of the copper plates and inks bore similarities to the use of movable type and ink to create printed books. For many years it was believed that, despite this similarity, Blake developed his illuminated books partly as a reaction against the mass production of books, hearkening back to the methods of mediaeval craftsmen – specifically the artists who produced illuminated manuscripts –  who created unique items rather than mass produced articles. Consequently, it was believed that after he produced the copper plates for the illuminated books he created only individual books on commission. This belief, first championed by 19th century writers who claimed William Blake as a predecessor (Symmons, 1995), has recently been overturned, however, by the work of Joseph Viscomi. As a scholar and printer who attempted to physically reproduce the methods that Blake employed to create his illuminated books, Viscomi concluded that Blake mass produced these books in small editions of about ten or more books each (Adams, 1995, p. 444).

The primary way in which the illuminated book was meant to improve on the printed book did not lie in the avoidance of mass production, but rather in the relation between the image and the word. In printed books, engraved images could be included with the text, but as the text had to be formed with movable type the image had to be included as something separate and additional (Bolter, 2001, p. 48). In Blake’s illuminated books, in contrast, the written word belonged to the whole image first engraved on the copper plate and then transferred to paper. It participated in the imaginative power of the perceived image, rather than just retaining a purely conceptual meaning. As with the text of mediaeval illuminated manuscripts, the words in Blake’s illuminated books often merge the iconic and the symbolic (Bigwood, 1991). For example, in plate 22 of Blake’s The Marriage of Heaven and Hell, the description of the devil’s speech trails off into a tangle of diabolical thorns. Furthermore, the words are produced in the same colours used in the images to which they belong, and partake in their significance—light watercolours being used in the first edition of the joyous Songs of Innocence and dark reticulated inks being used in the gloomier Songs of Experience (Fuller, 2003, p. 263). As John Ruskin later observed, this ability to use colour in the text of illuminated books made it a form of writing that uniquely expressed its creator’s imagination (Ruskin, 1888, p. 99).

Like several other artists of his time, Blake was disturbed by the mechanistic and atomistic conception of nature first put forward by the ancient philosopher Democritus and then later revived around the seventeenth and eighteenth centuries by natural philosophers. This was the conception of nature as consisting of atoms in an empty void operating in accordance with mechanistic laws. Blake saw this as connected to the type of rationalism that would impose strict laws of reason on the mind and imprison the divine creative power of the imagination. Like others who opposed the mechanistic and atomistic worldview, Blake was particularly repelled by the mechanistic account of colour offered by Isaac Newton, voicing his objection to “Newton’s particles of light” (Blake, 1988, 153). It was thought that such an account treated colour in isolation from the power of the imagination to which it was naturally connected. It was also seen as severing colour from the living spirit of nature—the poet Goethe famously offering a complex alternative theory of colour which saw it as the result of a dynamic interaction of darkness and light.

For Blake, the printing press would at the very least be symbolic of the mechanistic an atomistic view of the world, the words in the printed text no longer partaking in the power of the imagination and the visible image but rather consisting of atoms of movable type and lying separated by voids of empty space.  The primacy of the imagination would be better served by the medium of illuminated books, where the image did not only illuminate the conceptual meaning of the word but also subsumed the word and imparted a deeper significance to it. The imagination was of central importance for Blake, who was a professional engraver as well as a poet, and for whom the medium of the image was a more fundamental part of his life and work than the written word (Storch, 1991, 458).

The ability to mass produce texts in which the image was primary and the written word secondary would have implications for literacy and education insofar as it could widely disseminate works that encouraged imaginative and perceptual understanding over strictly conceptual thought. While the illuminated book as such never became a widespread medium, some of the principles involved in its remediation of the illuminated manuscript and the printed book survived in the medium of the comic book and the graphic novel, which could also be said to realize some of its implications. These works were also mass produced and also differed from the printed book through the relation between the word and the image. For example, the way in which the symbolic word is made to partake in the imaginative power of the iconic image can be seen in the development of comic books in Britain. Early 20th century British comic books generally consisted of rows of images without words, each image having a block of text below it. When comic books adopted the style that introduced speech bubbles, thought bubbles, and sound effects into the image itself, the words became part of the action.

The illuminated book can also be seen as a precursor of hypertext and its remediation of the printed word, specifically insofar as the image in hypertext is coming to dominate the written word (Bolter, 2001, p. 47). In this regard, hypertext could also be said to be carrying through the implications that illuminated books posed for education and literacy. This is not to say that there are not significant differences between these media, of course. Creators of hypertext may look to the illuminated book for inspiration but leave behind the more laborious aspects of the medium, such as the use of copper plates and corrosive chemicals. This may be seen as both an improvement and a loss. One feature of the illuminated book absent in hypertext is the close connection between the work and the bodily act of creating it. As Carol Bigwood observes (1991, p. 309), reading Blake’s illuminated books is a perceptual experience in which we sense the movements of Blake’s hand and the rigidity of the copper on which the image was first made. So while the illuminated book remediates the printed word it may itself be remediated by hypertext.

References

Adams, H. (1995). Untitled [Review of the book Blake and the idea of the book]. The Journal of Aesthetics and Art Criticism, 53(4), 443-444.

Bigwood, C. (1991). Seeing Blake’s illuminated texts. The Journal of Aesthetics and Art Criticism, 49(4), 307- 315.

Blake, W. (1988). Selected writings. London: Penguin.

—–. (1967). Songs of innocence and of experience. Oxford: Oxford University Press. (Original work published 1794).

Blunt, A. (1943). Blake’s pictorial imagination. Journal of the Warburg and Courtauld Institutes, 6, 190-212.

Bolter, J. D. (2001) Writing space: Computers, hypertext, and the remediation of print (2nd ed.). New Jersey: Lawrence Erlbaum Associates.

Fuller, D. (2003). Untitled [Review of the book William Blake. The creation of the songs: From manuscript to illuminated printing]. Review of English Studies, 54(214), 262-264.

Keep, C., McLaughlin, T., & Parmar, R. (1993-2000). John Ruskin, William Morris and the Gothic Revival. The Electronic Labyrinth. Retrieved from http://elab.eserver.org/hfl0236.html

Ruskin, John. (1888). Modern Painters (Vol. 3). New York: John Wiley & Sons.

Storch, Margaret. (1996). Untitled [Review of the books Blake and the idea of the book & Blake, ethics, and forgiveness]. Modern Language Review, 91(2), 458-459.

Symmons, Sarah. (1995). Untitled [Review of the book Blake and the idea of the book]. British Journal of Aesthetics, 35(3), 308-9.

Categories
Discussion Reflections Text

Derrida and Writing

In a number of the readings for this course the philosopher Derrida has been mentioned, along his “graphocentric” view that writing is a more primary type of communication than speech. He is a difficult philosopher to understand, but I’ve studied his thought somewhat in the past and I’d like to try to clarify his ideas about writing as far as I understand them.

The background that Derrida was coming from, and reacting against, was structuralism. According to structuralism, words have their meaning by how they relate to other words in a whole system of language. Proponents of structuralism thus draw a distinction between language (the whole system that gives words their meaning) and speech (the things we actually say). The distinction is discussed by Stephen Fry and Hugh Laurie in this comedy sketch

YouTube Preview Image

A related distinction made by structuralists was that between the signified and the signifier. The signified is the place a word takes in the whole system of language and the signifier is the spoken sound of the word or written mark of the word.

Derrida rejected the idea of a fixed system of language giving meaning to everything written and spoken, and rejected the idea that there is a signified that gives meaning to the signifier. He believed that language should be understood in terms of the signifiers only, which in turn are to be understood as dependent on acts of signifying. These acts of signifying have meaning, he thought, only in relation to all other acts of signifying. With new acts of signifying, these relations could change, and so meanings are never fixed but are open to change, their meaning being constantly “deferred”. His method of “deconstruction” is an attempt to change received meanings and received interpretations, using methods such as reversing the received view about what is important and what is unimportant in a text.

Derrida believed that the notion that speech is primary and writing secondary was based on the mistaken view that, with speech, the meaning of our words is something “present”. According to this view, the person who speaks has mastered the system of language to some extent and is an authority on what he or she means. For instance, when you speak to me I am able to respond to your questions and reply, “No, what I meant was…” The written word, in contrast, is something whose meaning is more elusive, for it depends on what the writer meant when he or she wrote it, and the writer may be absent and might even be dead when we read it.

Although he acknowledged that from a historical point of view speech appeared before writing, Derrida thought that writing revealed the nature of language more fully than speech did, for it reflected the way in which the meanings of what we say are not within our control and are constantly open to revision and reinterpretation.

The clearest introduction to Derrida’s views on writing that I have come across is in Richard Harland’s book Superstructuralism. You can see some of it here.

There’s also a movie about Derrida on google video, which is not too bad

http://video.google.com/videoplay?docid=-7347615341871798222
Categories
Commentary 1

Orality and Mythology

In Orality and Literacy, Walter Ong (2002) drew a distinction between cultures characterized by literacy and cultures characterized by “primary orality”, the latter being comprised of “persons totally unfamiliar with writing” (p. 6). By accepting a form of the Sapir-Whorf hypothesis, the view that a culture’s language determines the way in which its members experience the world, Ong also considered these two types of culture to be two types of consciousness, or “modes of thought” (Ibid, p. 6). While Ong attempted to address how literate culture developed from “oral cultures”– i.e. cultures characterized by primary orality (Ibid, p. 31) – the sharp distinction he drew between the two respective types of consciousness involved in these types of culture makes the question of how this development would have been possible particularly troublesome (Dobson, Lamb, & Miller, 2009).

Ong  evidently recognized that there can be what might be called “transitional forms” between primary orality and literacy. He noted that oral cultures in the strict sense hardly existed anymore (Ong, 2002, p. 11), suggesting that cultures may be oral to a large degree even when they have been somewhat influenced by literate cultures. Furthermore, he granted that literate cultures may still bear some of the characteristics of the oral cultures from which they developed, possessing what he called “oral residue” (Ibid, p. 40-1). However, by characterizing literate and oral modes of thought as he did, it is not clear how it could even be possible for the former to arise out of the latter– although it is clear that they must have done so.

One of the main difficulties lies in Ong’s characterization of oral modes of thought as less “abstract” than literate modes. He asserted that all conceptual thought is abstract to some degree, meaning that concepts are capable of referring to many individual objects but are not themselves individual objects (Ibid, p. 49). According to this view, concepts can be abstract to varying degrees depending on how many individual objects they are capable of referring to. The concept “vegetation” is able to refer to all the objects the concept “tree” can and still more, and thus it is a more abstract concept. The oral mode of thought, Ong asserted, utilizes concepts that are less abstract and this makes it closer to “concrete” individual objects.

This notion of concepts being “abstract” is relatively recent, being developed mainly by the philosopher John Locke (1632-1704). In ancient and mediaeval thought, the distinction between the concept “tree” and this tree or that tree would be described as a distinction between a universal and a particular. Locke’s view that universals are “abstract” ideas was based on the theory that they are formed by the mind’s taking away or “abstracting” that which is common to many particulars (Locke, 1991, p. 147). For example, the concept “red” is formed by noticing many red objects and then “abstracting” the common characteristic of redness from all of the other characteristics the objects possess.

A problem with this theory of abstraction as a general explanation of how concepts are formed was pointed out by Ernst Cassirer (1874-1945). Cassirer noted that the theory first of all claims that it is necessary to possess abstract concepts in order to apprehend the world as consisting of kinds of things, and that without them we would only have what William James – and Ong after him (Ong, 2002, p. 102) – called the “big, blooming, buzzing confusion” of sense perception. The theory also claims that to form an abstract concept in the first place it is necessary to notice a common property shared by a number of particular objects. Yet according to the first claim we couldn’t notice this common property if we didn’t already have an abstract concept. We wouldn’t notice that several objects share the property of redness if we didn’t already have the concept “red” (Cassirer, 1946, p. 24-5).

Cassirer’s criticism of abstraction as a theory of concept formation could serve as a particularly valuable corrective to Ong’s account of the distinction between orality and literacy. Cassirer himself offered a similar account of two modes of thinking which he called “mythological” and “discursive”. The “mythological” mode of thought resembled Ong’s “oral” mode in many ways. Like Ong’s oral mode of thought it was a mode of thought closely linked to the apprehension of objects as they stood in relation to practical activity (Ong, 2002, p. 49; Cassirer, 1946, p. 37-8). Also like the oral mode of thought it was associated with the notion that words held magical power, as opposed to the view of words as mere arbitrary signs (Ong, 2002, p. 32-3; Cassirer, 1946, p. 44-5, 61-2).

If Walter Ong’s account of orality and literacy could be synthesized with Cassirer’s distinction between the mythological and the discursive, it would benefit in that the latter is capable of describing a development from one mode of thought to the other without posing the problematic view that this involves increasing degrees of abstraction. The development of the mythological mode into the discursive mode is not the move away from a concrete world of perception to an abstract world of conception, but the move from the use of one kind of symbolic form to the use of another type. Furthermore, as the mythological mode of thought is already fully symbolic it is possible to study this mode of thought by studying the symbolism used in mythological cultures. While the stages of development from the mythological to the discursive described by Cassirer (e.g. perceiving objects as possessing “mana”, seeing objects as appearances of “momentary gods”, polytheistic forms of thinking, and so on) may not be supported by empirical evidence, the kind of analysis that is offered by his theory of “symbolic forms” makes the type of development in question conceivable and provides us with a program for studying it.

References

Cassirer, E. (1946). Language and Myth. (S.K. Langer, Trans.). New York: Dover. (Original work published 1925).

Dobson, T., Lamb, B., & Miller, J. (2009). Module 2: From Orality to Literacy Critiquing Ong: The Problem with Technological Determinism. Retrieved from https://www.vista.ubc.ca/webct/urw/lc5116011.tp0/cobaltMainFrame.dowebct

Locke, John (1991). An Essay Concerning Human Understanding. In M. Adler (Ed.), Great Books of the Western World (Vol. 33). Chicago: Encyclopaedia Britannica. (Original work published 1698).

Ong, W. J. (2002). Orality and Literacy. New York: Routledge.

Categories
Technology

Technology as a way of revealing

I noticed that Rich had already posted a passage from Heidegger and “The Question Concerning Technology”, but I would like to discuss another part of it. Early in this essay Heidegger states that “Technology is a way of revealing”. I think that this is important and that the “revealing” Heidegger mentions is closely connected with what he elsewhere calls “regioning”, which he says opens “the clearing of Being”. By this he does not mean a type of conscious thought or unconscious thought but rather something that makes both of these possible to begin with. It is tied up with speaking a language and with dwelling among other people in the world. (He gets rather mystical when he tries to describe it in his later writings.) The view of technology as a way of revealing would suggest that technology is inextricably bound up with the way in which we live, our practices, and our institutions. It would support Neil Postman’s claim that a technology’s function follows from its form and that new technologies threaten institutions. It may be a bit disturbing, though, as we usually like to think of ourselves as rational beings who can represent technology objectively and freely decide how we will use it. As Heidegger himself explains at the end of the essay, though, it is not necessarily a fatalistic picture.

Categories
Text

Text

To elucidate the concept of “text” I decided to upload this clip from the documentary series Testament, presented by historian John Romer. The series, which was originally aired in 1988, is about the history of the Bible—how it was created and how it has featured in people’s lives throughout history. In this clip, which actually consists of a small clip from the beginning of episode 3 with a clip from the end of the same episode, a narrator reads from the rules of how a Jewish temple scroll is to be written. (Notice that it says no part of the scroll is to be written from memory.) John Romer also tells an interesting story that illustrates the idea that a text is authoritative. In an oral tradition there is no authoritative version of a story. A person tells the story from memory and can change it to suit his or her purposes, usually in response to the reactions of the audience. I was reminded of this clip when listening to James O’Donnell’s discussion of early Christianity and its dependence on writing.

YouTube Preview Image
Categories
Introductions

Thoth

Egypt: Abydos, originally uploaded by Brooklyn Museum.

I’ve chosen this picture because it contains an image of the Egyptian god Thoth, mentioned in Plato’s Phaedrus (although there his name is given as “Theuth”). The ancient Greeks identified this Egyptian god with their own god Hermes. They were both associated with writing and with guiding the dead to the underworld. In the case of Hermes I think this connection between writing and death had to do with Hermes being a god of travelers, little shrines to Hermes serving as markers that were placed along a path to guide travelers. So, Hermes was a guide for those traveling to the underworld and also a god of writing, the written words being regarded as like signs and markers along the road. (Just a speculation, though.)

My name is Stuart Edgar and I have just started the MET TBDL program. In addition to taking this course I am also currently taking ETEC 512. I’ve started this course late, just joining this week. For the past eight years I have been teaching philosophy at a university level, at first with the University of North Dakota and the University of Minnesota, and more recently with Athabasca University. I completed my PhD in philosophy from the University of Calgary last year. I’m particularly interested in ETEC 540 as I have always been fascinated by the relation between languages and worldviews.

Spam prevention powered by Akismet

Creative Commons Attribution 3.0 Unported
This work is licensed under a Creative Commons Attribution 3.0 Unported.