The Changing Spaces of Reading and Writing

Rip Mix Feed – Delphine’s Touch

View more presentations from Yassie.

Hi  Everyone

Though late in time, I am finally posting my RipMixFeed

I am so glad that I finally see how the slideshare could work in education especially in my department. Once we get the hang of it, it should be fairly easy to work with.

It’s just the story of an eventful year and half for me. I hope the narration will be audible.

Delphine

December 2, 2009   No Comments

Making [Re]Connections

This is one of the last courses I will be taking in the program and as the journey draws to a close, this course has opened up new perspectives on text and technology. Throughout the term, I have been travelling (more than I expected) and as I juggled my courses with the travels, I began to pay more attention to how text is used in different contexts and cultures. Ong, Bolter and the module readings were great for passing time on my plane rides – I learned quite a lot!

I enjoyed working on the research assignment where I was able to explore the movement from icon to symbol. It gave me a more in-depth look at the significance of visual images, which Bolter discusses along with hypertext. Often, I am more used to working with text in a constrained space but after this assignment, I began thinking more about how text and technologies work in wider, more open spaces. By the final project, I found myself exploring a more open space where I could be creative – a place that is familiar to me yet a place that has much exploration left to it – the Internet.

Some of the projects and topics that were particularly related to this new insight include:

E-Type: The Visual Language of Typography

A Case for Teaching Visual Literacy – Bev Knutson-Shaw

Language as Cultural Identity: Russification of the Central Asian Languages – Svetlana Gibson

Public Literacy: Broadsides, Posters and the Lithographic Process – Noah Burdett

The Influence of Television and Radio on Education – David Berljawsky

Remediation of the Chinese Language – Carmen Chan

Braille – Ashley Jones

Despite the challenges of following the week-to-week discussions from Vista to Wiki to Blog and to the web in general, I was on track most of the time. I will admit I got confused a couple of times and I was more of a passive participant than an active one. Nevertheless, the course was interesting and insightful and it was great learning from many of my peers. Thank you everyone.

December 1, 2009   1 Comment

Hypermedia and Cybernetics: A Phenomenological Study

As with all other technologies, hypermedia technologies are inseparable from what is referred to in phenomenology as “lifeworlds”. The concept of a lifeworld is in part a development of an analysis of existence put forth by Martin Heidegger. Heidegger explains that our everyday experience is one in which we are concerned with the future and in which we encounter objects as parts of an interconnected complex of equipment related to our projects (Heidegger, 1962, p. 91-122). As such, we invariably encounter specific technologies only within a complex of equipment. Giving the example of a bridge, Heidegger notes that, “It does not just connect banks that are already there. The banks emerge as banks only as the bridge crosses the stream.” (Heidegger, 1993, p. 354). As a consequence of this connection between technologies and lifeworlds, new technologies bring about ecological changes to the lifeworlds, language, and cultural practices with which they are connected (Postman, 1993, p. 18). Hypermedia technologies are no exception.

To examine the kinds of changes brought about by hypermedia technologies it is important to examine the history not only of those technologies themselves but also of the lifeworlds in which they developed. Such a study will reveal that the development of hypermedia technologies involved an unlikely confluence of two subcultures. One of these subcultures belonged to the United States military-industrial-academic complex during World War II and the Cold War, and the other was part of the American counterculture movement of the 1960s.

Many developments in hypermedia can trace their origins back to the work of Norbert Wiener. During World War II, Wiener conducted research for the US military concerning how to aim anti-aircraft guns. The problem was that modern planes moved so fast that it was necessary for anti-aircraft gunners to aim their guns not at where the plane was when they fired the gun but where it would be some time after they fired. Where they needed to aim depended on the speed and course of the plane. In the course of his research into this problem, Wiener decided to treat the gunners and the gun as a single system. This led to his development of a multidisciplinary approach that he called “cybernetics”, which studied self-regulating systems and used the operations of computers as a model for these systems (Turner, 2006, p. 20-21).

This approach was first applied to the development of hypermedia in an article written by one of Norbert Wiener’s former colleges, Vannevar Bush.  Bush had been responsible for instigating and running the National Defence Research Committee (which later became the Office of Scientific Research and Development), an organization responsible for government funding of military research by private contractors. Following his experiences in military research, Bush wrote an article in the Atlantic Monthly addressing the question of how scientists would be able to cope with growing specialization and how they would collate an overwhelming amount of research (Bush, 1945). Bush imagined a device, which he later called the “Memex”, in which information such as books, records, and communications would be stored on microfilm. This information would be capable of being projected on screens, and the person who used the Memex would be able to create a complex system of “trails” connecting different parts of the stored information. By connecting documents into a non-hierarchical system of information, the Memex would to some extent embody the principles of cybernetics first imagined by Wiener.

Inspired by Bush’s idea of the Memex, researcher Douglas Engelbart believed that such a device could be used to augment the use of “symbolic structures” and thereby accurately represent and manipulate “conceptual structures” (Engelbart, 1962).This led him and his team at the Augmentation Research Center (ARC) to develop the “On-line system” (NLS), an ancestor of the personal computer which included a screen, QWERTY keyboard, and a mouse.  With this system, users could manipulate text and connect elements of text with hyperlinks. While Engelbart envisioned this system as augmenting the intellect of the individual, he conceived the individual was part of a system, which he referred to as an H-LAM/T system (a  trained human with language, artefacts, and methodology) (ibid., p. 11). Drawing upon the ideas of cybernetics, Engelbart saw the NLS itself as a self-regulatory system in which engineers collaborated and, as a consequence, improved the system, a process he called “bootstrapping” (Turner, 2006, p. 108).

The military-industrial-academic complex’s cybernetic research culture also led to the idea of an interconnected network of computers, a move that would be key in the development of the internet and hypermedia. First formulated by  J.C.R. Licklider, this idea was later executed by Bob Taylor with the creation of ARPANET (named after the defence department’s Advanced Research Projects Agency). As a extension of systems such as the NLS, such a system was a self-regulating network for collaboration also inspired by the study of cybernetics.

The late 1960s to the early 1980s saw hypermedia’s development transformed from a project within the US military-industrial-academic complex to a vision animating the American counterculture movement. This may seem remarkable for several reasons. Movements related to the budding counterculture in the early 1960s generally adhered to a view that developments in technology, particularly in computer technology, had a dehumanizing effect and threatened the authentic life of the individual. Such movements were also hostile to the US military-industrial-academic complex that had developed computer technologies, generally opposing American foreign policy and especially American military involvement in Vietnam. Computer technologies were seen as part of the power structure of this complex and were again seen as part of an oppressive dehumanizing force (Turner, 2006, p. 28-29).

This negative view of computer technologies more or less continued to hold in the New Left movements largely centred on the East Coast of the United States. However, a contrasting view began to grow in the counterculture movement developing primarily in the West Coast. Unlike the New Left movement, the counterculture became disaffected with traditional methods of social change, such as staging protests and organizing unions. It was thought that these methods still belonged to the traditional systems of power and, if anything, compounded the problems caused by those systems. To effect real change, it was believed, a shift in consciousness was necessary (Turner, 2006, p. 35-36).

Rather than seeing technologies as necessarily dehumanizing, some in the counterculture took the view that technology would be part of the means by which people liberated themselves from stultifying traditions. One major influences on this view was Marshall McLuhan, who argued that electronic media would become an extension of the human nervous system and would result in a new form of tribal social organization that he called the “global village” (McLuhan, 1962). Another influence, perhaps even stronger, was Buckminster Fuller, who took the cybernetic view of the world as an information system and coupled it with the belief that technology could be used by designers to live a life of authentic self-efficiency (Turner, 2006, p. 55-58).

In the late 1960s, many in the counterculture movement sought to effect the change in consciousness and social organization that they wished to see by forming communes (Turner, 2006, p. 32). These communes would embody the view that it was not through political protest but through the expansion of consciousness and the use of technologies (such as Buckminster Fuller’s geodesic domes) that a true revolution would be brought about. To supply members of these communes and other wayfarers in the counterculture with the tools they needed to make these changes, Stewart Brand developed the Whole Earth Catalogue (WEC). The WEC provided lists of books, mechanical devices, and outdoor gear that were available through mail order for low prices. Subscribers were also encouraged to provide information on other items that would be listed in subsequent editions. The WEC was not a commercial catalogue in that it wasn’t possible to order items from the catalogue itself. It was rather a publication that listed various sources of information and technology from a variety of contributors. As Fred Turner argues (2006, p. 72-73), it was seen as a forum by means of which people from various different communities could collaborate.

Like many others in the counterculture movement, Stewart Brand immersed himself in cybernetics literature. Inspired by the connection he saw between cybernetics and the philosophy of Buckminster Fuller, Brand used the WEC to broker connections between ARC and the then flourishing counterculture (Turner, 2006,  p. 109-10). In 1985, Stewart Brand and former commune member Larry Brilliant took the further step of uniting the two cultures and placed the WEC online in one of the first virtual communities, the Whole Earth ‘Lectronic Link or “WELL”. The WELL included bulletin board forums, email, and web pages and grew from a source of tools for counterculture communes into a forum for discussion and collaboration of any kind. The design of the WELL was based on communal principles and cybernetic theory. It was intended to be a self-regulating, non-hierarchical system for collaboration.  As Turner notes (2005), “Like the Catalog, the WELL became a forum within which geographically dispersed individuals could build a sense of nonhierarchical, collaborative community around their interactions” (p. 491).

This confluence of military-industrial-academic complex technologies and the countercultural communities who put those technologies to use would form the roots of other hypermedia technologies. The ferment of the two cultures in Silicon Valley would result in the further development of the internet—the early dependence on text being supplanted by the use of text, image, and sound, transforming hypertext into full hypermedia. The idea of a self-regulating, non-hierarchical network would moreover result in the creation of the collaborative, social-networking technologies commonly denoted as “Web 2.0”.

This brief survey of the history of hypermedia technologies has shown that the lifeworlds in which these technologies developed was one first imagined in the field of cybernetics. It is a lifeworld characterised by non-hierarchical, self-regulating systems and by the project of collaborating and sharing information. First of all, it is characterized by non-hierarchical organizations of individuals. Even though these technologies first developed in the hierarchical system of the military-industrial-academic complex, it grew within a subculture of collaboration among scientists and engineers (Turner, 2006, p. 18). Rather than  being strictly regimented, prominent figures in this subculture – including Wiener, Bush, and Engelbart -voiced concern over the possible authoritarian abuse of these technologies (ibid., p. 23-24).

The lifeworld associated with hypermedia is also characterized by the non-hierarchical dissemination of information. Rather than belonging to traditional institutions consisting of authorities who distribute information to others directly, these technologies involve the spread of information across networks. Such information is modified by individuals within the networks through the use of hyperlinks and collaborative software such as wikis.

The structure of hypermedia itself is also arguably non-hierarchical (Bolter, 2001, p. 27-46). Hypertext, and by extension hypermedia, facilitates an organization of information that admits of many different readings. That is, it is possible for the reader to navigate links and follow what Bush called different “trails” of connected information. Printed text generally restricts reading to one trail or at least very few trails, and lends itself to the organization of information in a hierarchical pattern (volumes divided into books, which are divided into chapters, which are divided into paragraphs, et cetera).

It is clear that the advent of hypermedia has been accompanied by changes in hierarchical organizations in lifeworlds and practices. One obvious example would be the damage that has been sustained by newspapers and the music industry. The phenomenological view of technologies as connected to lifeworlds and practices would provide a more sophisticated view of this change than the technological determinist view that hypermedia itself has brought about changes in society and the instrumentalist view that the technologies are value neutral and that these changes have been brought about by choice alone (Chandler, 2002). It would rather suggest that hypermedia is connected to practices that largely preclude both the hierarchical dissemination of information and the institutions that are involved in such dissemination. As such, they cannot but threaten institutions such as the music industry and newspapers. As Postman (1993) observes, “When an old technology is assaulted by a new one, institutions are threatened” (p. 18).

Critics of hypermedia technologies, such as Andrew Keen (2007), have generally focussed on this threat to institutions, arguing that such a threat undermines traditions of rational inquiry and the production of quality media. To some degree such criticisms are an extension of a traditional critique of modernity made by authors such as Alan Bloom (1987) and Christopher Lasch (1979). This would suggest that such criticisms are rooted in more perennial issues concerning the place of tradition, culture, and authority in society, and is not likely that these issues will subside. However, it is also unlikely that there will be a return to a state of affairs before the inception of hypermedia. Even the most strident critics of “Web 2.0” technologies embrace certain aspects of it.

The lifeworld of hypermedia does not necessarily oppose traditional sources of expertise to the extent that the descendants of the fiercely anti-authoritarian counterculture may suggest, though. Advocates of Web 2.0 technologies often appeal to the “wisdom of crowds”, alluding the work of James Surowiecki  (2005). Surowiecki offers the view that, under certain conditions, the aggregation of the choices of independent individuals results in a better decision than one made by a single expert. He is mainly concerned with economic decisions,  offering his theory as a defence of free markets. Yet this theory also suggests a general epistemology, one which would contend  that the aggregation of the beliefs of many independent individuals will generally be closer to the truth than the view of a single expert. In this sense, it is an epistemology modelled on the cybernetic view of self-regulating systems. If it is correct, knowledge would be  the result of a cybernetic network of individuals rather than a hierarchical system in which knowledge is created by experts and filtered down to others.

The main problem with the “wisdom of crowds” epistemology as it stands is that it does not explain the development of knowledge in the sciences and the humanities. Knowledge of this kind doubtless requires collaboration, but in any domain of inquiry this collaboration still requires the individual mastery of methodologies and bodies of knowledge. It is not the result of mere negotiation among people with radically disparate perspectives. These methodologies and bodies of knowledge may change, of course, but a study of the history of sciences and humanities shows that this generally does not occur through the efforts of those who are generally ignorant of those methodologies and bodies of knowledge sharing their opinions and arriving at a consensus.

As a rule, individuals do not take the position of global skeptics, doubting everything that is not self-evident or that does not follow necessarily from what is self-evident. Even if people would like to think that they are skeptics of this sort, to offer reasons for being skeptical about any belief they will need to draw upon a host of other beliefs that they accept as true, and to do so they will tend to rely on sources of information that they consider authoritative (Wittgenstein, 1969). Examples of the “wisdom of crowds” will also be ones in which individuals each draw upon what they consider to be established knowledge, or at least established methods for obtaining knowledge. Consequently, the wisdom of crowds is parasitic upon other forms of wisdom.

Hypermedia technologies and the practices and lifeworld to which they belong do not necessarily commit us to the crude epistemology based on the “wisdom of crowds”. The culture of collaboration among scientists that first characterized the development of these technologies did not preclude the importance of individual expertise. Nor did it oppose all notions of hierarchy. For example, Engelbart (1962) imagined the H-LAM/T system as one in which there are hierarchies of processes, with higher executive processes governing lower ones.

The lifeworlds and practices associated with hypermedia will evidently continue to pose a challenge to traditional sources of knowledge. Educational institutions have remained somewhat unaffected by the hardships faced by the music industry and newspapers due to their connection with other institutions and practices such as accreditation. If this phenomenological study is correct, however, it is difficult to believe that they will remain unaffected as these technologies take deeper roots in our lifeworld and our cultural practices. There will continue to be a need for expertise, though, and the challenge will be to develop methods for recognizing expertise, both in the sense of providing standards for accrediting experts and in the sense of providing remuneration for expertise. As this concerns the structure of lifeworlds and practices themselves, it will require a further examination of those lifeworlds and practises and an investigation of ideas and values surrounding the nature of authority and of expertise.

References

Bloom, A. (1987). The closing of the American mind. New York: Simon & Schuster.

Bolter, J. D. (2001) Writing space: Computers, hypertext, and the remediation of print (2nd ed.). New Jersey: Lawrence Erlbaum Associates.

Bush, V. (1945). As we may think. Atlantic Monthly. Retrieved from http://www.theatlantic.com/doc/194507/bush

Chandler, D. (2002). Technological or media determinism. Retrieved from http://www.aber.ac.uk/media/Documents/tecdet/tecdet.html

Engelbart, D. (1962) Augmenting human intellect: A conceptual framework. Menlo Park: Stanford Research Institute.

Heidegger, M. (1993). Basic writings. (D.F. Krell, Ed.). San Francisco: Harper Collins.

—–. (1962). Being and time. (J. Macquarrie & E. Robinson, Trans.). San Francisco: Harper Collins.

Keen, A. (2007). The cult of the amateur: How today’s internet is killing our culture. New York: Doubleday.

Lasch, C. (1979). The culture of narcissism: American life in an age of diminishing expectations. New York: W.W. Norton & Company.

McLuhan, M. (1962). The Gutenberg galaxy. Toronto: University of Toronto Press.

Postman, N. (1993). Technopoly: The surrender of culture to technology. New York: Vintage.

Surowiecki, J. (2005). The wisdom of crowds. Toronto: Anchor.

Turner, F. (2006). From counterculture to cyberculture: Stewart Brand, the Whole Earth Network, and the rise of digital utopianism. Chicago: University of Chicago Press.

—–. (2005). Where the counterculture met the new economy: The WELL and the origins of virtual community. Technology and Culture, 46(3), 485–512.

Wittgenstein, L. (1969). On certainty. New York: Harper.

November 29, 2009   No Comments

Commentary 3: Web 2.0 and Emergent Multiliteracy Skills

Commentary #3: Web 2.0 and Emergent Multiliteracy Skills

Erin Gillespie

ETEC 540

November 27, 2009

Recently, I attended a literacy skills planning meeting. Various curricular text types were explained including procedural text, expositional text, descriptive text, persuasive text and narrative text. The literacy committee recommended that students and teachers use the Internet to increase exposure to the curricular text types and the skills inherent in each. Several suggestions involved Web 2.0 applications, such as blogging, collaborative document creation and wiki editing.  Interestingly, the committee did not consider Web 2.0 itself as embodying an emergent form of multiliteracy skills. In digitally advanced nations, members of society read and write in Web 2.0. Why was this form of emergent multiliteracy overlooked by my curricular designers when we all share the vision of preparing students for the  future with “real world” skills? Web 2.0’s emergent multiliteracies are meaningful and deserve a place in curriculum design.

In the article Web 2.0: A New Wave of Innovation for Teaching and Learning?, Bryan Alexander (2006) describes Web 2.0 applications and common practices within the concept. The major qualities of Web 2.0 are considered by Alexander (2006) to be content blocks called “microcontent”, openness, folksonomic metadata and social software. Alexander (2006) describes how services like social bookmarking, blogging and RSS feeds reflect the qualities of Web 2.0.  He concludes that Web 2.0’s services, which are emergent and therefore risky, may not be considered highly in the field of education.  In 2008, Alexander extends his argument with Web 2.0 and Emergent Multiliteracies. Concerning the status of Web 2.0 in the field of education, Alexander (2008) posits a more optimistic opinion.   He describes the “archival instinct” of the Web and states that many pedagogical possibilities of Web 2.0 are explored by teachers and students (Alexander, 2008). The implication of increased value of Web 2.0’s emergent multiliteracies in education further strengthens the argument that this genre requires serious consideration by teachers and curriculum design teams.

Alexander (2008) describes Web 2.0 as being composed of social connection, microcontent, social filtering and openness, similar to his theory in 2006. Instructors must understand these qualities to identify pedagogical possibilities of emergent multiliteracies. Social connection is fostered by Web applications that literally connect people based on the variables of interest or personality (Alexander, 2008).  Alexander (2008) lists a number of examples, such as blogs, FaceBook and Flickr to clarify. Microcontent is considered by Alexander (2008) to be small in size and to require a short investment in learning time. Alexander’s (2008) implication is clear in that microcontent makes Web authoring, and publishing, accessible and realistic in terms of time investment for teacher and student. Social filtering is the process of relating information between primary and secondary sources of Web content. Alexander (2008) considers it “the wisdom of the crowd”, and social filtering is evident in folksonomies created through tagging. Finally, Alexander’s (2006, 2008) fourth quality of “openness” for Web 2.0 content  refers to any content posted on the Web for a global audience to see and use. Considering the literary text types I must teach this year, none seem as dynamic and exciting as the genre of emergent multiliteracies of Web 2.0.

To consolidate my teaching style with Web 2.0 multiliteracy, I must always keep in mind pedagogical possibilities related to Alexander’s (2008) four qualities: microcontent, openness, social filtering and social connection. On a practical level, how could a teacher reach the professional satisfaction of exposing students to this emergent text type in a meaningful way? One popular technique when teaching text types is to take advantage of the traditional method of storytelling. I have taught narrative text and descriptive text through storytelling.  A challenging thought is how to use storytelling to teach emergent Web 2.0 multiliteracies! However, the emergent genre of Web 2.0 storytelling, as described by Alexander and Levine (2008), supports Alexander’s (2006, 2008) theory of Web 2.0 multiliteracies and pedagogical needs.  

Alexander & Levine (2008) argue that Web 2.0 has changed the genre of digital storytelling by blending digital storytelling with Alexander’s (2006;2008) Web 2.0 qualities. Expensive desk top publishing programs are being replaced by free Web 2.0 tools, effectively shifting the pedagogical focus from mastering a tool to telling a story with a tool (Alexander & Levine, 2008).  Web 2.0 digital storytelling is considered to be fiction or non-fiction with possible blurred boundaries, and is broad in scope (Alexander & Levine, 2008). The most significant difference between digital storytelling and Web 2.0 storytelling is the singular, linear flow of the former and the multidirectional flow of the latter (Alexander & Levine, 2008). With Web 2.0 qualities of social connectedness and openness, stories can virtually go in any direction, well beyond a linear form. Curriculum designers must recognize and design for these emergent qualities before advising teachers to use Web 2.0 tools to support literacy skills.

The emergent multiliteracies of Web 2.0 have meaningful literacy skills which should be included in curricular design. For example, Alexander and Levine (2008) note that content redesign is out of the hand of the primary creator with Web 2.0 storytelling. This implies that one challenge will be in teaching students about the consequences of openness and social filtering. In other words, an emerging skill embedded in Web 2.0 multiliteracy is Web 2.0 content analysis. The proposition by Alexander and Levine (2008) that there is a Web 2.0 storytelling genre exemplifies the need for continued research and increased pedagogical recognition concerning emergent multiliteracies. The revelation that each emergent Web 2.0 literacy genre may have its own set of multiliteracy skills should make every curricular designer and practitioner in digitally advanced educational environments sit up and take notice.

References

Alexander, B. (2006). Web 2.0: A new wave of innovation for teaching and learning? EDUCAUSE Review, 41(2), 32-44. Retrieved from http://net.educause.edu/ir/library/pdf/ERM0621.pdf.

Alexander, B. (2008). Web 2.0 and emergent multiliteracies. Theory into Practice, 47(2), 150-160. doi: 10.1080/00405840801992371

Alexander, B., & Levine, A. (2008). Web 2.0 storytelling: Emergence of a new genre. EDUCAUSE Review, 43(6), 40-56. Retrieved from http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume43/Web20StorytellingEmergenceofaN/163262

November 27, 2009   3 Comments

Oldest Bible now in digital….

In the December, 2009 edition of National Geographic, I came across an article by A.R. Williams detailing how the oldest known New Testament is now available online at http://codexsinaiticus.org/en/ . According to Williams, the virtual version lets you see additions that were made and corrections that were overwritten. I tried it out and it is truly realistic. It took scholars, including the British Library, over four years to digitize! Amazing. The tools at http://codexsinaiticus.org/en/manuscript.aspx really give you the feeling that you are flipping through the ancient codex.

November 27, 2009   No Comments

History of Social Technologies

Given that we are adding our favourite web 2.0/social media experiences, I thought that I’d provide a very brief history of social media:

Social Technologies have become a staple part of today’s digital world. Millions of people make social connections online through various websites like Facebook, Twitter and Flickr to name just a few. The popularity of such sites have seen incredible growth over the last few years, but when did this trend start? What follows is a brief history of social networking on the Internet.

1980’s :

Compuserve, which was around since the 1970s, evolved into a network that would allow members to share files. Discussion forums began to emerge as a result.

Another network called BBS (Bulletin Board System) allowed users to communicate using a modem over telephone lines. Long distance charges would apply so many Bulletin Boards were strictly local.

1990’s:

AOL (America Online) was in its heyday with member-created communities and searchable profiles.

Classmates.com became very popular as people tried to use the Internet to reconnect.

2000’s:

Friendster is launched. This site allowed members so see the connections that they knew they had and discover connections that they did not realize they had in common with others.

LinkedIn, a networking resource aimed mostly at professionals and businesspeople, is created.

MySpace becomes a huge hit mostly in the US. Its key demographic is the under 30 crowd.

Facebook quickly grows into the world’s largest online social networking site.

Twitter is launched and catches on quickly for those who can’t seem to get enough minutiae. Where and what social networks will evolve into is anybody’s guess.

November 25, 2009   2 Comments

The Age of Real-Time

I had the opportunity to go to the Annual Conference on Distance Teaching and Learning in Madison, Wisconsin this past August. The last keynote speaker, Teemu Arina discussed how culture and education are changing with emerging technologies. His presentation illustrated how we are moving from linear and sequential environments to those that are nonlinear and serendipitous. Topics of time, space and social media tie into Teemu’s presentation. The video of the presentation is about 45 minutes long but the themes tie nicely into our course and into many other courses within the MET program.

In the Age of Real-Time: The Complex, Social, and Serendipitous Learning Offered via the Web

November 24, 2009   No Comments

Rip.Mix.Feed Photopeach

Hi everyone,

For my rip.feed.mix assignment, I decided not to re-invent the wheel, but instead to add to an already existing wheel. When I took ETEC565 we were asked to produce a similar project when exploring different web 2.0 tools. We were directed to The Fifty Tools. I used PhotoPeach to create my story. My wife and I moved to Beijing in the fall of 2007 and we’ve been traveling around Asia whenever we get a break from teaching. The story I’ve made is a very brief synopsis of some of our travels thus far. Since the original posting, I have updated the movie with more travels. You can view the story here.  If you’re in China, the soundtrack U2 – Where the Streets Have No Name will not play because it is hosted on YouTube.

What I enjoy most about these tools is that they are all available online, all a student needs to create a photo story is a computer with access to the Internet. To make the stories more personal, it would be great if they had access to their own digital pictures. However, if they have no pictures of their own, they can find pictures, through Internet searches that give results from a creative commons license to include in their stories.

Furthermore, as I teach in an international school in which most students speak English as a second, third, or fourth language, and who come from many different countries, Web 2.0 has “lowered barrier to entry may influence a variety of cultural forms with powerful implications for education, from storytelling to classroom teaching to individual learning (Alexander, 2006).” Creating digital stories about their own culture provides a medium through which English language learners acquire foundational literacies while making sense “of their lives as inclusive of intersecting cultural identities and literacies (Skinner & Hagood, p. 29).” With their work organized, students can then present their work to the classmates for discussion and feedback, build a digital library of age/content appropriate material, and share their stories with global communities (Skinner & Hagood).

John

References

Alexander, Bryan. (2006). “Web 2.0: A New Wave of Innovation for Teaching and Learning?” EDUCAUSE Review, 41(2).

Skinner, Emily N. & Hagood, Margaret C. (2008). “Developing Literate Identities With English Language Learners Through Digital Storytelling.” The Reading Matrix, 8(2), 12 – 38.

November 22, 2009   2 Comments

Web 2.0 Toolbox

For my Mix. Rip. Feed activity, I have compiled a list of bookmarks in del.icio.us of most of the online tools and resources we have come across in ETEC540.  Some I have used before, but most are new to me.  I have added tags and brief descriptions of what the tools are for.  What this lacks in creativity, it makes up for in usefulness.

Click here to open my Web 2.0 Toolbox.

November 20, 2009   2 Comments

Remix Culture: Fair Use is Your Friend

As many of us have made digital materials, including video for MET courses, this video created by the American University Center for Social Media may be of interest. It describes fair use in the context of creating online videos and offers some best practices that educators can apply to their practice. Though Fair Use relates to American copyright laws, there are guidelines that we can take away and apply to our own contexts in the absence of any other documentation.

For the American University Center for Social Media’s full report see: Code of Best Practices in Fair Use for Online Video

View Remix Culture: Fair Use is Your Friend Video Here

November 14, 2009   No Comments

RipMixFeed using del.icio.us

For the RipMixFeed activity I collected a set of resources using the social bookmarking tool del.icio.us. Many of us have already used this application in other courses to create a class repository of resources or to keep track of links relevant to our research projects. What I like about this tool is that the user can collect all of their favourite links, annotate them and then easily search them according to the tagged words that they created. This truly goes beyond the limitations of web browser links.

For this activity I focused on finding resources specifically related to digital and visual literacy and multiliteracies. To do this I conducted web searches as well as searches of other del.icio.us user’s links. As there are so many resources – too many for me to adequately peruse – I have subscribed to the tag ‘digitalliteracy’ in del.icio.us so I connect with others tagging related information. You can find my del.icio.us page at: http://delicious.com/nattyg

Use the tags ‘Module4’ and ‘ETEC540’ to find the selected links or just search using ETEC540 to find all on my links related to this course.

A couple of resources that I want to highlight are:

  1. Roland Barthes: Understanding Text (Learning Object)
    Essentially this is a self-directed learning module on Roland Barthes ideas on semiotics. The section on Readerly and Writerly Texts is particularly relevant to our discussions on printed and electronic texts.

  2. Howard Rheingold on Digital Literacies
    Rheingold states that a lot people are not aware of what digital literacy is. He briefly discusses five different literacies needed today. Many of these skills are not taught in schools so he poses the question how do we teach these skills?

  3. New Literacy: Document Design & Visual Literacy for the Digital Age Videos
    University of Maryland University College faculty, David Taylor created a five part video series on digital literacy. For convenience sake here is one Part II where he discusses the shift to the ‘new literacy’. Toward the end of the video, Taylor (2008) makes an interesting statement that “today’s literacy means being capable of producing fewer words, not more”. This made me think of Bolter’s (2001) notion of the “breakout of the visual” and the shift from textual to visual ways of knowing.

Alexander (2006) suggests that social bookmarking can work to support “collaborative information discovery” (p. 36). I have no people in my Network as of yet. I think it would be valuable to connect with some of my MET colleagues so if you would like share del.icio.us links let’s connect! My username is nattyg.

References

Alexander, B. (2006). Web 2.0: A new wave for teaching and learning? Educause Review, Mar/Apr, 33-44.

Bolter, J.D. (2001). Writing space: Computers, hypertext and the remediation of print. London: Lawrence Erlbaum Associates, Publishers.

Taylor, D. (2008). The new literacy: document design and visual literacy for the digital age: Part II. Retrieved November 13, 2009, from http://www.youtube.com/watch?v=RmEoRislkFc

November 14, 2009   2 Comments