Article for Handbook of Distance Education

Alex Kuskis and I recently completed the final revisions to this article on “interaction” for the 3rd edition of the Handbook of Distance Education (proof version only). We worked on emphasizing the historical rooting of the concept in cybernetics and the work of its father, Norbert Wiener. Weiner initially conceived of interaction in terms of messages, communication and control, and like some in distance education, he did not see any qualitative difference between human-human and human-machine interaction:

To me, personally, the fact that the signal in its intermediate stages has gone through a machine rather than through a person is irrelevant and does not in any case greatly change my relation to the signal. Thus the theory of control in engineering, whether human animal or mechanical, is a chapter in the theory of messages. (Wiener, 1950, p. 25)

It is largely in this systematic and cybernetic form that the term “interaction” has been integrated into the discourses of distance education, lifelong learning, educational technology, and other educational sub-domains: “Much of learning theory and instructional systems design is founded in or explained by analogous reference to concepts borrowed from General Systems Theory” (Larsen, 1985, p.

When applied to engagement between student and teacher, student and content and among students. the term interaction clearly has a very broad semantic range. One recent factor that makes this range even broader (in perhaps a slightly worrisome way) is the increasing importance of analytics for education: The (automated) analysis of patterns of student interactions to individually customize their environment and (let’s face it) to monitor and evaluate their work. We make reference to Pariser’s “The Filter Bubble” to suggest some of the challenges presented by this “interaction as shaped through analytics.” Here’s what we write:

In his book The Filter Bubble (2011), Eli Pariser focuses at length on how services like Facebook, Google, and Yahoo use sophisticated and hidden algorithms to customize content (e.g., feeds items, search results, and advertising) according to users’ past behaviors and current inputs. Pariser argues that these mechanisms can lead to a kind of “information determinism,” a situation in which our past queries, selections and even evasions may “entirely decide” what is made available for selection and interaction in “our future” (p. 90).

Is the future of interaction in educational environments one of computerized surveillance and of customization of engagement according to user metrics? This would be ironic for a project ( which is supposed to help learners become autonomous and accountable as decision makers and citizens.

This entry was posted in Writing. Bookmark the permalink.