Category Archives: C. Embodied Learning

Embodied learning in maths and sciences

This week, the two most interesting articles I read were Mirror Worlds and Deepening students’ scientific inquiry skills during a science museum field trip. While I found some elements of the field trips article problematic because it took for granted that teachers were too overworked to fully plan for a field trip to a museum that would meet their curricular outcomes, I found its focus on developing inquiry while on site interesting. In my personal experience, field trips are carefully planned with pre-visits to the site. The site is then used to seek answers previously asked or to spark inquiry that will be brought back into the classroom for later learning. In this article, Gutwill focuses on teaching students skills for inquiry through a shift from scientific knowing to scientific questioning.

Gautam, on the other hand, focused on “mirror worlds” in which the entire field trip is virtual and learners meet and collaborate with others in an online environment without ever leaving the classroom. Immersive education provides learners with the feeling of “being there” even when physical presence is not possible (Gautam, 2018) and there is a digital representation of real-world objects. This version of embodied instruction provides an exciting possibility, as described by Gautam et. al. because it allows users to collaborate in a common environment via remote locations, represented as avatars. It opens up possibilities for rich socio-cognitive learning amongst peers without expense and hassle of traveling to onsite learning environment.

For me, the learning this week was in separating embodied learning from situated learning. Situated cognition, Gautam writes, is best achieved when knowledge is situated in authentic contexts. The focus being on developing learning that is useful and not learning for the sake of learning. For the learning to be effective in this instance, the learner must feel present in the environment in order to construct their understanding. For this to happen there must be immediacy (synchronous interactions) and intimacy (ability to interact with others via proximity, eye-contact, etc.). This leads me to question the difference in cognitive benefits between learning onsite and learning via virtual environments. I would welcome your thoughts and further readings.

  1. What do you think is the impact of “virtual field trips” for students who may already be disconnected from their immediate environment? If students are able to experience in-situ experiences in exotic places but are not familiar with what is around them, how does this impact their understanding?
  2. I had a conversation this week with a colleague regarding a project for the upcoming year that I would distinctly categorize as inquiry due to the open-ended nature of it and the fact that we are beginning with a question. The colleague, however, cautioned that we should not call it inquiry when introducing it to the staff because inquiry had such negative connotations, which shocked me a little. In your context, how is inquiry a used for your learners to understand learning contexts both on virtual and lived field trips?
  3. How might you tweak a lesson you have recently taught in maths or sciences to integrate embodied learning?

 

Gautam, A., Williams, D., Terry, K., Robinson, K., & Newbill, P. (2018). Mirror worlds: Examining the affordances of a next generation immersive learning environment. New York: Springer

Gutwill, J. P., & Allen, S. (2011). Deepening students’ scientific inquiry skills during a science museum field trip. Journal of the Learning Sciences, 21(1), 130-181.

Embodied Learning and Umwelt

Embodied learning is an interesting concept and one that I hadn’t previously encountered. William Winn (2013) describes embodied learning as the body, brain and environment as one; these facets are interdependent and cannot be separated from one another. Winn further describes learning as a physical act that involves the whole body. As Winn states, “Learning is no longer confined to what goes on in the brain” (2013, p. 22). While exploring Winn’s work, as well as delving into the deep depths of this module’s readings, a couple of key points stood out to me. These points are umwelt, collaboration and physical movement in the form of gestures.

Winn’s article introduced me to the concept of Umwelt. A term coming from Germany in 1934, Winn describes Umwelt as, “used to refer to the environment as seen and understood, idiosyncratically, by different individuals” (2013, p. 12). Each person’s umwelt is different from everyone else’s. Further, umwelts are ever changing. This is something, particularly after taking a few MET courses, that I think about often in my teaching. Students all come to my class with different knowledge systems and experiences. Winn (2013) argues that these are all connected. What I hadn’t fully thought about before was that we as teachers can never fully understand a student’s umwelt and how they will react to a certain situation. This made me cautious not to underestimate the impact of the environment in students’ knowledge.

Collaboration is a topic that comes up over and over again, though I had not previously thought about it in relation to physical movement and the connection to the brain. Roschelle et al. (2010) argue, that tasks “…should be designed so that individual contributions are needed for group success” (p. 405) – the we sink or swim together mentality. They believe that this collaborative learning will become more structured and easier for teachers to deliver with more embedded technological practice.

Novack et al.’s (2014) work on using hand gestures to teach mathematics was intriguing. This article was of high interest to me as I often use physical manipulatives to teach math. The author’s research found that acting (the act of acting directly on physical objects) gave a relatively shallow understanding of a novel math concept as compared to gesturing (the act of representational hand movements) which developed a deeper and more flexible learning. Further, the knowledge translated better to applying this learning to other tasks. This made me think about my own teaching of math; when I taught Kindergarten I think I did much more gesturing, whereas in grade 6 I do a lot less. This is certainly an issue that I will explore further. (Full disclosure, I still use the ol’ hand trick for my nine times tables!)

I enjoyed the readings this week as they introduced me to some new concepts but also helped my increase my understanding about topics that I was already familiar with. My new goals that have originated out of several readings combined are to combine collaboration with physical movement, as well as to gesture away while teaching math!

Questions:

  1. Can anyone provide any examples of how they have included technology to harness the benefits of collaborative learning? How to do hand out specific roles or tasks to ensure that each group member contributes?
  2. Do you use gesturing when teaching math? Can you provide a specific example?

 

References

Novack, M. A., Congdon, E.L., Hemani-Lopez, N., & Goldin-Meadow, S. (2014). From action to abstraction: Using the hands to learn math. Psychological Science 25(4), 903-910.

Roschelle, J., Rafanan, K., Bhanot, R., Estrella, G., Penuel, B., Nussbaum, M., & Claro, S. (2010). Scaffolding group explanation and feedback with handheld technology: Impact on students’ mathematics learning. Educational Technology Research and Development, 58(4), 399-419.

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114.

Embodied Learning with Mobile 3D/AR/VR

As I have mentioned in the past, my primary role is to work with all K-12 teachers to incorporate technology into the curriculum. This year, my middle and senior school teachers have been incorporating 3D design and 3D printing in problem solving and design activities for various courses including math, physics, and environmental science. As stated by Zydney & Warner, “additional research is also needed in order to determine how mobile apps can serve as problem-solving tools through the scientific process in addition to scaffolds or supports” (p. 14). One type of mobile app that I have yet to explore is mobile-based 3D scanners. These apps take a series of 2D pictures to create a 3D model; some utilize augmented reality (AR) and virtual reality (VR) to aid in target acquisition. Winn argues that “exposure to an environment can lead to physical changes in the brain, resulting in heightened perceptual sensitivity, which leads a person to actually see things differently in the environment” (p. 18). This type of technology could help students investigate and manipulate physical objects, and develop structural modifications and solutions.

An example of this type of technology is Qlone which is available free on the Apple App Store. This application requires the user to place the target object on a template printable in any size.

Here is a video that demonstrates the Qlone application: https://www.youtube.com/watch?time_continue=49&v=BkOxvT_esQo

The power of this type of mobile application will certainly grow once they develop the ability to quickly capture 3D objects and landscapes within the natural environment. This leads to my questions for you:

  1. Some obvious applications for 3D scanning would include measuring surface area of actual objects (math), and analyzing the external structures of insects (biology). Do you see any applications for 3D scanning in your classroom?
  2. Dunleavy, Dede, & Mitchell (2008) mention that students may not be accustomed to AR which requires a large investment in modeling, facilitation, and scaffolding to be built into the virtual leaning environments. Do you think that it is worth the time/money to develop and implement these artificial environments or is it better spent working with AR/VR tools built for real-world applications?

Dunleavy, M., Dede, C., & Mitchell, R. (2009). Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journal of Science Education and Technology, 18(1), 7-22.

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness and dynamic adaptation. Technology, Instruction, Cognition and Learning1(1), 87-114.

Zydney, J. M., & Warner, Z. (2016). Mobile apps for science learning: Review of research. Computers & Education, 94, 1-17.

Lightbot – Implications of Embodiment in Coding

“Learning is considered to arise from the reciprocal interaction between external, embodied, activity and internal, cerebral, activity, the whole being embedded in the environment in which it occurs.” (Winn, 2003, p.22)

The premise of embodiment relies on constructivist ideas of learning. Students learn to use their body to process and demonstrate conceptual understanding. The research about embodiment also resonates with other notable theories such as the social-communicative learning. Here, the context in which the learning occurs serves as a decisive factor for concept attainment.

Let’s explore the ways in which the use of embodiment principals will support the learning of logic and coding in the application Lightbot.

Coding: Lightbot

  • Makes learning concepts concrete, tangible and accessible

Some theorists believe that “the body is a public resource for thinking, learning, and joint activity” (Stevens, 2012, p.338). More specifically, like artificial manipulatives, the body acts as a medium of processing information. Stevens quotes William (2012) and claims that “the “bodily basis of the conceptual system we use to think mathematically” (p. 217).” (Stevens, 2012, p.342) Adding muscle memory tags can also enhance information retrieval rate.

In the coding application Lightbot, users are asked to solve puzzles of command to direct a robot to light a light bulb in a specific space. Using hand gestures will also users to indicate the specific direction in which the robot will move hence supporting the problem solving process.

  • Embodiment allows for and utilizes learning from the first person perspective

The idea of embodiment expect learners to internalize and re-represent heard or seen concepts. They are commanding their body parts to materialize thoughts and communicate understanding. Moreover, it is possible that learners are mobilizing mirror neurons (i.e. to copy the agent and or the object) to re-represent and internalize explanation. This implies that the use of body parts to enable first hand experience hence increasing opportunities for direct experience. Kim, Roth & Thom (2011) alludes to the idea re-represent with body and that it is similar to utilizing slow motion to take a closer look at concepts and to reduce misconceptions.

In the case of solving puzzles in Lightbot, learners have to impersonate the robot to problem solve from their avatar’s point of view. Thus, the only way to solve the puzzle is to follow through imaging oneself as the robot.

  • Providing social learning opportunities to make concepts more explicit

In one study, it was found that almost all gestures were used for social interactions. More specifically, “[t]he body participates in abstracting the ideas unfolded in the interaction imaginatively and spontaneously.”(Kim, Roth & Thom, 2011, p.224) Naturally, body parts are accessible tools for social negotiation. When young learners lack the vocabulary to share ideas, body parts serve as convenient tools to express meaning. Embodiment theorists then claim that verbalizing alone is insufficient to create learning pathways.

With more challenging Lightbot puzzles, students will have to work together to help how they solve it. Here, the use of gestures and other body parts to share expertises.

Wonderings

How can the use of embodiment reduce misconceptions?

How can VR and AR technology support learning in coding?

How does immersive technology support the concept of embodiment?

 

References

Kim, M., Roth, W. M., & Thom, J. (2011). Children’s gestures and the embodied knowledge of geometry. International Journal of Science and Mathematics Education, 9(1), 207-238. http://ezproxy.library.ubc.ca/login?url=http://dx.doi.org/10.1007/s10763-010-9240-5

Stevens, R. (2012). The missing bodies of mathematical thinking and learning have been found. Journal of the Learning Sciences, 21(2), 337-346. http://ezproxy.library.ubc.ca/login?url=http://dx.doi.org/10.1080/10508406.2011.614326

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114. Full-text document retrieved on January 17, 2004, from: http://www.hitl.washington.edu/people/tfurness/courses/inde543/READINGS-03/WINN/winnpaper2.pdf

Embodied Learning – Simple Technology and Rich Social Practices

For my topic, I chose to work with probe-ware on handheld devices.  Winn (2003) states his premise that “cognition is embodied in physical activity, that this activity is embedded in a learning environment, and that learning is the result of adaptation”.  He posits that learning is the reciprocal interaction between the external (embodiment) and the internal (cognition) embedded in the environment.  His point her is that “learning is no longer confined to what goes on in the brain”.  Our gestures, movements, and spatial positioning contribute to our understandings – for example we remember the movements and gestures of children’s songs even as adults.  Role plays, too, can be very effective in supporting long-term retention of understandings, as they help us to relate to the roles and to see value and importance in it through empathy with our given role.

In a similar vein, Niebert et al, (2012), use the examples of metaphors and analogies to support learning beyond experiences and cognition.  A metaphor or analogy properly used allows students to relate complicated concepts to their everyday life.  I see this as a sort of mental role-play.  Rather than acting it out, our brains visualize how the concept works through understanding the analogy.  Niebert et al write “it takes more than making a connection to everyday life to communicate science fruitfully. We show that good instructional metaphors and analogies need embodied sources. These embodied sources are everyday experiences conceptualized in, for example, schemata such as containers, paths, balances, and up and down”, (2012).  They use the theory of experientialism to boldly claim “thinking about and understanding science without metaphors and analogies is not possible” as support for the need for embodiment for students to relate to concepts that they can not physically experience.

Zucker et al, (2008), wrote the the use of probe ware with PDAs in the classroom resulted in “substantial learning gains” compared to instruction without them.  They reported on a TEEMSS II project (technology enhanced elementary and middle school science) where hand held probes were used to collect, share and analyse date effectively while actively engaging the students.  Roschelle, (2003), agrees, but also recognizes a number of significant challenges to the effective use of hand-held mobile devices in the classroom.  He argues that for them to be used effectively, the teachers need to grow in their TPCK base.  Technology can very easily be ineffective or even disruptive to learning, but also has great potential.  He presents 3 case studies to demonstrate that “simple, well-honed technology and rich, pedagogically developed social practices” can greatly increase understandings while not allowing technology to control the students or driving up significant costs.  The 3 case studies put forward are: classroom response systems, participatory simulations, and collaborative data gathering.  All three use a specific, uniform technology to perform a simple well-defined function that the students can engage and interact with.  The teaching and learning are supported by the tech, but occur outside of it through designing of experiments, critiquing, analysis of results, discussion of patterns, and explanation of responses.  I find this approach very helpful, as the teacher can still direct the learning but in an engaging and effective way.

  1. Roschelle indicated that one of the biggest challenges to handheld devices and probe ware is the lack of uniformity and compatibility in available technology: devices and apps. I found this to be a problem in my class last year when I experimented with BYOD.  How can we as teachers support effective use of student-owned devices when there is such diversity of incompatible platforms and apps, and without mandating a particular one?
  2. If, as Winn claims, “cognition is embedded in physical activity”, in your opinion, does the use of tech and personal devices support or counteract this claim?
  3. The TELEs we looked at prior all involved learning being immersed in a tech environment, but Roschelle and Zucker see the use of tech as being embedded in the social practice of learning. Which of these models is better from a pedagogical viewpoint?

 

  • Niebert, K., Marsch, S., & Treagust, D. F. (2012). Understanding needs embodiment: A theory‐guided reanalysis of the role of metaphors and analogies in understanding science. Science Education, 96(5), 849-877. http://ezproxy.library.ubc.ca/login?url=http://dx.doi.org/ 10.1002/sce.21026
  • Roschelle, J. (2003). Keynote paper: Unlocking the learning value of wireless mobile devices.Journal of Computer Assisted Learning, 19(3), 260-272. 10.1046/j.0266-4909.2003.00028.x
  • Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114. Full-text document retrieved on January 17, 2013, from:http://www.hitl.washington.edu/people/tfurness/courses/inde543/READINGS-03/WINN/winnpaper2.pdf
  • Zucker, A., Tinker, R., Staudt, C., Mansfield, A., & Metcalf, S. (2008). Learning science in grades 3-8 using probeware and computers: Findings from the TEEMSS II Project. Journal of Science Education and Technology, 17, 42-48. http://ezproxy.library.ubc.ca/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=28816389&site=ehost-live

Embodied Learning and Assessment

In the Winn article, it is explained that embodied learning happens when the learner is able to use all of their body to interact with the environment. “Learning is no longer confined to what goes on in the brain…  sometimes the coupling between a person and the environment is so tight that it more convenient to think of person and environment as one evolving system rather than two interacting ones” (Winn, 2003, p. 22). Kinesthetic learning allows the learner to move around, rather than passively listen to lectures or read from textbooks. This creates an environment where student learning is constantly changing and evolving. Students should be provided with opportunities to help them actively participate in projects and problems, while collaborating and communicating with their peers. Knowledge is not constructed in isolation or in a vacuum, but rather through critical thinking with groups of people (Winn, 2003, p. 4). These 21st century skills are what students will need as they enter the workforce and these skills allow them to become lifelong learners. Since technology is evolving at a rapid pace, it is even more important for research to keep up and stay current (Winn, 2003, p. 22).  

According to the Winn (2003) article, “[l]earning is best explained in terms of the student’s evolving, contextualized understanding and is valued on that criterion, rather than on the basis of traditional objective assessments” (p. 3). Assessment is a topic that I have spent a lot of time thinking about lately and I’ve decided to include some of that in this week’s post. It has been years since I’ve taught grade 7 and I had completely forgotten about the standardized assessment (FSA) that BC teachers are required to administer to their grade 4 and 7 students. I disagree with this assessment for many, many reasons (I won’t get into all of them).

How can an assessment that is administered to all these students (regardless of socioeconomic status, ethnicity, etc) give us a clear understanding of what our students are able to achieve? According to Cowley and Easton (2017), “[t]he act of publicly rating and ranking schools attracts attention and can provide motivation. Schools that perform well or show consistent improvement are applauded. Poorly performing schools generate concern, as do those whose performance is deteriorating. This inevitable attention provides one more incentive for all those connected with a school to focus on student results” (p. 3). Are we not publicly shaming schools that score poorly on this assessment?  It is believed that research needs to focus on student socioeconomic status, ethnicity, family support (or lack of) and quality/preparedness of teachers. Focusing on these areas, will give us a more complete picture as to what happens to students as “they go through the education system” (Winn, 2003, p. 4). None of these areas are accounted for when administering a standardized test. The test does not allow students to collaborate with others, use technology to look topics up or even to type the written components (there are 4 components: one math and one English is done digitally and one math and one English is written in the book provided; most of the questions are multiple choice). When in life, would we have students sitting in isolation without access to technology for support?

In the chapter, “Enhancing math learning for all students,” it is believed that graphing calculators help support higher-level thinking because students are able to solve multi-step problems that otherwise would not be able to be solved in the classroom. The use of these calculators prevents students from being bogged down in the calculations and allows them to focus on the process (p. 951). These tools are relatively inexpensive, which makes them accessible to all classrooms (unlike expensive computers or tablets). Students should have access to these calculators all of the time. Studies show that students who use graphing calculators daily learned more than those students who used them infrequently (Voogt & Knezek, 2001, p. 956). According to the National Center for Education Statistics, 11% of high school math classrooms use computers, whereas 40% use graphing calculators (as cited in Voogt & Knezek, 2001, p. 952). “Cutting edge research is exploring the latest new advance – graphing calculators that are connected via a wireless network. In simple uses, the wireless network can enable teachers to engage in formative assessment. For example, a teacher can take a quick poll of students’ responses to a conceptual question and display the results instantly. Teachers can use this capability to give students feedback and to adjust instruction” (Voogt & Knezek, 2001, p. 952). It is important to note that calculators should not be used for learning basic math skills (mental math, estimation, etc.) as these are still very important skills that are required in our daily lives (Voogt & Knezek, 2001, p. 953). Going back to my assessment piece, if we are allowing our students to use calculators, computers and tablets in our classrooms to enhance their learning, how can we possibly expect them to succeed on a standardized test that does not allow them to use these types of technology?

Questions:

  1. How can we possibly expect our students to be successful on a standardized test when it’s delivery looks nothing like how we teach our students on a daily basis (little or no technology, no discussion, no embodied learning, no collaboration, etc.)? What does this do to their self-esteem when they get the results back (we have to send home the graded booklets)?
  2. How can we integrate STEM projects into standardized assessments? Why are math and English the only two subjects represented on these assessments? Doesn’t this make students think that other subject areas are not as important?  

Cowley, P., & Easton, S. (2017, February). Report Card on British Columbia’s Elementary Schools 2017. Retrieved March 11, 2018, from Report Card on British Columbia’s Elementary Schools 2017

Enhancing math learning for all students. In J. Voogt & G. Knezek (Eds.) International Handbook of information Technology in Primary and Secondary Education, Springer, 951-959. Retrieved on March 11, 2018, from http://ezproxy.library.ubc.ca/login?url=http://www.springerlink.com/content/k044345111t8v102/

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114. Full-text document retrieved on January 17, 2004, from: http://www.hitl.washington.edu/people/tfurness/courses/inde543/READINGS-03/WINN/winnpaper2.pdf

Embodied Learning: Primary Learner

  • When discussing your practice, describe a topic that you teach that you think would benefit from an embodied learning approach and explain why.
  • E-Portfolio: How could you use what is developed in these studies to design learning experiences for younger learners that incorporate perception/motion activity and digital technologies? What would younger children learn through this TELE (technology-enhanced learning experience)?

The notion that your body influences your mind is the central premise of Winn, W. (2003) article and that learning occurs when people adapt to their environment. Winn, W. (2003) claimed that “we must think of the learner as embedded in the learning environment and physically active in it, so that cognition can be thought of as embodied as well as cerebral activity” (p. 3).  Additionally, Lindgren, R., & Johnson-Glenberg, M. (2013) research showed conceptual development and comprehension are enhanced with the creation and manipulation through engaging and interacting with your physical surroundings. Moreover, they found that Mixed Reality (MR) technologies, virtual environments, are “well suited for facilitating embodied learning because they combine physical activity with salient and compelling representational supports” (p. 447).

Personally, I have seen a rapid shift in the classroom where students can connect with abstracts concepts in virtual and online learning environments.  Klopfer, E., & Sheldon, J. (2010) noted that participatory simulations “enables students to see the world around them in new ways and engage with realistic issues in a context with which students already connected” (p. 86).  

I am lucky to be part of my school division’s STEAM Cohort (mostly elementary teachers) which incorporates art (A) with the standards of science, technology, engineering and math. We recently changed a typical paper-and-pencil animal research project to be more immersive and embodied by incorporating Mixed Reality and Learning-for-Use environment (motivate, construct & reflect). 

Design Challenge: Can you create an animal that would help you SURVIVE?

Note: Station 1 uses the Animal VR cards. The cards provide the opportunity for the students bring the animals to “live” and by connecting the animals with other cards (food, predators or prey).  As Klopfer, E., & Sheldon, J. (2010) concluded these embodied environments has the “potential to engage students by seeing information in context and providing a platform through which they creatively explore content by designing and exploring scenarios through the lens of games” (p. 93).

The TPACK framework is useful in learning because it supports active and collaborative blended learning. Typically, most MR applications for primary students have embodied learning environments which provide few opportunities to collaborate with peers. In other words, they mostly include single user applications.

  1. How can primary students be better supported to work with their peers in an embodied environment?
  2. How is it possible for primary students mix virtual/augmented realities? Is it essential to manipulate realities at a young age?

Klopfer, E., & Sheldon, J. (2010). Augmenting your own reality: Student authoring of science‐based augmented reality games. New directions for youth development, 2010(128), 85-94.

Lindgren, R., & Johnson-Glenberg, M. (2013). Emboldened by embodiment: Six precepts for research on embodied learning and mixed reality. Educational Researcher, 42(8), 445-452

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114.

 

Physicality in a Virtual World?

Cognitive Learning Theory Renaissance

I found this week’s readings to be quite engaging as it was mostly new information for me. I mean, I’ve always felt that there was a clear benefit to having students physically engage with content in order to solidify their learning, but I’ve never had it explained to me in the detailed and justified way it was explained by Winn (2003). I learned that cognitive learning theories were strayed away from for some time, to be essentially replaced by constructivist and social learning theory approaches. Their downside seemed to be how they segregated the brain’s “internal, cerebral activity” (Winn, 2003) from the immediate environment.

Well, it turns out that learning is inextricably linked our internal, cerebral activities are to external, “embodied” activities; two seeming-opposites engaged in an endless, reciprocal dance named “dynamic adaptation” resulting in learning. These concepts are what engaged me; to consider learning as not only the result of internally-generated knowledge structures but to be described as a series of “distinctions”, environmentally-triggered, which pressure us to adapt. Perhaps even more fascinating is the rabbit hole that opens once you start to consider learning as being fundamentally linked to environment. This means that everyone’s learning, or perceived world, is literally unique, shaped by their environment and naturally-varied experiences as well as genetics, while being constrained by sensory limitations (all essentially related to the concept of “Umwelt”).

Environment guides Learning

What I found perhaps most impressive about Winn’s writing was the clarity in the explanations of how how learning can be guided by the environment itself. There are four stages:

  1. Declare a Break
    • Activity is somehow interrupted by noticing something new or unaccounted for
  2. Draw a Distinction
    • Sorting the new from the familiar
  3. Ground the Distinction
    • Integrate the new distinction into the existing knowledge network (or, if it defies deeply-rooted beliefs, it will simply be memorized then forgotten – sound familiar??)
  4. Embody the Distinction
    • The new distinction is applied to solve a problem

Artificial environments (e.g. video games, VR) can allow us to go beyond scaffolding (such as seen in SKI/WISE) and embed pedagogical strategies into the environment itself by understanding these four stages. I mean, why not? Rarely can we design every aspect of our real-world environments, but we certainly can in video games and VR experiences. Of course, not everyone is a game designer, but most of us could manage to create a Virtual Field Trip, for example. The experiences could be designed so that they force students to create a “series of new distinctions” which could lead them to understanding whole environments (Winn, 2003); something extremely powerful especially for students that could never visit the environment in person.

A Variety of Applications

I think that even topics like quadratic equations and parabolas could benefit from this embodied earning approach. This could look like anything from a teacher designing a “tactile” activity in Activity Builder on Desmos, or leveraging tools like those found on GeoGebra or NCTM (e.g. sliders, tap-and-drag functionalities) for a more interactive, embodied experience. Tech allows us to explore abstract concepts in an embodied way, which is perhaps one of its greatest affordances.

Affordances of VLEs

All this thinking led me to explore more recent papers on the subject of virtual environments. It turns out significant research has been done on VLEs (Virtual Learning Environments). For example, I came across a paper by Dalgarno and Lee’s (2010) that identified five affordances (or benefits) of VLEs that translate directly into learning benefits:

  1. spatial knowledge representation,
  2. experiential learning,
  3. engagement,
  4. contextual learning, and
  5. collaborative learning.

These probably come as little to no surprise to most of us, but it is certainly nice to have them listed so simply and to know that significant research has determined their effectiveness.

VLE’s Unique Characteristics

Dalgarno and Lee’s work also argued that 3D VLEs have two unique characteristics,  “representational fidelity” and “learner interaction”, both of which I feel are particularly essential to both video games and VR design.

Unique Characteristics of 3D VLEs

Representational Fidelity

Learner Interaction

  • Realistic display of environment
  • Smooth display of view changes and object motion (e.g. high frame rate)
  • Consistency of object behaviour (e.g. realistic physics)
  • User representation (e.g. avatars)
  • Spatial audio (e.g. 7.1 surround)
  • Kinesthetic and tactile force feedback (e.g. rumble functionality)
  • Embodied actions (e.g. physical manipulatives in a virtual environment)
  • Embodied verbal and non-communications (e.g. chat functionality, online/local multiplay)
  • Control of environmental attributes and behaviour (e.g. customization interface)
  • Construction/scripting of objects and behaviours (e.g. programmed functionalities are possible and at the whim of the programmer/designer)

(all examples in brackets above are my own contributions)

Basically, when these two characteristics mingle, deeper learning experiences are bound to take place as they leverage the five affordances that translate directly into learning benefits. However… there’s a limit! There is an optimum level of interactions between them that maximizes learning; going “beyond the optimum” can actually lead to limited or negative returns with respect to their learning benefits (Fowler, 2015). Pretty crazy hey? I guess this is a case of “too much of a good thing”?

Questions Linger

A few questions still linger as I come down from learning all this new stuff. Perhaps you can help? 🙂

  • If contact with environment can trigger particular genetic “programs”, does this mean that genes also determine student learning capabilities? If that’s the case, is there some way we can engineer environments to “trigger the right programs”, while avoiding the “wrong” programs?
  • I referenced manipulation of sliders earlier when referencing a VLE. Does this type of interaction actually “count” as physical interaction, or does embodied learning need to incorporate gross motor skills, for example?
  • How does one determine the optimum level of interaction between the representational fidelity and learner interaction of a VLE?
    • I mean, I feel like The Legend of Zelda: Breath of the Wild does a pretty dang great job of mixing these two and teaching the player without using any words, but how did they find that exact sweet spot without leading to negative returns?
  • Bonus Question: I’ve never participated in a distance/online course that takes full advantage of any of the affordances of TELEs and VLEs. Has anyone else?

 

Thanks for reading, and apologies for being late here. Kinda struggling to keep my head above water at the moment.

Scott

 

References

Dalgarno, B. & Lee, M. (2010). What are the learning affordances of 3-D virtual environments? British Journal of Educational Technology, 41, 10-32.

Fowler, C. (2015). Virtual reality and learning: Where is the pedagogy?. British journal of educational technology, 46(2), 412-422.

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness and dynamic adaptation. Technology, Instruction, Cognition and Learning1(1), 87-114.

 

Appendix

There were a few extra things I came across that were very interesting but would have made the body of the post even longer than it already was. I still wanted to share them because of how useful they seem. Mainly there’s a Table, a Figure and supporting Context, which all relate to a “design for learning” framework for deriving appropriate learning activities. In essence, these resources can help teachers clearly define the learning context before learning takes place in order to maximize effectiveness.

Specifically, the learning context should include/combine variables such as:

  • Locus of control (teacher or learner)
  • Group dynamics (individual or group)
  • Teacher dynamics (one-to-one, one-to-many, many-to-many)
  • Activity of task authenticity (realistic or not realistic)
  • Level of interactivity (high, medium, or low)
  • Source of information (social, reflection, informational, experiential)

The idea is that combining these variables based on the requirements of a given learning context can help a teacher determine the most appropriate teaching and learning approach.

Finally, the Table 1 and Figure 3 below (Fowler, 2015) are meant to be used to help derive learning activities that will take place within this clearly defined learning context. I hope you find them useful!

(Fowler, 2015)

 

(Fowler, 2015)

Embodied Learning and Math

Though not necessarily tied to the idea of technology, one excerpt from this week’s readings reminded me of this graphic that’s been floating around my social media feeds:

 

(MindShift.com, 2018)

Winn writes, “Some recent thinking suggests that it is better to consider students to be tightly coupled to the environment rather than embedded in it. Being embedded suggests the student is passive, carried along as the environment changes. Successful students are anything but passive.” (Winn, 2003).

To be brief, Winn argues that “Artificial environments can use computer technology to create metaphorical representations in order to bring to students’ concepts and principles that normally lie outside the reach of direct experience” (2003). Essentially, technology helps the learning and provides a form of adaptation, in that the learner interacts with their environment significantly more than was possible or realized previously.

In another article, I read about the application of a program on handheld devices called TechPALS to mathematical problem solving. It was a great reminder of how the software of the technology does not have to be entirely about immersive experiences within the specific curriculum area for it to be effective. This article used control classes and classes integrating TechPALS to have students work on “repeated practice, feedback, and cooperative learning”, which creates embedded experiences within the content, and affects the environment in which the students interact with the subject matter. Roschelle et al. write that TechPALS is important because, “technology can socialize learning, encouraging positive behaviors such as asking questions, giving explanations, and discussing disagreements. These social behaviors, in turn, may engage students in connecting conceptual and procedural aspects of mathematics content” (Roschelle et al. 2010). The embodiment of their learning is intrinsically tied to what they refer to as “positive interdependence” and “individual accountability”. As far as setting up a similar scenario in my own practice, a mobile app like Kahoot or something comparable but perhaps less gamified? The students should be able to fit pieces of learning together like a jigsaw this could serve similar aims for embodied learning. From my perspective, reading for its usefulness and engagement, the instructional design of the lessons had everything to do with the embodiment of the content, and little to do with the actual technology.  As the environment changes, the students interact with it in various ways, and the ability to engage in conversation about those observations, question each other respectfully, and have their views challenged goes back to the “adaptive” learning environment Winn was referring to. The point of technology helping to facilitate those goals is outlined in his idea of learning as adaptation, and the possibility of “us[ing] technology to reduce the limits imposed by our sensory [or cognitive] bandwidth” (Winn, 2003), facilitating more spaces for students to interact with the environment as it happens.

Finally, the last article I read was about mental mathematical strategies by Jérôme Proulx. It was a very interesting take on embodied and embedded learning, as it’s a current article linked to the theoretical ideas of John Threlfall (2002), and not necessarily what I would instinctively teach. I think I have some researching to do! Proulx argues that teaching strategies for mental maths is almost unnecessary, and could be readdressed in education. He writes that, “This is at the grass roots of Threlfall’s argument for the futility of classification and choice of strategies, for no mapping of classifications of strategies produced by students appears satisfactory. This said, even if some authors, as Threlfall highlights, recognize the variety in strategies as too great to contain them in categories and that these would need to be broadened enough to encompass them all, he insists that not even broad categories would successfully account for the diversity in strategies from one author to another. Categories or classifications somehow become useful fictions, that can even be seen to serve a question- able purpose, especially when it comes to teaching these strategies” (Proulx, 2013). His article cites perspectives “grounded in enactivism” where students interact with the problem as it happens and use what is comfortable for them to solve it. So my first question to you is based on his writings:

Q1: Is there value in naming strategies (specifically for mental maths) if Proulx has determined “it does not give much justice or credit to the nature of students’ mathematical activity when they engage in these strategies in a mental mathematics context” (2013)?

Q2: How does an educator monitor differentiation in embodied learning?

 

References:

Roschelle, J., Rafanan, K., Bhanot, R., Estrella, G., Penuel, B., Nussbaum, M. & Claro, S. (2010). Scaffolding group explanation and feedback with handheld technology: impact on students’ mathematics learning. Educational Technology Research and Development, 58 (4) pp. 399-419.

Proulx, J. (2013). Mental mathematics, emergence of strategies, and the enactivist theory of cognition. Educ Stud Math. (84) pp. 309–328.

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114.

Embodied Learning and Mixed-Realities

It was fascinating to read and learn about embodied learning this week because as a primary teacher I see and use this type of learning regularly across the school, especially in the early years. Think of the song ‘I’m a Little Teapot’ or ‘The Itsy-Bitsy Spider’ for instance. While learning those songs everyone learns the lyrics while learning the gestures that go with it, and I would put money on the fact that most adults would be able to recite the words along with the movements years after being taught. According to Winn (2003), cognition does not just involve the brain but the whole body. Embodied learning then is “how our physical bodies serve to externalize the activities of our physical brains in order to connect cognitive activity to the environment” (Winn, 2003). Lingred & Johnson-Glenberg (2013) explain that in recent years there has been a lot of attention given to improve and introduce new instructional methods that focus on ways to include using the body to make meaningful connections with content in math and science. These new innovations have lead to the emergence of “new technologies that accept natural physical movement” such as gestures touch and body positioning. Combining the real world with the physical world is what Lingred & Johnson-Glenberg (2013) describe as mixed-realities (MR). In many schools today we have a wide-range of tools at our disposal to emerge students in mixed-realities. “These technologies typically involve [using] real-world objects, such as our bodies…[alongside] some type of digital display.” (Lingred & Johnson-Glenberg, 2013). While participating in a MR, students are in a controlled context where they have the chance to interact physically with the content to further understand concepts and explore cause and effect directly.

Reflecting on embodied learning and Module B, I think that planning with embodied learning in mind and mixed-realities could fit into any of the TELEs we investigated. Using this type of approach aligns well with the principles we discussed in the last module. It promotes learning through collaboration, it’s motivating, reaches all learning styles, as well as gives the teacher an opportunity to rethink and administer unique and well-planned assessments.

In Math class, I occasionally use an embodied learning approach and mixed-realities. For instance, a few weeks ago we made human graphs in different ways to teach students about scale, and we incorporated the robot Sphero when teaching students about angles. It definitely has a place in the classroom and I can appreciate how using the whole body and getting kids involved can help them make more connections, as well as be more involved and invested in their learning.

Questions I have are:

When should we introduce mixed/virtual/augmented realities etc? Should young students be exploring and making connections with their natural environments before exploring things outside of their physical reach?

Is embodied learning something that teachers do naturally? Does it need to be explicitly planned? Or is it something that is woven into planning organically through best practices?

References:

Lindgren, R., & Johnson-Glenberg, M. (2013). Emboldened by embodiment: Six precepts for research on embodied learning and mixed reality. Educational Researcher, 42(8), 445-452.http://www.move2learn.education.ed.ac.uk/wp-content/uploads/2015/04/Lindgren-2013-Embodied-Learning-and-Mixed-Reality.pdf

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114.