Author Archives: Michael Hengeveld

Shared Video: TELEs in the STEM classroom

Assignment #2, The Design of TELEs encouraged us to shared our videos.  I produced a guide to PBL that finally gives a more research grounded basis for our STEM program.  Very satisfying!

This video was designed to highly one aspect of the guide:  The Role of Technology in STEM.  I hope it provides concrete examples of how we can use the affordances of technology to transform how we teach.

I’ve enjoyed learning from you all.  It’s been awesome to be in an environment for sharing so many good ideas.  Enjoy the rest of the summer!

Michael

T-GEM and Buoyancy

TLDR;  We designed and flew a helium balloon probe and studied the complex system of forces through PHeT simulations.  I’d conclude that age and stage of the students matters a lot, making the teaching of “real world” science to K-10 a unique challenge requiring more than good simulations.

***

This summer I ran a STEM program with a colleague.  One of our projects was to make a cheap, simple aerial probe using Helium balloons.  The ultimate goal was to send various probes to collect data at inaccessible heights in the area around the school.  Jacobsen and Wilensky (2009) describe a complex system as having many interacting parts that are interdependent.  With no fewer than five independent parameters, this certainly qualified!  Our first probe (by student preference) was a camera.  We ran the lesson in five parts, loosely following a T-GEM model of exploring and modifying a model.

Lesson 1:  What is buoyancy?

In groups, students defined buoyancy, and tried to describe the model by which it worked, using words and pictures.  Everyone agreed that buoyancy acts upward, but there was a split as to what causes the upward motion.  Competing theories included reduced gravity, air pushing up, and helium pushing up.  Given ten balloons each, groups had to measure and report how much buoyancy each balloon offered, in units of their own choosing.  Most went with things like “paperclips/balloon”.

Lesson 2:  Balloons and Buoyancy

We discussed standardized units and the SI system from the previous class.  Students were given access to the PHeT module “Balloons and Buoyancy” and asked to answer the question:  What parameters can you control to make the balloon have the most buoyancy?  This activity was difficult to use—the display had too many options for them, but they did eventually agree that a cold, external heavy species gas, and hot, light species balloon gas was the best combo.  As Stephens and Clement (2015) note, hands on the keyboard sometimes means mind elsewhere, and students often need heavy scaffolding to use these simulations effectively.  We tried “pairs programming” in which students work in pairs, which seemed to help.  Stephens and Clement (2015) also suggest that student generated questions can be very effective.

Lesson 3:  Actually Flying the Balloons

We determined how many balloons were required for our load, and flew a balloon camera.  It was awesome.  Students agreed that our model of how much lift we should get was less than expected.  Also, wind complicated our model.  We collected observational and height data.

Lesson 4:  Buoyancy as a Force

We agreed that based on observation, many things were pushing on the balloon at the same time.  We used PHeT a second time, with another buoyancy simulation.  This time, students were asked to name and describe the forces pushing on the blocks, and how to maximize the buoyancy.  They seemed unbothered by the fact that it was blocks in water, not balloons in air.  Everyone figured out to minimize the mass, and maximize the volume.  Nobody referred to this in terms of density.

Lesson 5:  Putting It All Together

In designing our second probe, students were asked to list all the parameters that the system depended upon, and how to optimize them.  This was compared to their Lesson 1 drawings and definitions, and they had a discussion about the difference.  Ultimately, they did embody that buoyancy is actually the combination of many forces.  They could show physically how to control buoyancy.  They could not, however, make the words come out.  It is hard to say if this is problematic.  Most of them identified buoyancy as the central controlling force that acted upward.  While true from an outside perspective, it lacks granularity or a casual link between buoyancy and forces with other origins.  On the positive side, most of them agreed that they had not considered that more than one force could be acting at the same time, which is perhaps the beginning of a “strong” or “radical” conceptual change (Jacobsen and Wilensky, 2009).  The second probe flew better, but had very wobbly footage because of the winds.

Q:  When is the best age/stage to engage in more complex systems?
Q:  If students are motivated by real phenomena, but studying that phenomena is complex, is there a high quality middle ground?

Friedrichsen, P. M., & Pallant, A. (2007). French fries, dialysis tubing & computer models: Teaching diffusion & osmosis through inquiry & modeling. The American Biology Teacher, 69(2), 22-27.
Jacobsen, M. & Wilinsky, U. (2006). Complex systems in education. Scientific and educational importance and implications for the learning sciences. Journal of the Learning Sciences, 15(1), 11-34.
Stephens, A. & Clement, J. (2015). Use of physics simulations in whole class and small group settings: Comparative case studies. Computers & Education, 86, 137-156.

Embedded Networks: The International Boiling Point Project

“Speculate on how such networked communities could be embedded in the design of authentic learning experiences in a math or science classroom setting or at home. Elaborate with an illustrative example of an activity, taking care to consider the off-line activities as well.”

I chose to explore GLOBE as an example of networked communities. It is a fascinating and extensive resource and instantly called to mind a very similar networked learning project that I would like to share: The International Boiling Point Project.  Every year, schools from across the globe collaborate to answer the simple question—what factors control how water boils? Each school team contributes their results to a common data base. From a teaching perspective, this allows students to explore the effects of altitude, local air pressure, water types, and methodology in a way that would not be possible in isolation. The magic of this experiment touches on two main issues from this lesson:

1) Students socially construct knowledge
2) Communities of knowledge have normative practices

When students first post their data, there is invariable confusion. Do we use Celsius, Kelvin, or Fahrenheit? Why did that group not include altitude data? That group has a boiling point of over 100! The networking requires that students agree on normative language and practices in a very real way. By interacting with and critiquing other groups, they are forced to look more carefully at their own practice in way that is difficult to motivate in a normal “lab” activity. The paper by Driver et al. speaks to the importance of providing these social constructions in the classroom:

“Science classrooms are being recognized as forming communities that are characterized by distinct discursive practices…researchers are experimenting with ways of organizing classrooms so as to reflect particular forms of collaborative enquiry that can support students in gradually mastering some of the norms and practices that are deemed to be characteristic of scientific communities” (Driver, 1994, p. 9)

One thing I have noticed in these collaborations is that some students who are very active in informal learning environments do not contribute when in the formal learning environment. It’s almost as though they feel that since it is “official” it is not safe to contribute. How do we encourage meaningful participation in a formal learning environment?  Also, Driver (1994) suggests that communities of practice have very specific language and symbols.  Is combining subjects in a STEM environment problematic in this regard?  That is, does mixing the symbols and practices of mathematics, science, and technology come with problems?

Driver, R., Asoko, H., Leach, J., Scott, P., & Mortimer, E. (1994). Constructing scientific knowledge in the classroom. Educational researcher, 23(7), 5-12.
Means, B. & Coleman, E. (2000). Technology supports for student participation in science investigations. In M.J. Jacobson & R. B. Kozma (Eds.), Innovations in science and mathematics education: Advanced designs for technologies of learning (pp. 287-320). New Jersey: Lawrence Erlbaum Associates, Publishers.

Peneul, W.R., & Means, B. (2004). Implementation variation and fidelity in an inquiry science program: Analysis of GLOBE data reporting patterns. Journal of Research in Science Teaching, 41(3), 294-315.

Embodied and Informal Learning

It’s Safe to Come Back Now.

Essentially, early applications of AI to model the brain and human learning failed because it viewed cognition as internal and sequestered from the environment.  Constructivism and Situated Learning theories filled the gap, exploring best practices from a broader learning environment perspective.  Their success as theories seem to permeate the MET program.  After significant developments in neuroscience, it’s safe to come back to cognitive learning theories!

Can the Brain Operate in the Absence of an Environment?

Embodied cognition  seems to come down to this question.  In the old system of AI, although not necessary, once “loaded with programs” the brain could operate independently of its environment, a computer floating through space just doing its own thing.  The key change was to overthrow the “isolated brain” model and replace it with a complex, adaptive cognition system that is floating in an environmental soup.  In this model the “computer” cannot operate without a context.  Moreover, the embedded connection between the corporeal organs of the cognitive system (eyes, ears, etc) and the environment form a unique “umwelt”.  In other literature, I’ve heard this called a “lifeworld”.  A paper by Jones (2013) notes that students are naturally motivated to learn and develop successful adaptations to their environment when involved in informal learning activities, like Geocaching.  I believe motivation and the concept of umwelt are very strongly connected.  That is, it is easier to be motivated to learn things when you perceive them clearly and see subtleties in the same way that “beer tasters…[have]… heightened perceptual discrimination” (Winn, 2003, p. 13).    In his 2010 article, Nunez argues that the time is right to develop and use a more rigorous scientific approach to this theory of learning.

Learning as Adaptation.

This section of the Winn paper produces a teaching “road map” of sorts for providing the desired environmental pressures to idealize learning.  If bio-chemistry and genetic history provide a basis for our cognition, then environment provides the pressure to adapt or “learn” in stages:

  1. Notice something is wrong with concept.  (Declare a break)
  2. Disambiguate the effect.  (Draw a distinction)
  3. Embed the “new rule” to the existing conceptual network.  (Ground the distinction)
  4. Give the idea a trial run to test its usefulness.  (Embodying the distinction)

This seems a lot like Scaffolded Knowledge Integration with an additional “usefulness testing” stage.  These readings have made me more aware of the situated learning in my own practce as it relates to the senses.  I can see that the design and building of physical artifacts in PBL is of crucial importance!

Questions for Colleagues

  1. There is a mention of “Genetic predisposition to change” (Winn, 2002, p. 19).  Does this suggest that some students are genetically better at learning?
  2. Further in the paper, Winn states “The rules or procedures, that specify how the student interacts with the environment in the first place also change through adaptation, based on their success at producing fruitful behaviour.”  (Winn, 2002, p. 20).  Is this the same as saying that winning begets winning?  Is learning exponential or self-rewarding?
  3. Finally, in reference to Jones’ (2013) study of informal learning structures, how do we leverage the intrinsic motivational features of informal learning and make it count for our more formal processes?  Can understanding student “umwelt” and making their learning visible help us chose more motivating projects and approaches to teaching?

References:

Jones, A., Scanlon, E., & Clough, G. (2013). Mobile learning: Two case studies of supporting inquiry learning in informal and semiformal settings. Computers & Education 61, 21-32.

Linn, M., Clark, D., & Slotta, J. (2003). Wise design for knowledge integration. Science Education, 87(4), 517-538.

Núñez, R. (2012). On the science of embodied cognition in the 2010s: Research questions, appropriate reductionism, and testable explanations. Journal of the Learning Sciences, 21(2), 324-336.

Winn, W. (2003). Learning in artificial environments: Embodiment, embeddedness, and dynamic adaptation. Technology, Instruction, Cognition and Learning, 1(1), 87-114.

Synthesis of Models: Is Motivation King?

Comparison and Contrast
I found that covering Anchored Instruction, SKI, LfU, and T-GEM in such a short time had them pretty jumbled in my head, so this was a great chance to sort things out!  I conducted a review by re-reading the main literature surrounding the different frameworks, as well as reviewing our posts.  It is interesting to note that the theories span from 1992-2007.  In the context of changing technology, I think this is relevant.  In comparing the four theoretical frameworks I found commonalities and stand-out features.

Commonalities
All of the approaches we studied are rooted in constructivism, inquiry, and collaboration.  I would probably describe this to as a parent as “hands-on learning in groups” and I’m all in favour!  I have four kids in the school system and I can honestly say that this appears to be how things are done in K-7 here in BC for the most part; there is a lot of social learning and project-based learning.  The 8-12 years, and higher education are a different story and don’t really fully adopt any of these features.  Despite the “flatness” of knowledge that tech offers, my own high school remains highly prescriptive, fragmented, and individual at both the staff and student level.  The movement to a more student-centered mantra is messy and filled with uncertainty.  What are they learning?  What if it isn’t the same thing?  How do I fairly assess students who are working in groups on differentiated projects?  For my money, this revolution can and will only start with encouragement and pro-D investment at the teacher level.

Stand-out Features (for me)
1)  The features of “Anchored Instruction” seem flow from their focus on authentic problem solving.  I found the Jasper Series a bit dated, but this could easily be mapped onto more modern tools or an entirely different delivery mode.  I’m not sure that the ideas need to rest on a video series at all.  In my own classes I have found that building real structures like greenhouses, and wind turbines are the authentic tie in to content that student find engaging.  We experience a lot of failure in tying together the procedural and declarative, but we are making progress.

2)  The standout feature of “SKI” for me is the focus on misconceptions.  This was my least favorite approach, because it seems to presuppose that something worthwhile is being studied in the first place.  Dealing with misconceptions is really important, but from my view of how learning works in the classroom, motivation to learn must occur first.  I found that most of what might be accomplish in SKI is also covered incidentally by any theory of learning that is constructivist, or iterative.

3)  I really like the “LfU” model’s focus on motivation, especially this quote:

“The problem with these traditional approaches is not that they attempt to communicate knowledge instead of giving students opportunity to construct it thought direct experiences, but that the  transmission approach does not acknowledge the importance of the motivation and refinements stages of learning and relies too strongly on communication to support knowledge construction.”  (Edelson, 2000, p.  378)

This framework is the strongest fit with my own experiences in teaching science and mathematics. Our STEM team at Templeton have begun asking a lot more questions about the “lifeworlds” of students and how they engage with school.  This goes beyond “real world” and requires looking at what is relevant to students.  No easy feat and I’m not yet sure how to do it.  Mostly we have been collecting surveys and reflecting on the choices students make when they are allowed to choose their own “capstone” project topics.

4)  The stand-out feature of “T-GEM” for me was the focus on data driven models.  I really like how the approach is inquiry based, and iterative.  This “accretion” or refinement method is a great way to expose and resolve misconceptions or contradictions.  The only weakness is the extreme level of scaffolding required.  The literature stresses “experienced teacher” so many times that I wonder if it is perhaps not the best way to coax teachers into a more “student centered” learning stratagem.  I personally find that time is the scarcest of resources.  This model could be a disaster in our current “all go all the time” K-12 system.

The table below shows a review of the papers and posts, based on the terms and explicit focus given for each framework.  I used this to organize my synthesis:

Explicit Focus

LfU TGEM SKI Anchored Instr.
Constructivist

Y

Y

y

y

Inquiry

Y

y

Y

Y

Student-Centered

Y

Y

Y

Y

Collaborative

Y

Y

Y

Y

Real World

Y

Y

Y

Engagement

y

Y Y
Situated

y

Y

Y

Iterative

Y

y

Technology

y

Y

Lifelong learning

y

Differentiated

y

Complex

Y

References:

Cognition and Technology Group at Vanderbilt (1992b). The Jasper series as an example of anchored instruction: Theory, program, description, and assessment data. Educational Psychologist, 27(3), 291-315.

Edelson, D.C. (2001). Learning-for-use: A framework for the design of technology-supported inquiry activities. Journal of Research in Science Teaching,38(3), 355-385.

Khan, S. (2007). Model-based inquiries in chemistryScience Education, 91(6), 877-905.

Linn, M., Clark, D., & Slotta, J. (2003). Wise design for knowledge integration. Science Education, 87(4), 517-538.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record, 108(6), 1017-1054.

 

Rocketry and Resistance

Noticing the effects of air resistance is easy.  Predicting the effects of air resistance on the motion of an object, however, is mathematically complex and beyond the scope of high school.  In Physics 11, students are introduced to motion without the effects of an atmosphere to keep it simple but highly unrealistic (especially for really fast things like bullets or rockets).  After years, my question is why even bother?  The students effectively learn that physics is only true in books and exams which only solidifies the separation between their informal and formal learning. Empirical tools can do an excellent job of modelling real motion of particles in an atmosphere while also introducing authentic challenges in science, which is more compelling for students (CTGV, 1992a).  The partial sacrifice is the simple analytical math part of the model.

The diagram below summarizes a T-GEM approach to a Compressed Air Rocketry project in which students are given the challenge of designing a rocket that will fly as far as possible on a short blast of air.

The project incorporates the affordances of social learning and making learning visible (Linn, 2003).  Students work iteratively in teams, making their learning visible through diagrams, group meetings, and presentations.  Three e-learning resources are needed for this:

1)  a camera with 60 fps or higher (most phones and all iPads)
2)  access to PhET Projectile Motion online simulator  https://phet.colorado.edu/sims/projectile-motion/projectile-motion_en.html
3)  Access to the freeware program Physics Tracker http://physlets.org/tracker/

Special attention should be paid to helping the students collect quality data, where scaffolding is necessary, or the evaluation part of the activity will collapse. Rich scientific data collection is not a teenage instinct!  On that note, Khan’s study references “experienced science teachers” so often that I am left wondering–is it implied that T-GEM as a framework is difficult to wield without appropriate experience or deep grounding in TPACK?

Cognition and Technology Group at Vanderbilt (1992a). The Jasper experiment: An exploration of issues in learning and instructional design. Educational Technology, Research and Development, 40(1), 65-80

Khan, S. (2007). Model-based inquiries in chemistry. Science Education, 91(6), 877-905.

Khan, S. (2010). New pedagogies for teaching with computer simulations. Journal of Science Education and Technology, 20(3), 215-232.

Linn, M., Clark, D., & Slotta, J. (2003). Wise design for knowledge integration. Science Education, 87(4), 517-538.

A “Bridge Building” reflection through a LfU lens

Using the LfU literature as a lens, I reflected on the “Back Country Bridge” project that my STEM 11/12 class did this past September.  At the time, I had never heard of Learning For Use as a design theory.  This is what we presented to the students on the first day:

“Research, design, and construct the lightest possible wood-frame bridge that will safely allow a 100 kg person to cross a 4.0 m crevasse.”

Students worked collaboratively in groups of three.  Evaluations were set as 50% for the final bridge performance, 25% for written tests, and 25% for shop procedures. There was some direct instruction on how to shape and fasten wooden members, how to analyze forces, and how to test for strength.  We assigned homework problem sets with relevant math and physics, and set a “prototype” testing day at the 2.5 week mark to keep the students from procrastinating.  The entire project was 4.5 weeks long.

In retrospect, here is how I think we faired relative to my paraphrasing of the four LfU tenets of design found in Edelson (2000, p. 375):

1.   Learning takes place through construction and modification of knowledge structures.

This is later defined as constructivism, and I think we are following a constructivist model of learning. We are quite purposeful in our attempt to ensure that projects end with the creation of an artifact, be that physical or digital.  The students really got into building and testing the bridges, so that seemed like a success.

2.  Learning can only be initiated by the learner, whether it is through conscious goal-setting or as a natural, unconscious result of experience.

I feel that this is really about authentic engagement.  Later in the paper, it clarifies:

“…although a teacher can create a demand for knowledge by creating an exam that requires students to recite a certain body of knowledge, that would not constitute a natural use of the knowledge for the purposes of creating an intrinsic motivation to learn”  (Edelson, 2000, p.375)

I like that he emphasizes that “academic threats” or extrinsic motivation are not authentic engagement.  I think we failed here in our bridge project.  Although many of the students got into the building and testing, we spent zero time considering if this project was relevant to students or how they experience their environment.  I chose the project because I do back-country travel, and I like bridges.  In other words, it was relevant to me.  In future, I would like to be more considered in our choice of projects, or find some way to involve students in the selection process.

3.  Knowledge is retrieved based on contextual cues, or “indices”.

This is called situated learning elsewhere in the literature (NLG, 1996).  For those of you who teach physics and mathematics, you’ll know that the analysis of bridge structures is about as situated as trigonometry and “static equilibrium” can get.  We did test to see if students could recognize contextual cues and transfer this knowledge to similar structures, like bicycle frames and chairs.  The results were so-so, and we discussed that as colleagues.  Perhaps we need to include more transfer exercises or reflections that ask students to place bridge analysis in a larger context, something Garcia & Morrell (2013) call “Guided Reflexivity”, and Gee (2007) calls “Critical Learning”.   We didn’t do much in the way of meta-cognition at that point in the year.

4.  To apply declarative knowledge, an individual must have procedural knowledge.

I had trouble with this tenet.  Isn’t this isn’t just a repeat of principle 1?  Since our students are working in groups in a constructivist model, the development of common vocabulary and declarative knowledge is fully necessary to communicate, or the project doesn’t move forward (which sometimes happens and requires intervention).  The act of design and successful iteration is the application of “procedural knowledge”, which has declarative knowledge embedded.  Maybe I’m missing something here.

Overall, I feel like LfU is just a merger of constructivism and basic cognitive learning theories.  My school’s program and projects would benefit a lot by being more purposeful about authentic engagement and helping students see their project as part of a larger domain of related problems.

Edelson, D.C. (2001). Learning-for-use: A framework for the design of technology-supported inquiry activities. Journal of Research in Science Teaching, 38(3), 355-385

Garcia, A. & Morrell, E. (2013). City Youth and the Pedagogy of Participatory Media. Learning, Media and Technology 38(2). 123-127. http://dx.doi.org/10.1080/17439884.2013.782040

Gee, J. 2007. Semiotic Domains: Is playing video games a “waste of time?” In What video games have to teach us about learning and literacy (pp.17-45). New York: Palgrave and Macmillian.

New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review. 66(1), 60-92.

The Hot Box

I explored and customized the WISE project Solar Radiation and Solar Ovens.  This is a retrospective activity for me, because our class did a Solar Thermal Heating project this year that is the same concept, with a different use case.  I wish I had known about the WISE project data base—the interactives make excellent use of simulations, visualizations, and feedback!  I customized the WISE project to fit our Solar Thermal Heating unit.  We used a constructivist approach.
We started by watching a short video of a Navajo girl who grew up with her grandparent off the grid in rural Arizona.  They use wood for heating in the winter and as an engineering student, she wanted to do something to help her family so she designed, built, and installed a passive solar thermal unit on their adobe house.  The kids found it really inspiring, and strongly connected with the real-world application of science, which is highly motivating (Fernades, 2014).  The girls in the class also thought it was pretty cool that the girl in the video was an engineering rock star.  Our goal to build a solar thermal unit was clear, and the students were all pretty certain that they were up to the task.  Hattie (2007) suggests that this is key to reducing the gap in affective processes, like effort and engagement.
In small groups, students shared what they thought engineering was as a career.  Later in the week, a City of Vancouver engineer gave us a tour of the solar thermal heating units at the local swimming pool.  He also told them about his career as an engineer.  Students wrote a reflective paragraph on their perception of engineering as a career.  Although I did not see this at the time, this is one example of Scaffolded Knowledge Integration in action.  Many of the WISE project slides mirror what we did, but in a much slicker way.  In groups of three, students researched, designed, presented, and built solar thermal units.  This makes thinking visible, makes science accessible, and helps students learn from one another; three of the four tenets of Linn et al (2003).  Team members agreed on group roles and responsibilities.  They were responsible for designing a test for the units and there were many misconceptions about the difference between heat and temperature.  At this point, and many others, the instructors (including myself) were responsible for providing cognitive feedback cues (Hattie, 2007) that addressed the faulty interpretations.  Eventually, they collaboratively chose to measure the temperature difference between intake and outtake air.  Once the basic units had been tested, they went back to the books to see if they could design additional efficiencies.  Some chose to silver the outside (for radiation losses), other teams built reflecting “wings” to capture more incident radiation.  One group installed a fan to increase the flow of air through the unit.  This “Beta Model” iteration design is key for students to learn to critique, compare, revise, and rethink (Linn et al, 2003).

Fernandes, S., Mesquita, D., Flores, M. A., & Lima, R. M. (2014). Engaging students in learning: findings from a study of project-led education. European Journal of Engineering Education, 39(1), 55-67.

Hattie, H. & Timperly, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112.

Linn, M., Clark, D., & Slotta, J. (2003). Wise design for knowledge integration. Science Education, 87(4), 517-538.

Consider a block of mass m at rest on an inclined plane…

The Jasper Experiment is responding to what I might call the “Block On An Incline” issue:

It is a classic Physics 12 lecture in which students develop an algorithm for analyzing the possible motion of a block that has been placed on sloped surface.  The analysis is completely canned, and stripped of any context, but involves an impressive collage of math and reasoning skills.  It is considered a traditional pinnacle of achievement to solve these problems in the study of dynamics.  The group at Vanderbilt point out that skills or tools transmitted by a teacher in the absence of context or discussion are “inert knowledge” (CTGV, 1992a, p. 67).  I completely agree.  Their main idea is effectively:

CONSTRUCTIVIST + SOCIAL + COMPLEX PROBLEMS = DEEPER UNDERSTANDING

Authentic, complex problems, they argue, are key to use because the act of exploring the solution space of a problem (e.g. What is possible?  Can I estimate values?) is a more relevant ability than memorizing algorithms.  In a related study of complex problem solving, Vye et al. (1997) note that students of traditional classrooms are good at calculating things, but pretty weak problem solvers.  Collaborative work, they find, has the potential to improve the quality of problem solving.  Effectively, if the effort is focused and roles are understood, groups come up with much better solutions to problems than individuals.

Over a decade later, Park and Park (2012) worry that the complex and open form of problem and project based learning allows students to spend too much time on failed ideas.  This lost time means fewer topics are covered and their toolkit of knowledge is garbage.  Their fix is to structure the problem solving to cut out the failures, which is ultimately a return to algorithms disguised as real world problems.  This recommendation doesn’t deal with the original Jasper issue that students have trouble identifying what sub-skills are required to solve a problem because of lack of exposure to these types of questions.

The contemporary videos in this question set (c.f. Khan Academy etc.) are not Jasper-type videos.  Instead, they form a fantastic repository of guided practice for the many specific sub-skills that may pop up while teachers get on with helping students learn how to select the right tools for the job.  I use them frequently.

Cognition and Technology Group at Vanderbilt (1992a). The Jasper experiment: An exploration of issues in learning and instructional design. Educational Technology, Research and Development, 40(1), 65-80.

Vye, Nancy J.; Goldman, Susan R.; Voss, James F.; Hmelo, Cindy; Williams, Susan (1997). Complex mathematical problem solving by individuals and dyads. Cognition and Instruction, 15(4), 435-450.

Park, K., & Park, S. (2012). Development of professional engineers' authentic contexts in blended learning environments. British Journal of Educational Technology, 43(1), E14-E18.

Billy’s Bike Speedometer

I found the concept of PCK very related to James Gee’s (2007) concept of “semiotic domains” which deals with groups or concepts that are hard to understand from the outside. Consider the analogy of being introduced to hockey.  It is not enough to read and learn the rules of hockey.  Nor is it enough for an outside party to be really good at explaining it.  To truly understand, you must also watch and be invited to play the game (at whatever level), observe the history, rivalries, etc.  You must get involved and become a “tribe member” at level that transcends mere anthropological observation.  The same idea can be applied to jazz music, physics, gourmet cooking…whatever.  So, in some sense, PCK is about knowing the best way to induct new tribe members.  The addition of technology to the mix, or TPACK, is PCK while using technology without falling into the trap of “tech for the sake of tech” or limitations of “enhancement only” in Sarah’s SAMR model from our Design of TELEs posts.

Here is a recent example of PCK from my own practice.  In one of my courses, we do PBL all year.  Billy chose to design and make a bike speedometer because he “always wanted to know how fast he was going, but couldn’t afford a speedometer.”  We knew enough to first teach him how to code and use Arduinos (simple computer boards) as well as basic algebra.  Then, over three weeks he delved on his own into the content for calculating speed and how magnetic switches work, and came up with this.
We knew enough to create the conditions in which he could follow his passion to learn more about technology, math, and physics.  Through these experiences, the core of my practice is evolving into “how can I best guide students to become tribe members?”

Gee, J. 2007. Semiotic Domains: Is playing video games a “waste of time?” In What video games have to teach us about learning and literacy (pp.17-45). New York: Palgrave and Macmillian.
Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record, 108(6), 1017-1054.