Author Archives: jan lewis

Exploring Density and Buoyancy with T-GEM Cycles in the Elementary Classroom

It’s been a few years since I’ve taught either Science or Math, and that was to Grades 1-3 most recently, but as I read about GEM and T-GEM I was intrigued and wondered how this model could be applied to primary science. The challenging concept I selected was that of buoyancy and its relationship with gravity/mass/density. In Grade 3, students in Ontario study material forces. I recall classes having a hard time understanding why the buoyant force allowed certain objects to seemingly overcome the force of gravity but not others. The concepts of mass, volume, and density are not solidified at this stage so explanations or even demonstrations were not usually deeply understood. If I had an opportunity to teach this Science unit again using T-GEM cycles it might look something like this (over a series of classes, I’m sure):

 

Compile Information – I would begin with showing students a data table for five blocks of mystery objects from the PHET Density & Buoyancy Simulation and demonstrating how to read a two-column chart. Then we would briefly discuss what students know about these objects, the abbreviations (what does L and kg stand for?) and the numbers beside them (making a connection to money when reading decimals, are these numbers placed in any particular order?). Add keywords to a word wall: litres (L), kilograms (kg).

Source: https://phet.colorado.edu/sims/density-and-buoyancy/density_en.html

 

GEM Cycle 1:

Generate – First, I would then ask students to find trends or generate some relationship statements about the data in groups and then share with the class. For example, “The water has the smallest number but the gold has the largest number” or “I wonder why gasoline is smaller than water? Aren’t they both liquids?” Other questions related to the nature of the data the teacher might guide discussions of include: “What might “density” be measuring?”, Why is the pool measured as 100 L but the scale measuring 0 kg right now?”

Then, I would direct their attention to the cubes and ask them to explain what they see: Each shape is a cube, they’re five different sizes, five different colours, labelled with five different letters. I would ask them to put the cubes in order in two ways (letter label and size) and then to predict what the scale would read if I were to measure each cube. (I would deliberately not use the term “weigh” or introduce the term “mass” at this point). I would also ask them to predict whether they think any of the numbers on the data table might appear on the kg scale and what whether/how the 100 L measurement of the pool might change.

Then, I would ask students to predict which cube would measure the highest number when I place it on the kilogram scale. (They will likely say the largest cube will be highest and smallest lowest and I will add the word “size” to our word wall). I would ask them write a rule explaining what they think the relationship between a cubes size and how many kilograms it is.

 

Evaluate – Now, I would begin the simulation by placing each cube on the scale and have students (or a student scribe for the class) record the data in a new table:

Block Label Size (1-5, 5=largest) KiloGrams (kg)

I’m deliberately constraining them at this point by doing this part of the simulation as a demonstration to keep them from dropping the blocks into the pool or changing any of the other variables. I’d ask them to reflect on what they saw: Were your predictions completely correct? How can you explain this? I’d ask them to notice the L measurement change and compare that by measuring the block on the kg scale again and compare these numbers…

 

Modify – Finally, I’d ask the students if their original rule needs to be changed now that they’ve seen the measurements and have their groups try to make a new rule explaining why each block received the measurement it did since it can’t be because of its size.

 

GEM Cycle 2:

Generate – To start the second cycle, I would ask students to predict what will happen when each block is dropped into the pool and explain their thinking. If they say the same thing will happen to all five blocks (ie. all sink or all float) I would not correct that at this point. I would ask them to draw what would happen on a two-column drawing one side for “prediction” the other for “actual”. I’d again ask them in their groups to write a “rule” for predicting what will happen to a block when it’s dropped into a pool of water.

 

Evaluate – Now, the students would stay in their groups and load the simulation themselves using the Mystery button and experimenting with dropping the blocks into the water, and drawing what actually happened beside their predictions. I’d ask them to reflect on what they saw: Were your predictions completely correct? How can you explain this?

 

Modify – I’d again ask the students if this original rule needs to be changed now that they’ve run the simulation and have their groups try to make a new rule explaining why each block behaved as it did when dropped into the pool and write that down. I’d ask them to record the kg of each block on the “actual” side of their diagrams and consider whether this number might be connected to what they observed in the simulation or not when they make their new rule…?

 

GEM Cycle 3:

Generate – At this point, I would ask students to predict what will happen when each block is dropped into the pool if the simulation is changed so that all blocks have at least one thing the same, if they all measured the same kilograms for example. I’d again ask them in their groups to write three “rules” for predicting what will happen to the blocks that are all the same (a) mass, (b) volume, and (c) density when they’re dropped into a pool of water (it’s not important that they don’t know what they terms mean, this is part of the discovery).

 

Evaluate – Then, the students would stay in their groups and load the simulation themselves and use the “same” x buttons to set the blocks to the same value and continue experimenting with dropping the blocks into the water. I’d ask them to reflect on what they saw and evaluate their predicted rules. I’d assign a role to one of the group members to keep a running record of “things we want to know/don’t understand” and to another as the recorder to write or draw the group predictions and rules and the actual results. At the conclusion of this part of the simulations, I’d ask them to come up with a working definition of the terms “mass”, “volume”, and density” that they’ve been observing.

 

Modify – I’d again ask the students if their original rules needed to be changed now that they’ve run the simulations and have their groups try to make a new rules explaining why each block behaved as it did. I’d challenge them to incorporate their definitions of “mass” and “density” into their explanations.

 

GEM Cycle 4:

Generate – Finally, I would ask students to think about four materials that come in blocks they are familiar with: Ice, Metal, Wood, and Styrofoam. I would ask them to explain to me what would happen if I threw a block with the same mass but of each different material into the pool. (I might bring in these objects and a bowl to help them imagine). I would then return to the Mystery simulation screen and reveal to them that the blocks in this simulation are each made of a different mystery material just like my example objects. I would ask them to generate a rule using material names for what would happen when we dropped those materials into a pool.

 

Evaluate – So now, the students would stay in their groups and load the simulation but go to the Custom button. I’d ask them to experiment with the different materials, their masses and volumes and explore/record/discuss what happens.

 

Modify – I’d ask the groups to use this information to try to identify the material of each mystery block using the data from all the simulation tabs. As an extension, I’d introduce the Buoyancy Simulation (Intro tab only) and allow them to gather information to inform their hypothesis from those materials and we’d discuss the concept of weight (N) versus mass (kg) at this point, while formally introducing the topic of the buoyant force.  Finally, we’d return to the data table from the first cycle and ask students to explain what it means that wood has a density of 0.40 kg/L while lead and gold (both metals) have a much higher density. What would happen when we drop wood into the pool versus metal? Then in the Buoyancy Playground tab of that simulation, they’d compare materials such as styrofoam, wood, or metal and craft a final truth statement about gravity (N or kg) and density as well as density and buoyancy.

 

Extension/Next Steps – I’d draw their attention to the bottom of both tabs in the Buoyancy Simulations where the density of the fluid within the pool can be changed and observe what happens to the blocks when the fluid is converted to “air” “gasoline” “olive oil” “water” or “honey”.

 

Have you used GEM cycles in Primary Science or Math in your practice?  I’d be interested in how that went in terms of student understanding, management, and time.

Theoretical Framework & Affordances of Anchored Instruction

I chose the second question: “What is the theoretical framework underpinning the development of the Jasper series? What kind of teaching and learning activities do the materials support and what is the role of technology? In your view, what are the potential cognitive and social affordances of the technology; in other words, how can video technology enhance learning? What are these affordances for students with learning challenges or learning issues in math? Take a look now at Math Pursuits at the University of Cincinnati and their “Classroom Connections” Video Clips. In what ways do the videos in Classroom Connections and the support materials on the site exemplify these affordances?”

 

The theoretical framework underpinning the development of the Jasper series is constructivist and founded on the “anchored instruction” approach to instructional design. The CTGV appears to be promoting a problem-based learning (PBL) model of student-centered mathematics discovery, however they also make affordances for two other models of instruction that embrace teacher-centered controls to varying degrees (CTVG, 1992a). Their guiding paradigm “emphasize[s] the need to rethink the goals of education and the assumptions about learning that underlie many curricula and teaching practices” (CTVG, 1992a, p.66). The goal of teaching is not seen as the improvement of test scores, on the contrary they claim “[t]ests serve to define the goals of one’s instruction” (CTVG, 1992a, p.66). The math assessments were traditional pencil-and-paper. To this end, Jasper can be considered “technology-based” only in the delivery of the complex and engaging narrative-based problems and allows teachers a great deal of leeway in determining the depth and approaches students will take when exploring these mathematical issues. It is interesting that CTGV (1992b) noted that “our Jasper teachers and students hated our pencil-and-paper assessment instruments” (p.309) and after identifying a need for formative assessment as an indicator of increased time with the problems CTGV selected teleconferencing. This suggests that the video-based narrative, complex, authentic problems, and highly interactive group-based solutions organically suggested a richer style of assessment than was originally provided. The technological advances of today would have served CTGV well to that end.

Jasper is grounded in the pedagogical philosophy of “generative learning” as opposed to “inert knowledge” (CTGV, 1992a, p.67) and don’t focus on computational skills or pre-teaching the foundational concepts as much as attempting to help students “learn to become independent thinkers and learners rather than simply become able to perform basic computations and retrieve simple knowledge facts [and]…identify and define issues and problems on their own rather than simply respond to problems that others have posed” (CTGV, 1992a, p.66). They borrow from the concept of “apprenticeship learning” by “situating instruction in meaningful problem-solving contexts…and enable them to understand the kinds of problems and opportunities that experts in various areas encounter and the knowledge that these experts use as tools” (CTGV, 1992a, p.67). They assert that by allowing students to self-generate information they will retain more, however they note that this fact causes “considerable interference if the information that is generated is incorrect” (CTGV, 1992a, p.68).

They specifically reference Gibson’s (1977) concept of “affordances” and point out that the video-based narrative nature of the adventure affords the posing of complex, authentic, and open-ended mathematical problems and the much deeper cognitive demands that solving such problems requires (CTGV, 1992a). They’ve also embedded enrichment options for students that include “What if” thinking and connections from Math across the curriculum and into the outside world. This allows teachers to provide additional opportunities for students with gifted IEP designations as well as affording inquiry-based learning that isn’t strictly scripted to the plot sequence of the original mathematics adventure. The problems are complex enough that they’ve been designed to facilitate group solutions and discussions, which is especially useful for children with learning difficulties. Socially, the narrative video allows students to engage at a much richer level both with the content and with each other as they consider and discuss any number of aspects of the story and the problems-posed.

This style of narrative-based instruction reminds me of the book series Science Adventures (2015) by Richard and Louise Spilsbury which embeds hands-on experiments within a real-world context woven through by an engaging narrative.

These books don’t allow generative learning as an affordance, however, because their problems are not open-ended enough, though it could be argued the use of narrative in book format would afford even greater cross-curricular opportunities, such as students making their own videos of the narrative or creating something similar to present other science topics to an audience of their peers. Similarly, the videos in Classroom Connections (2018) make use of the engaging video vignette to pose a problem that has cross-curricular and real-world connections. The problems encourage learners to discuss the problem and the handouts provided as pdfs on each video page provide scaffolding for the group-work and task-thinking processes. These videos do not meet the level of complexity that the Jasper series affords, however, and the vignette-length eliminates the power of narrative to engage and motivate students. There is no sense of adventure or continuity provided because these are not stories, this makes the problems seem more like a video version of a textbook question than an authentic mathematical need.

.

References

Cognition and Technology Group at Vanderbilt. (1992a). The Jasper Experiment: An Exploration of Issues in Learning and Instructional Design. Educational Technology Research and Development, 40(1), 65-80. Retrieved from http://www.jstor.org.ezproxy.library.ubc.ca/stable/30219998

Cognition and Technology Group at Vanderbilt (1992b). The Jasper series as an example of anchored instruction: Theory, program, description, and assessment data. Educational Psychologist, 27(3), 291-315. Retrieved from http://web.a.ebscohost.com.ezproxy.library.ubc.ca/ehost/pdfviewer/pdfviewer?vid=1&sid=ec15d6d4-9b0b-4b0e-9e24-ee5b2c6a30cd%40sessionmgr4010

TPCK, SAMR & Backwards Design

As I was reflecting on the concepts of PCK and TPCK this week, I was reminded of the importance of planning as part of sound PK whether with an additional C or T or just in general.  Specifically, the idea of “backwards design”, or “assessment up” planning, where teachers start with the end in the mind, usually the assessment of specific curriculum expectations to be demonstrated in a certain way, and then work back from that end point to discover the nitty gritty of content delivery to best position students to be successful in their eventual learning demonstration.

I believe this is a foundational practice for teachers who produce successful learners and who hope to use a technology integration framework, such as TPCK, or analyze their current and future integration using the SAMR lens, effectively.  As with all things that get suggested for teachers to do or add or implement for their practice, however, it becomes a question of how does one start?  There is a lack of time and a sense of wasting one’s time “re-inventing the wheel” that goes unaddressed.  Having some guidance already prepared to assist teachers in their backwards planning with TPCK goes a long way, in my opinion.  Like others mentioned in their posts, this is not my first exposure to TPACK.  A 2009 paper by Harris, Mishra & Koehler included the importance of adding what they’ve termed “Learning Activity Types” to the toolkit of the teacher using TPCK to backwards plan lessons that provide higher level/rich-technology integration. It’s not enough just to know what we want to teach and try to fit technology in as an afterthought.  As Mishra and Koehler (2006) mentioned, “Merely introducing technology into the educational process is not enough” (p.1018).  These authors go further in their 2009 paper and state, “effective teaching requires knowledge of both the activity types that are appropriate for teaching specific content and the manners in which particular technologies can be utilized as part of the lesson, project, or unit design” (p.406, emphasis added).  This, to me, sounds like the essence of backwards design planning.

I once took an ISTE Schoology Course (cleverly named iPadeology) which had specific resource pages about TPCK, SAMR, differentiation with technology, and various instructional models, including STEM. They provided two very useful forms (licensed under Creative Commons for our use) that I’d saved and wanted to share in light of this week’s topic.  I hope you find them useful:

On a personal professional note, I’ve recently found myself applying some PK in my creation of group structures for our school’s newly formed Minecraft: Education Edition STEAM Club.  This year’s club challenge is building our school, to scale, where 1 metre = 1 Minecraft block.  There are many different areas in our school to be measured, graphed, and built and I’ve spent the last week creating multi-grade zone crews to oversee each area.  As I finished, I suddenly recalled something I’d read in a previous MET course about the effects of gender on gaming technology behaviour.  The gist of the relevant findings of the research study were that when dual genders were given the chance to play a digital game, the girls always back-seated themselves while the boys took over.  However, if placed in same gender groupings, the girls often excelled, taking risks, learning socially from each other, and expressing a greater sense of accomplishment, satisfaction, and enjoyment with the game and with themselves as users of technology.   I think this relates to the Math confidence piece Christopher brought up in the news article he posted in announcements this week, as well.  I wanted to give the girls just as much opportunity to problem solve, get “mathy”, and create with this challenge as the boys, and I believe my initial grouping based on “fairness” was actually about to work against that.  Therefore, I revamped my entire crew list to reflect this research-enhanced PK and will hopefully provide an equally fun and fulfilling learning experience for both boys and girls.  If I had actively been using a framework guide for TPCK such as the one provided above, I wonder if I would have tweaked the memory of that research before I started the planning, rather than having it come up as a sort of pedagogically sound coincidence?  It certainly would have saved me time!

References

Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ Technological Pedagogical Content Knowledge and Learning Activity Types. Journal of Research on Technology in Education, 41(4), 393–416. Retrieved from ERIC database. (EJ844273)

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record, 108(6), 1017-1054. Retrieved from http://one2oneheights.pbworks.com/f/MISHRA_PUNYA.pdf

Ideal Definitions & Reality-Based Designs

My most appealing technology metaphors from the list are Jonassen’s idea of technology being something students learn with rather than from.  His concept of “Mindtools” as a category of technology which supports this meaning-making truly resonated with me.  My unspoken definition of “technology” places an emphasis on the rich potential of the tool to enhance student exploration, creation, communication, and collaboration in our perpetual quest for deeply connected “learning”.  Dede’s notion that technology includes interactive media which are “tools in service of richer curricula…” accurately describes many of the examples I would have chosen to illustrate what pedagogically-minded technologies might look like.  

Ideally, the designers of the learning experience should also be the deliverers of those very same experiences, so that their TELEs can be crafted to cater to the unique (and changing!) affordances and constraints of a real group of children in a real place, rather than a one-size-fits-all plan or program.  When otherwise, designers should be required to pilot their designs as the only adult in actual classrooms from several different socioeconomic neighbourhoods before marketing their products or handing down their initiatives.  In other words, how would I design a TELE?  It depends!

Competitive Play, Privacy and Time+Expertise

For this activity, I interviewed a colleague teaching at a K-8 public school in Ontario with a population of roughly 300 students. “Kathryn” has been teaching in public elementary education for 12 years and is a Drama and Music major who currently teaches 26 students in a 6/7 homeroom residing in one of only three closed classrooms in the building. She has been teaching sixth grade either as a straight or as a split for four years at this location. She teaches all subjects, except for French, PhysEd. and Visual Arts, including STEM subjects. As a Grade 6 teacher there is a particular focus on math because in Ontario sixth grade is a testing year for their province-wide standardized test. This school in general also has a school-wide improvement plan related to improving math scores overall as a goal for this year. The interview was conducted face-to-face on January 24, 2018 in a closed room over the lunch break.  Audio was recorded onto the interviewer’s phone.

While transcribing my interview with Kathryn three themes or topics wove a common thread through her responses: Competitive Play, Privacy, and Time+Expertise.

 

Competitive Play

Kathryn repeatedly used words related to the power of technology, particularly games, to connect, motivate, and engage students. She describes herself as “enthusiastic” towards technology integration and measures whether her views of technology have changed over the course of her 12 year career in terms related to her enthusiasm and the positive feelings she receives from student responses.  She identifies her favourite apps and the typical uses of technology in her teaching by relating them to how well students “connect” to a screen versus a textbook both for literary understanding and for engagement, and how “positive” her students are about getting to play.  She mentions the use of “competition” as a motivational tool and identifies several apps that her students respond positively to because of the competition created, referencing things like “it can be played like a race” and, “they can see their results real-time” and they compete to get them posted first .  She also uses “house leagues” in her classroom to manage her students and ties in the competitive aspects of certain favourite apps to these house league points and monthly celebrations for the winning team.

 

Privacy

Another theme that was repeated throughout the interview in one form or another was the value Kathryn perceived her students placed on technologies which allowed them to play and participate in real time but to do so anonymously if they chose. Kathryn’s class is a blend of introverted and gregarious preteen boys and girls, some with persistent behaviour challenges stemming from underlying anxiety disorders. The option for students to engage with or without broadcasting their identity to their peers was mentioned positively more than once.  She identified her wish that the school provided the money such that every student could have their own device “that doesn’t come from home so we could have more control over it” and that would then allow her to “really differentiate instruction but in a private way” [emphasis in original].  It’s clear that Kathryn respects the sensitivities of her students and finds value in technological tools that allow their wish for privacy and teacher-student confidentiality to be maintained even while engagement and differentiation are being leveraged.

 

Time+Expertise

By far the biggest theme weaving through my discussion with Kathryn was that of time. Technology examples which she used and praised almost always had qualifiers such as “efficient”, “productive”, “that I didn’t have to make”, “saves time”, “not reinventing the wheel”, allows the class to “get through [content] more quickly”, “doesn’t take as much [class] time”.  The problems she identified with technology were the converse, described in terms such as how they “hold us back”, “I don’t have the time to sit down and search for what I need”, “takes me awhile”.  When asked whether she felt she needed to be an expert to integrate technology into STEM subjects she answered immediately, No, I need time. I don’t need expertise I need time.”  However, I noticed that when asked to expand on this, her example connected the amount of time it takes to troubleshoot an app she is unfamiliar with to the deficit in time and the slow down of productive work in her classroom. During transcription, I made the following note: Although K said she did not need to be an expert, her example of time actually is built on the inference that a lack of expertise on her part in how to model and troubleshoot and train her students due to changing updates was the reason that time became the issue in the first place.

This dual theme of expertise having an impact on time arose again at the end of the interview when questions dealt with what she felt her school could do to better support teachers using technology in her building and the biggest hindrance in supporting elementary students in STEM learning.  She identified the lack of content expertise (“specialized instruction”) in elementary teachers of STEM and the lack of well-trained, available support teachers for technology assistance and training for the rest of the staff.  She stated that for technology use in STEM or any subject to support student learning in greater measures in more classrooms, schools needed to “devise a system that doesn’t make one person responsible for its success. We need more staff for those types of things and more training.”  Lack of money for these human resources and technologies was identified as the root of these deficiencies, however the underlying themes remain that Kathryn feels teachers in general are not given adequate access to and/or made sufficient experts in technology for educational purposes and this lack of expertise increases the amount of time individual teachers must invest to teach themselves technology and simultaneously decreases the amount of class time available for efficient, productive work when the teacher feels ill-equipped to both train her students in, and troubleshoot any issues that arise during, its use.

Cite U Like links inactive?

Anyone else having issues accessing articles from the CiteULike folders? So far I’ve found one of the articles of interest from Folder 2 and all four of the articles of interest I tried to open from Folder 3 returning various errors, from saying access is forbidden to the page isn’t found to bringing me to the organizations homepage with no record of the article in question.  So far every article of interest I’ve clicked on from Folder 7, regardless of the Full Text link option I choose, is bringing me to a periodical service provider and suggesting I purchase the article to view more than the abstract and a couple pages.

Chris, is this the expectation for these resources or is there a trick to accessing this data that I just don’t know?

Technology in the Elementary Class: Video Case 5 and 8

Summary of the issues raised through the two elementary school case videos:

I found it interesting that of the two case videos I watched, the only teacher who is really deliberately embracing technology has also embraced Project Based Learning and the constructivist “chaos” that goes along with that kind of learning.  This teacher was seasoned enough to be comfortable with this style of teaching and the loss of control such a style entails.  Even the new teaches who are closer to or within the generation that is much more technically skilled are not comfortable with using technology in education as a given but see it as an alternative, a wish, or a maybe if … (fill in the blanks from things such “students are the appropriate age”; “students already know the content”; “I have enough support”; “I have extra time”).

The main issues that arose through both these cases at an elementary school level are

  1. The why and how technology is used (SAMR can guide this);
  2. the lack of teacher training/application/access/support which is directly contributing to the lack of feelings of comfort with the use of technology for their students; and,
  3. the limited amount of time in the face of the quantity of learning objectives teachers are expected to guide students through within a year.
SAMR EdTech InfoGraphic

Retrieved from: http://lingomedia.com/stages-of-edtech-the-samr-model-for-technology-integration/

For more specific details related to what I noticed about SAMR and the various interviewees’ views of technology see below… Continue reading

“Good” Digital Tech for STEM

  1. My “good” STEM tech wish-list…
  • What is a good use of digital technology in the math and science classroom?

I think good use of tech in Math and Science needs to be purposeful, either for the reinforcement of foundational skills and concepts, or the exploration of ideas and themes, or the expansion/presentation of student ideas and learning.  

  • What would such a learning experience and environment look like? What would be some characteristics of what it is and what it isn’t?

This means any program or app used should allow students to have their own account in which to save their progress and review it as needed.  It should not be a “one size fits all” generic drill and kill.  It should include aspects of game-based learning to inspire motivation, including containing a relevant or engaging narrative, rewarding accomplishments, and allowing users to continue to replay “levels” until they have successful solved that area.  The best tech will also have the ability to assess or have inputted as a starting level, the content students are working with, connected to the curriculum.  If used to reinforce or build on ideas, it will track and be responsive to whether students are correctly choosing their answers, in Math for example, and tailor the next questions to them, limiting (and rewarding!) ones they appear to have already mastered and focusing on the ones they still need to develop skills in.  Finally, it will allow any on-screen text to be read aloud to students in a variety of languages, as well giving them a place to make notes and share their progress or findings with others.

  • How might a learning experience with technology address a conceptual challenge, such as the one you researched in the last lesson?

In order to address a conceptual challenge, the technology must provide a narrative scenario that requires a concept-linked problem to be accurately solved.  It must therefore allow for verbal or textual responses rather than just number-punching or clicking pre-provided multiple choice answers.  Where misconceptions become evident through failed attempts to successfully answer or solve the problem, the tech must have embedded or linked video and audio content that relates to that misconception or reiterates the problem in a relevant way that encourages the student to rethink her position and reasoning based on questions or new information.

2. Reflection and reality…

  • What makes this a good use of digital technology?

This would be a “good” use of digital technology because it enhances the thinking of the student and allows for differentiated content and responses, as well as receiving relevant and real-world feedback, encouraging motivation, and allowing for collaboration and sharing.

  • Is this a vision or is it possible in real classrooms? What makes this vision a challenge to implement and what might be needed to actualize it?

Honestly, this is probably a vision at this point.  There are some techs that do some of these things, for example IXL Math aligns to curriculum and provides elements of gamification, Prodigy does an excellent job including narrative and game-based learning into its skill and drill design, with their video claiming to use diagnostic tests presumably to place students in proper content-levels, and platforms like Edmodo or ClassCraft allow for collaboration, rewards and sharing of student created content which could easily include Math or Science lessons. Furthermore, some BBC website simulations for beginning Science concepts (see here) do a good job allowing students to experiment with ideas and reading aloud the text to them or providing a problem-based scenario to guide their explorations, but to my knowledge there is nothing out there that meets all of these things on my wish-list.  

This vision is a challenge because differentiated technology requires immense human planning and front-loading beforehand, not to mention 1:1 device access and reliable, high-speed Internet, teachers who are on-board with the idea, taking the time to set things up and become somewhat comfortable with the tech and the interface itself, and who actually possess, or are willing to make, the time in the school day to promote at-home use or provide at-school use to introduce, train, troubleshoot, use, and follow-up with this technology.  I don’t think we’re ready for something like this yet on several different levels.

Constructivism’s Answer to Children’s Misconceptions in Science

(Disclaimer: It’s been a long time since I’ve taught Science and an even longer time since I’ve taught Math so I found this activity challenging because I don’t really have anything to comment on that is relevant to my personal practice.  I tried to read the course article about Children’s Conceptions of Heat and Temperature because I remember teaching a unit in Grade 7 related to that but the UBC link is broken and I couldn’t find the article online that didn’t cost money to read. I read an article that referenced Erickson’s work however, Children’s Ideas About Hot and Cold (Appleton, 1984).  The last Science I’ve taught has been to Grades 1-3 students and I really wanted to read Children’s Understandings of Science: Goldilocks and the Three Bears Revisited (McClelland & Krockover, 1996) which studied first grade students and their understanding of heat, and compare these ideas, but again there was no access available).

 

From McClelland & Krockover’s abstract and introduction, however, I have found that many of the peculiarities of children’s understandings are echoed in the video and the other readings.  McClelland & Krockover (1996) found that students adopted misconceptions of scientific conceptions based on their exposure to literature, in this case the fairytale of Goldilocks and the Three Bears.  This is consistent with the findings of others in this week’s readings that children are prone to make contradictory statements about the nature of scientific phenomenon when that phenomenon is presented in a different way, for example temperature descriptions and changes described qualitatively rather than quantitatively (Appleton, 1984).

Researchers also found that children often rely on sensory input even when it has been proven to be unreliable, as when a cold hand registers cool water as quite hot despite the actual temperature but they may choose to “live with the contradiction” rather than challenge their personally held conception (Appleton, 1984).  This reminded me of Vygotsky’s ZPD and of Piaget’s understanding of the symbiotic dance of learner-teacher in a child’s schema-construction rather than the “tabula rasa” Shapiro (1988) references which had been the guiding pedagogy of 1960s-80s.  McClelland & Krockover (1996) supported Piaget and Vygotsky’s worldview’s when they found that the first graders were able to change their conceptions when presented with activities that put the contradiction to their previous beliefs.  This is similar to what Heather was able to do in the video when she reassessed what she believed to be the shape of the Earth’s orbit and what Mark (Shapiro, 1988) was able to do when he connected previous lessons to the reflection of light from objects into our eyes, revising his prior hypothesis recorded before the unit began.

All the researchers I read made a strong case for the use of constructivist and constructionist practices in the Science classroom.  Shapiro (1988) did this when she pointed out the value placed on both hands-on and self-directed “experiments” by her research subjects; Appleton (1984) did this when he commented on the value of using relevant, accessible situations rather than abstract examples or those that were beyond the children’s experience; and McClelland & Krockover (1996) directly identified the present view “that learning is the result of the interaction between what children are taught or what they experience, and their current ideas or conceptions (Driver, 1981)” (p.33) and targeted constructivist concepts of prior knowledge and social interaction as active meaning-makers in children’s understanding of scientific concepts.  All of this points to the necessity, in my opinion, of rethinking the amount of information teachers are expected to “cover” in each science unit.  A better alternative is a streamlined curriculum focusing on the topics that most children hold misconceptions in for each strand of scientific thought so that teachers can actively and deliberately tailor learning in a pattern of (1) misconception-identification, (2) contradiction-exposure, and (3) independent-and-guided exploration in order to construct more accurate understandings.

 

References

Appleton, K. (1984). Children’s Ideas About Hot and Cold. Learning About Science Project (Primary). Working Paper No. 127. Retrieved from: https://files.eric.ed.gov/fulltext/ED252407.pdf

McClelland, A.K. & Krockover, G.H. (1996) Children’s understandings of science: Goldilocks and the Three Bears revisited. J Elem Sci Edu (8)32. https://doi.org/10.1007/BF03173747 Retrieved from: https://link.springer.com/article/10.1007%2FBF03173747?LI=true

Shapiro, B (1988). What children bring to light: Towards Understanding What the Primary School Science Learner Is Trying to Do  Retrieved from: https://files.eric.ed.gov/fulltext/ED309081.pdf

 

Chrome Getting Huffy About PDFs Anyone?

Hi all,

I’m working primarily from my Mac and generally prefer Chrome over the native Safari browser for course stuff, but I’ve found lately that when I attempt to open things (like PDF files of course readings, for example) I will more often than not get a black screen instead of anything to read. Then I have to copy the URL into Safari before I can read anything.  It’s definitely a first world problem LOL but annoying nonetheless.

Anyone else experiencing these issues and (more importantly) might know a way to fix the problem?

Thanks if you can help!

Jan