Student resistance to active learning in engineering: How can pedagogy and educational technology be used to engage students in post-secondary engineering education?

Background and Context

I support a first-year engineering design course with about 1000 students. A design process is taught by engaging students in a common problem in the first semester. In teams, students translate the client’s problem into engineering terms to recommend solutions. In the second semester, students re-iterate the design process in new teams and with unique real clients.

Given the nature of learning design, I think the course’s scaffolding and experiential elements are clever and well supported through project and teamwork. Our lectures are taught by different members of our teaching team depending on the topic. Student teams sit at their assigned lecture tables. Each class is a mix of direct instruction, modelling, and facilitated activities. Outside of synthesis and discussion based activities, we use student response systems like Mentimeter and Kahoot!. However, my thoughts are inconsistent with the student feedback we receive. Perhaps because unhappy students are more vocal, I have heard many complaints about the course either being too boring, too obvious, or too difficult to participate in.

As a non-engineer, I am not always familiar with the course content. I sympathize with the students who feel that they cannot participate because they do not know the content. The complaints make me wonder if active learning is appropriate given low motivation, preparation, and engagement. However, I know that active learning can lead to better learning, but students require structure to orient and transition themselves to these non-traditional approaches. As a result, I want to know how to better organize and leverage our learning management system with pre-lecture tasks to support our students and address their feeling of learning.

Modified Annotated Bibliography

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences of the United States of America, 116(39), 19251-19257. doi:10.1073/pnas.1821936116

I really enjoyed this article. It’s a great cross-over design that was able to compare passive and active instruction in an undergraduate Physics course. Notably, the learning objects and handouts were identical in both conditions, the intervention occurred in two consecutive classes and was taught by two instructors that previously had no interaction with the students. The experimental instructors and authors of the test of learning were blinded (instructors did not see test of learning; test of learning was created independently from the instructors, based on detailed learning objectives).

The findings that students learned more in active learning but perceive that they learned less was so interesting! The authors attribute this to poor metacognition of novices, overestimation of learning in passive instruction, and unfamiliarity with active learning. The suggestions of early assessment, clarifying the meaning of learning/cognitive load, and responding to student feedback are important. Active learning is challenging because it is an ongoing commitment between the instructors and students. It requires engagement and motivation.

Saterbak, A., T. Volz, and M. Wettergreen. 2016. “Implementing and Assessing A Flipped Classroom Model for First-Year Engineering Design.” Advances in Engineering Education 5 (3): 3.

This course is strikingly similar to the one I support. Although there was no statistically significant difference in learning found between the experimental and control, I still think the flipped classroom was interesting. As the authors point out, design is already a heavily inquiry/project based topic. At the same time, it was difficult to control for confounding given the pre/post tests were not in controlled conditions (i.e., they were take home assignments) and I wonder if the comparison to the second semester iteration introduced confounding due to transition to engineering and post-secondary education. Regardless, I think a future iteration of this experiment with varied assessments (e.g., problem statement, design, team dynamics) in addition to the project management through the Gantt chart analysis could be done. As an observer to the engineering course, I’ve noticed that my course also emphasizes these concepts and wonder how the active instruction may or may not impact the learning gains/losses in these elements of design.

The authors took on a large commitment to create resources (videos, quizzes, exercises) for their course. I wonder how they decided to go with this direction. As well, it would be interesting to survey if students are using these resources, how/when they are using them, and their perceptions.

Kaewunruen, S. (2019). Enhancing railway engineering student engagement using interactive technology embedded with infotainment. Education Sciences, 9(2), 136. doi: 10.3390/educsci9020136

Similar to Deslauriers et al (2019), it was found that students learned more in active instruction, but students preferred more traditional methods and did not like technology as much. Students preferred face-to-face interaction and networking. I really like how the author included a digital-native orientation survey. I didn’t expect this and think it’s an interesting way to gauge familiarity/comfort/readiness with technology. In a future iteration of a similar study, I wonder if the researchers could look at the user experience of the learning management system and how users actually use it/what they expect. Eye path and time spent on specific assigned tasks could be measured.

Conclusion

The emergent themes from these articles are that for successful active learning requires:

  • addressing the benefits and perceived challenges of active learning,
  • using frequent and early assessment to help students regulate their behaviour,
  • scaffolding active lecture exercises with pre-lecture activities,
  • orienting and transitioning students to technology,
  • framing technology within the pedagogical goals.

In my own context, this might take the form of:

  • pre-lecture tasks to scaffold learning for in-lecture activities: the challenge with such a large course is assessing student learning frequently and providing immediate feedback. Pre-lecture tasks in the LMS could help address this. Currently, we have unstructured readings, but I find that textbooks are often written for experts, rather than novices. Embedding formative assessment within a reading task would provide immediate feedback to learners while increasing their engagement.The pre-lecture tasks could target specific items depending on the point in term. For instance, the first pre-lecture task could be a syllabus reading and quiz, and reading and quiz on active learning/learning. In other cases, the pre-lecture task could include a reading (text, video, etc.) and a follow up quiz, activity, or reflection. As Saterbak et al (2016) suggest, these pre-lecture tasks work at the lower levels of Bloom’s taxonomy and scaffold learning for the more complex in-lecture exercises.
  • orientation and transition to technology: this is always a tricky process. I wonder if it would be useful for learners to complete a technology readiness assessment. The assessment would give a general score for readiness and direct students to resources to improve/access. A course LMS welcome video would also be helpful in orienting students to the LMS and how it works. At a larger scale, a common course homepage would be helpful for students.

Underlying assumptions

Something I notice as I look back at this post is the underlying transition from behaviourist and cognitivist to constructivist methods. All the pre-lecture tasks have high levels of behaviourist and cognitivist biases. This is definitely more teacher centred. I don’t know if this is necessarily a good or bad thing, but it does assume that there needs to more of an expert guiding the learners and then fade as mastery develops.

Leave a Reply

Your email address will not be published. Required fields are marked *