IP 2 – Annotated Bibliography

Tsai, Y. L., & Tsai, C. C. (2020). A meta‐analysis of research on Digital Game‐based Science Learning. Journal of Computer Assisted Learning, 36(3), 280–294. https://doi.org/10.1111/jcal.12430

Tsai & Tsai (2020) performed a meta-analysis of 26 peer-reviewed empirical studies, published between 2000 – 2018, that examined the use of digital games for science learning (including physics, chemistry, biology, and natural science). The study design followed preferred reporting items for systematic reviews and meta-analyses (PRISMA) and APA meta-analysis reporting standards. Comprehensive Meta-Analysis software was used to calculate the overall effect size of two groups (random-effect model), gameplay design (GD; n = 14) and game-mechanism design (GMD; n = 12), and the subgroup analysis tool (mixed-effects model) to compare education level, single/multiplayer game design, roleplay/no roleplay game type, and learning mechanisms/gaming mechanisms.

In the GD group, the results showed that students’ science knowledge acquisition at all educational levels significantly improved when digital games were used for learning in place of other teaching methods. There was no significant difference observed between single and multiplayer games or games with or without roleplay. In the GMD group, it was found that both added learning mechanisms and gaming mechanisms significantly increased science knowledge acquisition at all educational levels, with no difference between the two. The authors suggest that the results support Piaget’s theories that connect play and cognitive development and note that many children lack motivation to learn science because it is perceived as complex, but digital game-based learning may engage them in the subject matter.

The authors of this meta-analysis provided sound reasoning for their research design and provided relevant connections between the results and learning theories that inform pedagogy. They acknowledged several limitations to their study and presented compelling arguments for further research on topics such as the connections between digital game-based learning and student problem solving and gameplay behavior. This paper is clearly written and provides empirical evidence that students can develop their scientific knowledge through digital gameplay.

 

 

 

Wang, L.-H., Chen, B., Hwang, G.-J., Guan, J.-Q., & Wang, Y.-Q. (2022). Effects of digital game-based STEM education on students’ learning achievement: A meta-analysis. International Journal of STEM Education, 9(1), 1–13. https://doi.org/10.1186/s40594-022-00344-0

Wang et al. (2022) performed a meta-analysis of 33 studies with a total of 36 effect sizes that were published between 2010 – 2020. The studies were selected in accordance with the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines, based on the following: focus on games and STEM (specifically science, math, and engineering/technology) learning of students (K-12 or higher education), presence of a control group, quantity of data, and publication in English. Comprehensive Meta-Analysis 3.0 software was used to calculate the effect size (expressed as the standard mean of difference using the random-effects model), and variables considered included the control treatment, educational level, subject, intervention duration, game type, and gaming platform.

The results indicate that there is a significant positive effect size of digital game-based learning on STEM students’ achievement. Digital games outperformed non-digital games, but there was no significant difference between subject disciplines, traditional or multimedia instruction of control groups, or the platform that was used. Primary school students showed significantly better learning achievement when learning from digital games, but there was no significant difference between intervention duration. The authors note, however, that intervention duration of less than one week has the highest increase in achievement when compared to each of the longer intervention periods, perhaps due to novelty.

Although the authors provided details about how studies were chosen, I would like to read more about how learning achievement was measured, and whether pre-tests and post-tests were used for assessment. Wang et al. (2022) do acknowledge the limitations of their meta-analysis and indicate that a follow-up study from another perspective (e.g. cognitive skills, affective influences) is warranted. The research methods could have benefitted from some more detail, and a connection to pedagogical practices would create more relevance for practicing educators.

Leave a Reply

Your email address will not be published. Required fields are marked *