Evaluation Tool and Motivation
For this assignment, I have decided to choose OnTask as the learning analytics tool of choice for analysis. I want to look into the ways that this tool goes above and beyond monitoring student behavior patterns for “early alert” by providing real-time feedback for students and suggestions for further improvement by recommending additional resources for reference. In addition, as it is not limited to one LMS system, it is a potentially useful add-on for a variety of contexts. I hope to reflect on the ways that this tool would be especially helpful in the context of the UBC OnTask Pilot Study for first-year physics course as an example.
I chose to utilize the Cooper Framework (Cooper, 2012) as the guide to evaluate this learning analytics tool, as it takes on a more “descriptive, rather definitive approach which allows to deal with real-world complexity of how analytics is” (p.3). I believe this framework refers to “characteristics of learning analytic tools which may overlap and are assumed to be extensible and adaptable” (p. 3), which I believe is important when it comes to benefiting from the affordances of the tools, while thinking about best practices for utilization in different educational contexts.
OnTask Tool Functions (OnTask, n.a)
OnTask Project “aims to improve the academic experience of students through the delivery of timely, personalized and actionable student feedback throughout their participation in a course”. Some core functions of OnTask include:
- Assessment of data about student activities throughout the semester and allows instructor to design personalized feedback with suggestions about learning strategies for students to adjust their learning progressively
- LMS- agnostic and receives data from various sources (i.e. online engagement, assessments, student information systems, electronic textbooks, discussion forums,etc.)
- Directs students to specific chapters/examples in textbooks, suggest additional readings/resources, enroll in required workshops/tutorials/labs, redirects to university support services
Cooper’s Framework – UBC OnTask Pilot Study
Based on UBC’s OnTask Pilot Project (Moosvi, 2019), the tool was used in first-year physics course PHYS1117 by instructor Simon Bates and co-instructor Mateus Fandino to engage with students more and add personalized feedback for each student based on their performance throughout the weeks. Using Cooper’s proposed framework (Cooper, 2012), I have analyzed this case study as below:
Analysis subjects, objects, and clients | Analysis Subjects: first-year students in PHYS1117
Analysis Objects: first-year students in PHYS1117, specifically student performance within the course
Analysis Clients: course instructors |
Data Origin | ● Data collected from LMS system Canvas (i.e. assignments, exams and assessments, student engagement data, etc.)
● In-class attendance and participation
● Laboratory sections |
Orientation and Objectives | Orientation:
Diagnostic in nature even though the pilot project is an exploratory one, as it aims to investigate whether personalized feedback can improve student performance.
This study is also a reflective orientation, although the natural extension of the pilot project could be used to predict outcomes of future cohorts, in which a more diagnostic mode is expected.
Objective: to enhance student performance and outcomes based on mix of quantitative and qualitative measures |
Technical Approach | Assumption of technical approach being statistical and hypothesis testing in nature based on analysis of students’ previous course performance without personalized feedback from OnTask compared to students’ course performance with the personalized feedback from OnTask. |
Embedded Theories | Socio-constructivist approaches for students to co-construct their learning process via receiving and implementing feedback given by instructors |
Based on some of the feedback from students, there was a general positive perception from students for utilizing OnTask to provide feedback on what to focus on readings, reminders for upcoming assessments and deadlines, areas of confusion and commonly made mistakes, reflection of previous and future coursework, as well as helping students gauge their own progress in the course. There was also an increase of student engagement and interaction overall due to the weekly personalized newsletters that the instructors sent out.
Reflection
The pilot study results are in alignment with the findings of Pardo et. al (2019) on a first-year undergraduate engineering course. Although the size effect of personalized feedback was not large on the midterm scores, the result still provided a significant and positive impact on student satisfaction with feedback and academic performance in the midterm exam (p. 136). The researchers associated the feedback as an important factor to support student success, and highlighted the need to establish a connection between student data and how to utilize it to provide high quality feedback. Suggestions on creating comment templates to tailor to a course, or for algorithms to match between comments and observed data were provided.
Some further research on better techniques to identify individual differences in student participation in learning experiences and students study habits were highlighted, as these missing pieces of information could potentially provide additional information for interventions with potentially high effects.
In terms of applying it to my personal context of foreign language teaching and learning, due to the smaller scale of students per class and the increased amount of qualitative assessments, there might not be as big of a difference between utilizing OnTask versus the instructor providing feedback directly. However, I still believe OnTask could be a great tool to identify common challenges that students might be facing based on their qualitative assignments, and can still provide insight on student performance in courses regardless.
References
Cooper, A. (2012) CETIS Analytics Series Volume 1, No 7; A Framework of Characteristics for Analytics http://publications.cetis.ac.uk/2012/524
Lim, L., Gentili, S., Pardo, A., Dawson, S., & Gašević, D. (2018). Combining technology and human intelligence to provide feedback and learning support using OnTask . In Companion Proceedings of the 8th International Conference on Learning Analytics & Knowledge (LAK’18), Sydney, Australia. Academia.
Moosvi, Firas (2019, June 20). Learning Analytics Project OnTask: A Case Study https://learninganalytics.ubc.ca/ontask-a-case-study/
OnTask (n.a.). https://www.ontasklearning.org/
Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2017). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, doi:10.1111/bjet.12592