Changing Students Learning Behaviour via Learning Analytics
Abstract: Three main audiences for learning analytics are institutions, instructors & course designers, and students. Our focus is on students. We take a stance that a goal of learning analytics for students should be not only to inform them about their performance, but rather clearly influence learners to modify their learning behaviour and lead to better learning outcomes. As learning sciences show, students’ learning is heavily influenced by their individual differences. Our research aims at developing understanding of how information presented in and a form of Learning Analytics visualizations affects individual students. In this talk I will elaborate on these theoretical concepts and present results of our study showing varying effect of Learning Analytics visualizations on students with different goals. Our findings highlight the methodological importance of considering individual differences and pose important implications for future design and research of learning analytics visualizations.
Dr. Marek Hatala is a Professor at the School Interactive Arts and Technology at Simon Fraser University and a Director of the Laboratory for Ontological Research. He received his PhD in Artificial Intelligence from the Technical University in Kosice (Slovakia). His research is driven by the problems arising between the computing systems and their users. The areas of his prior interests include configuration engineering design, organizational learning, semantic interoperability, ontologies and semantic web, user modeling in ubiquitous and ambient intelligence environments, and software engineering and service oriented architectures. Dr. Hatala’s current research is framed within the area of Learning Analytics. Specifically, he builds on the learning sciences to establishing the theories of effects of open learner models on learner’s motivation with the goal to improve their learning outcomes in the online learning environments.
In October 2015 the Learning Analytics Visual Analytics (LAVA) group held the first ever learning analytics hackathon at UBC. During the two-day event, more than 70 participants with a wide range of backgrounds and expertise applied a variety of approaches to analyzing learning-related data. Some used classroom observation data to better understand how learning unfolds, while others used data from a learning management system to identify patterns in how learners use available materials.
According to Leah Macfadyen, program director of Evaluation and Learning Analytics at the Faculty of Arts, the idea for the hackathon came about in one of the group’s weekly meetings, after a member brought in a large data set and asked for help. The group had a lot of fun tossing around ideas about how to analyze the data and how to best present the results. The outcome was so successful that the group wondered, “Why not make it bigger? Why not have a hackathon?”
Macfadyen and other event organizers were pleasantly surprised with the large turnout. “People are interested in doing this?” asked Megan Barker, a Science Teaching and Learning Post-Doctoral Fellow with the Carl Wieman Science Education Initiative and data presenter at the hackathon. “We had no idea. We thought maybe a couple of people would find this interesting. Then all of the sudden our registration was full and we had people on the waitlist.”
The event began with five researchers from across UBC pitching learning data sets to participants. Each data set captured a different aspect of learning, for example, student interactions with video content and moment-by-moment actions in a virtual physics lab. Data presenters then challenged participants to find the story behind the data.
The hackathon attracted undergraduate and graduate students as well as faculty, staff and professionals. Participants formed groups depending on which of the five data sets they chose to work on as well as their disciplinary area of expertise and preferred analytics approach. “We wanted to bring together people who were interested in working with data. If they didn’t have the same background it’d be even better. They could learn from each other,” said Macfadyen.
Tyler Robb-Smith is a student at the British Columbia Institute of Technology and has a background in Nanohydrodynamics. He came to the event to meet people and build on his data analysis skills. “It’s interesting looking at different perspectives. It’s interesting how everyone has an area of expertise, and added together it made [the process] quite easy,” he said. “You are able to go a lot further in a project that individually would take you a lot longer.”
The event was an opportunity for like-minded people to meet and share their passion for data analysis as well as learn about LAVA. The group started with Macfadyen and a few students who were interested in working with learning research data, but it quickly grew. It wasn’t long before other faculty and staff started asking to join the weekly meetings.
“A lot of people didn’t know then and still don’t know that this type of research actually exists at UBC. Learning analytics is a new field,” said Macfadyen. According to her, the university already has a lot of learning and teaching data available, for example, from student registration systems and course evaluation systems. “These goldmines of potential insight are just sitting around and they could and should be used to inform decisions about planning throughout departments,” she added.
That’s why the event was also aimed at raising the profile and visibility of this emerging field and showing what can come out of this type of analysis. Learning research data can tell instructors about what the most effective teaching methods are and how they’re working in their classrooms. It can inform departments about why certain classes are more popular than others. It can help instructors plan for courses.
“We really wanted to raise awareness about how data can be used to improve teaching and learning practices,” said Ido Roll, senior manager for Research and Evaluation in the Centre for Teaching, Learning, and Technology and one of the event’s organizers. “The hackathon was a great way to combine interesting questions about how people learn, large data sets and a group of eager and motivated experts.”
This year’s event was a success. At the end of the hackathon nine participant-led groups presented their research findings. Several of these projects have become full-scale research projects following the event. According to Roll and Macfadyen, the most common question from participants was how soon would there be another hackathon.
An NLP-informed learning analytics approach for extracting and measuring aspects of argumentation
Venue: Buchanan C105C
This paper/presentation reports on a work-in-progress and shares preliminary results for an attempt to use NLP-informed learning analytics methods to extract and measure aspects of students’ argumentation while they learn how to think and argue like scientists. The approach explored in this paper caters to aspects of deep learning and detects the flow of the argumentation directly from the structure and the composition of the language that the students use in their writings. The model integrates insights from natural language processing techniques and argumentation theory in such a way that derives the metalinguistic features of argumentation directly from the linguistic units produced in students’ written language.
Hackathon participants, please share your files with us! :
– Manipulated data
– Results (including failed results)
Please include a readme.txt file with the following information:
– Names of group members
– Names and descriptions of the included files
– A short description of the process – what was your approach and what have you found (or did not find)? (do not spend much time on this)
This week we watched this video from Educause which had various professionals discussing why measuring learning is difficult.
Some key ideas from the video that had us talking were descriptions of the process of learning as a “black box” or “magic”. We tried to bring the discussion of measuring and studying learning into the context of learning analytics.