In this LAVA session (April 18th 2-3ish pm), Gabriel Smith from the Faculty of Land & Food Systems will present a custom Canvas integration that allows students to decide how assessments in a course will be weighted within ranges defined by the instructor and applies those weights to their final grades. He will be discussing the rationale for the project, giving a demonstration of the tool, and asking us to help brainstorm ideas for future analytics features.
For this week’s LAVA Meeting (Feb 15, 2024, 2pm) we will continue meeting on Zoom. In this session, Alison, Craig, and Justin will be looking for some feedback on our Hackathon LAK Practitioner report. I will be sharing the paper shortly (and apologise in advance for a bunch of LAVA notifications!). For those of you that don’t know – the LA Hackathons started as an idea from an early LAVA session and have become a regularly occurring event since.
Our hope is to do a quick introduction about the LA hackathons, and hear from you folks what parts of the paper you found the most interesting or had follow up questions on. Do you agree that hackathons are a useful form of student engagement in LA? This will help guide us on the most interesting/important parts of the paper from an outside perspective (since, obviously, we think the whole thing is great ???? ) and may shape the direction of our 15 minute LAK presentation.
This month’s LAVA meeting will even be more informal than most – but for those of you online before the holidays, I couldn’t not have at least one chatGPT themed LAVA session to wrap up 2023. My question is – what does chatGPT “know” about Learning Analytics? Let’s find out!
Homework: send me (or prepare) your prompts – what questions would you ask to find out what chatGPT “knows” about Learning Analytics? Are you an expert in a particular area of Learning Analytics – what is something you would consider “common knowledge” at this point – we will see if chatGPT gets it right.
In the session we will review the prompts, see what chatGPT comes up with, and evaluate the responses.
In this week’s meeting Leah Macfadyen and Alison will be sharing work that we’ve dubbed “The IKEA Model”.
Initially developed as a Directed Study in the MET program, the goal was to build an instructor-facing learning analytics dashboard using data that is accessible from Canvas by an instructor. The “project” includes a series of steps and scripts that would allow you to pull together Canvas and New Analytics data into a Tableau dashboard. We will discuss the background of the project, our roles, and its current state.
An essential part of analytics is the communication of insights to colleagues and stakeholders. In this LAVA session, we will look at Quarto, an open-source scientific and technical publishing system that combines code and its outputs with descriptive text and exports them to many formats, including HTML, PDF and MS Word. We will cover standard tools to author Quarto documents, how to format them using Markdown, and how to include code with R as an example (Python and Julia are also supported).
Preparation (entirely optional):
We will briefly cover Markdown but will only have time to go over some things in the basic syntax. If you would like to familiarize yourself with Markdown beforehand, here is an excellent interactive tutorial (10–20 min): https://commonmark.org/help/tutorial/
If you would like to work along the session to create a Quarto document, you have two options:
1) Use a cloud-based system by creating a free posit Cloud account (recommended, formerly RStudio Cloud): https://posit.cloud/plans/free
Stephan previously worked as a Postdoctoral Teaching and Learning Fellow in the Microbiology and Immunology Department at UBC and coordinated the Experiential Data science for Undergraduate Cross-disciplinary Education (EDUCE) program. He designed and delivered data science-focused modules across multiple MICB courses, conducted and analyzed surveys, and maintained open education resources. He joined Skylight (Science Centre of Learning and Teaching) as the Science Education Specialist in the Computer Science Department and supports instructors with the migration to a new online assessment tool offering students the opportunity to practice with randomized questions for mastery learning and more flexibility when scheduling exams in a new computer-based testing facility. He focuses on the impact of learning technologies on student learning, well-being and inclusivity and how their implementation and course design should inform each other.
In this week’s LAVA meeting Alison shared how Sauder Learning Services takes advantage of Canvas APIs. Alison is a Research Analyst in the Sauder Learning Services team. The team supports our faculty’s use of technology in their teaching practice. In her role, she attempts to understand what data is available and can be used to inform teaching practices.
From Alison:
In this LAVA talk I will be discussing how my team uses available Canvas APIs for completing routine tasks and extracting data. I will briefly share what we have learned about the differences between REST and GraphQL APIs at a high level (as the technical details are beyond me). I will also share some of the projects that we have used the APIs for, and discuss our approach to the development of these projects from an operations standpoint (how we try to “reduce clicks” and save time), and from a learning analytics standpoint (how to use the data available to inform teaching practices).
In our August 4th LAVA meeting, we will be hearing from Dr. Fatemeh Salehian. Fatemeh is a postdoctoral research fellow in the School of Information at UBC. She is currently affiliated to the University of Michigan, Ann Arbor, and the University of South Australia. She has been working on learning analytics projects for over seven years. Her research focuses on understanding how learners self-regulate their learning processes and how learning analytics can help to support self-regulated learning since it is critically associated with students’ performance and learning outcomes. Her research focuses on using data to study learners’ behaviors in online learning environments to develop indicators of learning behavior.
From Fatemeh:
I will talk about two projects focusing on providing actionable information for students.
First is the OnTask platform which offers an intuitive interface to upload data about student engagement into a matrix and the definition of a set of simple when/do rules to customize email messages for the students. Instructors use the indicators about student engagement to select/ignore text portions that are then collated and sent to the students as regular email messages. These messages provide students with personalized support and suggestions to improve their engagement. In our recent analysis, the results showed that there is a significant association between the messages’ topics and the students’ performance.
Second, My Learning Analytics (MyLA) is a student-facing dashboard that provides students with information about their engagement with course materials and resources, assignments, and grades in a Canvas course. A set of three learning analytics data visualizations have been designed to reveal behavioral patterns associated with good learning skills, guide decisions about actions students can take that may improve their academic outcomes, and provide a transparent view of students’ course standing.
This week we will be hearing from Dr. Bowen Hui. Dr. Hui is Associate Professor of Teaching & Associate Head of Undergraduate Affairs (CMPS) in the Computer Science program at UBCO.
From Bowen:
Teamable Analytics: A Team Formation and Analytics Tool
Forming effective teams for large classes is a challenge for educators due to the complexity of project needs, the diversity of individual characteristics, and the criteria different educators have for forming teams. Although many researchers over the past several decades studied the success factors of a team, there is still little consensus on how a team should ideally be formed. Consequently, how one decides to form teams in a class depends on the domain, classroom context, and pedagogical objectives.
In this demo, we present a web application that offers several algorithms to support the team formation process called Teamable Analytics. Teamable Analytics is compatible with any learning management system (LMS) that uses the LTI protocol. Our tool provides a dashboard for educators to elicit student characteristics and customize how those responses are combined to form teams. In contrast to existing team formation software, our tool supports more use cases for building teams, it provides a general end-to-end solution to the team formation process, it is integrated with the LMS so to minimize data setup and privacy risks, and it makes use of visual analytics to diagnose problematic formations, increase user trust, and monitor ongoing team performance.
To date, our team has integrated Teamable Analytics with the Canvas LMS and completed 7 pilot studies with interdisciplinary classes consisting of 15 to 210 students across both UBC campuses. We are currently seeking collaborators to further the research and development in the visual analytics component of the project.
This week’s LAVA session [Thursday April 7, 2-3pm, Zoom] will be led by Annay Slabikowska (PAIR) and Craig Thompson (CTLT), who will present recent work on the Student Flows Project.
From Annay and Craig:
The project is collaboration between the Learning Analytics and Planning and Institutional Research teams and the Faculty of Science, including contributions from analysts in the Faculty of Arts and the Sauder School of Business.
The goals of the Student Flows Project are threefold: to create a shared common data set that can be used by analysts across the institution to explore student flow scenarios, to develop visualizations of student flows and to recommend tools to visually analyze student flows, and to collaboratively document and share best practices for exploring student flow scenarios.
By “Student Flows” we mean student progression through courses, specializations, and degree programs during their time at UBC. This can include analysis of time to degree completion, identifying common patterns of switching between faculties or majors, investigation of common course co-enrolment or sequencing, and more. In addition to these enrolment-based data elements, we are also interested in student characteristics, and how they relate to enrolments. For example, we need to be able to explore whether there are differential course or program outcomes for various student cohorts, such as international students, mature students, or indigenous students, as well as temporal cohorts.
From our initial exploration of this space, it became clear that there was not a one-size-fits-all solution to data analysis and reporting regarding student flows. For example, the level of detail needed varies by audience: one Dean may be interested in understanding how students flow into and out of their Faculty, whereas a department head may be interested in how students choose to sequence courses within their major. Additionally, at any given level of detail, different stakeholders will have different needs terms of what information they want to see, and how they would like it to be presented. Thus, we have endeavored to create a framework (dataset, documentation, and established best practices) that will enable a distributed team of analysts to undertake work in this space, to more easily collaborate and share, and to answer the plethora of questions raised by various institutional stakeholders.
We draw inspiration from other universities such as University of Michigan, which is an early leader in establishing a common data set for institutional researchers, and numerous institutions exploring visualization techniques for student flows such as UC Davis, University at Buffalo, and University of New Mexico.
This week (Thursday, 2-3pm, Zoom) our LAVA session will be led by Warren Code, the Associate Director at Skylight.
From Warren:
Running for the six weeks from early July to the start of Jumpstart in mid-August, the Academic Essentials program ( https://you.ubc.ca/academic-essentials/ ) launched in Summer 2020 as a set of three optional Canvas courses intended to help incoming undergraduate students to prepare for their first year at UBC’s Vancouver and Okanagan campuses. I was one of the developers of the “Readiness for University Mathematics” course, providing a review of mathematical concepts and practice of associated skills as well as recommendations for studying and other engagement in a first Calculus course at UBC. In this session, I will be describing our approach so far in evaluating this review course in terms of outcomes in the credit-carrying math courses in the Fall term, where we are drawing on a self-assessment taken at the start of the course, student activity in Canvas and WeBWorK (the online homework system where most of the practice occurs), and some diagnostic measurements taken in the Calculus courses that have been running for several years now.
Our key questions are:
– Does the summer review course bolster students’ pre-calculus skills?
– How does participation in the summer impact students’ success in their Fall term Calculus course?
– How prepared are our incoming students in the last few cohorts?
I will discuss some preliminary findings for the first two questions from the 2020 cohort as we are preparing to add data from the 2021 cohort later this term. Our main team so far for this analysis is myself, Zohreh Moradi (Skylight’s Research Analyst) and Costanza Piccolo (Associate Professor of Teaching in Mathematics and one of the other co-developers of the “Readiness for University Mathematics” course).