Marko Prodanovic from Sauder Learning Services will be presenting their work on accessing and making use of Panopto video data.
Over the past year we’ve seen a sharp increase in teaching and learning happening through asynchronous video lectures. This also means a lot of new data!
In this session, Marko will show some of the work he’s done in developing a Python tool for fetching and managing viewing data from the Panopto video platform. We’ll look at the means by which data is being accessed (through various APIs), the transformations we perform and the reasoning behind them. We’ll also walk through a set of video analytics dashboards, built in Tableau that utilizes this data. As this project is ongoing, the last 15 minutes of the session will be dedicated to feedback, suggestions and discussion around the broader applications of this kind of viewing data.
Continuing the theme of learning analytics instrumentation, Jeff Longland will provide an update on UBC’s work towards a data system for learning events. We’ll look at how learning tools can be instrumented to emit structured events, how events are ingested and stored, then finally how they can be accessed. A live demonstration will be attempted, so even if the topic isn’t of particular interest, it might be worthwhile for the humour alone!
Learn more about UBC Learning Analytics: https://learninganalytics.ubc.ca/
Our LAVA meeting this week (Wednesday June 10, 3pm PT, on Zoom) will be led by Fabian Froehlich.
Fabian Froehlich joined the Faculty of Education as a graduate student in 2018. Specializing in Media & Technology Education Studies Fabian’s research focuses on inclusive instructional design through educational technology. He is a SOTL-specialist (Scholarship of Teaching and Learning) at the Centre of Teaching and Learning Technology and worked for the Learning Analytics Team of UBC as a videographer.
The presentation summarizes findings of my master-thesis: social network analysis as a progressive tool for learning analytics. In order to investigate the following research question: “Do students presented with social network analysis data on online course discussions adjust their engagement behavior?”. A quasi-experiment was conducted relying on an embedded mixed-method research design. Students (n=18) participated in three online discussions. Two online discussions allowed students to access social network analysis visualizations through Threadz, a Canvas plugin. The overall inquiry focuses on how this exposure of learning analytics data might influence the students.
If you would like to attend but are not part of the LAVA emailing list, please contact Alison Myers (firstname.lastname@example.org) for the Zoom information. You can also request that Alison add you to the LAVA mailing list, where we share information about upcoming LAVA sessions.
This week’s LAVA session will be led by Stoo Sepp (Manager, Learning Design in the Faculty of Education’s Educational Technology Support team). The focus of the session will be on Interaction Treatments. Stoo will be providing an overview and leading a discussion on how Learning Analytics and Learning Design theory can be advanced by this idea.
First proposed in 1989, Interaction Treatments refer to the types of interactions that typically occur in technology-enabled learning environments. Applying this concept to the field of learning analytics, we can extend it to incorporate more contemporary theorizing around types of interactions, while refocusing on the ‘L’ in Learning Analytics. In this session, a brief overview of the concept of Interaction Treatments will be presented, along with types of learning analytics used to inform pedagogical action. Finally we’ll have a discussion about how these concepts relate to instructor intention and the design of learning experiences.
For this session, we are asking for feedback on the upcoming Learning Analytics Hackathon. We will walk through what the approach is this year and what has been developed so far. We will also ask for the group’s feedback! If you want to take a look at the github repo which will be the home-base for the hackathon, please do! The overall goal for the hackathon is for students with any level of programming experience to be able to participate and learn, so it will be useful for us to have some fresh eyes from people with various levels of experience as we make our final adjustments ahead of the hackathon.
More information about the hackathon can also be found on the registration page.
I’ll be discussing my work at the Visual Cognition Lab under Dr. Ron Rensink, where my team and I study the perception of correlation in visualizations. The purpose of this is two-fold. Using visualizations as a stimuli can help us understand how the visual system gets statistical information from scenes. Conversely, this understanding can lead to better visualizations by giving us rigorous ways to measure the effectiveness of a design.
For example, consider the pair of graphs below, each representing an identical set of age and height measurements for a group of individuals.
The graph on the left is clearly superior, revealing relationships that are invisible in the graph on the right. But we don’t really know why. As designers of visualizations, the best we can do right now is appeal to our intuition, to the “best practices” identified by our colleagues, or to the results of field studies.
While these methods may have worked well enough so far, they may not scale well as visualizations become increasingly complex and high-dimensional. In my presentation, I’ll show our research can eventually let us develop methods to judge visualizations from first principles.
Rama has been working on building an Evaluation Visualizer, “EvalVis” which gives an overview of some of the ISoTL projects going on at UBC (http://isotl.ctlt.ubc.ca/). “EvalVis” is an interactive visual interface that will show innovation projects, area of impact, and evaluation approach. Rama will be showing the in-progress version of the tool, as well as discussing some of the challenges of the project so far.
This week we watched this video from Educause which had various professionals discussing why measuring learning is difficult.
Some key ideas from the video that had us talking were descriptions of the process of learning as a “black box” or “magic”. We tried to bring the discussion of measuring and studying learning into the context of learning analytics.