Monday Nov 26, 2018: Stoo Sepp

Stoo is a PhD Candidate in Educational Psychology at the University of Wollongong Australia, exploring the role that gestures play in cognition and learning. In this session, he will tap into multimodal learning analytics as a means to capture such physical interactions:

“To explore the role of gestures, a novel instrument for data collection was developed, which seeks to capture the physical interactions learners have with multimedia learning materials focusing on learning geometry. By leveraging the touch-based technologies in iPads, physical interactions with learning materials as well as quantitative data related to these interactions can be captured, and parsed. This novel instrument has implications for the field of Multimodal Learning Analytics (MMLA), potentially opening the door to a new field of Embodied Learning Analytics (ELA) and presents an interesting path forward for the measurement of embodied data, and how this may inform teaching and learning.  This presentation will provide a brief theoretical overview, as well as a demo of the developed apps.”

Monday October 15, 2018: Kazem Banihashem

Role of Learning Theories in the Use of Learning Analytics

Kazem is a PhD candidate in the field of educational technology at Allameh Tabataba’i University (ATU), Tehran, Iran and currently a PhD visiting scholar at UBC.  In this session, Kazem will discuss:

“In my presentation, I would like to talk about the role of learning theories in the use of learning analytics in education and then I will explain about my dissertation and the goals I and my supervisors are hoping to achieve in that research. My dissertation focuses on development of an instructional design model based on constructivism theory in higher education with a focus on learning analytics. “

Monday Oct 1, 2018: Jeff Longlad & Scott Mcmillan

Toilets, Pipes and Accidents

Jeff Longland, Solutions Architect and Scott Mcmillan, System Analyst II are involved in the development of the technical infrastructure for the 2-year funded UBC Learning Analytics Project. In this session, they will give a summary of year one and their plans for year two.

“The Data Janitors return. Jeff and Scott will discuss their lessons learned during the first year of the Learning Analytics project and share the technical working group’s recommendations for year two. ”

Monday Sep 17, 2018: Sauder’s Learning Services Team

“Module Progress in Canvas” Pilot Project

UBC Sauder’s Learning Services team provides teaching staff and faculty with services and tools that can support design, development, and delivery of rich learning experience to students. In this session, the team will talk about a dashboard they have developed for monitoring module progress. From Alison Myers:

We have developed a pilot dashboard that allows the visualization of Module Progress in Canvas. When you build a module in Canvas you have the option to add prerequisites and requirements that enforce a sequence of completion. However, the only option to view the progress of students is at the level of the individual student. Using the Canvas API and Tableau we were able to create a dashboard that would allow an instructor to see the overall progress of the class with the option of viewing the progress of individual students. At LAVA we will be presenting the Tableau prototype, as well as discussing the project from three roles: our Manager of Learning Ecosystems Support and Solutions, our Canvas Tech Rover, and our Research Analyst.”

Monday August 13, 2018: Julie Wei

Application of Assessment Analytics to Improve Teaching and Learning: Why and How

Julie Wei is a Senior Research Analyst at UBC Faculty of Arts. Julie is an interdisciplinary researcher, with graduate degrees in Educational Psychology (PhD, University of Illinois at Urbana-Champaign, 2016), Statistics (Master, UIUC, 2013), and Curriculum & Pedagogy (Master, Henan Normal University, 2005). Before moving to Canada, Julie had taught at K-12 and universities for more than ten years. Julie’s research has focused primarily on how to effectively design and use assessment, evaluation and learning analytics to improve teaching and learning.

In this session, Julie will discuss why and how assessment analytics has the potential to make a valuable contribution to the field of Learning Analytics by broadening its scope and increasing its usefulness.

“Assessment is a central element of the learning process, as it not only defines what learners consider important and measures their achievements across all phases of learning process but also gives instructor and institutions a valuable feedback about whether the program goals and institutional objectives have achieved. Thus, assessment analytics benefits would extend to all stakeholders involved in the educational process. In order to get maximal benefits from assessment analytics, assessments should be designed mindfully to help collect finer level test data. In contrast to traditional methods of assessment that provides only a final score, a diagnostic assessment has the potential to provide more detailed information about in which specific areas that learners do well or need to make improvements if it is well-constructed, thus it can collect, analyze and report data about leaners for the purpose of “understanding and optimizing, learning and the environment in which it occurs.” (Siemens G, 2013). Diagnostic assessment is especially helpful to understand complicated learning process in content domains that are traditional considered to be difficult. In this talk, I share with the audience how assessment should be designed mindfully and revised iteratively, as well as how the granular information could be used to help teachers, students and schools make improvements.”

This session, we will be meeting in *DL-005* which is the classroom right across from our normal meeting place (DL011).

Monday July 16, 2018: Review the state of LA

Let’s join to review and discuss the state of Learning Analytics now compared to early days.

We will be watching some of the videos from the LAK 2012 conference. Then we can talk about whether the concepts still apply, what progress has been made, or what changes have occurred in the last 6 years.

There are 3 sessions chosen that are about 20 minutes each, we can watch and discuss.

1. [LAK 2012] April 30: 2B – The Learning Analytics Cycle: Closing the loop effectively

2. [LAK 2012] April 30: 1B – Using an Instructional Expert…

 

3. [LAK 2012] May 1: 5B – Does the Length of Time Off-Task Matter?

 

Monday June 18, 2018: Sanam Shirazi

Supporting Students Use of Learning Analytics

I am a Senior Research Analyst at UBC Faculty of Arts and work on learning analytics and academic analytics projects in that faculty. I stepped into the world of learning analytics when I started my Master’s degree at School of Interactive Art & Technology, SFU. My research was focused on student-facing LA visualizations and dashboards. In this session, I discussed how we can support students use of learning analytics in the classroom:

Drawing on the literature, I will talk about some of the known challenges students face when using learning analytics, particularly around the interpretation of presented information and action taking.

Then I will discuss a proposed framework (from the literature) that addresses some of the identified challenges and integrates engagement with analytics as part of the larger teaching and learning activity to support its productive use.

I will show some examples of how the framework can be implemented.

Monday May 7, 2018: Learning Analytics and Knowledge Conference 2018 Keynote

Towards User-Centred Analytics: User-Adaptive Visualizations

In this session, we will watch and discuss one of the keynotes from Learning Analytics and Knowledge Conference 2018 by Professor Cristina Conati, Professor, Department of Computer Science, University of British Columbia. Cristina is interested in integrating research in Artificial Intelligence, Human Computer Interaction and Cognitive Science to create intelligent user interfaces that can effectively and reliably adapt to the needs of each user.

“As digital information continues to accumulate in our lives, information visualizations have become an increasingly relevant tool for discovering trends and shaping stories from this overabundance of data. Education is not an exception, with learner and teacher visualization dashboards being extensively investigated as new means to change pedagogy and learning. Visualizations are typically designed based on the data to be displayed and the tasks to be supported, but they follow a one size-fits-all approach when it comes to users’ individual differences such as expertise, cognitive abilities, states and preferences. There is, however, mounting evidence that these characteristics can significantly influence user experience during information visualization tasks. These findings have triggered research on user-adaptive visualizations, i.e., visualizations that can track and adapt to relevant user characteristics and specific needs. In this talk, I will present results on which user individual differences can impact visualization processing, and on how these differences can be captured using predictive models based on eye-tracking data. I will also discuss how to leverage these models to provide personalized support that can improve the user’s experience with a visualization.”

Here is the link to slides, video and details of this keynote: https://latte-analytics.sydney.edu.au/keynotes/

Monday April 23, 2018: Sarah Perez

Sarah is a Data scientist and Research at the UBC Centre for Teaching and Learning. Sarah has a background in data analysis and visualization from the field of bioinformatics and is applying that expertise to education research. Working with faculty members and researchers, she coordinates research projects to assess how the use of technologies or teaching methods affects student learning.

In this session will be presenting and asking for some feedback for a project she has been working on

“For the last few months I have been looking at what students do in virtual science labs mostly focusing on the strategies they use to learn from them. You may have seen visualizations of students’ clickstream data at previous Lava meetings. On Monday, I will show you some data on how students explore the interface and the physical phenomena we ask them to model. Since exploration is an important inquiry skill, I am hoping you can help me figure out how to assess how students explore. Get ready to play with some data and get in the head of students!”

Monday April 9, 2018: Craig Thompson

OnTask: A platform to provide timely, personalized, and actionable feedback to large cohorts

Craig Thompson is a Research Analyst working on the UBC Learning Analytics Project. In this presentation, Craig will provide an overview of the recent literature on mass-personalized feedback, and give a technical demo of OnTask. 

“OnTask is a tool that enables instructors to give targeted, customized messages based on the metrics that they set for their courses. This allows instructors to provide students with timely and personal feedback that can scale to large courses. Feedback can range from targeted remediation suggestions for students at-risk to enrichment opportunities for high achieving students. 

OnTask is currently being pilot tested as part of the UBC Learning Analytics project.

Spam prevention powered by Akismet