*Note – this meeting was rescheduled from February 11th
Firas Moosvi is a Learning Scientist working on the UBC Learning Analytics Project. In this session, he will discuss the potentials of Learning Analytics to surface classroom inequities.
“Though the field of learning analytics is new to me, I continue to be fascinated by its potential to transform teaching practice. For the first part of this LAVA meeting, I will present examples of how Learning Analytics has been used to surface trends of classroom inequities to instructors and institutions. Inequities include everything from gender, ethnicity, prior knowledge, and even personalities. I have a couple of papers already, but if you know of any examples that you thought are particularly striking, feel free to send them my way! In the second part, I hope to generate some discussion with the following prompt: “It is our responsibility as members of a university community to help highlight and address classroom inequities (where possible). No prior readings, come by for a fun and (maybe) spirited discussion! As always, standard disclaimer: views and opinions are my own.”
Stoo is a PhD Candidate in Educational Psychology at the University of Wollongong Australia, exploring the role that gestures play in cognition and learning. In this session, he will tap into multimodal learning analytics as a means to capture such physical interactions:
“To explore the role of gestures, a novel instrument for data collection was developed, which seeks to capture the physical interactions learners have with multimedia learning materials focusing on learning geometry. By leveraging the touch-based technologies in iPads, physical interactions with learning materials as well as quantitative data related to these interactions can be captured, and parsed. This novel instrument has implications for the field of Multimodal Learning Analytics (MMLA), potentially opening the door to a new field of Embodied Learning Analytics (ELA) and presents an interesting path forward for the measurement of embodied data, and how this may inform teaching and learning. This presentation will provide a brief theoretical overview, as well as a demo of the developed apps.”
Role of Learning Theories in the Use of Learning Analytics
Kazem is a PhD candidate in the field of educational technology at Allameh Tabataba’i University (ATU), Tehran, Iran and currently a PhD visiting scholar at UBC. In this session, Kazem will discuss:
“In my presentation, I would like to talk about the role of learning theories in the use of learning analytics in education and then I will explain about my dissertation and the goals I and my supervisors are hoping to achieve in that research. My dissertation focuses on development of an instructional design model based on constructivism theory in higher education with a focus on learning analytics. “
We would like to invite you to the 4th UBC Learning Analytics Hackathon, which will take place on October 27-28, 2018. This event is organized by UBC Learning Analytics Pilot, LAVA and CAPICO.
Did you know that Canvas includes an API for accessing and modifying your learning data in your own programs and scripts? Have you ever wondered if you could build a Canvas app that improves your own learning? This fall, the hackathon will explore how the Canvas API can be used to improve student learning and experiences.
This event brings together students, researchers, faculty, staff, and any other interested individuals to get hands-on experience with analyzing and working with the Canvas API. During this two-day hackathon, participants will form teams, work with Canvas’s REST API, design and build apps and dashboards, and then show off what they accomplished at the end of the weekend with a brief presentation. Prizes and awards will be given out for interesting projects.
Jeff Longland, Solutions Architect and Scott Mcmillan, System Analyst II are involved in the development of the technical infrastructure for the 2-year funded UBC Learning Analytics Project. In this session, they will give a summary of year one and their plans for year two.
“The Data Janitors return. Jeff and Scott will discuss their lessons learned during the first year of the Learning Analytics project and share the technical working group’s recommendations for year two. ”
UBC Sauder’s Learning Services team provides teaching staff and faculty with services and tools that can support design, development, and delivery of rich learning experience to students. In this session, the team will talk about a dashboard they have developed for monitoring module progress. From Alison Myers:
“We have developed a pilot dashboard that allows the visualization of Module Progress in Canvas. When you build a module in Canvas you have the option to add prerequisites and requirements that enforce a sequence of completion. However, the only option to view the progress of students is at the level of the individual student. Using the Canvas API and Tableau we were able to create a dashboard that would allow an instructor to see the overall progress of the class with the option of viewing the progress of individual students. At LAVA we will be presenting the Tableau prototype, as well as discussing the project from three roles: our Manager of Learning Ecosystems Support and Solutions, our Canvas Tech Rover, and our Research Analyst.”
Application of Assessment Analytics to Improve Teaching and Learning: Why and How
Julie Wei is a Senior Research Analyst at UBC Faculty of Arts. Julie is an interdisciplinary researcher, with graduate degrees in Educational Psychology (PhD, University of Illinois at Urbana-Champaign, 2016), Statistics (Master, UIUC, 2013), and Curriculum & Pedagogy (Master, Henan Normal University, 2005). Before moving to Canada, Julie had taught at K-12 and universities for more than ten years. Julie’s research has focused primarily on how to effectively design and use assessment, evaluation and learning analytics to improve teaching and learning.
In this session, Julie will discuss why and how assessment analytics has the potential to make a valuable contribution to the field of Learning Analytics by broadening its scope and increasing its usefulness.
“Assessment is a central element of the learning process, as it not only defines what learners consider important and measures their achievements across all phases of learning process but also gives instructor and institutions a valuable feedback about whether the program goals and institutional objectives have achieved. Thus, assessment analytics benefits would extend to all stakeholders involved in the educational process. In order to get maximal benefits from assessment analytics, assessments should be designed mindfully to help collect finer level test data. In contrast to traditional methods of assessment that provides only a final score, a diagnostic assessment has the potential to provide more detailed information about in which specific areas that learners do well or need to make improvements if it is well-constructed, thus it can collect, analyze and report data about leaners for the purpose of “understanding and optimizing, learning and the environment in which it occurs.” (Siemens G, 2013). Diagnostic assessment is especially helpful to understand complicated learning process in content domains that are traditional considered to be difficult. In this talk, I share with the audience how assessment should be designed mindfully and revised iteratively, as well as how the granular information could be used to help teachers, students and schools make improvements.”
This session, we will be meeting in *DL-005* which is the classroom right across from our normal meeting place (DL011).
Justin Lee is a Programmer Analyst at UBC Faculty of Land & Food Systems. Last session Justin invited us to his world and discussed some principles of programming.
“Composition is a style of programming that encourages us to break a complex problem down into smaller subproblems, build solutions to the subproblems, and compose the solutions together to solve the original problem. On Monday, I’ll demonstrate how I use composition as a powerful tool to reduce complexity (and bugs), and increase the expressiveness of code. I hope you’ll be able to use some of these ideas to help solve problems in the future!”
Let’s join to review and discuss the state of Learning Analytics now compared to early days.
We will be watching some of the videos from the LAK 2012 conference. Then we can talk about whether the concepts still apply, what progress has been made, or what changes have occurred in the last 6 years.
There are 3 sessions chosen that are about 20 minutes each, we can watch and discuss.
1. [LAK 2012] April 30: 2B – The Learning Analytics Cycle: Closing the loop effectively
2. [LAK 2012] April 30: 1B – Using an Instructional Expert…
3. [LAK 2012] May 1: 5B – Does the Length of Time Off-Task Matter?