Role of Learning Theories in the Use of Learning Analytics
Kazem is a PhD candidate in the field of educational technology at Allameh Tabataba’i University (ATU), Tehran, Iran and currently a PhD visiting scholar at UBC. In this session, Kazem will discuss:
“In my presentation, I would like to talk about the role of learning theories in the use of learning analytics in education and then I will explain about my dissertation and the goals I and my supervisors are hoping to achieve in that research. My dissertation focuses on development of an instructional design model based on constructivism theory in higher education with a focus on learning analytics. “
We would like to invite you to the 4th UBC Learning Analytics Hackathon, which will take place on October 27-28, 2018. This event is organized by UBC Learning Analytics Pilot, LAVA and CAPICO.
Did you know that Canvas includes an API for accessing and modifying your learning data in your own programs and scripts? Have you ever wondered if you could build a Canvas app that improves your own learning? This fall, the hackathon will explore how the Canvas API can be used to improve student learning and experiences.
This event brings together students, researchers, faculty, staff, and any other interested individuals to get hands-on experience with analyzing and working with the Canvas API. During this two-day hackathon, participants will form teams, work with Canvas’s REST API, design and build apps and dashboards, and then show off what they accomplished at the end of the weekend with a brief presentation. Prizes and awards will be given out for interesting projects.
Jeff Longland, Solutions Architect and Scott Mcmillan, System Analyst II are involved in the development of the technical infrastructure for the 2-year funded UBC Learning Analytics Project. In this session, they will give a summary of year one and their plans for year two.
“The Data Janitors return. Jeff and Scott will discuss their lessons learned during the first year of the Learning Analytics project and share the technical working group’s recommendations for year two. ”
UBC Sauder’s Learning Services team provides teaching staff and faculty with services and tools that can support design, development, and delivery of rich learning experience to students. In this session, the team will talk about a dashboard they have developed for monitoring module progress. From Alison Myers:
“We have developed a pilot dashboard that allows the visualization of Module Progress in Canvas. When you build a module in Canvas you have the option to add prerequisites and requirements that enforce a sequence of completion. However, the only option to view the progress of students is at the level of the individual student. Using the Canvas API and Tableau we were able to create a dashboard that would allow an instructor to see the overall progress of the class with the option of viewing the progress of individual students. At LAVA we will be presenting the Tableau prototype, as well as discussing the project from three roles: our Manager of Learning Ecosystems Support and Solutions, our Canvas Tech Rover, and our Research Analyst.”
Application of Assessment Analytics to Improve Teaching and Learning: Why and How
Julie Wei is a Senior Research Analyst at UBC Faculty of Arts. Julie is an interdisciplinary researcher, with graduate degrees in Educational Psychology (PhD, University of Illinois at Urbana-Champaign, 2016), Statistics (Master, UIUC, 2013), and Curriculum & Pedagogy (Master, Henan Normal University, 2005). Before moving to Canada, Julie had taught at K-12 and universities for more than ten years. Julie’s research has focused primarily on how to effectively design and use assessment, evaluation and learning analytics to improve teaching and learning.
In this session, Julie will discuss why and how assessment analytics has the potential to make a valuable contribution to the field of Learning Analytics by broadening its scope and increasing its usefulness.
“Assessment is a central element of the learning process, as it not only defines what learners consider important and measures their achievements across all phases of learning process but also gives instructor and institutions a valuable feedback about whether the program goals and institutional objectives have achieved. Thus, assessment analytics benefits would extend to all stakeholders involved in the educational process. In order to get maximal benefits from assessment analytics, assessments should be designed mindfully to help collect finer level test data. In contrast to traditional methods of assessment that provides only a final score, a diagnostic assessment has the potential to provide more detailed information about in which specific areas that learners do well or need to make improvements if it is well-constructed, thus it can collect, analyze and report data about leaners for the purpose of “understanding and optimizing, learning and the environment in which it occurs.” (Siemens G, 2013). Diagnostic assessment is especially helpful to understand complicated learning process in content domains that are traditional considered to be difficult. In this talk, I share with the audience how assessment should be designed mindfully and revised iteratively, as well as how the granular information could be used to help teachers, students and schools make improvements.”
This session, we will be meeting in *DL-005* which is the classroom right across from our normal meeting place (DL011).
Justin Lee is a Programmer Analyst at UBC Faculty of Land & Food Systems. Last session Justin invited us to his world and discussed some principles of programming.
“Composition is a style of programming that encourages us to break a complex problem down into smaller subproblems, build solutions to the subproblems, and compose the solutions together to solve the original problem. On Monday, I’ll demonstrate how I use composition as a powerful tool to reduce complexity (and bugs), and increase the expressiveness of code. I hope you’ll be able to use some of these ideas to help solve problems in the future!”
Let’s join to review and discuss the state of Learning Analytics now compared to early days.
We will be watching some of the videos from the LAK 2012 conference. Then we can talk about whether the concepts still apply, what progress has been made, or what changes have occurred in the last 6 years.
There are 3 sessions chosen that are about 20 minutes each, we can watch and discuss.
1. [LAK 2012] April 30: 2B – The Learning Analytics Cycle: Closing the loop effectively
2. [LAK 2012] April 30: 1B – Using an Instructional Expert…
3. [LAK 2012] May 1: 5B – Does the Length of Time Off-Task Matter?
I am a Senior Research Analyst at UBC Faculty of Arts and work on learning analytics and academic analytics projects in that faculty. I stepped into the world of learning analytics when I started my Master’s degree at School of Interactive Art & Technology, SFU. My research was focused on student-facing LA visualizations and dashboards. In this session, I discussed how we can support students use of learning analytics in the classroom:
“Drawing on the literature, I will talk about some of the known challenges students face when using learning analytics, particularly around the interpretation of presented information and action taking.
Then I will discuss a proposed framework (from the literature) that addresses some of the identified challenges and integrates engagement with analytics as part of the larger teaching and learning activity to support its productive use.
I will show some examples of how the framework can be implemented.“
Data Visualization Critique: A Graphic Design Perspective (Session 2)
Emma Novotny, Senior Graphic Designer in UBC Faculty of Arts has agreed to come back and lead another Visualization Feedback session (which was really great last time)!
If you have a visualization that you think could be improved, that you would like the eye of a graphic designer on, or would otherwise like feedback from Emma please submit it to Alison before the end of day on May 25th. With the visualization, please include:
The context (what project is this for?
What is the goal of the visualization?
Who is the audience?
How will it be used?
Any other detail you think is important
If you don’t have your own visualization to show but know of one that you would like to see improved send that! Emma will choose 1-3 of the visualizations and spend time on the 4th talking about her thoughts on how to improve the visualization/ what she might change/other feedback.
Towards User-Centred Analytics: User-Adaptive Visualizations
In this session, we will watch and discuss one of the keynotes from Learning Analytics and Knowledge Conference 2018 by Professor Cristina Conati, Professor, Department of Computer Science, University of British Columbia. Cristina is interested in integrating research in Artificial Intelligence, Human Computer Interaction and Cognitive Science to create intelligent user interfaces that can effectively and reliably adapt to the needs of each user.
“As digital information continues to accumulate in our lives, information visualizations have become an increasingly relevant tool for discovering trends and shaping stories from this overabundance of data. Education is not an exception, with learner and teacher visualization dashboards being extensively investigated as new means to change pedagogy and learning. Visualizations are typically designed based on the data to be displayed and the tasks to be supported, but they follow a one size-fits-all approach when it comes to users’ individual differences such as expertise, cognitive abilities, states and preferences. There is, however, mounting evidence that these characteristics can significantly influence user experience during information visualization tasks. These findings have triggered research on user-adaptive visualizations, i.e., visualizations that can track and adapt to relevant user characteristics and specific needs. In this talk, I will present results on which user individual differences can impact visualization processing, and on how these differences can be captured using predictive models based on eye-tracking data. I will also discuss how to leverage these models to provide personalized support that can improve the user’s experience with a visualization.”