Monday Jan 29, 2018: Emma Novotny

Data Visualization Critique: A Graphic Design Perspective

Emma Novotny is a Senior Graphic Designer at UBC Faculty of Arts. Before starting at UBC, Emma Novotny worked in Toronto as a graphic designer in the Knowledge Translation Department at St. Michael’s Hospital. There she designed web and print-based tools and infographics for patients and primary care providers. She has also worked at a small design studio called Tennis (formerly ALSO Collective) where she produced branding, editorial, and web projects for a range of clients in the healthcare, not for profit, and arts and culture industries.

In this session, Emma will provide feedback on a couple of data visualization projects based on her own expertise. She will choose 1-3 of the visualizations that were submitted to her by LAVA folks and spend time on talking about her thoughts on how to improve the visualization/ what she might change/other feedback. She will also use the time to give some general information regarding visualizations.

Monday Jan 15, 2018: Envisioning Hackathon 3.0

Few people have expressed interest in running a Hackathon this year. So in the next LAVA meeting, we will be discussing as a group what Hackathon 3.0 could look like.  Alison is a Research Analyst in the UBC Sauder School of Business who helped organize Hackathon 2.0 at UBC last year. She writes:

“For those unaware, we have run 2 hackathons in the past. The first we had researchers with datasets/their own questions, the second we had a single dataset (MOOC data) with a range of questions. Groups were formed, data was hacked, and results were presented. In the past we have used “Hackathon” somewhat more loosely than you may have seen elsewhere, where we haven’t actually required those attending to have any previous experience.

We will quickly review what we have done in the past and then it will be an open discussion.

If you have interested in planning a Hackathon – what would you like it to look like? If you have participated in a LAVA Hackathon (or other) – what did you like, what would you change?”

 

Monday Dec 4, 2017: Alison Myers

Power BI vs. Tableau (a totally biased review from a Tableau user)

Alison is a Research Analyst at UBC Sauder School of Business.  As part of her role, she is involved in a variety of data analytics and data visualization projects that require adopting different tools and techniques.  She writes:

“I have been recently exploring Power BI as an alternative data visualization tool and would like to share with the group. I’m happy for this to be more of a discussion than a presentation, so anyone with experience with either Tableau or (especially) Power BI please be prepared to chime in. In the comparison, I plan on:

1) Showing a quick demonstration of the two tools with a shared dataset

2) Highlighting some of the main functionality differences that I have noticed

3) Discussing which tool might better suit different needs

If anyone has a) any specific questions about functionality in either tool (ie. “can you build X”) please let me know and I will try to answer/bring that up for discussion.”

You can contact Alison via Email.

Monday Nov 20, 2017: Leah Macfadyen

“Because its 2017”: Equipping educators and scholars for the learning analytics era

Leah Macfadyen is the Program Director of Evaluation & Learning Analytics at UBC Faculty of Arts. As of next year, she will be moving on to a new instructor position at UBC Faculty of Education. As part of the interview process, Leah was asked to suggest an outline for a course that she would develop for Masters in Educational Technology program.

“What do educators need to know about learning analytics in 2017? In September, as part of the interview process for my new position in the Faculty of Education, I was given instructions that I should plan to deliver “a 45-minute talk that provides an overview of a core course that I would develop and teach for the MET program, as well as how I see it fitting within the broader MET program.” You can learn more about the Masters in Educational Technology (MET) program at http://met.ubc.ca/ . In this session, I’ll share with you the outline I developed and spoke about for a course in learning analytics, and explain the underlying logic to my design ideas. I’ll be very pleased to gather feedback from all of you, as well as further ideas. “

Monday Nov 6, 2017: Craig Thompson

Learning Analytics @ The University of Saskatchewan: A Perspective

Craig Thompson is a Research Analyst working on the UBC Learning Analytics Project. He joined UBC in September, having previously worked at the University of Saskatchewan developing Learning Analytics pilot projects for the last 3.5 years.

In this presentation, Craig will present several tools developed and used at the University of Saskatchewan, including: (1) A personalized student messaging system for delivering automated, tailored advice. (2) A dashboard for instructors to view aggregate demographics about students in their courses. (3) an interactive dashboard for administrators to explore demographics and performance characteristics of students in their programs. (4) Ribbon visualizations of student flows through academic programs (tool developed at UC Davis). Having first hand experience with these pilot programs, Craig will also share lessons learned from the trenches of Learning Analytics.

Monday Oct 23, 2017: Alain Prat

Working against the WeBWork clock: What are the behaviour patterns of students who struggle to complete online calculus assignments?

Alain Prat is a Science Teaching and Learning Fellow in the Math Department at UBC. His research focuses on understanding and supporting the lowest performing first year calculus students. He writes:

“Since 2010, the math department at UBC has been gradually  adopting the WeBWork online homework system in most first and second year  courses. Instructors typically give students several days to complete their WeBWork assignments, and allow students several attempts at each problem. Despite this, many students struggle to complete their online assignments. In this talk, I’ll discuss how the timing of answer submission recorded in WeBWork log files can reveal the behaviour patterns of students who struggle with WeBWork. In particular, students who don’t complete the WeBWork start the assignments closer to the deadline, have shorter login sessions and don’t persist for as long once they encounter a problem they can’t solve. I’ll discuss what these observations can reveal about the mindset of struggling students, and how assignments could be restructured to help increase their completion rate.”

Monday Sep 25, 2017: Abdel Azim Zumrawi & Leah Macfadyen / SEoT Data (Summary)

On Sep 25th, Abdel Azim Zumrawi (Statistician, UBC Centre for Teaching and Learning) and Leah Macfadyen (Program director, Evaluation and Learning Analytics, UBC Faculty of Arts) spoke about challenges of meaningfully capturing, summarizing and presenting Student Evaluations of Teaching and Learning (SEoT) data at UBC.

Leah opened the session by talking about history of SEoT at UBC. The UBC Senate has been considering student evaluations ever since 1974. Then later in 2007, an updated policy, recommended by Teaching and Learning Committee, was approved by the senate that requires every course section or learning experience to be evaluated by students every time it is offered (with some exceptions). For more information visit http://teacheval.ubc.ca/.

Based on this policy, a modular model is implemented at UBC, where the student evaluations questionnaire includes university-wide questions, as well as, faculty and department specific ones. Most of these questions adopt a 5-point Likert scale to measure respondents agreement. The response categories are then translated into quantitative scores. Below is a visual representation of a Likert scale.

Note: images are not present in the original evaluation questionnaire.

The original SEoT data is ordinal and not ratio scale, meaning that the points are ordered along one spectrum but the distance between them is not known. This poses some challenges when summarizing and presenting SEoT data, as pointed out by Abdel Azim. For instance, using “average” to compare evaluations across individuals and units can be misleading. To demonstrate his point, Abdel Azim shared an example of 6 distributions of SEoT scores that all have the same average but clearly show very different patterns.

One would naturally think that a measure of variability is required to better describe and distinguish these patterns. Abdel Azim argued that “standard deviation” is not an accurate measure of variability for ordinal SEoT data. He suggests adopting a simple and intuitive “dispersion index” suited for ordinal data instead. A dispersion metric would range from 0 (complete agreement) to 1.0 (a 50-50 split between the two extreme scores).

In addition to dispersion index, Abdel Azim suggests looking at “percent of favorable responses” (i.e., those rated 4 or 5) when summarizing SEoT data. Several years of data at UBC shows that overall, students tend to give instructors higher ratings of 4 and 5. However, the percentage may differ from one course offering to another.

Revisiting the 6 distributions of SEoT data in the earlier example, Abdel Azim pointed out that while averages are exactly the same, both dispersion index and percent of favorable responses are very different per case. This signifies the necessity of adopting appropriate metrics for summarizing SEoT data.

Abdel Azim explained that “response rate” is one other factor that should be taken into account when analyzing SEoT data. Not all students in all classes choose to complete the evaluations, resulting in varying response rates. Extensive statistical analysis of UBC SEoT data has been done to determine minimum recommended response rates for generating reliable score distributions for class sizes, where scores were classified as “favorable” or “unfavorable”.

Zumrawi A., Simon P. Bates & Marianne Schroeder (2014) What response rates are needed to make reliable inferences from student evaluations of teaching?, Educational Research and Evaluation: An International Journal on Theory and Practice, 20:7-8, 557-563

Justin Lee (Programmer Analyst, UBC’s Faculty of Land and Food Systems) closed the session by sharing his visualization work that allows users to explore SEoT data for his faculty using the above metrics in one interactive dashboard.

March 28th 2017: Ian Linkletter/ Mattermost

Note: Room change this week to Buchanan C105C

Ian is a Learning Technology Specialist in UBC’s Faculty of Education, and will talk about the Mattermost tool: what it is, how it works, and whether we can get data out of it that might tell us anything useful about learning or engagement. He writes:

Mattermost is an open source communication tool that facilitates communication and collaboration in a chat-type environment. You could call it an open source and UBC-hosted Slack alternative. I’ll be going over the pilot so far, how Mattermost was selected, how it is currently being used in Education, followed by a hands-on demonstration and then opening things up for discussion. That discussion might include analytics potential as well as whether it would be useful for the LAVA group to connect between meetings.

A PhD student from the Department of Language and Literacy Education will be joining me, as he’s interested in using Mattermost as part of a study on team collaboration tools for language learning. I’m hoping he will be willing to share a little about his research.”

Ahead of the meeting, Ian would like to encourage people to register for the Mattermost LAVA group. This was created a few months back in conversation with Leah. It’s just an experiment for now but who knows!

  1. Register for the LAVA Mattermost group: https://mattermost.elearning.ubc.ca/signup_user_complete/?id=gqe8d991uj8oxjtzzg51ar39io
  2. Verify your email, then log in at https://mattermost.elearning.ubc.ca/lava.
  3. Check out the desktop/mobile apps: https://about.mattermost.com/download/#mattermostApps

Hackathon 2.0

In January 2017, the Institute for the Scholarship of Teaching and Learning and the Learning Analytics Visual Analytics group held Hackathon 2.0 at UBC. The two-day event brought together over 80 students, researchers, faculty and staff to explore educational data. Organizers welcomed participants with a range of expertise, and encouraged beginners to sign up to the event.

My interest in having a hackathon was to bring people together who either want to learn more about doing data analysis, know about data analysis but want to bring into a new context, which is learning analytics, or don’t know anything about either of those things but are generally interested,” said Alison Myers, data analytics specialist at the UBC Sauder School of Business.

This year the hackathon began with a series of workshops on visual analytics, temporal data analysis and statistics using the R software. The idea was to support participants in expanding their knowledge base and making first steps with their analyses.

Hackathon participants were given data from two UBC Massive Open Online Courses: one course focused on Chinese philosophy, the other focused on the science behind climate change. Data included event-trace data, student demographics, discussion-forum posts, attitude surveys, and summative data, enabling a broad range of analytical approaches. Participants formed groups depending on their expertise and interests.

“I have done research on learning data in the past. Patrick is doing computer science so he knows more about coding. And Vesta has experience with visual analytics,” explained Mario Cimet, a student studying Cognitive Systems at UBC, about his team.

The event was an opportunity for like-minded people to meet and share their passion for data analysis. The hackathon was also aimed at raising the profile and visibility of learning analytics. Learning research data can give instructors feedback about their teaching approaches and resources, and how they’re working in their classrooms. It can inform departments about why certain classes are more popular than others and thus support planning at the program level.

“Learning analytics is using evidence about learners to improve the process [of teaching],” said Cimet. “I think it’s important because any decision that you make that is going to deal with their education, you should do it with as much evidence as possible. You should do based on facts.”


Here are three examples of what the participants were able to achieve during the hackathon weekend:

  • Video watching behaviour showing how students skip forward (top) and backward (bottom): https://www.youtube.com/watch?v=QTROm-ImO7M&feature=youtu.be by Xueqin Zhang(Qin) and Matthew Fong
  • Course tree, where circle size shows either activity level across all learners, time spent per learner, or frequency of this being the last visited page of the course: http://link.landfood.ubc.ca/courseTree/ by Anh Nguyen, Shirley Lin and Justin Lee
  • Another course tree, where the width of a line represents the movement from node to node, the size of the circle is the number of unique learners, and the colour of a circle shows whether this was a student’s last activity in the course: http://static.useit.today/ubcxhack.html by Patrick Coleman, Mario Cimet and Vesta Sahatciu

LAVA Hackathon: Bringing interdisciplinary skills together for incredible impact

Reflections on the hackathon experience by data presenter/scientist/instructor Dr. Megan Barker

I’d been looking forward to the hackathon for a few months, and it completely blew away my expectations!   As a data presenter at the hackathon, my role was to share ideas and data from my collaborative project characterizing classroom practices in UBC biology – all in the hopes of tempting data-savvy hackers to play with the data for the weekend.   In my research project team, we currently have pedagogical expertise but are sorely lacking skills in visual and data analysis.  This hackathon was a perfect opportunity for us to share the dataset with analysts and students looking for real educational data to work with.  The event was a smash success: we built teams, worked on real projects together, and had tangible successes by the end.

In our research, my colleagues and I ask the basic question:

classroom_learning

We approach this by observing and collecting data from many classes in our department…..Read the full article.