Monday Jan 15, 2018: Envisioning Hackathon 3.0

Few people have expressed interest in running a Hackathon this year. So in the next LAVA meeting, we will be discussing as a group what Hackathon 3.0 could look like.  Alison is a Research Analyst in the UBC Sauder School of Business who helped organize Hackathon 2.0 at UBC last year. She writes:

“For those unaware, we have run 2 hackathons in the past. The first we had researchers with datasets/their own questions, the second we had a single dataset (MOOC data) with a range of questions. Groups were formed, data was hacked, and results were presented. In the past we have used “Hackathon” somewhat more loosely than you may have seen elsewhere, where we haven’t actually required those attending to have any previous experience.

We will quickly review what we have done in the past and then it will be an open discussion.

If you have interested in planning a Hackathon – what would you like it to look like? If you have participated in a LAVA Hackathon (or other) – what did you like, what would you change?”

 

Monday Dec 4, 2017: Alison Myers

Power BI vs. Tableau (a totally biased review from a Tableau user)

Alison is a Research Analyst at UBC Sauder School of Business.  As part of her role, she is involved in a variety of data analytics and data visualization projects that require adopting different tools and techniques.  She writes:

“I have been recently exploring Power BI as an alternative data visualization tool and would like to share with the group. I’m happy for this to be more of a discussion than a presentation, so anyone with experience with either Tableau or (especially) Power BI please be prepared to chime in. In the comparison, I plan on:

1) Showing a quick demonstration of the two tools with a shared dataset

2) Highlighting some of the main functionality differences that I have noticed

3) Discussing which tool might better suit different needs

If anyone has a) any specific questions about functionality in either tool (ie. “can you build X”) please let me know and I will try to answer/bring that up for discussion.”

You can contact Alison via Email.

Monday Nov 20, 2017: Leah Macfadyen

“Because its 2017”: Equipping educators and scholars for the learning analytics era

Leah Macfadyen is the Program Director of Evaluation & Learning Analytics at UBC Faculty of Arts. As of next year, she will be moving on to a new instructor position at UBC Faculty of Education. As part of the interview process, Leah was asked to suggest an outline for a course that she would develop for Masters in Educational Technology program.

“What do educators need to know about learning analytics in 2017? In September, as part of the interview process for my new position in the Faculty of Education, I was given instructions that I should plan to deliver “a 45-minute talk that provides an overview of a core course that I would develop and teach for the MET program, as well as how I see it fitting within the broader MET program.” You can learn more about the Masters in Educational Technology (MET) program at http://met.ubc.ca/ . In this session, I’ll share with you the outline I developed and spoke about for a course in learning analytics, and explain the underlying logic to my design ideas. I’ll be very pleased to gather feedback from all of you, as well as further ideas. “

Monday Nov 6, 2017: Craig Thompson

Learning Analytics @ The University of Saskatchewan: A Perspective

Craig Thompson is a Research Analyst working on the UBC Learning Analytics Project. He joined UBC in September, having previously worked at the University of Saskatchewan developing Learning Analytics pilot projects for the last 3.5 years.

In this presentation, Craig will present several tools developed and used at the University of Saskatchewan, including: (1) A personalized student messaging system for delivering automated, tailored advice. (2) A dashboard for instructors to view aggregate demographics about students in their courses. (3) an interactive dashboard for administrators to explore demographics and performance characteristics of students in their programs. (4) Ribbon visualizations of student flows through academic programs (tool developed at UC Davis). Having first hand experience with these pilot programs, Craig will also share lessons learned from the trenches of Learning Analytics.

Monday Oct 23, 2017: Alain Prat

Working against the WeBWork clock: What are the behaviour patterns of students who struggle to complete online calculus assignments?

Alain Prat is a Science Teaching and Learning Fellow in the Math Department at UBC. His research focuses on understanding and supporting the lowest performing first year calculus students. He writes:

“Since 2010, the math department at UBC has been gradually  adopting the WeBWork online homework system in most first and second year  courses. Instructors typically give students several days to complete their WeBWork assignments, and allow students several attempts at each problem. Despite this, many students struggle to complete their online assignments. In this talk, I’ll discuss how the timing of answer submission recorded in WeBWork log files can reveal the behaviour patterns of students who struggle with WeBWork. In particular, students who don’t complete the WeBWork start the assignments closer to the deadline, have shorter login sessions and don’t persist for as long once they encounter a problem they can’t solve. I’ll discuss what these observations can reveal about the mindset of struggling students, and how assignments could be restructured to help increase their completion rate.”

Monday Sep 25, 2017: Abdel Azim Zumrawi & Leah Macfadyen / SEoT Data (Summary)

On Sep 25th, Abdel Azim Zumrawi (Statistician, UBC Centre for Teaching and Learning) and Leah Macfadyen (Program director, Evaluation and Learning Analytics, UBC Faculty of Arts) spoke about challenges of meaningfully capturing, summarizing and presenting Student Evaluations of Teaching and Learning (SEoT) data at UBC.

Leah opened the session by talking about history of SEoT at UBC. The UBC Senate has been considering student evaluations ever since 1974. Then later in 2007, an updated policy, recommended by Teaching and Learning Committee, was approved by the senate that requires every course section or learning experience to be evaluated by students every time it is offered (with some exceptions). For more information visit http://teacheval.ubc.ca/.

Based on this policy, a modular model is implemented at UBC, where the student evaluations questionnaire includes university-wide questions, as well as, faculty and department specific ones. Most of these questions adopt a 5-point Likert scale to measure respondents agreement. The response categories are then translated into quantitative scores. Below is a visual representation of a Likert scale.

Note: images are not present in the original evaluation questionnaire.

The original SEoT data is ordinal and not ratio scale, meaning that the points are ordered along one spectrum but the distance between them is not known. This poses some challenges when summarizing and presenting SEoT data, as pointed out by Abdel Azim. For instance, using “average” to compare evaluations across individuals and units can be misleading. To demonstrate his point, Abdel Azim shared an example of 6 distributions of SEoT scores that all have the same average but clearly show very different patterns.

One would naturally think that a measure of variability is required to better describe and distinguish these patterns. Abdel Azim argued that “standard deviation” is not an accurate measure of variability for ordinal SEoT data. He suggests adopting a simple and intuitive “dispersion index” suited for ordinal data instead. A dispersion metric would range from 0 (complete agreement) to 1.0 (a 50-50 split between the two extreme scores).

In addition to dispersion index, Abdel Azim suggests looking at “percent of favorable responses” (i.e., those rated 4 or 5) when summarizing SEoT data. Several years of data at UBC shows that overall, students tend to give instructors higher ratings of 4 and 5. However, the percentage may differ from one course offering to another.

Revisiting the 6 distributions of SEoT data in the earlier example, Abdel Azim pointed out that while averages are exactly the same, both dispersion index and percent of favorable responses are very different per case. This signifies the necessity of adopting appropriate metrics for summarizing SEoT data.

Abdel Azim explained that “response rate” is one other factor that should be taken into account when analyzing SEoT data. Not all students in all classes choose to complete the evaluations, resulting in varying response rates. Extensive statistical analysis of UBC SEoT data has been done to determine minimum recommended response rates for generating reliable score distributions for class sizes, where scores were classified as “favorable” or “unfavorable”.

Zumrawi A., Simon P. Bates & Marianne Schroeder (2014) What response rates are needed to make reliable inferences from student evaluations of teaching?, Educational Research and Evaluation: An International Journal on Theory and Practice, 20:7-8, 557-563

Justin Lee (Programmer Analyst, UBC’s Faculty of Land and Food Systems) closed the session by sharing his visualization work that allows users to explore SEoT data for his faculty using the above metrics in one interactive dashboard.

Spam prevention powered by Akismet