Working against the WeBWork clock: What are the behaviour patterns of students who struggle to complete online calculus assignments?
Alain Prat is a Science Teaching and Learning Fellow in the Math Department at UBC. His research focuses on understanding and supporting the lowest performing first year calculus students. He writes:
“Since 2010, the math department at UBC has been gradually adopting the WeBWork online homework system in most first and second year courses. Instructors typically give students several days to complete their WeBWork assignments, and allow students several attempts at each problem. Despite this, many students struggle to complete their online assignments. In this talk, I’ll discuss how the timing of answer submission recorded in WeBWork log files can reveal the behaviour patterns of students who struggle with WeBWork. In particular, students who don’t complete the WeBWork start the assignments closer to the deadline, have shorter login sessions and don’t persist for as long once they encounter a problem they can’t solve. I’ll discuss what these observations can reveal about the mindset of struggling students, and how assignments could be restructured to help increase their completion rate.”
On Sep 25th, Abdel Azim Zumrawi (Statistician, UBC Centre for Teaching and Learning) and Leah Macfadyen (Program director, Evaluation and Learning Analytics, UBC Faculty of Arts) spoke about challenges of meaningfully capturing, summarizing and presenting Student Evaluations of Teaching and Learning (SEoT) data at UBC.
Leah opened the session by talking about history of SEoT at UBC. The UBC Senate has been considering student evaluations ever since 1974. Then later in 2007, an updated policy, recommended by Teaching and Learning Committee, was approved by the senate that requires every course section or learning experience to be evaluated by students every time it is offered (with some exceptions). For more information visit http://teacheval.ubc.ca/.
Based on this policy, a modular model is implemented at UBC, where the student evaluations questionnaire includes university-wide questions, as well as, faculty and department specific ones. Most of these questions adopt a 5-point Likert scale to measure respondents agreement. The response categories are then translated into quantitative scores. Below is a visual representation of a Likert scale.
The original SEoT data is ordinal and not ratio scale, meaning that the points are ordered along one spectrum but the distance between them is not known. This poses some challenges when summarizing and presenting SEoT data, as pointed out by Abdel Azim. For instance, using “average” to compare evaluations across individuals and units can be misleading. To demonstrate his point, Abdel Azim shared an example of 6 distributions of SEoT scores that all have the same average but clearly show very different patterns.
One would naturally think that a measure of variability is required to better describe and distinguish these patterns. Abdel Azim argued that “standard deviation” is not an accurate measure of variability for ordinal SEoT data. He suggests adopting a simple and intuitive “dispersion index” suited for ordinal data instead. A dispersion metric would range from 0 (complete agreement) to 1.0 (a 50-50 split between the two extreme scores).
In addition to dispersion index, Abdel Azim suggests looking at “percent of favorable responses” (i.e., those rated 4 or 5) when summarizing SEoT data. Several years of data at UBC shows that overall, students tend to give instructors higher ratings of 4 and 5. However, the percentage may differ from one course offering to another.
Revisiting the 6 distributions of SEoT data in the earlier example, Abdel Azim pointed out that while averages are exactly the same, both dispersion index and percent of favorable responses are very different per case. This signifies the necessity of adopting appropriate metrics for summarizing SEoT data.
Abdel Azim explained that “response rate” is one other factor that should be taken into account when analyzing SEoT data. Not all students in all classes choose to complete the evaluations, resulting in varying response rates. Extensive statistical analysis of UBC SEoT data has been done to determine minimum recommended response rates for generating reliable score distributions for class sizes, where scores were classified as “favorable” or “unfavorable”.
Justin Lee (Programmer Analyst, UBC’s Faculty of Land and Food Systems) closed the session by sharing his visualization work that allows users to explore SEoT data for his faculty using the above metrics in one interactive dashboard.
Ian is a Learning Technology Specialist in UBC’s Faculty of Education, and will talk about the Mattermost tool: what it is, how it works, and whether we can get data out of it that might tell us anything useful about learning or engagement. He writes:
“Mattermost is an open source communication tool that facilitates communication and collaboration in a chat-type environment. You could call it an open source and UBC-hosted Slack alternative. I’ll be going over the pilot so far, how Mattermost was selected, how it is currently being used in Education, followed by a hands-on demonstration and then opening things up for discussion. That discussion might include analytics potential as well as whether it would be useful for the LAVA group to connect between meetings.
A PhD student from the Department of Language and Literacy Education will be joining me, as he’s interested in using Mattermost as part of a study on team collaboration tools for language learning. I’m hoping he will be willing to share a little about his research.”
Ahead of the meeting, Ian would like to encourage people to register for the Mattermost LAVA group. This was created a few months back in conversation with Leah. It’s just an experiment for now but who knows!
I’ll be discussing my work at the Visual Cognition Lab under Dr. Ron Rensink, where my team and I study the perception of correlation in visualizations. The purpose of this is two-fold. Using visualizations as a stimuli can help us understand how the visual system gets statistical information from scenes. Conversely, this understanding can lead to better visualizations by giving us rigorous ways to measure the effectiveness of a design.
For example, consider the pair of graphs below, each representing an identical set of age and height measurements for a group of individuals.
The graph on the left is clearly superior, revealing relationships that are invisible in the graph on the right. But we don’t really know why. As designers of visualizations, the best we can do right now is appeal to our intuition, to the “best practices” identified by our colleagues, or to the results of field studies.
While these methods may have worked well enough so far, they may not scale well as visualizations become increasingly complex and high-dimensional. In my presentation, I’ll show our research can eventually let us develop methods to judge visualizations from first principles.
Meetings have moved to a new time (every other Monday, 3pm) this term.
Note temporary different meeting space on Feb 1st: DL-011 (the boardroom in the Sauder building where we have met in previous terms)
This first meeting will be a planning and brainstorming session. Pleasesend Alison 2-3 slides that a) introduce yourself and your work and b) propose a presentation/demo/workshop/paper/talk that you would be willing to give to the group. We will then spend time on the 1st by going through the slides and creating a schedule for the coming meetings.
If you can’t make the meeting, please still send the slides along with a little blurb and I will share with the group.
Rama has been working on building an Evaluation Visualizer, “EvalVis” which gives an overview of some of the ISoTL projects going on at UBC (http://isotl.ctlt.ubc.ca/). “EvalVis” is an interactive visual interface that will show innovation projects, area of impact, and evaluation approach. Rama will be showing the in-progress version of the tool, as well as discussing some of the challenges of the project so far.
Work in Progress: Development of an app to visualize a learner’s own learning data
In this session, final year COGS student Valerie Wyns will give a ‘work in progress’ presentation on her project to develop an app, ‘modusloci’, that will allow learners to visually analyze their own ‘learning data’ (e.g. school notes). This development project builds on the hypothesis that if learners can visually make the connections between sources, subjects, topics (particular->general) it will offer them a new perspective on the meta-system in which their knowledge resides, and allow them to understand what they need to understand in a deeper way. Valerie will offer more details of the logic of her project and will explain her plan to visualize both a data map of a learner’s input data, and patterns of the learner’s habits. In particular, she will concentrate on the data mapping function, asking: What aspects of data are salient in a meta-system way? How can she create a platform that is playful, fun, and ultimately useful to the end user?
Visiting scholar and HCI pioneer Dr. Ben Shneiderman will lead an informal workshop meant to teach use of EventFlow software – a tool developed by his team for temporal sequence analysis and visualization. Bring your laptop and your data!
ABSTRACT Event Analytics is rapidly emerging as a new topic to extract insights from the growing set of temporal event sequences that come from medical histories, e-commerce patterns, social media log analysis, cybersecurity threats, sensor nets, online education, sports, etc. Our current work on EventFlow (www.cs.umd.edu/hcil/eventflow) supports analysis of point events (such as heart attacks or vaccinations) and interval events (such as medication episodes or long hospitalizations). In this hands-on session, Dr. Shneiderman will show how domain-specific knowledge and problem-specific insights can lead to sharpening the analytic focus so as to enable more successful pattern and anomaly detection.
BEN SHNEIDERMAN (http://www.cs.umd.edu/~ben) is a Distinguished University Professor in the Department of Computer Science and Founding Director (1983-2000) of the Human-Computer Interaction Laboratory (http://www.cs.umd.edu/hcil/) at the University of Maryland. He is a Fellow of the AAAS, ACM, and IEEE, and a Member of the National Academy of Engineering, in recognition of his pioneering contributions to human-computer interaction and information visualization. His contributions include the direct manipulation concept, clickable web-link, touchscreen keyboards, dynamic query sliders for Spotfire, development of treemaps, innovative network visualization strategies for NodeXL, and temporal event sequence analysis for electronic health records.
Ben is the co-author with Catherine Plaisant of Designing the User Interface: Strategies for Effective Human-Computer Interaction (5th ed., 2010) http://www.awl.com/DTUI/. With Stu Card and Jock Mackinlay, he co-authored Readings in Information Visualization: Using Vision to Think (1999). His book Leonardo’s Laptop appeared in October 2002 (MIT Press) and won the IEEE book award for Distinguished Literary Contribution. His latest book, with Derek Hansen and Marc Smith, is Analyzing Social Media Networks with NodeXL (www.codeplex.com/nodexl, 2010).
Changing Students Learning Behaviour via Learning Analytics
Abstract: Three main audiences for learning analytics are institutions, instructors & course designers, and students. Our focus is on students. We take a stance that a goal of learning analytics for students should be not only to inform them about their performance, but rather clearly influence learners to modify their learning behaviour and lead to better learning outcomes. As learning sciences show, students’ learning is heavily influenced by their individual differences. Our research aims at developing understanding of how information presented in and a form of Learning Analytics visualizations affects individual students. In this talk I will elaborate on these theoretical concepts and present results of our study showing varying effect of Learning Analytics visualizations on students with different goals. Our findings highlight the methodological importance of considering individual differences and pose important implications for future design and research of learning analytics visualizations.
Dr. Marek Hatala is a Professor at the School Interactive Arts and Technology at Simon Fraser University and a Director of the Laboratory for Ontological Research. He received his PhD in Artificial Intelligence from the Technical University in Kosice (Slovakia). His research is driven by the problems arising between the computing systems and their users. The areas of his prior interests include configuration engineering design, organizational learning, semantic interoperability, ontologies and semantic web, user modeling in ubiquitous and ambient intelligence environments, and software engineering and service oriented architectures. Dr. Hatala’s current research is framed within the area of Learning Analytics. Specifically, he builds on the learning sciences to establishing the theories of effects of open learner models on learner’s motivation with the goal to improve their learning outcomes in the online learning environments.
An NLP-informed learning analytics approach for extracting and measuring aspects of argumentation
Venue: Buchanan C105C
This paper/presentation reports on a work-in-progress and shares preliminary results for an attempt to use NLP-informed learning analytics methods to extract and measure aspects of students’ argumentation while they learn how to think and argue like scientists. The approach explored in this paper caters to aspects of deep learning and detects the flow of the argumentation directly from the structure and the composition of the language that the students use in their writings. The model integrates insights from natural language processing techniques and argumentation theory in such a way that derives the metalinguistic features of argumentation directly from the linguistic units produced in students’ written language.