Assessment Tools
May 16th, 2009 by Ed Leung
Reflection: Using Moodle’s Assessment Tools
I have never created a test within a LMS before. After exploring Moodle’s function in building a quiz, I’d say I have mixed feelings over it: while I see that it can be used as a powerful, time-efficient way teachers can use Moodle’s “quiz” function as a form of formative assessment, the function itself has some significant limitations that dampen its usefulness.
Background
I have created a pre-test in my Biology 11 Moodle course. The intention of the pre-test is to give students an opportunity to gauge themselves of their preparedness in writing the unit test, which would be conducted in class. In designing the test, I have modified some materials from course exercises, and actual test questions, to give students a relevant flavour of what they should be expecting to see on their actual unit test. In choosing the types of questions to use on the pre-test, I have tried to sample the broad scope of concepts learned in the taxonomy and evolution unit of my course. I have also tried to provide a variety of question types to prepare students differently – of the ten questions on the pre-test, 3 were multiple-choice questions, 3 were matching questions, and there are 4 more short and essay questions (2 of each type).
In designing the opening and closing date of this quiz, I am mindful of the fact that most students do not begin studying early to truly absorb and understand course materials. Hence, students were told that they must have completed the pre-test a few days before the actual unit test. I believe this would force students to begin studying early. To remain consistent with the concept of using formative assessment to foster growth in students’ understanding of course concepts, without creating a sense of frustration, I have chosen to NOT give actual, concrete marks for the pre-test – students would receive a participation mark for having completed the pre-test before its due date. To further reduce anxiety, I am also allowing students to try the pre-test twice. It is my hope that students would be able to reflect on the area(s) of challenge after their first test attempt, and would study or seek help to overcome these challenges.
Positives
- Moodle’s quiz function allows the instructor to create a test bank of questions. The instructor, when creating a new quiz, can choose to use these questions that are already built. This would allow an instructor to either: a) create multiple tests that can be used for different sections that he/she teaches, or b) create tests from different phases of a teaching unit – this would provide great insights into how students are progressing through a unit.
- Moodle allows a relative easy way to upload a picture into a test question. Instructors can use the “file” function under the main screen to upload a collection of images onto the Moodle site. Then, as test questions are being created, pictures can be easily added.
- Gibbs and Simpson (2005) argue that it is important to support student learning by giving them immediate feedback. This is one of the prominent features of Moodle’s quiz function in allowing a student who is trying a quiz to instantly know whether an answer is correct or not. If used as a summative assessment, the instructor can also disable that function, so students would only know their mark after they have submitted the assignment. Instant feedback also allows the instructor to provide the students with the correct answer and/or explanation of the answer as well. This works particularly well in short answer questions, where a partial answer can potentially prompt a screen that explains the question more thoroughly.
Challenges
- Objective Answers’ Nightmares: While Moodle quiz works very well for multiple-choice, true-false, and matching questions, the same cannot be said of objective answers. In my experience, when I have created a short answer question, a response would be marked wrong even when it is using almost all of the same words as the ‘standard answer.” (e.g. “Because all of these primates have fur” is marked wrong when the “standard” answer is “Because all of them have fur”).
- Misleading Score: When short-answer and essay questions are a significant component of a quiz created in Moodle, a student’s quiz score would be substantially skewed to be lower than reality. This is because Moodle does not have to ability to grade an essay-type question, which in itself is educationally sound. However, when calculating how a student has done on a quiz, it will report a mark of zero for any of such questions. I have placed a disclaimer as part of the instruction for the test, but the surprisingly low mark may still affect students’ motivation negatively.
- Scrambling Woes: Teachers who give multiple-choice tests would generally welcome a feature where the choices are automatically scrambled (or randomized) for them. This practice would help make cheating more difficult, and thus may dissuade students from any attempts to demonstrate academic dishonesty. However, logistically, this scrambling feature makes it difficult for a variety of high-level multiple choice questions. For example, I cannot give choices a to c with possible answers, then have choice “d” as “all of the above” and “e” as “none of the above.” I also cannot give a question where some choices feature a combination of earlier answers (e.g. choice “d” is “only a and c,” or choice e is “all of the above except c”). Moodle does not allow the instructor to turn the randomizing feature on for selected questions only, and this greatly decreases the effectiveness of the feature.
Reference:
Gibbs, G. and Simpson, C. (2005). “Conditions under which assessment supports students’ learning.” Learning and Teaching in Higher Education Accessed online 25 June 2009 http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf
[…] Assessment Tools […]