Creating a computer assisted assessment is two separate tasks for me. First, bending Moodle to my will to create the required questions. Second, designing assessment tasks for learning. This requires the art and skill of a seasoned teacher. While it is now a bit easier with Gibbs & Simpsons’s conditions, it is extremely difficult in this small assessment task.
My Moodle course, Developing a Personal Learning Network, will be an online course. There will be a once a week elluminate meeting with the Moodle shell as a resource and communication hub. Target participants will be teachers and administrators from primary and secondary education. The assessment is set up to have a 750 minute time limit and be taken twice, with the highest score recorded. I added graphics, embedded a video, added links to websites in questions and in feedback. I included one multiple answer question.
The real power of an Learning Management System is to deliver sufficient, timely, appropriate, and purposeful feedback. LMS’s provide multiple locations for “guiding the choice of further instructional or learning activities to increase mastery” (Gibbs & Simpson, 2005) via feedback. With general feedback and feedback for each incorrect and correct answer I can guide students to the appropriate websites and page numbers for remediation. I had to be careful not over do it and bombard students with too much information. As well, adding quality feedback can be quite cumbersome with Moodle quiz creation. When creating multiple choice questions there is so many places for feedback that the creation page for a single question requires scrolling. I found it extremely tedious to enter feedback into text blanks and simply stopped doing it after a few questions. Moodle needs to develop a more effective way for question creation.
While feedback is one key condition in designing assessment tasks for learning I must keep in mind that the task must also “engage students in productive learning activities” (Gibson & Simpson, 2005) which I think is difficult given Moodle question types. It is not impossible but extra care must be taken before I use this type of formative assessment over and over. Moodle generates easy and quick marks and provides instant feedback for the students but that does not necessarily make it the right tool for the job. I found the multiple choice questions that I added to this course were of very low order thinking, requiring little more than memorization, which will not engage the learner in critical assessment of their learning. However, with a larger test bank and a feedback loop it would reinforce key concepts more effectively.
Moodle’s capability to embed objects, graphics and links is more closely aligned and appropriate for active learning. In many of my questions and feedback I added links, embedded video and suggested websites for students to create something or refer to. My essay question required students to create a mind map of their Personal Learning Network and then linked to an appropriate site. In another I embedded a video and asked students to reflect and apply what they had learned in that context. I think these types of interactive and multimedia objects engage and challenge the learner more than straight text.
One area of concern for my Moodle assessment is how the grade is obtained when adding in non-auto scored items with auto-scored items on an assessment. Students are given a skewed scored that is either too high or too low. The total score will either include the essay questions, that have yet to be assessed, making the students score appear low. Or it will exclude the essay questions from the total which may make the assessment score seem high. Either way the student must return to this assessment for feedback after already receiving a mixed message assessment score. Ensuring that the feedback is “received, attended to and acted upon by the student” (Gibbs & Simpson, 2005) after the mixed message will be extremely difficult. I think if I was to deliver this assessment, I would remove the essay questions and make them into upload multiple file activity or ask student to do them in a forum so that we could use peer assessment and commenting.
I think in order for this to be an assessment for learning I need to develop quite a large test bank and I am not convinced this course needs this. While I can see creating a test bank for the plethora of computer jargon, I found a better solution in Hot Potatoes. I think that the matching and flashcards will be far more effective than a Moodle graded quiz for my audience. The feedback will be timely but not specific. The time on task, short and sweet, will orient students to the importance of the material. I believe that for my participants giving them a “grade” will act as a deterrent not a motivator. The Hot Potatoes activities fulfills the need for feedback without giving a grade.
Assessment must match student and teacher styles. While I am getting a better understanding of how computer assisted assessment can be used for learning, I more frequently see it used as assessment of very low order thinking. This type of assessment needs careful planning and implementation and it is not something that I can see working for me right now. Last year I almost completely abolished testing in my classroom and replaced it with collaborative inquiry based learning projects. So while I can see how Moodle quizzing could work as a formative assessment tool, it does not mesh well with my teaching philosophy.
Gibbs, G. and Simpson, C. (2005). “Conditions under which assessment supports students’ learning.” Learning and Teaching in Higher Education Accessed online 17 October 2010 http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf
Jenkins, M. (2004). “Unfulfilled Promise: formative assessment using computer-aided assessment.” Learning and Teaching in Higher Education , i, 67-80. Accessed online 17 October 2010 http://www.glos.ac.uk/shareddata/dms/2B72C8E5BCD42A03907A9E170D68CE25.pdf