Assessment

Assessment Reflection
E-Portfolio Assignment #4

Date of entry: July 3, 2011

First of all, as I reflect on developing my first assessment quiz in Moodle I am amazed at how question construction in itself can be such a worthy challenge! Who knew that coming up with practical, relevant questions and answers that was cohesive as a complete quiz would entail quite a bit of thought and also the learning curve involved in understanding how to set it up in Moodle. I definitely found it helpful to do some research on question construction, as well I found several YouTube tutorials on creating different types of questions (multiple choice, matching, short answer and essay), and the Moodle Doc pages very helpful. Additionally, reading Chapter 6 covering the topic of Quizzes in the book Using Moodle: Teaching with the Popular Open Source Course Management System (2nd edition) by Jason Cole and Helen Foster was very informative and came in handy.

For my first assessment, I developed an information literacy quiz to gauge students’ current skills to locate, evaluate and effectively utilize information. I added a time limit of 45 minutes and set it to the highest grade for grading method. Here are the following components I created for a total of 35 marks:

  • 3 multiple choice questions (worth 1 mark each)
  • 3 matching questions (worth 6, 6, and 8 marks respectively)
  • 2 short answer questions (worth 1 mark each)
  • 2 essay questions (worth 5 marks each)

I also utilized the auto-assess /grading feature, and embedded images into 3 of my questions. I pre-programmed feedback for students with an overall performance comment and grade.

At one point as I was attempting to add more questions to my quiz, I found that I couldn’t since a couple of my fellow MET students had attempts in progress. After consulting the Moodle’s discussion forums and Cole & Foster’s book I found that I had to remove their attempts before I could add more questions to my quiz. Also, after creating my quiz with the following required components: 3 multiple choice questions, 3 matching questions, 2 short answer questions and 2 essay questions I tested it out using the preview mode. I found that the inclusion of a time limited test really amped up the pressure to complete it before the timer ran out as it would follow the student as they scrolled down the page or clicked on the next set of questions. Another unique feature was to set the quiz in adaptive mode so that students could be offered a “submit” button for each question and if it is incorrect they have the opportunity to try again with a 0.1 penalty set and applied to their overall score. For the question display option, I inserted page breaks since I found it a bit annoying to continuously scroll down the page. So in this way, it broke up my quiz into 3 pages which could be navigated with a mouse click. Additionally, I had a colleague test it out and was disappointed to see a low grade at the end of the quiz. This was due to the essay questions (each worth 5 for a total of 10 points) that I found would have to be manually assessed. Thus, I added a note to those questions for students to know that the essay questions would be manually graded and the marks would be added to the quiz.

Another aspect I found while testing my quiz, was that even if no attempts were made to answer the quiz questions, students would receive my general feedback (which I initially had set it to “Good attempt”). Thus, I went back and removed the general feedback as students would already receive feedback for each question (correct or incorrect) and an overall percentage grade with a comment. This is one of the areas (providing constructive and motivating feedback) that I identified may be a bit problematic for online assessments. It’s difficult to offer unique feedback customized to each student so that they can determine their own learning path (i.e. identify areas they need to work on). However, I did like the option of unlimited attempts (which I opted for so it allowed more freedom for students to try the quiz again without any restrictions as this is an informal quiz). I also appreciated that I could see a history of responses that the students attempted before they got the answer correct. This helps me understand their thinking pattern so I can better assist them in their learning.

After reading some of the discussions going on in our course site, it triggered a thought that adding audio files to students’ online assessments would be beneficial rather than just offering text-based commentary. I think this would be particularly helpful as sometimes comments (especially in an online medium) may lack clarity, be ambiguous, have certain nuances and/or be interpreted differently than the original intent. One of the suggestions by Chris, a fellow classmate was the use of NanoGong at: http://gong.ust.hk/nanogong/. I found that you can also upload and attach audio files to students work at: https://www.youtube.com/watch?v=YoAxVOVnZxA. Another useful tool to consider would be incorporating Wimba Voice tools in Moodle which offers an array of applications (not limited to just assessment). I found an overview on it in Moodle Docs at: http://docs.moodle.org/20/en/Wimba_Voice_Tools_module. I also found some articles discussing the merits of using audio feedback at: http://clt.lse.ac.uk/voice-tools/ (in the recommended reading section). I’ll also be searching the ERIC database to find out the various uses and results of incorporating voice tools in a LMS.

As well, I found the merits of using different types of assessments in Moodle to address students’ diverse learning styles. I also found that the auto-assess feature useful as it would be a timesaver for grading most questions with the exception of essays. Yet, for the short answers I found that to take advantage of the auto-assess feature, I had to pre-program it with a list of acceptable answers. For example for my short answer question “What does the acronym EBM stand for?” I had to list “evidence based medicine” and “evidence-based medicine” to anticipate and address the potential combinations in which students would answer. I also ensured that “case-sensitivity” was set to “no, case is unimportant.” I particularly liked the shuffle within questions feature in Moodle so when students did attempt the quiz again they couldn’t merely record or memorize the correct sequence of questions/answers. I did try using the overall shuffle questions feature (which mixed the arrangement of all types of questions but ended up reverting back to my previous version as I had it set up so that the types of questions were grouped together (i.e. multiple choice, matching, etc).

I also realized that students would need to be aware that their final grade would not be reflective of all their submitted answers since their essays would not be added until after I had manually assessed it. Thus, I added this information for the introduction section of my quiz. Also, I realized that the instructions for my essay questions should be clearer in terms of the word limit (so that students know how much they should write). I revised my essay questions and limited it to 500 words or less. Additional assessment strategies and tips that I’ll need to keep in mind for future reference are effective quiz practices by Cole and Foster (2008), assessment conditions that support students’ learning by Gibbs & Simpson (2005), as well as the references below.

Future considerations will include:

  • developing relevant feedback for students using Moodle
  • finding the appropriate balance of offering formative assessments without overwhelming healthcare staff
  • designing assessments that support and motivate student learning (which are also reliable)
  • designing assessments that test a range of students’ abilities and relevant to their learning

Lastly, here’s a link to my Moodle site: http://moodle.met.ubc.ca/course/view.php?id=243 and quiz at: http://moodle.met.ubc.ca/mod/quiz/view.php?id=11570/

References
Chickering, A.W. & Gamson, Z.F. (1987). Seven principles for good practice in undergraduate education. American Association for Higher Education Bulletin, 39(7), 3-7. Retrieved from http://www.aahea.org/bulletins/articles/sevenprinciples1987.htm

Chickering, A.W. & Ehrmann, S.C. (1996). Implementing the seven principles: Technology as lever. American Association for Higher Education Bulletin, 49(2), 3-6. Retrieved from
http://www.aahea.org/bulletins/articles/sevenprinciples.htm

Cole, J. & Foster, H. (2008). Using Moodle (2nd ed.). Sebastopol, CA: O’Reilly Media, Inc. Retrieved from http://docs.moodle.org/20/en/Using_Moodle_book

Gibbs, G. and Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1. Retrieved from http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf

Jenkins, M. (2004). Unfulfilled promise: Formative assessment using computer-aided assessment. Learning and Teaching in Higher Education , 1, 67-80. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.97.1604&rep=rep1&type=pdf

The TLT Group. (n.d.).Seven principles: Collection of ideas for teaching and learning with technology. Retrieved from http://www.tltgroup.org/Seven/Library_TOC.htm

Leave a response

Your response:

Spam prevention powered by Akismet