Assessment

Initially, I believed that this was going to be a relatively easy component within this course. I have made hundreds of quizzes and I thought I would simply move one over to Moodle.  However, the Gibbs and Simpson (2005) reading had me re-thinking assessment.  Then I had to think of the fact that the students will be completing the quiz online.  This changes the nature of the questions that need to be asked, as simple factual statements could be looked up in a textbook or online.  This also changes the way I would use the quiz tool within Moodle for an online rendition of a provincially examinable course.  As I reflected on it, I realized that quizzes can be a powerful way for students to monitor their own learning of the key concepts and terms along the way as Jenkins (2005) points out..  Thus, it functions better as a formative assessment than having a major test as a summative assessment.  However, by adding short essays, there is a sense of tying in core concepts through a synthesis of ideas, so there is some aspect of a summative nature within the quiz.

There were several conditions that Gibbs and Simpson (2005) outlined that I kept in mind.  The second condition deals with students being oriented to those things which are most important.  Thus, I tried to focus my questions on the key concepts and overall learning targets rather than basic factual questions.  I had to weigh this with the tension of having them prepared for the final provincial based exam.  In as much as possible I tried to emulate the provincial exam question style and format.  I also kept in mind the third condition.  Gibbs and Simpson (2005) write, “probably the only way to solve problems is to solve problems” (p.15).  Thus, for my essay questions I gave them problems to answer.  The questions are geared to get them thinking and analyzing.  The first question asks them to understand a political cartoon and relate it to the ongoing Senate debate.  [A side note on the political cartoon.  Graeme Mackay has given free use of his cartoons for educators – http://mackaycartoons.net/about/] For the second essay, students were asked to choose between four topics related to ongoing issues in Canadian politics.

Working with the quiz tool was both simple and frustrating at the same time.  I found the creation of the questions to be quite easy within the tool, but struggled with the feedback aspect of it.  The multiple choice questions and matching feedback is fairly self-explanatory.  Thus, as Jenkins (2005) points out, multiple choice questions can provide regular feedback.  I can see how using a regular quiz made of multiple choice can help  reinforce key principles.  However, creating the possible answers for the short answer seems daunting.  Moreover, I rarely use short answer in my classes.  I believe that even though I have generated a list of possible responses, a students could phrase things slightly differently and still be correct.  I would have to go and manually check the answers.  I know this defeats some of the advantages to using the technology, but to have a worthwhile question in short answer requires some variance of expression which a computer-based technology just cannot assess.

It took me awhile to figure out how to set up the submission so that the right amount of feedback could be shown without having too much detail.  Given that I would need to mark the essay (and review the short answer response) I wanted to release the full feedback once the exam was marked.  After the exam, the students are able to see whether they got the multiple choice and matching right.  There is no overall mark, as having a mark without the essay mark is not particularly useful and may have students wonder why the did so poorly!

In regards to the feedback I included, I kept in mind what Chickering and Gamson (1987) mentioned in regards to feedback.  They state “students need appropriate feedback on performance to benefit from a course.”   Thus, I gave, in my overall feedback, the suggestions for students to look at the unit self assessment guide to see what they need to review.  I also encouraged students to participate in the weekly chats with their questions.  I think that within these forums, I would be better equipped to deal with general issues.  Students can also contact me via email with specific inquires.   Within the multiple choice section, I included brief explanations for each choice so that students could see the rationale for each response.  This also falls in line with Gibbs and Simpson’s fourth condition when they state that “feedback has to quite specific to be useful” (p.17).  Finally, I utilized the provincial exam specifications for marking the essays.  I believe that it is good for students to be accustomed to how the standardized test will be marked.  I included the marking grid in my feedback with the essay so that students will understand the score that they received.  I would also include specific comments on their essays so that they would know in what areas they would need to work on.

I made the quiz available for three days and with a cut off on the Wednesday night so that I could mark the essay prior to the Thursday regularly scheduled chat. This would also give students time to log in to the course and see the feedback on the quiz prior to our chat.

Overall, I feel like the greatest thing I have taken out of this activity is my own wrestling with how to better assess my own students in my own classes.  I also feel like I have a better appreciation for the challenges that those who teach online courses have with doing assessment.  I also have a better understanding of different uses and applications of quizzes.  It was also a good reminder of how to phrase effective quiz/test questions.

 

References:

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. American Association for Higher Education Bulletin, 39(7), 3-7. Retrieved from 
http://www.aahea.org/articles/sevenprinciples1987.htm

 

Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1(1), 3-31. Retrieved from http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf

 

Jenkins, M., (2004). Unfulfilled Promise: formative assessment using computer-aided assessment. Learning and Teaching in Higher Education, 67-80. Retrieved from http://insight.glos.ac.uk/tli/resources/lathe/documents/issue%201/articles/jenkins.pdf

Leave a Reply

Your email address will not be published. Required fields are marked *