Computer Mediated Assessment

Standard

565A Assignment #3: Quiz making

Designing assessments is no easy task and more often then not it seems as if it is viewed as the new form of Scantron. Computer aided assessment (CMA) offers many new opportunities and provides limitless ways for teachers to gauge whether or not their students are both engaged in the content as well as understanding it.

When reading the Jenkin’s article, I almost had to push down the knots in my stomach when I read that 80% of current assessment worldwide is in the form of exams, essays and reports. I would personally like to know a breakdown of what the percentage of Higher Education institutions is with regards to both the use of CMA and additionally how it is being used. It is almost infuriating when I see products that are available like Knewton, where they offer fully adaptive learning solutions, and post secondary institutions are still using multiple-choice questions to conclude a unit.

The latest NMC Horizon report on Higher Education has predicted that learning analytics is a near term horizon offering that will become more prevalent by the end of 2014. I usually trust their assessment and hope that their evaluation holds true, as we are at a cross roads in the implementation of elearning solutions. Now that these institutions have had several years to field test and discover some of these offerings, it is time they start considering how to utilize these tools more properly.

When you consider the 1970’s study that Gibbs et al. references it is hard to believe that assessment methodologies isn’t the first thing that is discussed once they have decided to move to more online and blended learning platforms. The study by Snyder purported that the most influential aspect of learning was not teaching, but assessment.

Therefore it is imperative that we pay more attention to the ways in which we assess. I definitely was blown away when I read the story of the student in the Gibbs article that explained how when he changed his efforts towards just passing he ended up with a 96%. Although he scored quite highly, he didn’t necessarily understand any of the content.

This was not only shocking, but made me think twice about both the phrasing of my questions, as well what I include. This led me to include a new video being embedded into the final essay question. I didn’t want to just have them respond to material that they could memorize, but encouraged them to review this video and then respond within a timed exam.

Reviewing the students’ responses to these questions would essentially remove any chance for the students who look for “cues”, and get them to provide an honest evaluation.

Personally I found the Moodle quiz to be very robust and powerful. The amount of features available really allowed me to explore different options. Having the ability to embed videos, graphics, as well as place limitations on how they view and access the quiz put the control in the hands of the instructor. Although it was something small, forcing that they at least open and view the material before they can access the quiz was a great little feature that gave me the piece of mind that my students would be fully aware of what was going to be on the assessment.

Although the assignment required that we asked a certain amount of questions, I feel if I were to do it again, I would have spread them out over a few weeks. Reviewing the Jenkins article I really liked how it was suggested that more frequent or weekly quizzes was a great way for a facilitator to keep on top of where each student was in the learning process.

Finally, I wanted to also highlight my experiments with SCORM packaging and using the Articulate Storyline software as well as Camtasia. This was a phenomenal experience, and you can see the test quizzes in the last topic module. The possibilities with this software are endless and I was able to combine Powerpoints and screen recordings in whatever format and sequence I wanted. However, the most powerful component that I discovered was the ability to produce videos that had pop up quizzes. Although it is something that has the potential to be misused or over hyped, I felt like it was an amazing real time evaluation tool that captured responses in the moment you precisely wanted as a facilitator. Instead of adding it at the end of a culminating activity, it was placed in the middle, forcing the student to pause and reflect. I definitely see it as more of an activity that would be used in a fully online course, but feel that the power and capabilities these programs yield, provide facilitators with truly limitless horizons when it comes to CMA.

 

 

http://www.nmc.org/publications/2014-horizon-report-higher-ed

 

Leave a Reply

Your email address will not be published. Required fields are marked *