Assessment

ETEC 565. My Experience Designing Assignment #4: Assessment Tools

Moodle Course: Climate Change  is an enrichment course for high school students enrolled in Grades 10 – 12 Experiential Science in the NWT. The Assessment tool I designed is the course  PRETEST  that is linked to the Course Introduction and Outline page.

I must say that I have a greater appreciation of the extremely detailed and organized work that is required to design a course that has integrity, and activities that engage students in meaningful learning.

I was not intending on using any multiple choice quizzes in my course until I read the requirements of this assignment. Through readings around this topic, I now see the value of this kind of quiz. It can engage students in learning without generating a lot of marking for the instructor, and periodic formative assessment can be motivating, provide a check-in, and give opportunity for feedback (Gibbs and Simpson, 2004, p. 8). With this in mind, I designed this assignment as the Pretest for the Climate Change course. In addition, after each module there will be a mini quiz, and then a final Postest. The quizzes are only worth 15%, but will serve to focus students on some of the central concepts in the course and each module.

I found that in order to do this assignment I had to conceptualize and create most of the course outline and content. Then I needed to consider the overall assessment strategy of this course. Being an enrichment course, I intended to concentrate formative assessment on on-going group collaborative projects and discussion forums, and have a final ‘capstone project’ (Ragan, Frezza, and Cannell, 2009).  So my overall vision of the course assessment meant that the requirements of this assignment could be best integrated as a pre- or post–test assessment. I watched many videos and asked questions of people in class with experience in order to gain some competence with the Moodle program. I ended up finding that despite many helpful peers, my specific questions could best be answered by watching videos on the Moodle site on programming, designing online quizzes etc. – because I could focus specifically on the issues I was facing.

I created this PRETEST as a way to review what students already know, bring their prior knowledge to mind and at the same time focus on upcoming content and ideas in the course – this is useful for the instructor and the students!  It puts the cognitive, teaching and social presence right up front in the course (Anderson, 2008a, p.345). At the end of the PRETEST I give the students the opportunity to ask their own questions, to help them generate some ideas about the issues, and ideally setting a learning goal for the course. I found that doing this assignment at this point caused me to rethink a number of issues around how I want to design the course, (and I presume this was planned?) and what might be best for students. For example, I have decided to add a mini-quiz at the end of each module, as mentioned, and I am thinking now about possibly using these ‘student questions’ during introductions to initiate student interaction with content as they begin to interact to create a learning community (Anderson, 2008b, p.58). Or, as an introductory discussion item – compare your course goals with that of your classmates, and having the group draw up a list of course goals right at the start that are student generated.

 

I really appreciated the clarity of the paper by Gibbs and Simpson (2004), looking at how assessment can best support student learning.  Their analysis certainly aligns with basic constructivist approaches to meaningful assessment that I wanted to use in my course. The paper reminded me of two things that have influenced how I have rethought my approach to assessment in my course. Firstly, assessment should be regular, throughout, chunking course content for meaningful feedback (p.16), and secondly, the assessment task should engage students in meaningful learning activities, and orient students to significant concepts and clarify expectations (p.20). Again, the assignment has me rethinking the rationale for using quizzes throughout – as assessment for learning, for giving timely feedback, and for chunking and focusing content. The short answer and essay format not only gives students practice writing and expressing in the language of the discipline, but also gives the instructor an opportunity for immediate feedback and interaction with the students (p.16). I have also decided to integrate self and peer-assessment with the module quizzes somehow. This will be explored further.

 

With this in mind, I tried to pick out central concepts from each module for the pretest, even though I have not quite completed the last two modules! I found I had to reformulate my questions several times to clarify the ideas, clarify the answers, or distinguish the concept from another question. It is difficult to ask clear questions! The quiz setup ended up being basically intuitive, but details like registering new students, trying to preview the quiz and get a mark generated were not as straightforward. There are also some subtleties in how to assign marks for the questions and give hints. For the matching question there was the potential to give more than three alternatives. Again, I felt as a probe into prior knowledge, it would be more useful for students to have three clear choices that had subtle differences in the definitions and concepts that would clarify their ideas as they responded. I decided not to give students hints in the pretest since I am looking for their prior knowledge, not prompting recall of information. The most problematic issue that I feel I still need to work on is the particular format for generating acceptable ‘words’ or ‘phrases’ for the short answer questions. Several times I put in correct answers that were marked wrong! I am still working on this particular “language”!

 

I have found the learning curve for designing in Moodle to be very steep for me, especially with respect to html, and I am still working my way up that exponential slope!! For this assignment I did not need this, but I had to have a good portion of my course designed and visualized in order to generate a meaningful assessment activity – after all, it should be tied to course learning objectives, concepts, and expectations. I have set up the grade book so that when a student takes the test, the mark will be available to them, and after I mark the essay questions the marks can be entered into the gradebook. Weightings for assignments still needs to be entered in the gradebook – so the setup for that is not completed yet.

As an exercise in self-reflection this assignment has been very useful for me to put in context all the technical skills I have been struggling with, and it supports the research in Gibbs & Simpson, that students focus on what they are assessed on, and if the task is meaningful there is learning.  This assignment has forced me to take the time to explore Moodle more deeply, analyze more clearly what I am trying to accomplish with the assessment in the course, and how to make the Pretest, like all the other assessments, an opportunity for meaningful learning.

For me this has been an intense exercise in self-directed learning, or intentional learning as Ragan et al. call it. The collaborative element at this point for me is not really in our MET course any more, but has expanded to the online Moodle community and Youtube internet community. I have renewed appreciation, and gratitude, for all those people who are so very generous with their time and expertise to share it with everyone. Without this support I don’t know how one could learn all this material without f2f support, modeling, demonstration, guided practice etc! So this is what I am doing with the internet…apprenticing myself to some people on Moodle and Youtube.

My Climate Change Pretest page:

http://moodle.met.ubc.ca/mod/quiz/view.php?id=17025

Guest password: P@ssw0rd

 

References

Anderson, T. (2008a). Teaching in an Online Learning Context. In: Anderson, T. & Elloumi, F. (Eds.), Theory and practice of online learning. (pp. 343-365). Athabasca University. Retrieved from: http://www.aupress.ca/books/120146/ebook/14_Anderson_2008_Anderson-DeliveryQualitySupport.pdf

Anderson, T. (2008b). Towards a Theory of Online Learning. In: Anderson, T. & Elloumi, F.  (Eds.), Theory and practice of online learning. (pp.45-74). Athabasca University. Retrieved from: http://www.aupress.ca/books/120146/ebook/02_Anderson_2008_Anderson-Online_Learning.pdf

Gibbs, G. and Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education (1), 3-31. Retrieved from: http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf

Ragan, E. D., Frezza, S., & Cannell, J. (2009). Product-Based Learning in Software Engineering Education. Presented at The 39th ASEE/IEEE Frontiers in Education Conference, October 18 – 21, 2009, San Antonio, TX. Retrieved from:  http://fie-conference.org/fie2009/papers/1153.pdf

 

Leave a Reply

Your email address will not be published. Required fields are marked *