Site Assessment
Reflection on Assessment Tools in Moodle
This week I created a formative review quiz on my Chemistry 12 Moodle course shell.
see the Reaction Kinetics Self Assessment Quiz. (TOPIC 4)
The goal in this Chemistry 12 Moodle space that I am constructing, is to create another method of support for my students’ studies, and give them a different avenue for successful transfer of their learning outcomes.
The features that I incorporated in my quiz:
Formative self-assessment (grades not formally collected or recorded)
This quiz is an invaluable self-assessment tool for my students to gauge their mastery of the concepts, at the level and thoroughness of a typical summative assessment that meets the BC Ministry of Education curriculum requirements for Chemistry 12. This online quiz will give them real experience with questions that ask them to think, covering knowledge, understanding and application, and higher mental processes. Gibbs and Simpson (2005) suggest that “tackling the assessed task engages students in productive learning activity of an appropriate kind.” (p. 15)
Timing – limits / delays
I set a time limit to give students a sense of how long it should take them to adequately answer these review questions. This quiz is half the length of a normal, period-long, summative test.
I set a time delay between the first and second attempt, and also between later attempts, in order to encourage students to think about their responses, and not just randomly click on Multiple Choice or Matching responses, repeatedly, in one sitting. Urging them to return on another day to try again, would better serve their purpose than randomly clicking buttons.
Question display
I shuffled the responses within questions (to keep it ever changing, for those photographic memories out there), but I did not shuffle the question order, as the layout of my quiz does progress in order of topic and in order of covering knowledge, understanding and application, and higher mental processes.
Attempts
Students are allowed unlimited attempts, since this is a formative review, aimed at helping them assess their level in the module so far. Each attempt does NOT build on the last. My reasoning for this is that I want them to be able to re-write the questions the next time and see if they would still choose the correct response, rather than having the correct response appear automatically.
For this same reason, I turned adaptive mode on, so that if students attempted a question and chose the wrong answer, they could try again right away, with my feedback as guidance.
Why feedback was of utmost importance in my quiz creation
In reference to turning adaptive mode on, I took the time to provide detailed feedback on every single multiple choice response. If they chose a concept for their answer that was totally off track, I could provide some guidance (not the answer!) as feedback, directing them to the topic to consider for review in order to be able to pick the correct response.
I also took the time to populate the Overall Feedback fields with statements such as:
WOW! You really know your Reaction Kinetics! Or You have a good grasp of this material – but keep working on it! Or You aren’t ready for this quiz. Keep working on the unit content. \The intent here is to let students know where they stand on the level of mastery required to move on from this material.
I think that I have placed a large importance in providing frequent and detailed feedback as suggested by Gibbs and Simpson. Throughout my quiz, the “feedback is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance” (Gibbs and Simpson, 2005, p. 18).
Feedback in the context of my quiz can be used to:
• “correct errors
• develop understanding through explanations
• generate more learning by suggesting further specific study tasks
• encourage students to continue studying”
(Gibbs and Simpson, 2005, p. 19).
Results
I think that the results feature is going to be an invaluable tool, in light of the fact that I will be using my quiz for formative assessment. I will be able to use this tool to create opportunities for dialogue, whether through guided discussions on topics or in class discussions of these formative quiz results (or preliminary attempts, especially if the results are poor). To diagnose a student’s strengths and weaknesses, by collecting the results of their trials, I can get a sense of how well my entire class is grasping the current content, how many are engaging with the topic, and how best to improve my teaching! Poor results or poor participation will allow me, before a summative assessment, to alter my f2f delivery accordingly to either improve student participation and motivation or to introduce the material in a different way, or perhaps include more interactive online content and discussion, to help students better process the information.
Security
I found it interesting that there are security features built in that you could enable such as
Full screen pop-up with some JavaScript security and Safe Exam Browser
Although not applicable to my formative review quiz, I see that I could explore this option if I ever wanted to use the quiz feature for a summative assessment.
On the same note, it is important to ensure that the autofill feature of your computer is disabled, as I have attempted my quiz so many times in an attempt to trouble shoot and review, and the correct responses automatically fill the response fields for me. If I was to allow a student to sit in my seat, they would surely have no difficulties completing this quiz, unless I turned autofill off.
Meeting the aims of this exercise, for my application:
Items from this list of effects of formative assessment based on Gagne (1977), summarizes thoroughly, my aims in the creation of this formative self-assessment quiz:
1. Reactivating or consolidating prerequisite skills or knowledge prior to introducing the new material
2. Focusing attention on important aspects of the subject
3. Encouraging active learning strategies
4. Giving students opportunities to practise skills and consolidate learning
5. Providing knowledge of results and corrective feedback
6. Helping students to monitor their own progress and develop skills of self-evaluation
7. Guiding the choice of further instructional or learning activities to increase mastery
8. Helping students to feel a sense of accomplishment.
(Crooks, 1988)
I also believe that this online assessment tool will provide opportunities for:
• repeatability
• student interest and motivation
• and that open access can encourage students to take responsibility for their own learning.
(Jenkins, 2004, p. 70)
Jenkins also points to the use of mock exams to improve students’ learning (p. 76).
The next step in implementing my Quiz:
My intention, when this quiz is used in a class, is to build in a discussion where students can discuss the results of various questions on the quiz. Research shows that submissions to online discussion areas encourage more reflective contributions (Anderson, 2008) and improve understanding of course material, resulting in improved responses on exams.
In order for this assessment for 565, it was noted that only the first 3 MC, first 3 Matching, 2 short answer and 2 essay would be assessed. I had to edit my submission accordingly, to remove or move lower down on the page, the questions that didn’t stand out with respect to embedded images or the creative feedback that I emphasized in most of my questions. Although in a “real” application of this quiz, my original quiz order makes more sense, starts with introductory, one word answers, and progresses to detailed answers with diagrams.
Working through the E-learning toolkit:
I found the E-learning toolkit to be very useful in helping me navigate through the intricacies of this task.
My particular areas of interest in the past few weeks have been the Learning Management System page, in this case specifically Moodle, as this is the LMS that is most easily implementable by me in my school district. I found the links to instructional videos, Moodle information pages, and discussion forums a very valuable and productive tool to completing this task.
The page on Synchronous Communication Tools was also useful, in that I realized that I have used many of these applications before, yet at the same time I have come to realize that I could implement many of these applications for more productive purposes in my own teaching strategies. Wimba and Elluminate seem to be viable options that could be implemented in the Moodle course shell that I am building.
I have been exposed to some digital video production in a previous MET course and surprised myself by really enjoying it. The toolkit page addressing production and post-production of digital images and videos was very interesting to me, now that I know the possibilities and my own capabilities in this area. In further exploring the video editing options listed, I am having to admit that I may need to move beyond the applications currently in my own personal toolkit, in order to progress in this area.
In looking forward to the digital story assignment, I can see some further exploration in this area of the E-learning toolkit, and perhaps finally “giving in” to my husband with respect to purchasing some new editing software to “make my job a little easier” and make those digital productions that I want to make, but can’t achieve with my current applications.
Ongoing challenges – the actual creation of the quiz
I have been exposed to Moodle before, but this was my first time creating a quiz. I found the learning curve not SO steep as to be totally deflating, but perhaps knowing about organizing the quiz question data bank ahead of time (not creating the text in Moodle, but using pre-typed questions) as well as making a zip of my image files that were to be used in my quiz, was less time consuming and less tedious than creating each question as its own entity.
I still had some troubles with the positioning of certain images. Sometimes I would populate a feedback field with content and this content would not appear. And font formatting seems to have taken on its own rules. But these are things that will come with time, and I am very pleased as to how interactive, relevant, and diverse a quiz I could create that has 100% applicability to my course and its students.
References
Anderson, T. (2008). Teaching in an Online Learning Context. In: Anderson, T. & Elloumi, F. Theory and Practice of Online Learning. Athabasca University. Accessed online 3 March 2009 http://www.aupress.ca/books/120146/ebook/14_Anderson_2008_Anderson-DeliveryQualitySupport.pdf
Chickering, A.W. and Ehrmann, S.C. (1996). “Implementing the Seven Principles: Technology as Lever,” American Association for Higher Education Bulletin, 49(2), p. 3-6. http://www.aahea.org/bulletins/articles/sevenprinciples.htm
Crooks, T.J. (1988) The impact of classroom evaluation practices on students,
Review of Educational Research, vol. 58, no. 4, pp. 438-481.
Gibbs, G. and Simpson, C. (2005). “Conditions under which assessment supports students’ learning.” Learning and Teaching in Higher Education Accessed online 11 March 2009 http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf
Jenkins, M. (2004). “Unfulfilled Promise: formative assessment using computer-aided assessment.” Learning and Teaching in Higher Education , i, 67-80. Accessed online 17 March 2009 http://www.glos.ac.uk/shareddata/dms/2B72C8E5BCD42A03907A9E170D68CE25.pdf.