“Cracking the Code”:
A reflection on Moodle course creation
I had previously wondered on the creation of on-line courses; what tasks and tools one would need. I imagined something similar to the drag and drop format found currently in the creation of certain web pages, in which you can customize to your hearts content. I had never imagined the opportunities offered by Moodle.
I must admit that I was not expected to be greeted by a blank page of html formatting. I had never been granted the opportunity to simply create; I worked within the affordances of tightly closed systems. This opportunity was empowering, tantalizing and downright scary. I sat, trying to recall how to code; I had used Logo in the 80’s, coded with html for a week at university and navigated around Joomla for a while, yet in each of these scenarios I following specific guidelines. The blank page provided by Moodle left much to consider, much to do and much to code. At first, I was lost among the barrage of <> , </>, yet I was determined to crack the code. With the help of online resources such as W3Schools, I was off and coding. From there I was able to code every page of the course being so pleased when the code transformed itself on the screen. I then knew that I had at my disposal an amazingly complex language that would allow me to create an online course. Asides from creating the actual course, I was looking forward to experiment with one of the main affordance of online courses; its ability to provide feedback and engage learners.
Designing a course: the Importance of feedback
Providing feedback in a timely manner has always been tricky. I often found myself awkwardly juggling quantity over quality, similar to the situation described by Gibbs, Simpson, Gravestock, and Hills (2005). The result was less than glamorous. Therefore, I wanted the sample assessment, and site in general, to provide the learners with relevant and timely feedback.
Badges
Before the MET, I assumed that badges were only suitable for use in younger grades. However, having tried them myself I realized their potential at all grade levels. I was curious if this potential would also be perceived by someone outside of the field of education; my husband. After completing the Introductory Unit, I asked him about the feedback he received from having obtained the first badge. He said it “brought a sense of closure. As it leaves a trace, it is a record of what I had accomplished; it makes the activities seemingly more substantial, that they counted for something. (…) A list of checkmarks would not have had the same effect. “
Choice of Assessment
While creating the module, I constantly thought of the question I most often heard from my students; “Is this worth marks?” This experience mirrors the notion of assessment as a driving force for student learning (Bates, 2014; Gibbs et al., 2005). Instructors must ensure that the students know where there grades are coming from. I thought about the concepts that would be covered in the course and the type of activities and assessments that best suited each, similar to the notion that learning theories should be applied to the tasks to which they are best suited (Prensky, 2003). The periodic table, for example, is often considered an ideal candidate for “traditional” testing and multiple choice questions (MCQs, whereas skills such as evaluating environmental impacts are best assessed through other formats such as case studies.
The amount of work behind the creation of online assessments quickly became apparent. These online assessments not only need to provide students with regular and comprehensive feedback to foster learning (Gibbs et al., 2005), but they had to, in a certain way, stand alone; the instructor could not make adjustments or offer assistance if the need should arise.
Based on the readings of Bates (2014),Gibbs et al. (2005) and Jenkins (2004), I made and updated a list of questions and steps that I considered relevant while creating assessments for an effective learning environment.
- What is the purpose of the assessment?
- What options/affordances could I use from Moodle to cater to this particular purpose?(type of assessment)
- What is the appropriate length of assessment to meet this purpose?
- What concepts did I want to evaluate?
- What question format is best suited for the complexity of each concept?
- What do I want the students to get out of the feedback?
- Write the questions clearly and succinctly so that students will understand and not get caught up in details you cannot answer on the spot.
- Create a list of possible errors students might make
- Come up with useful and relevant feedback to reinforce correct answers and offer support for incorrect answers
In my context, formative assessments and feedback have been phased out in lieu of summative assessments that served the sole purpose of assessing whether the students had acquired the desired concepts. These assessments were corrected and the class moved on to a new subject never truly providing an opportunity to apply the given feedback. I wanted to create an assessment that would serve as both a way to test students’ current knowledge and to offer them constructive feedback that they would actually consider relevant. I wanted to create an assessment that above everything would “improve and extend students’ learning” (Bates, 2014).
By browsing the assessment options in Moodle, I was met by a wave of potential, yet I feared applying an unfamiliar assessment method in a haphazard manner would not give the desired result. I selected the standard quiz format due to its similarities with my current practices, which would ease its proper integration (Ertmer, 2005).
The quiz format on Moodle can easily house many different question formats, support the selected purpose of the assessment and provide opportunities for immediate feedback. I chose to provide the feedback at the end of the quiz, to reduce student stress mid-assessment, especially as it is a timed assessment. To ensure that the feedback is carefully considered by the students, an issue described by (Gibbs et al., 2005), the designed quiz can be attempted twice. These options provide many advantages. First, as they help decrease student stress, it allows the mind of students to be open to learning (Willis, 2011) and receiving feedback. Secondly, by allowing more than one attempt, it renders the feedback of the first attempt immediately relevant; addressing the 6th and 9th condition mentioned by (Gibbs et al., 2005). A 30 minute window in between attempts was implemented to provide the learner with time to reflect on their result and consolidate their learning, leading to a higher chance of knowledge transfer. (Anderson, 2008a). Finally, the best of both scores on this particular assessment will be kept, allowing the 1st attempt to serve as a formative assessment. “Remedial feedback”(Gibbs et al., 2005) for incorrect answers and constructive feedback that consolidates knowledge of those who answered correctly was created for each automatically graded question. Furthermore, to diminish the possibility that students simply write down the correct answers in order, an issue similar to plagiarism that often arises with computer-assisted-assessment (Jenkins, 2004), the order of the questions and of the response items were randomized. Through these options I hoped to address the condition necessary for effective assessment and feedback.
Conclusion
One of the largest misconceptions I had regarding the LMS and online courses was that it, in its entirety, had to stand alone, similar to a webpage. An effective learning environment and online course should provide the scaffolding and space for discourse and learning, not the entire construct as an immovable object. The main affordances of an LMS are enabled by the fact that it is made up of the participants that inhabit its forums and interact with its content; it does not stand-alone, it is an extension of the instructor and the students.
References
Anderson, T. (2008a). Towards a theory of online learning. Theory and practice of online learning, 2, 15-44.
Bates, T. (2014). Teaching in a digital age. Open Textbook.
Ertmer, P. A. (2005). Teacher Pedagogical Beliefs: The Final Frontier in Our Quest for Technology Integration? Educational technology research and development, 53(4), 25-39. Retrieved from http://www.jstor.org.ezproxy.library.ubc.ca/stable/30221207
Gibbs, G., Simpson, C., Gravestock, P., & Hills, M. (2005). Conditions under which assessment supports students’ learning.
Jenkins, M. (2004). Unfulfilled promise: formative assessment using computer-aided assessment. Learning and Teaching in Higher Education, 1(1), 67-80.
Prensky, M. (2003). Digital game-based learning. Computers in Entertainment (CIE), 1(1), 21-21.
Willis, J. (Producer). (2011). Big Thinker: Judy Wollis Neurologist Turned Educator. Retrieved from https://www.youtube.com/watch?t=1&v=J6FqAiAbUFs