Categories

Assessment

The Process of Creating the Product

In working through the Assessment assignment I found myself struggling in many ways.  I questioned my own practices at times and patted myself on the back at others.  I welcomed the promise of easy marking on Moodle but questioned the validity of various marking mechanisms in the program.  I struggled with frustration regarding the constraints of the program, MET platform, and course expectations in trying to create a viable quiz that would represent meaningful, manageable assessment.

In this reflection on the assessment design process I will include the three major posts I created throughout the week as they clearly map out my state of mind as I worked through the assignment expectations.  I will also finish off with a short commentary on the process and product of this process.

Post 1:  Online Assessment:  Challenges and Opportunities

As I look ahead to the next task on making sure I have the appropriate assessment I feel pulled between what I am used to doing (the familiar), what I would like to try (innovativeness), and what the limitations/opportunities (forced perametres) are within the Moodle LMS.

New format equals new mechanisms of assessment

While it seems a relatively direct task to put up student work that has the potential to be assessed directly by the teacher, I question how to make assessment meaningful in a computer formatted means.  Assessment in my f2f foods and nutrition class is widely varied and is based on work in a variety of formats:

  • recipe website creation,
  • food advertisements,
  • cooking labs,
  • theory on cooking technique and food knowledge,
  • brochure on eating disorders,
  • etc). 

While this is all still possible via an online course, I question the students ability to deal with the tech issues.  If they are creating something in digital format, will they become overwhelmed with the creation of the product over the demonstration of their learning?  How can I overcome this?  I want to ensure they have choices in both formatting the assignments (digital verses hand created) and the submission medium (it become difficult to submit assignments digitally that were not created digitally) so I am better able to assess based on what they have learned in relation to the course material verses how tech-savvy they are and avoid the pitfall outlined by Gibbs and Simpson:

“Much assessment simply fails to engage students with appropriate types of learning” and “Some assessment can mis-orient student effort” (p. 15)

The Foods and Nutrition 9 course, while offered on-line, is not a technology teaching course.  Without continual presence of teacher support in creating projects such as brochures and websites, many students might feel overwhelmed in the tech aspect and not be able to designate a reasonable amount of time, potentially drowning in the workload.

Breaking out of imposed limitations

A second issue I am concerned about is the limitations for assessment within the assessment assignment for the course.  I am concerned that the assessment is question and answer based and may not provide students with adequate variety in outlets for representing their learning.  Jenkins argues that there are many avenues available for assessment in an online course format (MCQs [multiple choice questions], Case Studies, Online Portfolios, Group Discussion, Weblogs, etc) and “focusing solely on MCQs limits the possibilities of how ICT can be used for formative assessment.” (p. 3)

I plan to incorporate the required elements into an exam/test assessment as required in our course outline but will also include other assessment throughout my Moodle course to ensure that:

  1. assessment is not overly reliant on tests
    • “Students tend to gain higher marks from coursework assignments than they do from examinations …[and] students also prefer course work… Higher average marks and student preference would not count for much if coursework were inherently less valid as an assessment-but it is not.” (Gibbs and Simpson, p.7)
  2. students can demonstrate their learning in a variety of formats which match the materials to be learned/explored while ensuring specific criteria are clearly outlined:
    • “Criteria need to be explicit and understood by students, and demonstratably used in forming grades… Students need to understand the criteria in order to orient themselves appropriately to the assignment.” (ibid, p. 20)

Feedback now please!

The possibility within Moodle to provide rapid feedback is of interest to me in planning the Moodle course.   I like the idea of students being able to go through mini-tests in order to formatively assess their own learning prior to a summative exam at the end of a section.  The immediate feedback available to a quick multiple choice, matching, or short answer quiz or test will allow for feedback which is timely, will allow students to:

  1. Self assess for redirecting efforts:
    • “‘Knowing what you know and don’t know focuses learning.'” (Ibid, p. 16)
  2. Determine in theory sections which areas they need to refocus on:
    • “Feedback has to be quite specific to be useful.”
  3. receive feedback in time to redirect their efforts more effectively and efficiently:
    • “If students do not receive feedback fast enough ten they will have moved on to the new context and the feedback us irrelevant to their ongoing studies and is extremely unlikely  to result in additional appropriate learning activity. directed by the feedback.” (ibid, p. 19)

Sorry for the VERY lengthy post…there is so much to think about with assessment and making sure the students efforts are a good use of their time, we are assessing on what we are teaching and in a manner conducive to the subject matter and performance expectations, and we don’t overload ourselves as teachers in trying to assess everything in the most thorough manner leading to inevitable burnout!

I dare say that online assessments may help lighten the load in some areas to create more space for more meaningful assessments in others.

~~Caroline~~

Post 2: Moodle Frustration Anyone?

Is anyone else experiencing frustration with Moodle? (I laugh to myself as I ask this obvious question!)Here are some of my frustration points:

  1. The “short answers’ are too precision based.  If the answers are not at all in the exact words they are marked incorrectly.  I had a friend of mine complete the test and because he used different words he was marked 0/2 on a question in which he should have recieved 2/2.  How do you correct for this?  or even allow for this?  Is it simply a matter of knowing that “short answer” actually means one or more word exact answer?
  2. The linking of Moodle to the UBC Blackboard site links as well the dormant time.  At the point in which UBC recognizes you are no longer active, it boots you off.  It seems I get booted off Moodle at the exact time. 
  3. I would think automatic saving would be a given in any modern program.  When you do remember to save every sentence or two, you are redirected to a new page and then have to go back into the page you are working on.  Makes one avoid the saving process…with horrible consequences!
  4. Why can I not have more than one weblink per page?  Splash page ready to go…but cannot create more than one link on it, let alone including any graphic icons.

Anyone else out there with similar frustrations?  or possibly some answers to these issues? 

I will get back to my Moodle quiz (which I am thinking would be PERFECT for only multiple choice and matching type of answers!)

~~Caroline~~

Post 3: Damaged Self-Efficacy Levels Through Immediate Feedback?

Between the Moodle LMS limitations and the course expectations for assessment I find I am caught between the proverbial “rock and hard place.”

In creating my quiz I have included all necessary components and am eager to begin work on written feedback. However, I am stuck on one particular issue.

The Moodle platform allows for the important aspect of immediate feedback for students:

“A teaching method which places great emphasis on immediate feedback at each stage of a student’s progress through course units…has been demonstrated in many studies to improve student performance.” (Gibbs and Simpson, p. 18)

I will first outline my particular issue with what I see as a weakness in Moodle.

Has anyone else noticed that when creating the questions ALL marks are taken into consideration immediately, without regard to the fact that the essay questions will be marked seperately?

I went through the entire test a number of times, having chosen various options, but every time I indeed failed the exam because of the weight put on the essay questions that were to be marked later but were still taken into consideration in calculating the initial grade/feedback for the test.

If I were to create this test for real I would put the two components as seperate tests so that the students could receive immediate feedback on their Part 1 (multiple choice/matching/short answer responses) and simply wait for Part 2 (the essay questions) to be marked manually and added to their total.

I cannot even imagine the damage this could have to students in seeing a very low mark as one of the first, if not the first, grade they receive in the course. It could completely demotivate them, as well as damage any fragile self esteem levels amongst struggling students.

Imagine completing an exam you worked so hard to study for and then receiving only 48 percent as a mark (a mark actually representative of having got all the correct answers to that point on the test!) No amount of feedback subnotes will be able to take away that immediate blow!

Gibbs and Simpson also discuss this in noting that “A grade is likely to be perceived by the student as indicating their personal ability or worth as a person as it is usually “norm” referenced” and tells you, primarily, where you stand in relation to others. A poor grade may damage a student’s ‘self-efficacy’, or sense of ability to be effective.” (p. 11)

And it doesn’t seem enough in this case to argue that “we should design assessment, first, to support worthwhile learning, and worry about realiability later.” (p. 3)

So here I stand perplexed. And I cannot un-weight the essay questions because Moodle asks for a weight for all questions in an exam.

~~(a very perplexed) Caroline~~

Commentary: The Product of the Process

Throughout this process I found myself valuing the ability of LMSs such as Moodle to provide a machanism for formative assessment through monitering discussion groups and creating multimedia projects for the students.  Formative assessment does not have to be teacher based, and can come from other students as they post questions in the discussion forums and clarify their understanding.  Gibbs and Simpson remind us that in many successful learning situations “what achieved the learning was the quality of student engagement in learning tasks, not the teachers doing lots of marking.” (p. 8)  As well, feedback in this format may be more immediate than when it is teacher centred so that extreme quality of feedback can be balanced by the rapidity in that “imperfect feedback from a fellow student provided almost immediately may have much more impact than more perfect feedback from a tutor four weeks later.” (p. 19) 

The option, as well, to rewrite an exam after a designated time meant to be spent on revisiting the material and direction as to where they should focus their attentions also allows for summative evaluation formats, such as tests, to take on a formative format.  In this sense, feedback is both immediate and in-depth, supporting the idea that “feedback has to be specific to be useful” (p. 17) and recognizing that “if students do not receive feedback fast enough then they will have moved on to the new content and feedback is irrelevant to their ongoing studies and is extremely unlikely to result in additional appropriate learning activity, directed by the feedback.” (p. 19)

LMSs such as Moodle also limit assessment opportunities but can be manipulated to allow for almost any assessment needs.  This may mean less reliance on automarking facilities, and more emphasis on teacher directed marking, but the quality of assessment may be more in line with the ultimate goal of student comprehension in the end. because “if a student is looking for encouragement and only receives correction of errors this may not support their learning in the most effective way.” (p. 20)  Learning is about a process which includes levels of self-efficacy which drie the learning and motivation processes.

Formative assessment is important for learners, but needs to be managed by the instructor so that it builds confidence and aids in comprehension.  Alloting marks for works in progress may be distracting and less valuable than indepth feedback.   A more valuable approach might be to use “two-stage assignments with feedback on the first stage, intended to enable the student to improve the quality of work for a second stage submission, which is only graded.” (p. 24)  This could be difficult in an online quiz assessment such as the one created in Moodle for the course, but the preceding activities (or the ability to retake an exam, as mentioned previously) could be the answer to achieving this goal even in the face of online quizzes and tests.

One of the most important aspects I feel I have taken away from this, however, is the exploration of the idea that with feedback it is imporant to try and revisions can always be made later as need arises.  Gibbs and Simspon deal with this point as well in noting that “…we should design assessment, first, to support worthwhile learning, and worry about reliability later.” (p. 3)  As Eisenhower stated, “Plans are nothing, planning is everything.”

The Quiz:  Safety Quiz

My Moodle project, Faber’s Foods and Nutrition 9 course, can be accessed at through the following link.  The safety quiz is located in Week 2 of the course.   http://moodle.met.ubc.ca/course/view.php?id=136

or the quiz can be directed accessed at: http://moodle.met.ubc.ca/mod/quiz/view.php?id=4314

References

Gibbs, G. and Simpson, C. (2005).  “Conditions under which assessment supports students’ learning.” Learning and Teaching in Higher Education Accessed online 8 March 2010 http://www.open.ac.uk/fast/pdfs/Gibbs%20and%20Simpson%202004-05.pdf

Jenkins, M. (2004).  “Unfulfilled Promise: formative assessment using computer-aided assessment.” Learning and Teaching in Higher Education , i, 67-80. Accessed online 8 March 2010 http://www.glos.ac.uk/shareddata/dms/2B72C8E5BCD42A03907A9E170D68CE25.pdf

Leave a Reply

Your email address will not be published. Required fields are marked *

Spam prevention powered by Akismet