Category Archives: Assignment 1: Rubric

Assignment 1 – Group 5 – Personal Reflection

The Experience

This assignment was a bit of a challenge both personally and I believe for my group.  This is the first time as part of the MET program and I have faced challenges connecting and bringing a group together to create a plan to complete a project.  Due to time zone difference and busy schedule our group was simply not able to meet and have a synchronous discussion and so, our project was completed solely through asynchronous communication.  This disconnection and delayed communication created a challenge to a focused efficient effort, mainly due to the short timeline from forming the group to project completion.   Overall, once our group started an online document, roles were quickly established and work was spread fairly evenly, as far as I could tell being an asynchronous activity.   This was far from the smoothest group process I have participated in with the MET program, but the project is completed and will be submitted on time. Given the timeline and limited information provided, I feel as though our group has submitted a well-developed product.

Impact on my Current Practice

This was an interesting exercise for me in terms of impact on my current practice.  I am not currently evaluating LMS options, however, our group elected to create a rubric using the SECTIONS model (Bates, 2014). For me, this was a practice exercise in creating my own rubric for educational technology, and perhaps programs at my organization.  This exercise helped me to realize the importance of articulating why certain criteria are important to a selection process.   Trying to think from a stakeholder perspective it was an interesting exercise in that there are different focuses that a stakeholder could have from a technical to andragogical, to a business perspective.  By clearly articulating why criteria have been established it helps all stakeholders come to the same page. I plan on taking this lesson to use in the development of a rubric for use within my own organization.

 

References

Bates, T. (2014). Teaching in a Digital Age (Chapter 8 on SECTIONS framework). Retrieved January 29, 2016 from http://opentextbc.ca/teachinginadigitalage

Assignment 1 Reflection – Colleen Huck

In our group scenario, BCcampus had two LMS platforms, one vendor based and the other open source. BCcampus was going to downsize their IT group in three months and had to decide which platform, if either, they wanted to move forward with.

To start the brainstorming process our group looked at the Bates SECTION model for inspiration. Each component of the SECTION model raised important points that BCcampus would need to consider when selecting and implementing one central LMS. Some points that really stood out included, ease of use (for both end users and administration), compatibility, and tech. support (Bates, 2014). From there we considered Chickering & Ehramann’s seven principles of good practice. I took each principle and created a list of specific LMS functionality that would reflect it. For example, contact between students and faculty; this could be accomplished through discussion boards, instant messaging, email etc. (Chickering & Ehramann, 1996). We translated the ideas raised by both Bates and Chickering & Ehramann into five of our six rubric criteria: access, support, functionality, cost, and organizational requirements.

Throughout the planning process the Spiro article inspired us to think long term. According to Spiro, the end of the LMS was imminent, gone are the days of “one-size-fits-all courses” (2014). He emphasised the importance of adaptability and self-directing learning (Spiro, 2014). This lead to our sixth rubric category, customization.  Once we decided on our main criteria, the rubric came together very naturally. We all started filling in the subheading and developing rating standards.

Going into assignment one I had some apprehensions. While I have been the end user for multiple LMS platforms, I have never used a LMS from the administration side. I was worried my inexperience would put me at a disadvantage in creating a rubric. I think being in a group was really beneficial in that way. We all brought different levels of experience to the table and were able to feed off of each other’s’ ideas. While it was challenging to coordinate schedules, I think in the end we all came together and produced a rubric we are proud of.

 

References

Bates, T. (2014). Teaching in digital age http://opentextbc.ca/teachinginadigitalage/ (Chapter 8 on SECTIONS framework)

Chickering, A. W., & Ehrmann, S., C. (1996). Implementing the seven principles: Technology as lever. American Association for Higher Education Bulletin, 49(2), 3-6. Retrieved from http://www.aahea.org/articles/sevenprinciples.htm

Spiro, K. (2014). 5 elearning trends leading to the end of the Learning Management Systems. Retrieved from http://elearningindustry.com/5-elearning-trends-leading-to-the-end-of-the-learning-managementsystem

Reflections on Assignment 1

Individual Reflections on Assignment 1: Online Delivery Platform Evaluation Rubric

I greatly enjoyed this assignment as it offered an authentic situation in which we could apply the knowledge from the readings and could experience the selection/design process first hand.  This assignment resembled a complex strategy game in which one must carefully navigate needs and aspirations, barriers and context in order to walk a narrow band between an overly generic rubric and an overly specific rubric; both of which are not useful tools. Below I have identified two key features in creating this balance and have provided my take home message. At this point in the reflection I would like to thank all members of group 1 for the amazing collaborative work.  It was a pleasure working with you. Thank you for this wonderful learning opportunity.

The importance of identifying the context and an example from our rubric

In order to create a useful evaluation rubric, or for any selection of technology, the context, the needs and the culture of the users must be carefully considered (Brown, 2009).  By researching BCcampus and carefully considering both the SECTIONS framework (Bates, 2014) and the good practices from the ISTE (2008), we were able to assess the needs of BCcampus, their requirements and to a certain point what the agency hoped to achieve through the implementation of their LMS.  BCcampus, supporting over 25 post-secondary institutions, needs to serve as a model for all affiliates, and therefore needs to select an LMS that would show innovation, allow for collaboration and can easily incorporate leading-edge technologies (Coates, James, & Baldwin, 2005).  The rubric, also needed to help identify a LMS platform that allowed the diverse affiliates of BCcampus to customize and personalize the LMS according to their needs.  As BCcampus is proud of its role as a leader in technological innovation it was very important for us to ensure that the rubric correctly discriminated against LMS platforms that would not be able to adapt to upcoming technologies: we were looking to avoid the potential hang-ups described by Porto (2015) and Spiro (2014). Using these examples, one can see how the context of BCcampus drove many of the decisions in our rubric.

The importance of Identifying the barriers and an example from our rubric

Yet In spite of having a grasp on the expectations of BCcampus towards their future LMS platform, it was also imperative to consider the potential barriers, similar to those mentioned by Zaied (2005), that may limit the proper implementation of the technology. A rubric that does not address these issues might lead BCcampus to select a platform that is ill-suited to them and would not be used.   We attempted to highlight each of these potential barriers through the rubric, allowing BCcampus the opportunity to select the best possible platform considering their situation.  For example, the reduction in IT personal and limited three month timeline meant that cost, provided IT support and compatibility were essential.  As BCcampus was also open to the idea of implementing a new LMS that they had not tried before, our team opted for a simpler weighted criteria that did not assume prior experience with the LMS or time to experiment.

My take home message

I am astonished by the amount of elements that need to be considered in order to create a useful tool and to find that proper balance between too narrow and too generic a rubric. I knew context was very important to consider yet I had no idea how, when designing a tool, every element must be justified in regards to that context in order for the tool to be deemed relevant.  I believe that each of the rubrics generated from this assignment are context specific and time-sensitive.  For example, the rubric from the BCcampus scenario would not necessarily warrant the same results if applied to another organization and would certainly have been considered less relevant for BCcampus a year earlier when the cuts were yet to be announced.  Generic rubrics are not as useful as I once believed.  The selection of a tool or of a technology is not about applying a pre-made rubric or a generic set of criteria; one needs to cater the criteria to one’s specific and temporal context in order to make judicious choices and create an effective learning environment.  Just as a one-size-fits-all course is no longer appropriate (Spiro, 2014), so too is the one-size-fits-all rubric.

Again, thank you for this wonderful learning opportunity and thank you to all members of group 1 for the amazing collaborative work.  It was a pleasure working with you.

Danielle

 

References

Bates, T. (2014). Teaching in a digital age. Open Textbook.

Brown, T. (Producer). (2009). From Design to Design Thinking. TED Talks. Retrieved from https://www.youtube.com/watch?v=UAinLaT42xY

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary education and management, 11, 19-36.

International Society for Technology in Education (ISTE). (2008). Standards for teachers. Retrieved from  http://www.iste.org/standards/standards-for-teachers

Porto, S. (2015). The uncertain future of Learning Management Systems. The Evolllution: Illuminating the Lifelong Learning Movement.  Retrieved from http://www.evolllution.com/opinions/uncertain-future-learning-management-systems/

Spiro, K. (2014). 5 elearning trends leading to the end of the Learning Management Systems.

Zaied, A. N. (2005). A framework for evaluating and selecting learning technologies. Learning, 1(2), 6.

 

Assignment 1 – Reflection

Last week I attended a workshop about teamwork as a part of our department retreat. The purpose of this workshop was to understand group development, personalities and roles within a group and learn how groups evolve and improve. During the workshop we learned about the different stages of group/team formation based on the Tuckman model (Tuckman, 1965). The stages are forming, storming, norming, performing and adjourning. I found it fascinating to observe our group in this context while we worked on our assignment.

The first stage is forming. This is when individuals are not clear on what they are supposed to do and the mission is not owned by the group. We certainly began this way. We were unfamiliar with each other, not sure how to best communicate, and we were a little lost. I found communication to be a major issue for me during this first stage. I started off writing on the group discussion board, others sent messages through the blackboard system. Unfortunately, I didn’t get these messages. I looked at my spam box yesterday and was surprised to find all the messages from Blackboard in there. Apparently there was something in the message that made it default into this box. There were so many different methods to use for asynchronous communication, but because we were unfamiliar with each other and our preferences were different, it lead to lack of communication, especially on my part.

The second stage is storming. This is when roles and responsibilities are established, and there may be some conflict, competition may be high, and some push for position and power. As our group began to work on the google doc, we started to move from the first stage into the second stage. I find that the movement through these stages are slower with asynchronous communication. I think one of the aspects that helped us move from forming to storming was the ability to synchronously communicate on the google doc through its chat function. We tried to arrange a google hangout meeting but due to schedules and time differences, this was not possible. However, some of us were working on the google doc at the same time, allowing us to chat. We were able to clarify ideas, bounce stuff off of each other and start to trust one another. With asynchronous communication, this exchange is a lot slower so I find our development as a group to also be slower.

The third stage is norming. This is when success occurs, the purpose is well defined and team confidence is high. As a group, we were able to accomplish our goal of creating a rubric and justifying our choices, but I’m not sure that we were all on the same page so I’m not confident to say that we truly reached this stage. We all come from different backgrounds and experiences, and its not surprising that our ideas are not perfectly cohesive. But given time, more opportunities to communicate, exchange ideas and understand each other’s perspective, we could evolve to become a more cohesive and effective group. Again, I think the challenge is in communication. Technology has made it possible for us to work together from different time zones and locations, but this convenience comes at a cost. From this group experience, I feel that it takes a greater level of communication to progress through Tuckman’s stages, particularly if communication is asynchronous. Especially with a deadline, though we try to exchange ideas, we end up working as individuals in a group rather than group members with well defined roles, responsibilities and expectations. I think we were still hesitant to challenge each other and create conflict, but according to Tuckman (1965) conflict is necessary to progress through the stages. If we had the opportunity to meet with google hangout, our outcome may have been different. With these online courses, I think group work is challenging but also very rewarding. I enjoy group work as it gives me an opportunity to get to know my peers better, and I feel that it is a more personal interaction than through class discussions.  I think if we worked with the same group for a number of projects, giving us more opportunities to communicate, we could become a highly effective group and make it to the performing stage.

Tuckman, B. W. (1965). Developmental sequence in small groups. Psychological Bulletin, 63(6), 384-399.

Reflection on Group 4, UMP

I could relate to the specific need of the UMP through the other course I am currently taking, ETEC 533. There, our assignment last week was to upload video group projects to a system called CLAS, and we then had the option to annotate others’ videos or place or comments on the Connect discussion page. I chose the former option, and this is the first time I’ve done this, so I could picture what was required of an LMS for these medical students and faculty.

I would describe my role as “managing editor”, writing the précis, trying to coordinate who would do what and making most of the final editing decisions, but not really adding any of the ‘meat’ to the rubric. As I explained to the group, I’ve always taken more of a submissive role on MET group projects, but since this is my last semester before graduating, I thought I would try to lead for a change. I think the results were great; everyone contributed and effectively brought in different skills from their varied backgrounds. Kate brought a lot of enthusiasm and common-sense; Mo brought us crucial insider knowledge from working in the field and put the rubric together as a PDF; Nidal brought a lot of technical expertise, and Mark had the work experience and know-how most closely linked to our specific task.

Assignment 1 Reflection

Our group was tasked with creating a rubric that would help determine the best LMS for a group of Year 3 medical students who would require video-based assessment through a distance education model.  Luckily for us, Momoe teaches within the medical field and was able to respond to questions we had regarding the nature of clinical assessment, and could lend her expertise towards the building of the rubric’s criteria.  Randy started us off in a Google document, and Nidal contributed early on with work he had done in another course, that looked into the best LMS to use based on the SECTIONS framework of Bates & Poole (2003) and Anderson (2008).  Mark also chimed in with research he conducted about popular criteria for choosing an LMS, which although it was not cited, helped us narrow in on what we all commonly agreed would be required in our rubric.

I did not contribute to this early research-gathering process, as the other members moved very quickly when gathering resources, but I contributed to early drafts of the rubric once we had decided upon criteria.  I had assumed we might find a time to live-chat, but due to the size of our group and the challenge of Nidal’s location, that never happened.  Thankfully I was placed into a group of initiative-takers, who all contributed where they saw fit and we were in constant communication over e-mail and in the Google doc itself.  Randy, Momoe and I spoke through the Google document chat and flushed out the rubric together during the week, with Nidal contributing when he was available.  Randy volunteered to take on the task of the precis and Mark the paragraph rationale.  I am usually much more involved in the creation process of group tasks, but I felt very fortunate to have this work for this particular task.  This is my last week at my current school and the days have been full of administration and goodbyes, with my evenings dedicated in part to another assignment due in another course.  I pledged to my group members that I would look for opportunities to take the lead in the future, and thankfully they are all gracious individuals.

I learned a lot from the perspectives and ideas of my group mates, and I hope I contributed in a way that was at least somewhat helpful this round.  I look forward to continuing on with them, and having a more focused personal energy to bring to the table.

—-

References

Anderson, T.  (2008). “Towards and Theory of Online Learning.”  In Anderson, T. & Elloumi, F. Theory and Practice of Online Learning. Athabasca University.

Bates and Poole. (2003) “A Framework for Selecting and Using Technology.”  In Effective Teaching with Technology. San Francisco: Jossey-Bass. Pages 75-105.

Group 4: UBC’s Undergraduate Medical Program

By: Momoe Hyakutake, Nidal Khalifeh, Mark Viola, Kate Willey, and Randy Ray

LINK TO RUBRIC 

PRÉCIS

Our project, following a directive from UBC’s Dean, is to “select an LMS to deliver distance-based video-based clinical assessment”. The LMS users are 3rd year medical students and clinical faculty from UBC, UNBC (Northern British Columbia) and UVic (Victoria). The main affordance we are seeking for these users is the ability to record, edit, share, annotate, and play video remotely and asynchronously. Students will be videotaped while demonstrating specific clinical examination skills. They will then upload video of this work to the LMS. There, their peers and faculty will be able to comment on specific points in the video. Because no LMS has these specific video features embedded within it, we would require one that supports plug-ins or apps that play hosted videos (e.g. unlisted Youtube videos). We have also been asked for discussion forums and access to student resources such as a “clinical reasoning framework”, which fit more easily within the capabilities of most, if not all, LMSs.  Students will be face-to-face for a clerkship along with one of the faculty members, but others will be located at one of three cooperating Universities – thus, distance education principles must also be considered.

RATIONALE

Whenever an institution decides to implement a change as large as the adoption of a new Learning Management System (LMS) there should be a rather long and exhaustive vetting process where numerous options are considered and tested. When considering criteria for an LMS evaluation, our main concern was identifying the target features and functions that were key to the program in question: the Year 3 video initiative in UBC’s Undergraduate Medical Program. According to Wright, Lopes, et al., a selection committee of key stakeholders needs to be created to identify the ‘target’ features and functions of the required LMS. In addition to some of the basic features in the LMS, our selection committee identified three key features that were mandatory and non-negotiable: 1) the ability to handle media streaming and video annotation; 2) the functionality of the LMS on different OS and via mobile devices through native apps; and 3) the ability of the LMS to provide collaboration tools for end users. During Year Three Rural Family Practice Clerkships, UBC’s Undergraduate Medical Program requires that students be videotaped demonstrating specific clinical skills, while other classmates and faculty annotate the video with feedback. This requirement of the course necessitates the LMS’s ability to handle media streaming. As some of the faculty will not be in the same physical location as the rest of the group members, we identified the need for strong collaborative tools that allow for synchronous and asynchronous communication, including wikis, threaded discussion forums, instant messaging and chat functions. Finally, the ability to have a full user experience on a mobile platform will allow for more access and a more positive experience by students and faculty. As users are in different locations, the stability of the LMS on different operating systems will ensure a more consistent experience with less of a reliance on IT support.

References:

Anderson, T. (2008). The theory and practice of online learning. AU Press.

Bates, T., & Poole, G. (2003). Effective teaching with technology in higher education: Foundations for success (1st ed.). San Francisco, CA: Jossey-Bass.

 

Wright, C., Lopes, V., Montgomerie, C., Reju, S., and Schmoller, S. (2014). Selecting a Learning Management System: Advice from an Academic Perspective. Educase Review. Retrieved from: http://er.educause.edu/articles/2014/4/selecting-a-learning-management-system-advice-from-an-academic-perspective

Reflections on the Evaluation Rubric – Assignment #1

For the Online Delivery Platform Evaluation Rubric Assignment, our group was tasked with determining which LMS would successfully meet the needs of a new online program being developed to support students enrolled at Le Conseil scolaire francophone de la Colombie Britannique. After reviewing the scenario description, and attempting to assess the current and potential future needs of Le Conseil, we collaborated to create an evaluation rubric that was organized around four fundamental areas of consideration: Logistics, Support, and Management, Communication, Design, and Usability. Much like assessment rubrics that are created in collaboration with students to guide and support their work, our group developed a rubric that included four different rating levels to determine how a given LMS might function to meet the needs of Le Conseil. We attempted to create a clearly defined set of criteria to assess the ways in which different LMS would satisfy the goals as described within each of the four different categories.

In terms of assessing the current needs of Le Conseil and their new online program, we found this to be challenging, as we were unsure as to whether the online courses would be offered individually to students or within a cohort group structure. Therefore, the criteria that we incorporated within our rubric could be utilized to assess the capability of different LMS to meet the needs of both types of course organization. According to Coates (2005), customizable LMS provide course instructors and designers with the ability to be adaptable to the needs of diverse academic cultures and communities (p.31). I believe that this consideration is crucial in selecting and evaluating LMS, as the needs of the students will determine the future planning that the online program will be required to implement in order to support student learning and remain effective and relevant in a changing technological landscape.

The content of the course offerings will be required to match with student needs and expectations, and therefore, personalized course layout and design becomes crucial in helping instructors and students access and create adaptive content to meet changing needs and learning objectives (Spiro, 2014). We believe that these areas are addressed throughout the rubric, and most specifically within the design and usability categories.

Considerations around cost, infrastructure, and support for communications are critical in the LMS evaluation process. Although we didn’t receive any information about these areas as they apply to Le Conseil, we integrated these components into the evaluation rubric in a format that allows for flexibility in terms of the different LMS so that each may be applied for assessment purposes. The SECTIONS framework, as proposed by Bates (2014), provided further guidance with incorporating cost effectiveness, management features, and organizational issues into the overall scheme of the evaluation rubric.

We feel that our evaluation rubric represents the collective work of our team members, and reflects some of the differences in background knowledge and professional experiences that we each brought to the planning and final design.

Our LMS Evaluation rubric may be accessed here

References

Bates, J. (2014). Teaching in digital age, Chapter 8. Retrieved from http://opentextbc.ca/teachinginadigitalage/

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary Education and Management, 11,(1), 19-36. http://link.springer.com/article/10.1007/s11233-004-3567-9

Porto, S. (2015). The uncertain future of learning management systems. The Evolllution: Illuminating the Lifelong Learning Movement. Retrieved fromhttp://www.evolllution.com/opinions/uncertain-future-learning-management-systems/

Spiro, K. (2014). 5 elearning trends leading to the end of the learning management systems. Retrieved from http://elearningindustry.com/5-elearning-trends-leading-to-the-end-of-the-learning-management-system

Assignment 1 Reflection — Meghan Gallant

Designing an LMS evaluation rubric challenged me in ways I did not initially expect. While I knew I would have to take the requirements and limitations outlined in the case study into consideration, I did not expect to experience anxiety as I assumed to know what would be important to consider from YESNet’s perspective.

Our group began with a clear structure to get started with—Bates’ (2014) SECTIONS model. I was pleased to be using the SECTIONS model in its intended context and felt confident that, using the framework as a guide, I would contribute relevant and thoughtful criterion. However, when I began to think about what YESNet required, I realized I knew very little about what they would desire in an LMS or how they expected an LMS to facilitate blended learning. I could not put myself in YESNet’s shoes, so I started researching LMSs and blended learning.

Ellis and Calvo (2007) suggest that the first step when implementing a blended learning environment is that: staff begin by undertaking some sort of decision-making. Those initial decisions depend on the size and scope of the redevelopment or design of the course, the needs of students, the learning strategies of their department, and the culture of the institution. (p.63)

Unfortunately, I am not a member of YESNet and I did not undertake any decision making, so I had to make a lot of assumptions about YESNet’s needs. The first assumption I made was back when I chose to contribute criterion for the Ease of Use and Cost components of Bates’ (2014) SECTIONS model. After reflecting on why I chose to work with these particular components, I discovered that these two considerations are what I usually consider early on when choosing technology for my own classes. Someone had to cover Ease of Use and Cost, but was I biased in choosing them? Had I put my priorities ahead of the needs of the students, the learning strategies of YESNet, and the culture of the institution in assuming that Ease of Use and Cost would be priorities? Would my suggested criterion be useful, or would my bias skew the criterion’s validity? I wasn’t sure what type of feedback to expect when I went into our second group meeting.

Once I started working with the group, I felt better about my contributions. The group worked together re-wording, adding and deleting, and rearranging the criterion. After several hours of work, the group developed a rubric I am proud of. Revision and collaboration are not features unique to our group—this happens all the time. However, it highlighted an important point—choosing evaluation criterion is not a task that should be undertaken by a single person. I feel that working in a group softens personal bias and keeps assumptions to a minimum. As a group, we discussed the criterion and drew from our combined experience; this lead to our group addressing many considerations that would have never crossed my mind. Ideally, this is how the development of an evaluation rubric should be approached—as a team working toward a common goal.

References

Bates, T. (2014) Teaching in a digital age. (Chapter 8). Retrieved from http://opentextbc.ca/teachinginadigitalage/

Ellis, R.A. & Calvo, R.A. (2007). Minimum indicators to assure quality of LMS-supported blended learning. Journal of Educational Technology & Society, 10(2), 60-70. Retrieved from http://www.jstor.org/stable/jeductechsoci.10.2.60?seq=1#page_scan_tab_contents

Assignment 1 – Reflections – Nidal Khalifeh

Working with groups without having a face-to-face experience is not easy. It was a challenge to know if I was providing what was expected form me. I enjoyed the task, as LMS is my main concern these days, and I wanted to know what is relevant and important when people evaluate an LMS. The case was a special one and not a regular LMS would solve the problem. Therefor the rubric needs to address those challenges.

My IT background made me suggest that the LMS must support the BYOD approach (Bring Your Own Device), which means that LMS must be cloud and support multi platforms while runing on mobile phones & tablets. I also knowing the challenges of integration and migration of different softwares so I suggested that the LMS must comply with the Ed-Fi standard which offers a standard for developers who build educational software and allow seamless integration and communication between such softwares. Security and privacy are now a major concern. SSL encryption and the Student Privacy Pledge can help in guaranteeing a level of security to the system.

Cavoukian, A. (2013). BYOD: (bring your own device) is your organization ready? Retrieved from: https://www.ipc.on.ca/site_documents/pbd-byod.pdf.

Ed-Fi Alliance. (2012). Ed-fi-powered student information system vendors poised for new ed-tech market opportunities and growth. Retrieved from: http://www.ed-fi.org/news/2012/08/ed-fi-powered-student-information-system-vendors-poised-ed-tech-market-opportunities-growth/.

Hope, J. (2015). Obama pledges student privacy protection in state of the union address. Enrollment Management Report, 18(12), 8-8. DOI: 10.1002/tsr.30038