How do you fix a problem like the rampant disinformation found on the internet? Most educators would probably say with knowledge, but the reality is that knowledge alone rarely fixes problems. What is typically required to fix a problem? Knowledge plus tools. Therefore, our team set-out to create a tool that could be applied to help individuals determine whether their sources of information were reliable or not. Originally, we were going to create a learning activity that taught students the different logical fallacies, but the reality is that a digital learning activity is not a tool. It is not something that the student user is going to use again and again, whereas with a learning tool, the hope is that once a student learns to use it, they will continue to use it as they build their skills and learn to apply their knowledge in different ways. This type of tool becomes useful, and “usefulness” for the student user is what I believe to be the most important aspect of “educational usability”.
Ultimately, as a group we settled on designing a tool that would help students determine whether they were using a reliable source of information or not. The tip that we received from our instructors during the proposal stage to look at a poorly designed tool, and then design something better really helped us to narrow the focus and find the ultimate inspiration for our project. Right away, I thought of the Alberta Government (2022) printable daily Covid-19 assessment checklists which have caused endless confusion, in comparison with Alberta Health Services (2022) digital Covid-19 self-assessment tool. A digital assessment tool with multiple branching scenarios is considerably easier to use than in a printable, checklist format. Using a digital checklist for assessing sources of information that would allow for different branching scenarios would hopefully be easier for students to use than the multitude of different checklists or frameworks that can be applied for assessing the quality of different sources of information. To the best of our knowledge, a comprehensive digital tool for assessing sources of information based upon established frameworks did not previously exist and creating one made sense due to cultural, media convergence where everything is becoming digitized (Jenkins, 2001).
I was very lucky to work with an amazing group with a wide-range of knowledge and expertise. We worked extremely well together and with regular on-line meet-ups we were able to bring our tool to life. We tried to distribute the work evenly with three of us (including myself) consolidating the frameworks and planning the branching scenarios for our tool, while one member built the tool in Articulate Storyline. One of the benefits of using Articulate Storyline was the option to incorporate gamification within the information assessment tool where users could receive badges as they completed the checklist provided their information source was proving to be credible. Gamification can assist in making a tool more engaging and capture students’ attention; making it more likely that they will use a digital tool (de Castell & Jenson, 2004).
While we really appreciated our group member’s skills and willingness to use Articulate Storyline, I do regret using this application for the design rather than H5P. If we had used H5P, I think that we would have been able to develop the entire tool beyond just a proof of concept by the deadline. Building the entire tool itself was too much work for one person, as each branching scenario that we created for the different types of information (scholarly, news media, oral history, primary sources, social media, audio visual, gray literature, nonfiction books) was quite comprehensive. However, Articulate Storyline is quite an expensive application and all of us did not have accounts or the training to use it. I am hopeful that we will eventually develop the entire tool using H5P.
The question of who our user would be did flip-flop a bit during the duration of the design process and our usability trials. Originally, I really wanted to design this tool with both high school and undergraduate students in mind. My thoughts were that it would potentially be most useful for the tool to be used and practiced in high school by teenagers when they are first learning and assessing whether information sources are reliable or not. There may also be more opportunities for students to practice using it with the support of a teacher at this level. Then, by the time they became adults, they would have experience with this tool regardless of whether they pursued post-secondary studies or not. Thus, hopefully helping all future adults to be more mindful of the quality of their information sources. However, this vision became difficult to assess for usability when the proof of concept completed was for the scholarly branch, as most high school students would not be familiar with the use of “scholarly sources” which would make testing difficult.
With only the scholarly branch completed, we did have to confine our user to undergraduate students. In order to assess usability, we developed a usability survey based off of Issa and Isaias (2015) usability criteria of: learnability, flexibility, robustness, efficiency, memorability, errors, satisfaction. We also encouraged the users to provide comments as desired. Overall, the feedback that we received was positive and the users really liked the concept behind our tool. However, for the most part our test users were well educated professionals who have already completed an undergraduate degree and are familiar with what makes a source of information reliable. Their feedback on the tool was invaluable in assessing all of the components related to usability criteria and ensuring that we had not missed anything and that the branches made sense. However, in the context of “educational usability” where you are considering whether the tool is useful for the student, we may have missed the mark a bit. We missed testing whether it was useful for actual, current undergraduate students, as they would be the intended end user. When considering the educational usability of a tool, it is really important to assess whether something is usable not just for the teacher, but whether it is useful for the student. This will be important to remember in the next stage of development and usability testing.
References
Alberta Government. (2022, February 9). Covid 19 Alberta health daily checklist (for adults 18 years and older). https://open.alberta.ca/dataset/56c020ed-1782-4c6c-bfdd-5af36754471f/resource/58957831-a4ab-45ff-9a8e-3c6af7c1622e/download/covid-19-information-alberta-health-daily-checklist-2022-02.pdf
Alberta Health Services. (2022). Covid 19 assessment for Albertans. https://myhealth.alberta.ca/Journey/COVID-19/Pages/COVID-Self-Assessment.aspx
de Castell, S. & Jenson, J. (2004). Paying attention to attention: New economies for learning. Educational Theory, 54 (4): 381 – 397.
Issa T., Isaias P. (2015) Usability and Human Computer Interaction (HCI). In: Sustainable Design. Springer, London. https://doi.org/10.1007/978-1-4471-6753-2_2
Jenkins, H. (2001, June 1). Convergence? I diverge. MIT Technology Review. https://www.technologyreview.com/2001/06/01/235791/convergence-i-diverge/