These are some of my posts on these topics:
Posted Date: January 19, 2013 4:44 PM
Subject: Evaluating Internet Resources: the Library Research Service
While it is not always a foolproof criterion, and yes I have been fooled before, the notion of “Does it all add up?” is appealing when evaluating any information or situation. Having examined the website “Library Research Service” (LRS), my gut reaction is that it is a good online resource. The LRS identifies itself as a unit of the Colorado State Library, which is an office of the Colorado Department of Education. The site’s URL, domain, publishing body, and content are all consistent with this identification. The LRS site contains a description of the organization’s purpose and mission. Contact information is complete and current. Information contained on the LRS site could be confirmed independently. The Library Research Service site has been online for at least fourteen years and is current: copyright 2013. The site links to appropriate professional documents and shows a history of regular updating. Links to and from the LRS website are associated with organizations with similar mandates and purposes. The site is well used and is visited by individuals who access the site from work and are graduate school educated. It is this author’s opinion that the Library Research Service website is a valid and reliable internet resource.
References
Alexa: The Web Information Company. (n.d.). Library Research Service (LRS). Retrieved January 19, 2013 from http://www.alexa.com/siteinfo/lrs.org
Library Research Service. (2013). Library research service: Research and statistics about libraries. Retrieved January 19, 2013 from http://www.lrs.org/
University of California Berkeley Library. (2012). Evaluating web pages: techniques to apply and questions to ask. Retrieved January 19, 2013 from http://www.lib.berkeley.edu/TeachingLib/Guides/Internet/Evaluate.html
Posted Date: January 16, 2013 10:49 PM
Subject: Doing the ERIC Two-Step
Wow…has ERIC ever improved since my undergraduate days. I find it easy to use and the links really helpful as I begin my own literature search. I must admit some frustration with ERIC’s current “limited availability of full-text documents” but found the direct links and “digital object identifier” useful in locating the online articles listed in my ERIC search results.
My journal article:
Tanner, H. & Jones, S. (2007). Using video‐stimulated reflective dialogue to learn from children about their learning with and without ICT, Technology, Pedagogy and Education, 16(3), 321-335. doi: 10.1080/14759390701614454
My ERIC document:
D’Angleo, F. & Iliev, N. (2012). Teaching mathematics to young children through the use of concrete and virtual manipulatives, Online Submission. Retrieved from ERIC database (ED5344228)
Posted Date: January 16, 2013 7:34 PM
Subject: What Kind of Article is This? Using the Rubric
I found the “Rubric for Evaluating Print and Internet Sources” (Gay, Mills, & Airasian, 2012, p. 94) from our text pretty appealing and thought I’d practice using it to evaluate “Making Learning Fun: Quest Atlantis, A Game With out Guns” (Barab, Thomas, Dodge, Carteaux, & Tuzun, 2005).
Relevancy: 1 – The source does not address the research interest of “my” study.
The article provides a qualitative report on a basic research project. The authors clearly state the article’s purpose in the opening line of the abstract, “This article describes the Quest Atlantis (QA) project…” (Barab et al., 2005, p. 86). I expected a good description of the project and, as the abstract did not include a summary of the author’s findings, I was not surprised when they were not well developed in the main article.
Author: 3 – Author name, contact information, and some credentials are included in the article.
I used ERIC and the Internet to get more information about the authors. It appears the subject of the article is related to the primary interests of a team of authors lead by Dr. Sasha Barab. Dr. Barab’s online vitae provides an extensive list of his academic qualifications and professional experience. It also demonstrates a history of publication and peer recognition. Like the rest of the team, Dr. Barab is affiliated with an accredited university.
Source: 5 – Source is a scholarly or peer-reviewed journal with links to related literature by the same author/s and ability to download fully online versions of articles.
Methodology: 3 – The source includes a full description of the research problem and the appropriateness of the methodology to investigate the problem.
As a basic research project, the author’s attempt to integrate education, entertainment, and social commitment (Barab et al., 2005, p. 86) seems herculean. However, the article provides an extensive description of and rationale for the development of Quest Atlantis (QA). The description focuses on QA’s theoretical framework, its methods, and it’s themes. This description is supported by the references provided. My concern with this paper arises from its Discussion and Implications Sections. The authors’ assertion that “QA has a number of characteristics that have helped it become a valuable intervention for schools” (Barab et al., 2005, p. 103) is not supported by the project methodology. Further investigation and documentation is required to support the claim that QA actually: advances social commitment, is connected to standards, engages girls, offers a flexibly adaptive curriculum, has a multidisciplinary focus, builds connections (Barab et al., 2005, p. 103), values diverse perspectives, fosters multi-cultural appreciation, and engages users in distance-mediated collaborations (Barab et al., 2005, p. 104). And although some statistics are provided on QA’s popularity (Barab et al., 2005, p. 87), this is not evidence of its value as a school intervention. As I finished the paper, dozens of ideas for “next-step studies came to mind. I think this article is a good example of basic research which “is conducted solely for the purpose of developing or refining a theory” (Gay et al., 2012, p. 16) – albeit a wide-ranging synthesis of theories and ideas. I would use this article as a reference source or for background information if my interest lay in demonstrating the efficacy of QA as a tool for school intervention or in confirming its effectiveness in fulfilling the claims listed previously.
Date: 5 – Current date of publication with a list of references consulted by the author including links and fully online articles.
The article is current. Given the scope of the project, I expected a lengthly and diverse list of references.
Barab, S., Thomas, M., Dodge, T., Carteaux, R. & Tuzun, H. (2005). Making Learning Fun: Quest Atlantis, A Game Without Guns. Educational Technology Research and Development, 53(1), 86-107.
Gay, L.R., Mills, G.E., & Airasian, P.W. (2012). Educational research: Competencies for analysis and application (10th ed.). Upper Saddle River, NJ: Merrill Prentice Hall
Posted Date: January 9, 2013 8:34 PM
Subject: mPortfolios Research Questions
I am interested in conducting research on the topic of: Mobile Electronic Portfolios (mPortfolios) for Primary Students. Thanks Emma, for opening a thread on the topic of electronic portfolios and for providing a good description of them.
Specifically, I am interested in knowing more about how the process and product of mPortfolios impacts the ability of young children to set learning goals and to reflect on their performance. I am also interested in the relationship between mPortfolio use and student performance. And finally, I would like to investigate factors that may positively and negatively impact the use of mPortfolios as an alternate to traditional written report cards.
Lots of interests…so I’ll attempt to focus…
1. What attitudes and beliefs do educational stakeholders have about replacing traditional written report cards with mPortfolios?
I am poised to begin using mPortfolios with my students. I have sent samples home to parents for their feedback and will make a presentation to the school board sometime this term. I thought a survey of attitudes and beliefs of students, parents, teachers, and administrators might identify issues I need to address.
2. What is the relationship between the attitude and beliefs teachers hold about mPortfolios and their willingness to use them with students?
When I started my research on mPortfolios I thought mostly about the product. The more I learned about the powerful processes involved in developing them, the more willing I became to use them with my students. How are this understanding and willingness correlated?
3. What impact does viewing a movie of themselves reading have on students’ ability to reflect on their performance and to set goals for improvement. What relationship exists between this process and students’ performance in reading.
I think is the most interesting topic. It was my work with students that encouraged me to push the boundaries of assessment. I am continually impressed by the ability of young children to think and to take control of their learning. I’d like to know, however, if I am just seeing what I want to see. I thought that narrowing the academic focus to reading and the mPortfolio focus to watching a movie of themselves reading would facilitate my research. I also thought that using a recognized reading assessment that incorporates reflection, like Fountas and Pinnell’s reading assessment, would collect more objective, reliable, and valid data on reflection and reading performance.
Pretty wordy, but I really benefited from reading through the previous postings and the responses they generated. Thanks to everyone who jumped in early and got the discussions going.
Don