Author Archives: Stephen Petrina

Meta-Analysis, the latest in a book titled Methods of Analysis

Just uploaded Meta-Analysis, the latest in a book I’m working on titled Methods of Analysis. It’s basically a side project or backstory to a few other research projects. Download the essays or working chapters from https://blogs.ubc.ca/researchmethods/methodologies/analysis/

Methods of Analysis

I’m working on a manageable book titled Methods of Analysis. It’s basically a side project or backstory to a few other research projects. Download the essays or working chapters from https://blogs.ubc.ca/educ500/methodologies/analysis/ The first para of the introduction is tentatively this:

The purpose of this book is to provide a sense of histories, philosophies, premises, and procedures of a range of methods of analysis. I try to say something novel— something that has not already been said— of the methods herein. This is obviously meta-analysis, understood to be an analysis of analysis and synthesis of analysis. Meta-analysis implies analytic theory, requiring a typology of methods or methodology. Researchers take analysis for granted, having neglected meta-analysis and the history of analysis. Researchers also take for granted methods of analysis, readily overlooking interrelationships and blurring distinctions. Among the expansive volume of texts on methods and methodologies, there is no single text or source that draws relationships and distinctions of a range of methods of analysis. This book addresses that oversight. Since the early nineteenth century and development of logical analysis, methods of analysis proliferated. This is unique, as the analogous, historical journey of synthesis witnesses no such proliferation. Nonetheless, a companion or sequel of methods of synthesis would be helpful.

Did #Brexit supporters consume quantities of Mad Cows?

Coincidence is not correlation and nor is correlation causation. Or is it? One of the more convincing data analyses post-Brexit was the depiction of a near perfect correlation between counties and regions in the UK that experienced breakouts of BSE or Mad Cow disease in 1992 and the counties and regions voting to Leave the EU in 2016.

MadCowBrexitb

While it might be a stretch to suggest that those who anticipated Leaving the EU consumed great quantities of Mad Cows in 1992, the near perfect correlation is telling.

#Unbelievable: experts and common sense

Coincidence is not correlation. About the same time that the UK was moving towards yes to a referendum to exit the European Union, the WHO’s International Agency for Research on Cancer (IARC) was downgrading coffee and upgrading “hot beverages” as a causal agent in cancer.

As the story goes, expert opinion debunked 400,000 years of the human experience of drinking hot beverages while the Remain camp boasted that expert opinion was on their side.

Thank you experts.

Leading up to the Brexit, the Conservative Party’s Michael Gove remarked: “I think people in this country have had enough of experts.” Anti-expert sentiment gained momentum and perhaps an upper hand. At the brink of the British exit from the EU and in the aftermath, experts scrambled for legitimacy:

Expertise can breed arrogance and false certainty; specialisms fall prey to group-think. Experts must be challenged and their work robustly interrogated. But that is very different from attacking evidence merely because it undermines your arguments (something that both Remainers and Leavers have done) and instantly impugning the motives of those who have produced it (ditto).

How could experts have got it so wrong? Immediately post-referendum, one pundit asked straight away: “do the experts know anything?” Another queries if we are now witnessing ‘the twilight of the experts.”

Were data saying one thing while experts were hearing another? Intentionally?

Welcome to the summer section of EDUC 500: Research Methodologies!

Getting inside people’s heads via ethnographic and phenomenological interviews

Distinctions between ethnographic and phenomenological interviews are profound and extremely important to maintain. In Research Decisions, Palys & Atchison allocate just two introductory pages to phenomenology, which is reframed as a philosophy of phenomenologism. Why is that?

In the glossary, P&A write that phenomenologism is “an approach to understanding whose adherents assert that we must ‘get inside people’s heads’ to understand how they perceive and interpret the world” (p. 425).

Can or should the phenomenologist or ethnographer ‘get inside people’s heads’? is this one of the purposes of research with human subjects via ethnographic and phenomenological methods? Add to this, historical methods, etc.?

In “Culture and Causality,” Fricke resolves at length:

It is true… that we cannot get inside people’s heads to actually know what motivates them or how they see the world. It is true, in other words, that we operate in terms of theories. But this is the same context within which we live our everyday lives. If I ask myself the ques- tion in my everyday dealings with people, “Why did she do that?,” I seek an answer as an attempt to understand that person’s view of the world, her motivations, and the concrete circumstances of a situation. I acknowledge, if I want to get as close to the true reasons as possible, that I might be wrong in my interpretation and that more information, or the reach for more consistency in light of the available information, may cause me to modify my initial understanding. If the thing I seek to explain is important, or if the person is particularly important to me, I may try to include information about her past history and wider networks of kin and association.

In some ways, anthropological fieldwork replicates this prosaic operation of the everyday. Our attempts at understanding are imaginative acts in which we try to get inside of the head of the cultural other. (pp. 476-477)

What do you think? Is this made obvious in research 2.0?

Science 2.0: Analytics predict risk of metabolic syndrome

News Staff, Science20.com, June 27, 2014– Obviously, as the creators of the four pillars of the Science 2.0 concept, we’re interested in new ways to use data to make meaningful decisions, but we recognize that key breakthroughs are more likely to happen in the private sector, where money can be made filling a demand.

A paper by Aetna and GNS Healthcare Inc. in the American Journal of Managed Care demonstrates how analysis of patient records using analytics can predict future risk of metabolic syndrome.

This could be useful, not just because a third of the population has the somewhat fuzzily-defined metabolic syndrome, a condition that can lead to a condition like chronic heart disease, stroke and diabetes, or because obesity accounts for almost 20 percent of overall health care costs in the U.S., but because it’s a roadmap for how to port the Science 2.0 approach to lots of fields.

“This study demonstrates how integration of multiple sources of patient data can help predict patient-specific medical problems,” said lead author Dr. Gregory Steinberg, head of clinical innovation at Aetna Innovation Labs. “We believe the personalized clinical outreach and engagement strategies, informed by data from this study, can help improve the health of people with metabolic syndrome and reduce the associated costs.”

“The breakthrough in this study is that we are able to bring to light hyper-individualized patient predictions, including quantitatively identifying which individual patients are most at risk, which syndrome factors are most likely to push that patient past a threshold, and which interventions will have the greatest impact on that individual,” said Colin Hill, co-founder and CEO of GNS. “The GNS automated data analytics platform paired with Aetna’s deep clinical expertise produced these results on extremely large datasets in just three months, a testament to the ability of both groups.”

GNS analyzed data from nearly 37,000 members of one of Aetna’s employer customers who had voluntarily participated in screening for metabolic syndrome. The data analyzed included medical claims records, demographics, pharmacy claims, lab tests and biometric screening results over a two-year period.

Read More: Science20.com

Googling is NOT College and Career Readiness

Paige Jaeger, LibraryDoor, April 25, 2014– This morning I received a desperate plea from a super-librarian who has seen her program go down-the-tubes with the arrival of one-on-one devices incorrectly implemented in silo-classrooms.   What a shame.  As a district adopts a new “writing program” with built-in research tasks, old tasks get dropped in order to accommodate new instructional models that have been crafted to increase someone’s bottom line. 

Ironically,  this school with a flexible schedule to allow for innovative learning endeavors, is reverting to a model of one-size-fits-all learning tasks demoralizing a cutting edge model of flexible scheduling to accommodate curriculum needs. 

If this sounds like your scenario, please wrap your head around a few poignant truths for advocacy.   These three teacher-assessment questions below are a great starting ground to discuss at faculty meetings, principal appointments or in the lunchroom.   Simple truths such as these may help to open research collaboration doors.   These are merely three of many possibilities, but are effective one-liners to help secure and maintain your foothold in research–in spite of new writing programs, learning modules, or other packaged products that arrive in your building! 

3steps

Inherent in transforming information is synthesis and a conclusion….  Transfer requires only reporting of data without deep understanding.  Most commercially-sold writing programs do not understand this.   If assignments don’t include an element of transforming information, they are low level thought and do NOT meet our state’s model of investigation nor the objectives of the Common Core.

We are living in an Age of Misinformation – not the information age. – Students need to learn how to access information as well as synthesize it to draw conclusions.  This is college and career readiness.  Not, finding information on Google or mere vetted websites and jotting those notes into a pro-forma document or virtual index cards….

At the New England Library Association conference where I help pre-conference PD a few weeks ago, I met many great librarians who also bemoaned this scenario.  We jokingly said we’d come up with a 12-Step program for recovery.  Well, we’ve done better than that!   We’ve boiled it down to 5 simple steps, because we know that brain research says the brain can’t remember more than 4 at a time!

  1. Administer the Google litmus test 
  2. Insert Essential Question at the beginning which will foster synthesis of those facts and conclusions 
  3. Require credible library resources to be used 
  4. Embed technology for engagement – somewhere 
  5. Insure that students have an opportunity to “present” their knowledge 

Now we really know that there is more to it than that, but these simple 5 will not scare them away from “Repackaging Research.”

Read More: Librarydoor.blogspot.ca

Is Googling Research?

One of the most pressing challenges for librarians and teachers is introducing students to research methods and processes. In the P12 system and post-secondary, teachers and supervisors take pains to distinguish between googling and research or between wikipedia and reliable sources. This often reduces to guidelines for smart Googling, elaborate “Research Methods Beyond Google,” or more lengthy cautions of plagiarism traps within search engines.

Similarly, cautions are raised about Wikipedia as an academic source or medical source (“Something you Should know Before Googling“) and critics love to point to founder Jimmy Wales’ infamous comment: “For God’s sake, you’re in college; don’t cite the encyclopedia.”

When Purcell et al.’s report on “How Teens Do Research in the Digital World” was released in 2012, no one was really surprised by the findings. The survey of over 2,000 middle and high school teachers found that ‘research’ for students means Googling. Two years later, few would argue that this has changed.

How can students be wrong? Google’s mission remains: “To organize the world’s information and make it universally accessible and useful.” And Google itself relies on Googling for “Google’s Hybrid Approach to Research.”

Research 2.0

One of the more profound iterations on research over the past decade is versioning research 2.0. Yet what is research 2.0? Is it merely that the means of dissemination and translation from basic research to application have changed with new media and technologies? Does research 2.0 mean that we now have research media and technologies that our more senior mentors did not have? Does it mean that we now have open access to a myriad of publications from which we may learn and to which we may contribute? Does it mean that googling and tweeting is research? Or is it something much deeper?

Similarly, one of the more profound insights into this is Latour’s (1998) “From the World of Science to the World of Research?” What does he mean by ‘from science to research’? Doesn’t one imply the other? Things are perhaps worse than they seem: “There is a philosophy of science, but unfortunately there is no philosophy of research,” he says. The difference between science and research begins with a new relationship between science and society: “They are now entangled to the point where they cannot be separated any longer” (p. 208). “That is what has changed most. Science does not enter a chaotic society to put order into it anymore, to simplify its composition, and to put an end to its controversies…. When scientists add their findings to the mix, they do not put an end to politics; they add new ingredients to the collective process” (p. 209).

This changing relationship between science and society is more profound than it seems. It now more importantly can be understood as a “New Deal between research and society” (p. 209). Latour concludes: “Scientists now have the choice of maintaining a 19th-century ideal of science or elaborating with all of us, the hoi polloi-an ideal of research better adjusted to the collective experiment on which we are all embarked.”

Research 2.0?