I’m working on a manageable book titled Methods of Analysis. It’s basically a side project or backstory to a few other research projects. Download the essays or working chapters from http://blogs.ubc.ca/educ500/methodologies/analysis/ The first para of the introduction is tentatively this:
The purpose of this book is to provide a sense of histories, philosophies, premises, and procedures of a range of methods of analysis. I try to say something novel— something that has not already been said— of the methods herein. This is obviously meta-analysis, understood to be an analysis of analysis and synthesis of analysis. Meta-analysis implies analytic theory, requiring a typology of methods or methodology. Researchers take analysis for granted, having neglected meta-analysis and the history of analysis. Researchers also take for granted methods of analysis, readily overlooking interrelationships and blurring distinctions. Among the expansive volume of texts on methods and methodologies, there is no single text or source that draws relationships and distinctions of a range of methods of analysis. This book addresses that oversight. Since the early nineteenth century and development of logical analysis, methods of analysis proliferated. This is unique, as the analogous, historical journey of synthesis witnesses no such proliferation. Nonetheless, a companion or sequel of methods of synthesis would be helpful.
Coincidence is not correlation and nor is correlation causation. Or is it? One of the more convincing data analyses post-Brexit was the depiction of a near perfect correlation between counties and regions in the UK that experienced breakouts of BSE or Mad Cow disease in 1992 and the counties and regions voting to Leave the EU in 2016.
While it might be a stretch to suggest that those who anticipated Leaving the EU consumed great quantities of Mad Cows in 1992, the near perfect correlation is telling.
Coincidence is not correlation. About the same time that the UK was moving towards yes to a referendum to exit the European Union, the WHO’s International Agency for Research on Cancer (IARC) was downgrading coffee and upgrading “hot beverages” as a causal agent in cancer.
As the story goes, expert opinion debunked 400,000 years of the human experience of drinking hot beverages while the Remain camp boasted that expert opinion was on their side.
Thank you experts.
Leading up to the Brexit, the Conservative Party’s Michael Gove remarked: “I think people in this country have had enough of experts.” Anti-expert sentiment gained momentum and perhaps an upper hand. At the brink of the British exit from the EU and in the aftermath, experts scrambled for legitimacy:
Expertise can breed arrogance and false certainty; specialisms fall prey to group-think. Experts must be challenged and their work robustly interrogated. But that is very different from attacking evidence merely because it undermines your arguments (something that both Remainers and Leavers have done) and instantly impugning the motives of those who have produced it (ditto).
How could experts have got it so wrong? Immediately post-referendum, one pundit asked straight away: “do the experts know anything?” Another queries if we are now witnessing ‘the twilight of the experts.”
Were data saying one thing while experts were hearing another? Intentionally?
Welcome to the summer section of EDUC 500: Research Methodologies!