Because experimental research in teaching and learning is a difficult undertaking (especially in actual classrooms rather than the laboratory), we lack evidence-based guidance for some aspects of our teaching. Where evidence does exist though, I strive to incorporate it into my practice. So what happens when I realize that my evidence-based practice is based on (apparently) fabricated data? First, I take a moment to appreciate the irony that the alleged fabrication was in a study of honesty.
Background
Two years ago, I wrote about my experiences with academic integrity in the classroom, along with one “evidence-based” practice I began using based on the findings of a PNAS article (Shu et al., 2012). In a nutshell, this study found that when asked to self-report information (like on a tax return), people are more likely to report honestly if they are asked to sign a statement affirming veracity before they fill out the information rather than after. This made logical sense to me and was relatively easy to implement in my courses, so henceforth I began all of my exams and online quizzes by asking students to affirm that they would follow the relevant rules. All was well.
Cut to August 15, 2021. I once again shared that old academic integrity blog post via Twitter, and then 2 days later, the Data Colada blog also posted on the topic. Had I been paying attention in the intervening time, I would have seen that subsequent studies failed to corroborate the findings of the 2012 paper, and in 2020 a new paper was published in PNAS, titled Signing at the beginning versus at the end does not decrease dishonesty (Kristal et al., 2020). This was written by the same group as the 2012 paper, plus two new authors. As the title suggests, they were unable to replicate the 2012 findings. They noted in part that one experiment in the earlier study, relating to auto insurance, had a data anomaly which led them to speculate that, “. . . the randomization failed (or may have even failed to occur as instructed) in that study.” This new publication included the raw data from the 2012 study, and a group of anonymous researchers analyzed these data in collaboration with Data Colada. This analysis led them to conclude that,
“There is very strong evidence that the data [for the auto insurance study included in Shu et al. (2012)] were fabricated.”
(Data Colada & Anonymous)
All authors of Shu et al. (2012) confirmed that Dan Ariely was the only author in contact with the (undisclosed) insurance company from which those data were purportedly received, and Ariely stated that he did not modify the data file after he received it. However, circumstantial evidence suggests he created the data (O’Grady, 2021). The article was retracted by PNAS, by request of the authors, on September 21, 2021.
So What Now?
My original blog post on academic integrity preceded the pandemic. Since then, the rapid transition to online teaching has forced many educators, including me, to more directly grapple with issues of academic integrity, fairness, privacy, and trust. My preferred approach is to minimize opportunity and motive for cheating while erring on the side of trust. During the pandemic online pivot, this meant that I only gave open-book quizzes and exams, and I did not use any surveillance software. The only forbidden exam activity was collaborating with classmates. I continued to begin exams with my integrity pledge. Do I think some of my students cheated? In a class of 175 students, I’d be surprised if it didn’t happen. That is the choice I made though, in order for my students to work without surveillance or suspicion.
I now know that it likely doesn’t matter if students sign an academic integrity pledge before or after an exam. This makes me wonder if signing the pledge makes any difference at all. I know some students may take advantage of my current approach (especially online), but I strongly prefer this over an atmosphere of suspicion and distrust that results from using something like Proctorio. I now have a bit of reluctance with regard to an integrity pledge though. By asking students to sign an integrity pledge, am I implying that I’m concerned they might cheat? Or, by taking them at their word, does it simply show trust?
Perhaps the more important question is whether this approach even works? I haven’t dug into the literature in depth, but there is evidence that honor codes, when implemented at the institutional level, reduce instances of cheating (D. L. McCabe et al., 2002; D. McCabe & Treviño, 2002). Without institutional honor codes (like in my case) though, evidence is mixed on whether classroom honor codes reduce cheating, but they do increase students’ perception of respect and trust from their instructor (Konheim-Kalkstein et al., 2008).
For the time-being, I will continue to use an academic integrity pledge (for both in-person and online exams). I don’t usually solicit comments on my posts, but if you would like to share, I’m curious about your ideas for supporting academic integrity in the classroom and online without conveying suspicion or distrust of your students.
References
Data Colada, & Anonymous. (2021, August 17). [98] Evidence of Fraud in an Influential Field Experiment About Dishonesty. Data Colada. http://datacolada.org/98
Konheim-Kalkstein, Y. L., Stellmack, M. A., & Shilkey, M. L. (2008). Comparison of Honor Code and Non-Honor Code Classrooms at a Non-Honor Code University. Journal of College and Character, 9(3), 1–13. https://doi.org/10.2202/1940-1639.1115
Kristal, A. S., Whillans, A. V., Bazerman, M. H., Gino, F., Shu, L. L., Mazar, N., & Ariely, D. (2020). Signing at the beginning versus at the end does not decrease dishonesty. Proceedings of the National Academy of Sciences, 117(13), 7103–7107. https://doi.org/10.1073/pnas.1911695117
McCabe, D. L., Treviño, L. K., & Butterfield, K. D. (2002). Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings. Research in Higher Education, 43(3), 357–378. https://doi.org/10.1023/A:1014893102151
McCabe, D., & Treviño, L. K. (2002). Honesty and Honor Codes. Academe, 88(1), 37. https://doi.org/10.2307/40252118
O’Grady, C. (2021). Honesty study was based on fabricated data. Science, 373(6558), 950–951. https://doi.org/10.1126/science.373.6558.950
[RETRACTED] Shu, L. L., Mazar, N., Gino, F., Ariely, D., & Bazerman, M. H. (2012). Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. Proceedings of the National Academy of Sciences of the United States of America, 109(38), 15197–15200. https://doi.org/10.1073/pnas.1209746109
Thanks Patrick for this informative post. Since I am in a similar situation as you (doing online exams, wanting to trust students and promote integrity), I read with much interest. I think your conclusions and approach make a lot of sense. Just a couple comments / questions:
The studies you refer to appear to be focused on the question of whether signing an integrity first or last makes a difference (rather than the effect of signing a pledge versus not signing a pledge). If the answer to that (first vs. last) appears to be no, then it is still possible that there is an effect of having vs. not having a pledge, right?
Furthermore, it seems that lack of a detectable (i.e. statistically significant) effect in those studies does not allow us to rule out the possibility that there may in fact be an effect, especially in other contexts. I imagine that you (and I) use a lot of coaching and language about the importance of academic integrity, perhaps more so in the studies in that second paper?
Perhaps I am just trying to hold out some hope that these pledges help . . .
Best wishes,
Darren
Thanks, Darren! Yes, there is definitely evidence that these pledges do help, I just need to dig into the research a bit more to figure out the best practices. This is especially true as I think most studies are about program- or university-wide honor codes. It’s a bit different to implement something in your own course if it isn’t also done in your students’ other courses.