Preprint: Teaching Good Research Practices: Protocol of a Research Master Course

This post is an extended synopsis of Sarafoglou A., Hoogeveen S., Matzke D., & Wagenmakers, E.-J. (in press). Teaching Good Research Practices: Protocol of a Research Master Course. Preprint available on PsyArXiv: https://psyarxiv.com/gvesh/

Summary

The current crisis of confidence in psychological science has spurred on field-wide reforms to enhance transparency, reproducibility, and replicability. To solidify these reforms within the scientific community, student courses on open science practices are essential. Here we describe the content of our Research Master course “Good Research Practices” which we have designed and taught at the University of Amsterdam. Supported by Chambers’ recent book The 7 Deadly Sins of Psychology, the course covered topics such as QRPs, the importance of direct and conceptual replication studies, preregistration, and the public sharing of data, code, and analysis plans. We adopted a pedagogical approach that (1) reduced teacher-centered lectures to a minimum; (2) emphasized practical training on open science practices; (3) encouraged students to engage in the ongoing discussions in the open science community on social media platforms. In this course, we alternated regular classes with classes organized by students. For each of these, an example is given below. In addition, Table 1 displays a selection of further topics discussed in the course.

Example of a Regular Class: The Sin of Data Hoarding

The fifth week of the course focused on “The Sin of Data Hoarding” Chambers (2017), that is, the chapter on data sharing (for a recent special issue see Simons, 2018). As an expert on this topic we invited Bobby Lee Houtkoop, a former student from the same program. Houtkoop recently conducted and published a survey study to reveal reasons why researchers are reluctant to share their data, and what can be done to overcome this reluctance (Houtkoop et al., 2018). In her lecture, Houtkoop discussed the dominant scientific culture in which data sharing is not the norm, even though data sharing offers unequivocal advantages for both the author and the scientific community. In cancer research, for instance, it was found that studies for which data were publicly shared received higher citation rates compared to studies for which data were not available (Piwowar, Day, & Fridsma, 2007). In addition, data sharing may improve the reputation or perceived integrity of the researcher. The scientific community benefits from data sharing since (1) it increases the longevity of the data, (2) data can be reanalyzed and reused efficiently (e.g., for meta–analyses), and (3) statistical or reporting errors are more likely to be found (Vanpaemel, Vermorgen, Deriemaecker, & Storms, 2015; Wicherts, Borsboom, Kats, & Molenaar, 2006). Houtkoop then presented the methods and results of the survey study. The survey results demonstrated that data are shared only infrequently. Most respondents acknowledged the benefits and importance of data sharing in general; however, they perceived data sharing as less beneficial for their own research projects. Among the perceived barriers to data sharing are the respondents’ belief that data sharing is not a common practice in their fields, their preference to share data only upon request, their perception that data sharing requires additional work, and their perceived lack of training in data sharing. Houtkoop’s study sparked a lively discussion among the students about future research, about initiatives that encourage data sharing, but also about limitations of the study. In particular, the students were critical about potential biases in the results due the low response rate of the survey (i.e., a response rate of only about 5% which, however, translated into a sample of 600 respondents) and the self-selection of the respondents.

The end of the class featured a “Newsflash”. In that particular week, the science community was excitedly debating the results of the “Many Labs 2” project (Klein et al., 2018) which had just been published. In this project, the participating research teams conducted high powered preregistered replications of 28 classic and contemporary findings across many samples and settings. The replication efforts showed that only 54% (i.e., 15 studies) could be replicated. In the newsflash, students discussed the article by Klein et al. (2018), the related news article published in The Atlantic titled “Psychology’s Replication Crisis Is Running Out Of Excuses” (Yong, 2018), and the BBC radio episode on the replication crisis (BBC Radio 4, 2018).

Example of a Student Class: The Sin of Data Hoarding

The student lecture continued where Houtkoop’s study left off. The student presenters emphasized the benefits of data sharing and created a tutorial for their peers on how to archive and share data of simple empirical studies on the Open Science Framework (see also Soderberg, 2018). The objective of this lecture was to encourage their peers to ask their future thesis supervisors permission to share the collected data in a public repository. The in-class assignment revolved around the Peer Reviewer’s Openness initiative (PRO; Morey et al., 2016) mentioned in the introduction. Specifically, the students let their peers create a set of questions for the signatories of the PRO initiative, inquiring about signatories’ post-PRO experiences with journals and editors, their attitude towards data sharing in their own research, and whether and how the signatories would improve the initiative. Students were divided into small groups and were instructed to read the article by Morey et al. (2016) on the PRO initiative. Then, each group had to propose concrete questions for the PRO signatories. In a plenary discussion, the students reviewed the questions, selected the ones they found most relevant, and created a survey. Since this exercise generated items that seemed informative and useful, the students who prepared the class decided to continue and execute the survey as a separate research project. Currently, the PRO initiative survey has elicited responses from over 120 of the current 340 signatories for whom Email information could be retrieved (i.e., 37.4%).

Table 1. Further Topics Covered and Suggested Literature for the Course “Good Research Practices” (selection)

Topic

Description

Blinded analyses In this class we discussed analysis blinding as a valuable addition to study preregistration to avoid hidden flexibility in data analysis. Analysis blinding, just as preregistration, prevents implicit or explicit forms of significance-chasing, but it retains the possibility for the data analyst to account for unexpected features of the data.

Suggested Literature: MacCoun & Perlmutter, 2015, 2018.
Statistical errors (with Olmo van den Akker)Statistical reporting errors can lead to erroneous substantive conclusions. In this class we discussed how researchers can minimize the chance of statistical reporting errors by using software that automatically detects inconsistencies. This lecture was given by Olmo van den Akker, who is part of the Meta-Research Center at Tilburg University that is specialized in scientific misconduct and reproducibility.

Suggested Literature: Chambers, 2017, Chapter 6; Epskamp & Nuijten, 2016; Greenland et al., 2016; Nuijten, Hartgerink, van Assen, Epskamp, & Wicherts, 2016.
Statistical errors (with Olmo van den Akker)
Registered Reports (with Chris Chambers)
Apart from publishing the course textbook “The 7 Deadly Sins of Psychology”, Chambers has participated in drafting the TOP guidelines and is the chair of the Registered Reports committee supported by the Center for Open Science. In his class, Chris Chambers shared his experiences of how he first proposed the Registered Report format to the Cortex editorial board, how the initiative was implemented in the journal, and how Registered Reports are having a growing influence on the scientific community.

Suggested Literature: Chambers, 2017, Chapter 8; Chambers, 2013.

Link to preprint: https://psyarxiv.com/gvesh/

References

BBC Radio 4. (2018). The replication crisis [Radio Broadcast]. BBC. Retrieved from
https://www.bbc.co.uk/programmes/m00013p9

Chambers, C. D. (2013). Registered reports: A new publishing initiative at Cortex. Cortex, 49, 609–610.

Chambers, C. D. (2017). The seven deadly sins of psychology: A manifesto for reforming the culture of scientific practice. Princeton: Princeton University Press.

Epskamp, S., & Nuijten, M. B. (2016). statcheck: Extract statistics from articles and recompute p-values (R package version 1.3.0).

Greenland, S., Senn, S. J., Rothman, K. J., Carlin, J. B., Poole, C., Goodman, S. N., & Altman, D. G. (2016). Statistical tests, p-values, confidence intervals, and power: A guide to misinterpretations. European Journal of Epidemiology, 31, 337–350.

Houtkoop, B. L., Chambers, C., Macleod, M., Bishop, D. V., Nichols, T. E., & Wagenmakers, E.-J. (2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in
Methods and Practices in Psychological Science, 1
, 70–85.

Klein, R., Vianello, M., Hasselman, F., Adams, B., Adams, R., Alper, S., … Nosek, B. (2018). Many Labs 2: Investigating variation in replicability across sample and setting. Advances in Methods and Practices in Psychological Science, 1, 443–490.

MacCoun, R., & Perlmutter, S. (2015). Hide results to seek the truth: More fields should, like particle physics, adopt blind analysis to thwart bias. Nature, 526, 187–190.

MacCoun, R., & Perlmutter, S. (2018). Blind analysis as a correction for confirmatory bias in physics and in psychology. In S. O. Lilienfeld & I. Waldman (Eds.), Psychological science under scrutiny: Recent challenges and proposed solutions (pp. 297–322). John Wiley and Sons.

Morey, R. D., Chambers, C. D., Etchells, P. J., Harris, C. R., Hoekstra, R., Lakens, D., … Zwaan, R. A. (2016). The peer reviewers’ openness initiative: Incentivizing open research practices through peer review. Royal Society Open Science, 3, 150547.

Nuijten, M. B., Hartgerink, C. H., van Assen, M. A., Epskamp, S., & Wicherts, J. M. (2016). The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods, 48, 1205–1226.

Piwowar, H., Day, R., & Fridsma, D. (2007). Sharing detailed research data is associated with
increased citation rate. Plos ONE, 2, e308.

Simons, D. (Ed.). (2018). Challenges in making data available [Invited Forum]. Advances in Methods and Practices in Psychological Science, 1.

Soderberg, C. (2018). Using OSF to share data: A step-by-step guide. Advances in Methods and Practices in Psychological Science, 1, 115–120.

Vanpaemel, W., Vermorgen, M., Deriemaecker, L., & Storms, G. (2015). Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra, 1, 1–5.

Wicherts, J., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61, 726–728.

Yong, E. (2018). Psychology’s replication crisis is running out of excuses [Online News Article]. The Atlantic. Retrieved from https://www.theatlantic.com/ science/archive/2018/11/ psychologys-replication-crisis-real/576223/

About The Authors

Alexandra Sarafoglou

Alexandra Sarafoglou is a PhD candidate at the Psychological Methods Group at the University of Amsterdam.

Suzanne Hoogeveen

Suzanne Hoogeveen is a PhD candidate at the Department of Social Psychology at the University of Amsterdam.

Dora Matzke

Dora Matzke is assistant professor at the Psychological Methods Group at the University of Amsterdam

Eric-Jan Wagenmakers

Eric-Jan (EJ) Wagenmakers is professor at the Psychological Methods Group at the University of Amsterdam.