Preprint: Introducing Synchronous Robustness Reports

“Data analysis is like an iceberg, it floats with one-seventh of its mass above water”
Adjusted from a quotation incorrectly attributed to Sigmund Freud

Preprint: https://osf.io/preprints/psyarxiv/edzfj

Abstract

“The vast majority of empirical research articles feature a single primary analysis outcome that is the result of a single analysis plan, executed by a single analysis team. However, recent multi-analyst projects have demonstrated that different analysis teams usually adopt a unique approach and that there exists considerable variability in the associated conclusions. There appears to be no single optimal statistical analysis plan, and different plausible plans need not lead to the same conclusion. A high variability in outcomes signals that the conclusions are relatively fragile and dependent on the specifics of the analysis plan. Crucially, without multiple teams analyzing the data, it is difficult to gauge the extent to which the conclusions are robust. We propose that empirical articles of particular scientific interest or societal importance are accompanied by two or three short reports that summarize the results of alternative analyses conducted by independent experts. The present paper aims to facilitate the adoption of this approach by providing concrete guidance on how such Synchronous Robustness Reports could be seamlessly integrated within the present publication system.”

Overview

In this short manuscript we do the following:

  1. Present a concrete format for Synchronous Robustness Reports (SRRs). The purpose of the SRR is to offer a concise and immediate reanalysis of the key findings of a
    recently accepted empirical article.  An SRR is limited to 500 words and has five sections: Goal, Methods, Results, Conclusion, and Code & Literature.
  2.  Outline a workflow that allows SRRs to be intergrated seamlessly within the current publication process.
  3. Provide a TOP-guideline menu for the adoption of SRRs.
  4. Showcase the advantages of the SRR with a concrete empirical example (check it out!).
  5. List four advantages of the SRR format.
  6. Counter four possible objections to the SRR format.

Concluding Comments

“We believe that SRRs hold considerable promise as a method to gauge robustness and encourage a more diverse statistical perspective on research with profound scientific or societal ramifications.

To those journal editors who believe that SRRs are impractical, uninformative or otherwise inadvisable, we issue a challenge: assign a Methods Editor and task them to arrange SRRs for, say, five empirical articles that will be published in your journal. Based on these concrete outcomes you can then make an evidence-based decision on whether or not your journal ought to be open for SRRs on a more permanent basis.

To those journal editors who believe that SRRs fall short of best practices and wish to raise the bar even further: note that SRRs can be seamlessly incorporated with preregistration (Nosek et al., 2019), registered reports (Chambers, 2013; Chambers, Dienes, McIntosh, Rotshtein, & Willmes, 2015), analysis blinding (Dutilh, Sarafoglou, & Wagenmakers, 2021; Sarafoglou, Hoogeveen, & Wagenmakers, 2023), or extended into the many analyst format (Aczel et al., 2021).

In sum, Synchronous Robustness Reports are a straightforward method to test robustness and reveal an important source of uncertainty that usually remains hidden. Their added value can only be assessed by practical implementation, but prior knowledge and limited experience suggest that Synchronous Robustness Reports are both feasible and informative.”

Key Reference

Bartoš, F., Sarafoglou, A., Aczel, B., Hoogeveen, S., Chambers, C., & Wagenmakers, E.-J. (2024). Introducing Synchronous Robustness Reports: Guidelines for journals.

Other References

Aczel, B., Szaszi, B., Nilsonne, G., Van Den Akker, O. R., Albers, C. J., Van Assen, M. A., . . . others (2021). Consensus-based guidance for conducting and reporting multi-analyst studies. Elife, 10, e72185. Retrieved from https://doi.org/10.7554/eLife.72185

Chambers, C. D. (2013). Registered Reports: A new publishing initiative at Cortex. Cortex, 49(3), 609–610. Retrieved from https://doi.org/10.1016/j.cortex.2012.12.016

Chambers, C. D., Dienes, Z., McIntosh, R. D., Rotshtein, P., & Willmes, K. (2015). Registered Reports: Realigning incentives in scientific publishing. Cortex, 66, A1–A2. Retrieved from https://doi.org/10.1016/j.cortex.2015.03.022

Dutilh, G., Sarafoglou, A., & Wagenmakers, E.-J. (2021). Flexible yet fair: Blinding analyses in experimental psychology. Synthese, 198(Suppl 23), 5745–5772. Retrieved from https://doi.org/10.1007/s11229-019-02456-7

Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., . . .Vazire, S. (2019). Preregistration is hard, and worthwhile. Trends in Cognitive Sciences, 23(10), 815–818. Retrieved from https://doi.org/10.1016/j.tics.2019.07.009

Sarafoglou, A., Hoogeveen, S., & Wagenmakers, E.-J. (2023). Comparing analysis blinding with preregistration in the many-analysts religion project. Advances in Methods and Practices in Psychological Science, 6(1), 1–19. Retrieved from https://doi.org/10.1177/25152459221128319

About the author

Eric-Jan Wagenmakers

Eric-Jan Wagenmakers

Eric-Jan (EJ) Wagenmakers is professor at the Psychological Methods Group at the University of Amsterdam.