The past few weeks have been quite an exciting time for psychological science, with the much-anticipated publication of the Open Science Collaboration’s reproducibility project results in the journal Science. As should be well-known to you by now, the findings were bleak. (For what I believe to be the best summary of this paper, see Ed Yong’s blog post.)
As a contributor to this project, I have been thinking about issues surrounding reproducibility & robustness of methods in psychology for some time. Clearly, something is not right in the way we conduct psychological research, and I am convinced that things need to change. Now.
Felix Schönbrodt recently posted a “Voluntary commitment to research transparency and open science” on his blog, where signatories can commit to the statements contained within. Reading over the statements in the commitment, I believed whole-heartedly that were all researchers to conduct their science according to the commitment, psychology would be in a better place.
Today, I signed this voluntary commitment. Below I re-post this commitment. The wording is identical to that of Felix and his colleagues, but I have added one point (point number 4, below). Although Felix has initiated this commitment, each commitment is personal, and he encouraged personalisation of future commitments. What matters is that researchers publicly declare their commitment. This is a very inspiring initiative. and I would like to thank Felix and his colleagues for leading this.
Voluntary Commitment to Research Transparency & Open Science
We embrace the values of openness and transparency in science. We believe that such research practices increase the informational value and impact of our research, as the data can be reanalyzed and synthesized in future studies. Furthermore, they increase the credibility of the results, as an independent verification is possible.
Here, we express a voluntary commitment about how we will conduct our research. Please note that to every guideline there can be justified exceptions. But whenever we deviate from one of the guidelines, we give an explicit justification for why we do so (e.g., in the manuscript, or in the README file of the project repository).
As signatories, we warrant to follow these guidelines from the day of signature on:
- Open Data: For every first authored publication we publish all raw data necessary to reproduce the reported results on a reliable repository with high data persistence standards (such as the Open Science Framework).
- Reproducible scripts: For every first authored publication we publish reproducible data analysis scripts.
- We provide (and follow) the “21-word solution” in every publication: “We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study.”1 If necessary, this statement is adjusted to ensure that it is accurate.
- For every first authored publication, we submit a pre-print of the manuscript to a dedicated archive (such as the PeerJ or the Social Science Research Network), to ensure accessibility of the material contained within to all researchers in a timely fashion.
- As co-authors we try to convince the respective first authors to act accordingly.
- As reviewers, we add the “standard reviewer disclosure request”, if necessary (https://osf.io/hadz3/). It asks the authors to add a statement to the paper confirming whether, for all experiments, they have reported all measures, conditions, data exclusions, and how they determined their sample sizes.
- As reviewers, we ask for Open Data (or a justification why it is not possible).2
Supervision of Dissertations
- As PhD supervisors we put particular emphasis on the propagation of methods that enhance the informational value and the replicability of studies. From the very beginning of a supervisor-PhD student relationship we discuss these requirements explicitly.
- From PhD students, we expect that they provide Open Data, Open Materials and reproducible scripts to the supervisor (they do not have to be public yet).
- If PhD projects result in publications, we expect that they follow points I. to III.
- In the case of a series of experiments with a confirmatory orientation, it is expected that at least one pre-registered study is conducted with a justifiable a priori power analysis (in the frequentist case), or a strong evidence threshold (e.g., if a sequential Bayes factor design is implemented). A pre-registration consists of the hypotheses, design, data collection stopping rule, and planned analyses.
- The grading of the final PhD thesis is independent of the studies’ statistical significance. Publications are aspired; however, a successful publication is not a criterion for passing or grading.
Service to the Field
- As members of committees (e.g., tenure track, appointment committees, teaching, professional societies) or editorial boards, we will promote the values of open science.
1Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2012). A 21 word solution. Retrieved from: http://dx.doi.org/10.2139/ssrn.2160588
2See also Peer Reviewers’ Openness Initiative: http://opennessinitiative.org/