Reproducibility & Replicability

Reproducibility and replicability are different but related concepts. There are important distinctions between the terms that need to be made when discussing how research workflows unfold across disciplines. Historically, not all disciplines have considered the need for reproducibility but the increased use of computational tools and methods across all fields of research has made concepts around replicability and, in some cases, reproducibility more ubiquitously relevant.


Reproducibility for research means that the workflow and data used in the research project can be used to yield the same results. For research to be reproducible, a new team of researchers must be able to use the same hypothesis, experimental method, data, population, and general conditions of the first experiment to reproduce the same results. Reproducibility of work leads to increased rigour and quality of scientific outputs, and thus to greater trust in science.


Replicability of research means that when the same workflow (e.g., context or population, research question, experimental design or approach, and analysis plan) is used using a new data set the outcome is consistent (albeit with a small margin of error). This shows the transferability of the workflow. Replicability is more difficult to achieve than reproducibility because the results rely on the reported workflow/methods of the original research (which is often not completely transparent) using new data.

Source:, used in:

The Replication Crises

Over the past decade, many researchers in STEM have been concerned with the “replication crisis“, this crisis is the documentation of many large-scale studies which have failed to find many previously thought to be ubiquitous phenomenon. The Nib has produced a visual narrative explaining the replication crisis in Psychology and the impact that it has (not just within Psychology but in other fields as well). There is an ongoing debate on whether a “replication crisis” equally applies to the humanities and social science disciplines as has been found in STEM. While one-for-one reproducibility does not always reflect the nature of work in non-STEM disciplines, the increasing use of digital tools and methods in research requires us to think about the ways in which our work will be possible to open, view, and manipulate in the future. A document written in proprietary software from twenty years ago may not be viewable on a modern device without intervention. Similarly, a digital humanities project which uses a script to draw conclusions off of data will not be replicable unless that script is made available along with the data. Thus, a lack of replicability as a result of non-open workflows puts humanities and social science research at risk and makes it difficult to build onto knowledge over time.

Scenario – Reproducibility

Let’s consider this scenario: you are starting a new collaborative research study and have engaged your partners in a conversation about how to make the results of the study reproducible. One of the project leads, an experienced researcher with many publications under their belt, makes the following comment:

“All of the results will be in the paper, won’t people be able to reproduce our results from there? If they have any more questions they can reach out directly.”

How would you respond to them?

Possible response

You might state that even an extremely detailed description of the methods leading to the results will not be sufficient in most cases to reproduce it. This could result from several aspects, including: different computational environments, differences in the software versions, implicit biases that were not clearly stated, etc. Additionally, spending the time and effort to create a detailed workflow together will increase both the scientific validity of the final results as well as minimize the time required for re-running or extending to further studies.

Dig Deeper

  • Learn more about replicability in the humanities: Peels, R. (2019). Replicability and replication in the humanities. Research integrity and peer review, 4(1), 1-12. 10.1186/s41073-018-0060-4
  • Learn how open science and other practices can help with improving research goals:
    • Frias‐Navarro, D., Pascual‐Llobell, J., Pascual‐Soler, M., Perezgonzalez, J., & Berrios‐Riquelme, J. (2020). Replication crisis or an opportunity to improve scientific production?. European Journal of Education, 55(4), 618-631.
    • Shrout, P. E., & Rodgers, J. L. (2018). Psychology, science, and knowledge construction: Broadening perspectives from the replication crisis. Annual review of psychology, 69, 487-510.