Week 3
Experiments cannot be reproduced by other labs.
The Challenge
A defining feature of scientific knowledge is that it is replicable: another scientist should be able to repeat the procedure, and re-generate the same results. Yet in reality, there are many obstacles to replication. The first obstacle is often: published procedures are not described sufficiently, to allow another scientist to reproduce them. Other obstacles to replication will be discussed in later classes (e.g. experimenter degrees of freedom and publication bias mean that published results offer an inflated estimate of effect sizes; professional incentives for scientists often don’t reward reproducing or being reproducible). Today we focus on the challenge of how to make scientific procedures reproducible. Everyone should read the resources under ‘everyone’; after that please read the additional readings most relevant to you.
Everyone
Nosek, B. A., & Errington, T. M. (2020). The best time to argue about what a replication means? Before you do it. Nature 583, 518-520
Teytelman, L (2018) No more excuses for non-reproducible methods. Nature 560, 411. Or watch a recent (2021) talk (the beginning is funny/sad):
Biological sciences
Errington, T. M. (2019) Slides for Reproducibility Project: Cancer Biology, Barriers to Replicability in the Process of Research. Talk at MetaScience conference.
Systems Neuroscience
Banga, K., Benson, J., Bonacchi, N., Bruijns, S. A., Campbell, R., Chapuis, G. A., ... & Winter, O. (2022). Reproducibility of in-vivo electrophysiological measurements in mice. bioRxiv.
Computational science
Barton, CM et al (2022) How to make models more useful.PNAS 119 (35) e2202112119
Pineau, J. (2020) The Machine Learning Reproducibility ChecklistLinks to an external site.
Developmental Science
Gilmore, R. O., & Adolph, K. E. (2017). Video can make behavioural science more reproducible. Nature human behaviour, 1(7), 1-2. https://www.nature.com/articles/s41562-017-0128Links to an external site.
Frank, M. C., Bergelson, E., Bergmann, C., Cristia, A., Floccia, C., Gervain, J., ... & Yurovsky, D. (2017). A collaborative approach to infant research: Promoting reproducibility, best practices, and theory‐building. Infancy, 22(4), 421-435. or watch this talk by Mike Frank
Social Science
Moody, J. W., Keister, L. A., & Ramos, M. C. (2022). Reproducibility in the Social Sciences. Annual Review of Sociology, 48. SO48CH21_Moody[001-021].pdf
The Tool
Practical Assignment:
1. Find a paper in your field which describes an experiment. Use the ARRIVE and CONSORT checklist to evaluate how well they communicate their methods and design decisions. What is missing? Note any trouble you have applying these checklists to research in your area.
https://arriveguidelines.org/sites/arrive/files/documents/ARRIVE%20Compliance%20Questionnaire.pdf
- Find the website or tool that is used for sharing detailed / executable protocols in your discipline (see examples below). Ideally, find an existing template similar to an experiment you have done or are planning to do. Spend 30 minutes beginning to create an experimental protocol in this tool. In response paper answer: How far did you get? What was difficult about the task?
In your response paper, describe what you accomplished in these tasks, and any snags you hit.
Then, critically evaluate these tools: will they address the obstacles to repeatability in your area of science? What else is needed?
Useful links and resources
Tools for shared protocols:
biological sciences:
online behavioural experiments:
computational biology:
In this wikipedia page, there is a list of electronic laboratory notebook software packages
ARRIVE and CONSORT methodology reporting standards:
Videos of workshop on how to improve reproducibility
Repository of scales used in Psychology:
Collection of articles about reproducibility:
Interview with Teytelman where he talks about the obstacles to uptake of protocols.io
A slate article about how personal and ugly the controversies about replication can get. This story is about a psychology experiment about moral priming, and the controversy was called “#repligate”.
More than twenty years ago there was a very famous controversy about the replicability of findings in genetically identical mice.
Freedman, L. P., Cockburn, I. M., & Simcoe, T. S. (2015). The economics of reproducibility in preclinical research. PLoS biology, 13(6), e1002165.
Ashley’s example (in progress): Conducting Online Research With Infants
The Critical Evaluation
The Challenge: Experiments cannot be repeated by other labs. Another scientist should be able to repeat the procedure, and re-generate the same results. A major challenge is that procedures are often not described sufficiently, to allow another scientist to repeat them. Even within labs people often struggle to replicate prior results. What personal experience have you had with this challenge? What makes repeatable methods particularly challenging in your area of science?
The tool: FAIR protocols and methodology reporting standards. Describe what you did, in fulfilling the practical activity. This includes describing how the paper you found adheres to the guidelines or what it is missing; and your experience trying to create a protocol using an online tool or template. Include any snags you hit.
Critical evaluation of the tool. What is the promise of these tools in addressing this challenge? What are the biggest obstacles? These might include the time and effort of writing or sharing complete protocols, uncertainty about which are the relevant aspects of the context, specific methods that are unusually difficult to faithfully transmit, problems with workflow, incentives, or adoption.
This response paper should be 1-2 pages long, single-spaced, please tell us which papers you chose to read at the top of your response paper.