Replication: how to address failure

Home Technologist 14 Replication: how to address failure

While the scientific method strives for objectivity, experimental results are still prone to unconscious bias and error. The solution is replication, replication, replication. 

Michèle-Nuijten

When Nature surveyed 1,576 researchers in 2016, more than 70% had at some point in their career tried and failed to reproduce another scientist’s experiment, while more than half failed to reproduce their own results. That’s a major problem because replication is a cornerstone of science: a study yields results which are then taken up by others and retested. Successful replication helps confirm validity; failure can suggest further scrutiny is needed. At least, that’s the theory.

“We need to incentivise methodological rigour and robust data analysis over results,” says Michèle Nuijten, an expert in meta-science from Tilburg University in the Netherlands. Unfortunately, headline-grabbing breakthroughs are more attractive to journals, so failed replication studies often go unpublished: only 13% of Nature’s respondents had this type of paper accepted.

A string of high-profile replication problems has forced scientists to rethink how they evaluate results. One notable admission came from pharmaceuticals giant Bayer in 2011, when it revealed that two-thirds of in-house studies identifying possible drug targets can’t be replicated. In 2015, psychologist Brian Nosek made further waves by publishing the much-anticipated results of his Reproducibility Project, in which a group of researchers attempted to replicate 100 notable psychology papers published in 2008. Just 36 proved replicable. In the same year, PLoS Biology suggested that the US spends around $28 billion each year on research that can’t be replicated.

Some implicated scientists have felt publicly shamed, others have insinuated incompetence by replicators. But many more see an opportunity to improve data analysis. “Openness is key,” says Nuijten. “Several journals now enforce policies of data sharing, which turns out to be a highly effective way to drive compliance among researchers.” Funders in Nuijten’s homeland have taken the incentive to replicate further. In 2016, the Netherlands Organisation for Scientific Research (NWO) launched the first grants programme dedicated to replication, worth €3 million. “That’s a relatively small amount,” says Nuijten “but the announcement generated a lot of interest and sent a positive signal that funding agencies may be willing to invest in this vital type of research.”

By Ben McCluskey @FreelanceSciWri   

 To check a PDF or HTML file for errors in statistical reporting, upload it on Statcheck 

SIMILAR ARTICLES

Hanne Jarmer

A 100-metre freefall, a trip into space, or a job as head of a department that is on the point…
TUM Fotostelle

Realistic training for extreme flight conditions.
Rami Malek Mr Robot

He had no idea where this exercise would lead to...
3d-pen

3D animation accessible to anyone.