Study Shows Ensuring Reproducibility in Research Is Needed

The IEEE Computer Society suggests improvements

3 min read

 Illustration of 6 people, some holding large check marks or Xs, looking at a giant clipboard with notes on it.
iStockphoto

About 60 percent of IEEE conferences, magazines, and journals have no practices in place to ensure reproducibility of the research they publish. That's according to a study by an ad hoc committee formed by the IEEE Computer Society to investigate the matter and suggest remedies.

Reproducibility—the ability to repeat a line of research and obtain consistent results—can help confirm the validity of scientific discoveries, IEEE Fellow Manish Parashar points out. He is chair of the society's Committee on Open Science and Reproducibility.

"Ensuring the robustness and trustworthiness of science that is done using computing and data is absolutely critical," Parashar says.

The inability to reproduce the results of an experiment can lead to a range of consequences, including the retraction of research and injury to the reputations of the authors and the journal that published it.

For example, after a string of failed attempts to replicate the results of a study on preventing fraud in policy review and insurance claim forms, a request was made in August to the Proceedings of the [U.S.] National Academy of Sciences to retract a 2012 paper on the research. According to the study, when people signed an honesty declaration at the beginning of an insurance policy form, rather than the end, they were less likely to lie about the information they provided. Insurance companies, private organizations, and government agencies then adopted the seemingly inexpensive and effective method to reduce fraud. But other research teams could not confirm the finding, and an anonymous group of scientists then found evidence suggesting that the original experiment used fabricated data.

ENSURING REPRODUCIBILITY AT IEEE

The goal of the ad hoc committee's study was to ensure that research results IEEE publishes are reproducible and that readers can look at the results and "be confident that they understand the processes used to create those results and they can reproduce them in their labs," Parashar says.

"Ensuring the robustness and trustworthiness of science that is done using computing and data is absolutely critical."

The committee's international membership spans academia and national laboratories as well as representatives from IEEE leadership.

To get a better sense of the issue, the group surveyed more than 100 IEEE journals, magazines, and conferences and analyzed the reproducibility models and practices they use. The findings were published online.

Here are three key recommendations from the report:

  • Researchers should include specific, detailed information about the products they used in their experiment. When naming the software program, for example, authors should include the version and all necessary computer codes that were written. In addition, journals should make submitting the information easier by adding a step in the submission process. The survey found that 22 percent of the society's journals, magazines, and conferences already have infrastructure in place for submitting such information.
  • All researchers should include a clear, specific, and complete description of how the reported results were reached. That includes input data, computational steps, and the conditions under which experiments and analysis were performed.
  • Journals and magazines, as well as scientific societies requesting submissions for their conferences, should develop and disclose policies about achieving reproducibility. Guidelines should include such information as how the papers will be evaluated for reproducibility and criteria code and data must meet.

ADDRESSING HURDLES

The report covers roadblocks researchers and journals could face when ensuring reproducibility, such as the allocation of responsibilities and economic issues, as well as ways to overcome them.

Also included is an overview of an ongoing pilot being conducted by IEEE Transactions on Parallel and Distributed Systems. It is offering incentives such as "reproducibility badges" to authors for making code and data available for reuse along with their publications.

The Conversation (1)
Don Carter
Don Carter04 Jan, 2022
M

Reproducibility—the ability to repeat a line of research and obtain consistent results—can help confirm the validity of scientific discoveries..

Needs to be changed to "- is foundational to science"

without reproducibility, it isn't science.