- How Science Works
- Sources and Experts: Where to Find Them and How to Vet Them
- Making Sense of Science Stats
- Editing for Story
- Editing Controversial Science
- Holding Science to Account
- Covering Health Care
- Climate and the Environment
- Fact-Checking Science Journalism: How to Make Sure Your Stories Are True
Illustrating Complex Science Stories
- The Role of Visuals in Science Journalism
- The Process of Building Science-Centric Graphics
- Strategies for Using Visuals to Put Breaking Science in Context
- Special Considerations for Data Visualization
- Uncertainty and Misinformation
- Editorial Illustration, Photography, and Moving Images
- Additional Reading and Resources
- About the Author
- Social Media and Reader Engagement
- Popular Science
- Op-Eds and Essays
- About This Handbook
By Apoorva Mandavilli / 3 minute read
Physicists and mathematicians have for nearly 30 years posted their work on an online archive called arXiv, inviting their peers to comment and even revise their work. The idea is to post work at the same time as, or even before, submitting it to a peer-reviewed journal for publication. Because publishing can be slow, this allows the work to be disseminated more quickly to the community of scientists, keeping science moving. It also allows scientists to get early feedback on their work.
The most popular preprint server for biology, bioRxiv, was launched in 2013, but it took years to grow popular. Life scientists found it difficult to embrace such openness, saying they were worried about their work being scooped by competitors. Eventually, bioinformaticians and geneticists, influenced by their mathematical backgrounds, adopted it. By July 2017, the server had begun receiving more than 1,000 submissions per month.
Buoyed by its popularity, its founders launched medRXiv, intended for preprints on medical research. Because the implications of medical research can be powerful, the curators of this repository vet papers even more thoroughly than is done on bioRxiv. But both archives have staff and volunteer experts check preprints for basic standards: plagiarism, content that might pose a health risk, or research that could be unethical in some way.
In the early months of the coronavirus pandemic, both of these servers suddenly became the place to submit emerging information. Researchers all over the world, beginning with doctors in Wuhan, China, began submitting early studies of the virus and the disease it causes. By mid-May 2020, there were nearly 3,500 coronavirus preprints on the two servers.
How the Pandemic Changed Publishing
Before the 2020 coronavirus pandemic, many reputable publications shied away from writing about preprints, wary of work that had not passed a thorough peer review. But during the pandemic, those barriers vanished. There was simply no time to wait for peer review, so scientists and journalists took to discussing and writing about preprints.
This was helpful in some ways. The first studies in Wuhan of risk factors associated with Covid-19’s severity came out in preprints, followed swiftly by their appearance in peer-reviewed journals. They helped public health officials in other countries prepare and warn residents with certain underlying conditions to be extra cautious.
But some preprints have created confusion and panic and seeded conspiracy theories.
For example, in January, a preprint on bioRxiv claimed to have found an “uncanny similarity” between the new coronavirus and HIV that was “unlikely to be fortuitous.” In essence, the preprint suggested that the new coronavirus had been created in a lab, fueling an existing conspiracy theory about its origins.
The paper was swiftly taken apart on Twitter, and before any reputable publication could cover it, bioRxiv pulled it.
The preprint server’s staff began screening coronavirus preprints more stringently and added a warning to the website, reminding readers as well as the news media that the manuscripts are preliminary.
Editors should be cautious when assigning preprints and, at the very minimum, vet a preprint with one or two experts before deciding to cover it.
Aiming to match the speed of research, journals have also tried to rise to the occasion, rushing papers through peer review, rapidly posting reviewed papers online and, at least in the initial months of the pandemic, offering them to journalists sans embargo. One analysis of 14 journals found that the average time for a manuscript to complete peer review and appear online had dropped from 117 to 60 days. This, too, has raised eyebrows from scientists who note that thorough peer review takes time, and speed is sometimes the enemy of accuracy.
“What we saw during the pandemic is an acceleration of trends that had been taking place over the past two decades,” says Ivan Oransky, co-founder of Retraction Watch. “My hope is that we learn lessons about how speed can be both really good and really bad.”