Avoiding False Balance
By Melinda Wenner Moyer / 2 minute read
Stories and anecdotes from “real people” — that is, sources who aren’t necessarily experts but whose experiences can be valuable to include — can make scientific trends and discoveries much more relatable to readers. But reporters can fall into the trap of framing opposing beliefs and perspectives as equally valid when, from a scientific or medical standpoint, they are not. This problem, called “false balance” or “false equivalency,” can be exceptionally dangerous.
I constantly have to remind myself to remain reasonably skeptical about everything.
Robin Lloyd, science journalist, former news editor at Scientific American
False balance often arises in stories about controversies that persist among the general public even though the science on the issue is clear-cut — such as climate change and vaccine safety. When journalists cover the ongoing societal debate over human-caused climate change, and they quote scientists on “both sides of the issue,” presenting them as equal in merit and in number, they fail to communicate to readers that there is, in fact, a clear scientific consensus on the issue. Likewise, when journalists cover vaccine safety and they quote parents who are concerned about childhood vaccines alongside experts who study vaccine safety, they frame these opposing opinions as equally valid, when the science clearly shows that childhood vaccines are safe.
In February 2015, the Toronto Star published an investigation into the human papillomavirus (HPV) vaccine Gardasil under the headline “A Wonder Drug’s Dark Side.” The piece was based largely on anecdotes from young women who believed they had developed serious health complications because of the vaccine. Some said they’d develop egg-sized lumps on the soles of their feet after being vaccinated, and that the vaccine caused them to require feeding tubes and wheelchairs. What was missing from the piece, however, was the expertise of scientists and physician-researchers who have studied Gardasil’s safety profile, as well as a description of the science and why it shows that Gardasil is safe. The Star eventually retracted the article, but not before damage was done to the public’s trust in Gardasil and vaccine safety in general.
To avoid false balance, it’s important for reporters who cover controversies to get a sense of where the scientific consensus lies. “They should only report scientifically outlier positions if solid evidence supports it, not just because someone is shouting it from their own tiny molehill,” writes the freelance science writer Tara Haelle in a 2016 explainer about false balance for the Association of Health Care Journalists. As an editor, check to see if your writers are equating ideas that may have significantly different levels of merit. Be wary of patients’ anecdotes, especially if they are framed as medical evidence. Editors might want to Google the names of sources quoted on two sides of a contentious issue to see if they seem equally worthy of merit and respect.
Over time, try to develop a reverse Rolodex — people you should never quote, or who should be quoted with caution.
Ivan Oransky, editor in chief, Spectrum
Sources make stories — and can break them, too. It’s crucial for science reporters to ensure that the people they interview have appropriate expertise, are devoid of major conflicts of interest, and reflect different racial and gender identities. Sometimes the difference between a good science piece and a bad science piece is one additional question asked at the end of an interview or five extra minutes of online research. If you’re not sure that the voices in a piece truly reflect the scientific consensus, keep digging until you are.