- How Science Works
- Sources and Experts: Where to Find Them and How to Vet Them
- Making Sense of Science Stats
- Editing for Story
- Editing Controversial Science
- Holding Science to Account
- Covering Health Care
- Climate and the Environment
- Fact-Checking Science Journalism: How to Make Sure Your Stories Are True
Illustrating Complex Science Stories
- The Role of Visuals in Science Journalism
- The Process of Building Science-Centric Graphics
- Strategies for Using Visuals to Put Breaking Science in Context
- Special Considerations for Data Visualization
- Uncertainty and Misinformation
- Editorial Illustration, Photography, and Moving Images
- Additional Reading and Resources
- About the Author
- Social Media and Reader Engagement
- Popular Science
- Op-Eds and Essays
- About This Handbook
Using the Right Tone
By Yasmin Tayag / 4 minute read
It’s tempting to ridicule or belittle people who believe in misinformation and conspiracy theories. But avoiding condescension is crucial. Making fun of people isn’t going to get them to change their minds or their behavior. Research suggests that the better way to achieve behavioral change is to use language that encourages people to explore and resolve their own thoughts about a subject.
I admit, I’ve ridiculed people who believe in misinformation (some headlines I have written: “Pelt Trump With Planet Earths to Make Science Great Again,” “Pope Francis: Fake News Spreaders Are S***-Loving Coprophiliacs”). In hindsight, I can see that my goal was not to undo misinformation but to get clicks from people who were already aligned with my perspective. “If you want to be provocative and piss people off and get a reaction, go ahead and be that way,” but don’t expect it to shift anyone’s perspective, says Simon Bacon, who studies behavioral change at Concordia University, in Montreal. To encourage readers to engage with the myth-busting, he suggests, “you have to be a lot more thoughtful.” The Daily Beast’s Neel Patel says we have to think about “who’s reading this that needs to read it — not who’s reading this to affirm what they already know.”
I often imagine a reader who is confused, skeptical, and mistrustful. Maybe this person is trying to navigate the overwhelming amount of information online. Emma Bloomfield, of the University of Nevada at Las Vegas, recommends thinking about why a person might believe a conspiracy theory or an item of misinformation in the first place. Why does this person think 5G is dangerous? Do they have a health issue? Have they gone down a YouTube rabbit hole? Remember that “no one is accidentally getting into these spaces,” says Bloomfield. “People are encouraging the spread of misinformation.”
Empathy for readers allows us to choose a communication style that can best reach them. My goal is always to help people understand what’s going on without judging them or losing their trust — which means avoiding language that makes them feel defensive or stupid. Smearing or cynical language and put-downs are “going to set off alarm bells for a certain reader who believes the media is adversarial,” says Damon Beres, of Unfinished. “To get people to trust you, you need a sober, nuanced approach that doesn’t belittle the audience.” That doesn’t always allow for the most stylish or voicey journalism, but this is “not the place for that kind of writing.” Besides, being judgmental encourages polarization, which is what encourages the spread of misinformation in the first place.
Telling people what to believe can create resistance, so do so with caution. According to Bacon, research on behavioral change suggests that the most effective way to influence people’s thinking is to provide information in a neutral way and allow them to draw their own conclusions, as opposed to presenting the information and telling them what to think. Looking back on a story about Chemtrails I wrote a long time ago for Inverse, I’m embarrassed about its condescending and biased tone: “[many] believe the Chemtrails conspiracy is a real thing, despite the fact that they’ve already been confronted with plenty of proof to show that the dense columns of white mist left behind by planes are, in fact, just mist.” In contrast, a far more nuanced story from The New York Times, addressing anti-vaccination ideas spread by the NFL star Aaron Rodgers, did not tell people to get vaccinated but stated as a matter of fact how vaccines actually protect people.
“Normally people will draw the right conclusions if presented that way,” says Bacon. “Rather than be definitive, give people the wiggle room to absorb.” Emphasizing consensus, like pointing out that scientists agree climate change is real, or that the majority of Americans support vaccination, can help them reach those conclusions on their own.
And yet, throughout the Covid-19 pandemic, public-health messaging has directly told people what to do. “Get vaccinated” is a phrase that I and many others have written countless times. In a situation as urgent as the pandemic, when people are seeking clear guidance, it makes sense to do this. But behavioral research suggests that the approach most likely to lead hesitant people to get vaccinated is to emphasize what vaccines do — safely protect against severe disease and prevent hospitalization and death — and allow people to make the choice for themselves. This is especially important to consider when communicating with resistant audiences, because misinformation often comes from “a negative reaction to being told what to do,” says Bacon. “When you push info on people, they will invariably say, ‘Well, I’m going to justify my position.’ Nobody likes to be attacked.”