Structuring Your Argument
By Yasmin Tayag / 4 minute read
When correcting misinformation, it isn’t enough to provide the correct information — you also have to help your audience remember it. Unfortunately, people tend to be influenced by misinformation even when they remember and accept the correction, says Monash University’s John Cook (a phenomenon known as the “continued influence effect”). One thing that influences the way an argument against misinformation is remembered is its structure: Put too much emphasis on the myth, and you risk its sticking. But if you build what the cognitive scientist George Lakoff calls a “truth sandwich” — sandwiching the misinformation with the facts — there’s a better chance that your audience will remember what’s true. To make it even more robust, explain how the correction overrides the myth.
Including the facts, the myth, and the explanation, and being thoughtful about the order in which they are presented, can help the correction stick. “Highlight the truth as much, if not more, than the falsehood,” says Paul Bisceglio of The Atlantic.
With that in mind, here is a good structure for dealing with misinformation:
State the facts: Editors I spoke with agreed that it’s a good practice to state the facts at the beginning of the story and to make clear the intention to correct misinformation. At the Coronavirus Blog, every time I wrote about the conspiracy theory that the coronavirus was deliberately cooked up in a Chinese lab, I made sure to state up front what the research indicated: that the virus most likely has an animal origin and shows no signs of bioengineering in its genetic sequence. “You would absolutely want to establish early on in the piece, in the nut or before, that you’re debunking something,” says Bisceglio. He steers writers away from long introductions that keep the illusion of the conspiracy in place.
There’s even a strong argument for inserting the correction in the headline, since online readers don’t always click through to read the actual story. “If they just read the headline or the first couple of sentences, they’ll still have that information,” says Neel Patel, science editor at The Daily Beast.
“A lot of fact-checks use the myth as the headline, then repeat the myth,” says Cook. “Avoid that. Put the emphasis on the facts. Make that your headline.” Cook and fellow co-authors of the excellent Debunking Handbook recommend the “fact-myth-fallacy” structure when debunking a claim. This involves presenting the correct information first, acknowledging the misinformation, then returning to the fact, explaining how it counters the misinformation.
Address the myth: Eventually, the misinformation has to be acknowledged. Editors take different approaches to doing so. In the past, I have been wary of repeating a myth word-for-word, out of concern that it will stick, so I alluded to it in more general terms. When I wrote about the false narrative that Chinese people were responsible for spreading the coronavirus, I gestured to it with language like “anti-Asian sentiment linking Asians and the coronavirus.” Emma Bloomfield, an assistant professor of communications at the University of Nevada at Las Vegas who studies climate-change rhetoric, told me that she also tries to “avoid repeating any kind of specific misinformation” and recommends using broad-strokes language instead.
A different school of thought endorses stating the myth outright in order to debunk it. “We’re ready to engage with the misinformation if we need to — tell the reader what is out there,” says Bisceglio. According to Cook, older research supporting the broad-strokes approach has been supplanted by more recent research showing that “you need to activate the myth in people’s minds before they can tag it as false.” Rather than tiptoe around the myth, state it directly — just once — and clearly identify it as misinformation. Language such as “One myth is X” or “It’s a common misconception that Y” can serve to warn readers, helping put them cognitively on guard. (More on “inoculating” audiences about misinformation later.)
Explain the correction: After stating the myth, return to the facts to ensure that they stick. In the “fact-myth-fallacy” structure, the last part of an argument juxtaposes the fact and the myth, explaining the inconsistency. It isn’t enough to simply state that the misinformation is false, because doing so isn’t memorable. “You have to give an alternative explanation” or story, says Katya Ognyanova, of Rutgers. This is where you might include details about the source of the misinformation, explain why people tend to believe it, or give another reason for what’s happening. As the Debunking Handbook notes, “It is important for people to see the inconsistency in order to resolve it.”
Fact-Myth-Fallacy Structure
To demonstrate this structure, let’s rely on a fictional misinformation claim that all dogs are actually surveillance robots working for Big Tech and shouldn’t be kept as pets. Debunking this myth is justified since it’s circulating nationally and animal shelters are overflowing with abandoned dogs. You’ve also learned that the misinformation is being promoted by a large company that profits from selling cat goods. Here’s how you might structure a story:
Fact
State the facts at the beginning of the story (headline, title, lede, nut) in a straightforward, concise way.
Headline: “Dogs Are Actually Just Dogs.”
Lede: Like all animals, dogs succumb to the cycle of life and death.
Myth
Address the myth just once, head-on. Use language that clearly labels it as misinformation.
A popular myth is that dogs are robots designed to spy on their humans on behalf of Big Tech.
Fallacy
Juxtapose the fact with the myth. Explain the inconsistency and provide an alternative explanation.
Unlike robots, which do not age, dogs are living beings that grow old and die, as anyone who has watched a puppy grow or said a tearful goodbye to a senior dog can attest. This false rumor originated with a cat-goods company that profits when people choose to keep cats, rather than dogs, as pets.
At OneZero, Damon Beres assigned a story about a conspiracy theory that had been referenced in Congress claiming that the far-left activists known as antifa were behind the Capitol Hill insurrection in January 2021. “Essentially we wanted to show, ‘Where did these claims come from’?” he says. Crucially, the story explained that the claim had come from an anonymous source on a website associated with conspiracy theories and arose from an analysis by a dubious facial-recognition company. The story deflated the claim, anchoring the explanation in a memorable set of details.