How to Address a Conspiracy Theory
By Yasmin Tayag / 4 minute read
The most important question you face is this: Should you even address conspiracy theories?
When the opportunity to address misinformation arises, the first question an editor should ask is whether it should be addressed at all. Doing so inevitably calls more attention to it, and not necessarily the right kind.
Research widely indicates that misinformation is “sticky,” meaning that people tend to remember and repeat it. Covering it journalistically, even with the intention of debunking, risks lodging it even deeper in the public mind. When deciding whether to cover a particular misinformation item, consider two questions: How much harm will the misinformation do, and what is the likelihood of its spreading?
Sometimes the answers to those questions are fairly clear. When I worked for the Medium Coronavirus Blog, it was easy to make the call to cover President Donald Trump’s suggestion to ingest bleach, since it was obviously both very dangerous and widely discussed across the country. More often, however, those questions are difficult to answer.
To get a sense of the harm a bit of misinformation or conspiracy theory may cause, look at how it’s already causing people to act or respond. If not well covered in the news media, these stories can still sometimes be found on local news or on forums like Reddit. It’s easy enough to tell from national coverage that people who believe the Earth is flat don’t seem to be causing any direct harm to anyone (though science literacy is taking a hit), so I wouldn’t feel pressured to cover that with much urgency. But the conspiracy theory that Chinese people are somehow responsible for Covid-19 clearly drove anti-Asian hate during the pandemic, and many Asian people have been harmed as a result. I feel the need to address that misconception urgently and often.
Gauging the popularity of misinformation is challenging because there often aren’t hard data tracking its spread. Sometimes agencies like Pew Research or the Public Religion Research Institute offer survey data on widespread conspiracy theories, such as public belief in QAnon. More often, getting a sense of an idea’s spread takes research. I use a keyword search on Google Trends to see whether searches for a particular phrase, such as “fake moon landing,” are on the rise. Performing a similar search on Google News and Twitter, sorted by date, can help, too. You can also search other social media, like Instagram or TikTok, where misformation proliferates. On these platforms, it’s hard to sort posts chronologically, but the number of likes and comments on a post, and the follower count of the profile that shared it, are good indicators of how popular an idea is becoming.
Accept that your estimations will be inexact. For example, when I decided to write about the niche claim that Covid was caused by exposure to 5G internet, I had seen a handful of posts about it on Twitter but wasn’t sure whether covering the claim was worth the risk of magnifying it. I justified the decision to do so by noting that celebrities had tweeted about it and that it had led a few people to burn down cellphone towers. In hindsight, though, I worry that I fanned a relatively small flame in the pursuit of clicks. Perhaps I didn’t sufficiently account for how far this misinformation could spread.
Paul Bisceglio, science editor at The Atlantic, assigned a profile of the scientist and anti-vaccination figure Robert Malone at a moment when he was relatively little known. But Malone had recently been interviewed by increasingly big-name right-wing pundits, and Bisceglio and his team felt he was on the cusp of becoming a conspiracy-theory celebrity. “We thought, ‘It would be really useful to have a really hard, straight look at who this guy is and what his deal is, so that as he continues to gain fame and infamy,” there would be a reliable and definitive source of information about him online.
Choosing to cover Malone was a gamble. Had the timing been off, or had the anti-vaccination movement not latched onto him, his profile in the magazine risked giving him more attention than was warranted. But it paid off. Bisceglio says he has “zero regrets” about the piece. “We watched it become one of the go-to sources for readers during the big Joe Rogan blowup — it offered real clarity and value in that moment for anyone trying to make sense of Malone’s claims.” If you Google Malone’s name now, that story is still one of the top hits. A story with such high stakes “does and should take greater resources than a simpler piece,” notes Bisceglio, who worked with The Atlantic’s fact checkers and legal team. “To me, the story’s impact spoke to the importance of writing about Malone, but also to the importance of doing so with considerable scrutiny.”
Stories on misinformation and conspiracy theories are often very popular — the more outlandish, the more attention-grabbing the headline — and this can influence a decision to cover them. I find it impossible not to click on “Fantastically Wrong: The Legendary Scientist Who Swore Our Planet Is Hollow,” Wired’s well-executed investigation, or on “’Chemtrail’ conspiracy theorists: The people who think governments control the weather,” a refreshingly straightforward story published by the BBC. If there is pressure from others in the newsroom to cover misinformation, tread carefully and push back if doing so seems irresponsible. “Sometimes you have to fight against engagement best practices and take a more responsible approach that results in fewer page views and article shares,” says Damon Beres, editor-in-chief of Unfinished and the former head of the science-tech website OneZero. “Sometimes taking the high road means sacrificing numbers.”