How Wikipedia Prevents the Spread of Coronavirus Misinformation 

A group of hawk-eyed experts operate on a special track to monitor medical information on the site.
Image may contain Human Person Doctor Clothing and Apparel
Photograph: Getty Images 

This edit was VERY poor,” wrote James Heilman, an emergency room doctor in British Columbia, to a Wikipedia contributor who had made a couple of changes toward the end of the article on the new coronavirus outbreak. Those edits recommended a special type of mask for blocking the transmission of the virus from those who have it, and Heilman, a prominent figure in reviewing medical Wikipedia articles, wanted to inform the editor that this advice was too sweeping and based on insufficient evidence. More than that, he aimed to send a warning. “Please do not make edits like this again,” he wrote.

Wikipedia’s reputation is generally on the ascent. Just last month, no less a publication than WIRED deemed it “the last best place on the internet.” What was once considered the site’s greatest vulnerability—that anyone can edit it—has been revealed to be its greatest strength. In the place of experts, there are enthusiasts who are thrilled to share their knowledge of a little part of the world with all of humanity. As Richard Cooke, who wrote the WIRED essay, observed: “It’s assembled grain by grain, like a termite mound. The smallness of the grains, and of the workers carrying them, makes the project’s scale seem impossible. But it is exactly this incrementalism that puts immensity within reach.”

Read all of our coronavirus coverage here.

His point, and it’s really indisputable, is that this mammoth online project has developed a personality, a purpose, a soul. Now, as the new coronavirus outbreak plays out across its many pages, we can see that Wikipedia has also developed a conscience.

The coronavirus articles on English Wikipedia are part of WikiProject Medicine, a collection of some 35,000 articles that are watched over by nearly 150 editors with interest and expertise in medicine and public health. (A survey for a paper co-written by Heilman in 2015 concluded that roughly half of the core editors had an advanced degree.) Readers of Wikipedia wouldn’t know that an article is part of the project—the designation appears on a separate talk page and really serves as a head’s up to interested editors to look carefully at the entries.

Once an article has been flagged as relating to medicine, the editors scrutinize the article with an exceptional ferocity. While typically an article in The New York Times or The Wall Street Journal would be a reliable source for Wikipedia, the medical editors insist on peer-reviewed papers, textbooks or reports from prominent centers and institutes. On these subjects, Wikipedia doesn’t seem like the encyclopedia anyone can edit, striving to be welcoming to newcomers; it certainly doesn’t profess a laid-back philosophy that articles improve over time and can start off a bit unevenly. The editor chastised by Heilman hasn’t returned to the article and instead is improving articles about sound-recording equipment.

By having these different standards within its pages, Wikipedia can be a guide to the big commercial platforms that have become way stations for fake cures, bogus comparisons to past outbreaks, and political spin. Twitter, Amazon, YouTube, and Facebook have all promised to cleanse their sites of this dangerous disinformation, but they are doing so in fits and starts and by relying in part on familiar, passive tools like acting when others flag dangerous content. Here is how Facebook's Mark Zuckerberg put it in a post on March 3: “It’s important that everyone has a place to share their experiences and talk about the outbreak, but as our community standards make clear, it’s not okay to share something that puts people in danger. So we’re removing false claims and conspiracy theories that have been flagged by leading global health organizations. We’re also blocking people from running ads that try to exploit the situation—for example, claiming that their product can cure the disease.”

Wikipedia shows, however, that extreme circumstances, especially when related to public health, require different, more stringent rules, not better application of existing rules. The stakes are simply too high.

I spoke this week with the Wikipedia editor who guided the article about the new coronavirus from a one-sentence item in early January to a substantial article with charts of infections around the world. She goes by the handle Whispyhistory, and is a doctor in South London; she spoke via Skype from her office, which she proudly noted had a new thermometer that looks like a laser gun.

Whispyhistory has only been contributing for three years; she was recruited through an edit-a-thon at a medical library. While at first she was open with her colleagues about her side project, now she prefers to remain anonymous. “You start getting hounded by people about what you are writing,” she said. “It’s just so much easier to not use your real name.”

WikiProject Medicine welcomed her, she said, but she’s had to build a reputation for accuracy and responsibility. “You have to know what you are saying,” she said, and even so it can be intimidating. “You’ve got so many people watching you.” The picture she paints of the project’s contributors is akin to the staff of a demanding teaching hospital. The editors confer on a talk page she calls “the doctors’ mess” where they perform “triage” to assess which articles require attention immediately. Science and data reign; and above all else, the pledge is to do no harm.

On January 6, she said, a colleague asked her if she had heard of an outbreak of atypical pneumonia in China. She hadn’t, but “being someone who writes for Wikipedia, the first thing you do is see if it’s on Wikipedia. Someone had written the article the day before.” The article was thin, but Whispyhistory had the sense that “this might be something big,” so she added the WikiProject Medicine tag to the article and wrote a note informing her colleagues to pay attention to the outbreak, which they did.

Like a young resident, she pulled all-nighters before showing up at the office at 6 am, keeping a watch over the article as the virus spread. In those early days, for instance, she saw a note on the doctors’ mess that linked to a news report claiming that the new coronavirus could survive on surfaces for nine hours. The author wanted to add that information to the Wikipedia page immediately. “That already sends an alert since there is nothing that’s really so important that you’ve got to add something straight away,” she recalled. She went from the news article to the paper that it cited, and discovered that it was looking at the SARS virus, not the (very similar) one that causes Covid-19. She decided not to include the research.

As Heilman put it in an email, “Keeping Wikipedia reliable and up-to-date involves deleting material just as much as adding it.” I asked both him and Whispyhistory how the article on the new coronavirus managed to exclude the arguments that were being made (at least until recently) by President Trump and his supporters—that the disease is being hyped by Democrats and that it’s comparable to the flu. Don’t they have angry wannabe contributors accusing Wikipedia of bias? “That’s really easy to answer. ... You have to cite everything you write,” Whispyhistory said. Heilman agreed that a requirement for legitimate sourcing filters out unfounded notions.

Bogus claims about the pandemic do show up on Wikipedia, but in a separate article: “Misinformation related to the 2019–20 coronavirus pandemic,” under the heading “Misinformation by governments/United States.” Heilman noted that Wikipedia has a structural advantage over the big social networks: “It takes more time and effort to disrupt Wikipedia than it does to restore Wikipedia to a reliable level. It’s the exact opposite on Twitter and Facebook, where it takes a second to spread false news,” while getting those lies removed will take a lot of time and effort.

Unless Twitter, Facebook and the others can learn to address misinformation more effectively, Wikipedia will remain the last best place on the Internet.


WIRED is providing free access to stories about public health and how to protect yourself during the coronavirus pandemic. Sign up for our Coronavirus Update newsletter for the latest updates, and subscribe to support our journalism.


More From WIRED on Covid-19