Politicians Concerned About Coronavirus Misinformation Should Look in the Mirror

April 7, 2020   •  By Luke Wachob   •    •  

Greg Lukianoff of FIRE wrote insightfully last month about the critical role free speech plays in informing people during crises like the ongoing coronavirus pandemic. Critiquing the “marketplace of ideas” metaphor that suggests truth generally wins out over falsehood in an open debate – an overly rosy view of how people come to believe what they do – Lukianoff offered his own “Iron Law” of free speech’s benefits to knowledge:

“It is always important to know what people really believe, especially when the belief is perplexing or troubling. Conversely, in the overwhelming majority of scenarios you are not safer or better off for knowing less about what people really think.”

Free speech does not ensure that society rejects all the wrong ideas and embraces all the right ones. But it gives each of us an opportunity to learn as much as we can about ourselves and the world around us. No authority has all the answers, so we all have an intrinsic right – and perhaps responsibility – to seek them out for ourselves.

Of course, some people have no answers at all, and many of them are in Washington, D.C. After watching national and international agencies repeatedly give bad advice to the public about a viral pandemic that could kill hundreds of thousands of Americans and many more around the globe, some political leaders are eager to combat misinformation… from random strangers on Facebook.

Enter Senators Mazie Hirono, Kamala Harris, Dick Durbin, and Bob Menendez, who sent a letter to Mark Zuckerberg this week expressing grave concerns about Facebook’s messaging service, WhatsApp. Actually, that’s too kind. They echoed a Mother Jones article in calling the app a “petri dish of coronavirus misinformation.”

Even though the company has adopted new policies to try to limit the spread of misinformation and educate its users, the senators lectured Zuckerberg on how WhatsApp could do better with more top-down control:

“WhatsApp could be altered to include a message asking people ‘are you sure this is true?’ before they forward a message or otherwise make forwarding a message more difficult. Even better, WhatsApp could use a combination of metadata and human content moderation to stop the spread of misinformation altogether and punish bad actors (e.g., by suspending their accounts).

Facebook’s failure to adopt these commonsense measures suggests an indifference to the problems on WhatsApp that is not only unacceptable, but dangerous.”

If ever there was a time for humility when it comes to controlling speech, now is the moment. We’ve already seen how well-intentioned speech rules backfire in a crisis – this crisis – as Google’s ban on “political” ads about the virus prevented Democrats from countering messages from the White House. Obviously, the government’s speech wasn’t going to be banned. I guess you can’t blame Google. Who knew the government was political?

That policy is gone now, but politicians do not appear to have learned anything from it. Suppose Facebook or any major social media platform had labeled as misinformation the World Health Organization’s January 14 claim that there was no evidence of human-to-human transmission of the virus. Suppose they had done the same for Surgeon General Jerome Adams’ February 29 exhortation to “STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus…”

Facebook would have been right, and the WHO and Surgeon General (and CDC) would have been wrong. But does anyone think that would have made a lick of difference? Of course not. Politicians would have attacked Facebook as a menace to public health and probably threatened the company with investigations and retaliatory legislation for good measure.

Policies against misinformation on social media are never just about misinformation on social media. They become battlefields to fight over controversial claims about contested social and political issues. Do we really want the government to have a thumb on that scale? The risk of important information being silenced at the wrong time is too high.

Cracking down on “misinformation” in January and February would have silenced people who were correct about the virus and would have forced users to regurgitate the government’s falsehoods in their posts on the topic. Everyone should be horrified by that thought. It’s the sort of thing you expect from an authoritarian regime like China’s, the primary subject of Lukianoff’s blog post in early March. Indeed, the world found out about the virus thanks to whistleblowers who governments tried to silence, such as China’s Dr. Li Wenliang.

But even democratic governments seek to silence some speech. That is why we will always need a robust First Amendment and broad cultural support for it. Otherwise the powerful will mandate that we all stick to the party line. And when the party line changes, you better change too, and don’t make a fuss about it.

Call me an idealist, but the best way for government to deter people from turning to untrustworthy sources is to be trustworthy itself. Politicians who want to curb misinformation about the coronavirus should clean up their own act first.

Luke Wachob

Luke Wachob