June 23rd 2020
Bennat Berger is an entrepreneur, investor, and tech writer.
Dealing with misinformation has always been, let’s say, a touchy subject for Big Tech.
Social media giants have long faced accusations of being too hands-off when it comes to policing fake news. Perhaps predictably, the one most under fire is Facebook — a company that has long held to the PR line that as a social media platform, it has little to no responsibility for what outside users choose to post.
But a platform, at least according to the metaphor, was nothing but pure, empty space.” To moderate what constitutes factual content, tech leaders like Zuckerberg argued, would be to infringe on users’ right to free speech.
Faced with the reality of a global pandemic, however, such philosophies have begun to change. In recent weeks, major companies have started working in tandem to prevent the spread of COVID-related misinformation.
We’re helping millions of people stay connected while also jointly combating fraud and misinformation about the virus, elevating authoritative content on our platforms, and sharing critical updates in coordination with government healthcare agencies around the world. We invite other companies to join us as we work to keep our communities healthy and safe.”
But why the sudden reversal on moderation, after all this time? With COVID, tech platforms are finding themselves just as responsible for protecting lives as any doctors on the front lines.
Some warn readers of impending societal collapse and tell them to grab cash and food while they can; others take the opposite tact, dismissing the risk posed to young people, and encourage them to gather despite social distancing.
More fraudulent posts provide false reassurance that readers can protect themselves from disease by drinking teas, ingesting colloidal silver, and diffusing essential oils. One viral post claimed that stomach acid would kill coronavirus if a person just drank enough water.
These false headlines are dangerous not only for the risk they pose to individuals but also for the harm they can incite across communities. If people think that they can cure their illness by drinking tea and diffusing essential oils, they might dismiss the need for social distancing and infect countless others — and put themselves at risk in the process.
CNN recently reported that at least one young adult caught the illness after attending a “coronavirus party” with their peers. The report notes that “the partygoers intentionally got together ‘thinking they were invincible’ and purposely defying state guidance to practice social distancing.”
Misinformation is dangerous. It poses a health risk that could impact not only the people who act on wrong information, but their neighbors, friends, and community members.
The potential for harm is so apparent that even tech companies, who are notorious for avoiding responsibility as moderators, feel compelled to step in and guide users towards reliable information. In this way, we can see content moderation as a necessity — something that quite literally has the power to save lives.
But what if we were to extrapolate that same awareness out to public health and safety concerns beyond COVID-19? If our experiences with the coronavirus spread show us anything, it would be that providing access to reliable sources and moderating fake news isn’t about infringing on free speech; it’s about protecting our communities.
Imagine a future where tech companies employ teams of researchers who are dedicated to continually debunking hoaxes, flagging pseudoscientific advice, and continually researching topics related to public health; imagine the good that could stem from common-sense moderation.
Tech companies have shown that when pressed, they can do a fantastic job of guiding users towards reliable information. So, can’t they do this with misinformation spread about vaccine safety, false cancer cures, climate change denial, and fake political news?
The dangers that misleading information about these topics pose may not be as immediate as the ones we see with COVID-19, but they are nonetheless real and pressing.