Opinion: The consequences will be deadly if we don’t fight vaccine misinformation

K. “Vish” Viswanath, PhD, is director of the Harvard T.H. Chan School of Public Health India Research Center. He is also the Lee Kum Kee Professor of Health Communication at Harvard Chan. The opinions expressed in this commentary are his own.

While a slow but steady rise in Covid-19 vaccine availability and vaccination has offered a light at the end of the tunnel, the challenges that come with ending the pandemic seem to be endless. Most recently, the US Centers for Disease Control and Prevention and the US Food and Drug Administration recommended pausing Johnson & Johnson’s vaccine due to six reported cases of a severe type of blood clot. This will surely create a new wave vaccine hesitancy. But a large number of people had already refused the vaccines outright or adopted a wait-and-see approach early on in the pandemic due to misinformation or disinformation they saw on social media platforms.

According to Pew Research Center, about 30% of Americans said they are not likely to get a vaccine, though this figure has steadily declined from September 2020. Misinformation and disinformation about vaccines and their effectiveness, particularly on social media, are one of the major drivers behind this hesitancy, and governments have been limited in how to rein it in. That’s why social media companies must do more to combat vaccine misinformation and disinformation. If they don’t, the consequences could literally be deadly, as vaccinations are key to stemming the pandemic. What’s more, Covid-19 misinformation could have a halo effect on other public health issues in the future.

    For anti-vaccine forces, social media platforms are invaluable. The platforms have made it possible to create an information ecosystem and a networked environment that drives anti-vaccine sentiments across the globe. And governments and public health forces have little ability to regulate this misinformation given how quickly it spreads. By lowering barriers to create, post and forward “information,” each user of the platform is a mass medium. The good, in this case, clearly comes with the bad.

      More Tech & Innovation Perspectives

      VC Jim Breyer: Silicon Valley still has a bright future. But Austin’s time is now

      We have to close the digital divide. That means internet access for everyone

      We must defend against the cyber threats facing our global financial systems

      Facebook, which owns Instagram and WhatsApp, recently announced that users who are in states that have opened appointments to all adults will get notifications about their eligibility at the top of their news feeds. The company has also unveiled a set of tools to help people in the United States and other countries locate places where they can get vaccinated, and it claims it is making data available to governments to address vaccine hesitancy and using WhatsApp chatbots to help in registration. Other platforms, too, such as TikTok and Twitter, have policies to monitor Covid-19 vaccine-related misinformation and encourage viewers to access information from reliable sources, such as the World Health Organization. But these are small, limited efforts that fail to deal with a more fundamental problem: Messages aggressively arguing against vaccinations are still too easy to find. Here are some more steps social media companies should be taking: Read More

      Create tools for public health authorities

      The onus for identifying and removing disinformation lies with the platforms — which have the technical capacity — and not on users. Social media companies should create tools and products to help public health authorities counter anti-vaccine propaganda, but also train them in using these tools. The tools could monitor misnformation, alert health authorities when it gains traction and then actively aid in countering it. And there’s no reason this can’t be a two-way street. Public health authorities should develop their own surveillance systems that allow them to track public health misinformation and disinformation in the information environment, including on social media. Similar tools should be made available without cost and with greater access to all their features in the interest of public health.

      Tweak the algorithms if necessary

      The absence of “gatekeeping” — the series of editorial judgments that a typical news story filters through before it gets on the air or in print at a traditional news media outlet — means that there is nothing to check the baseless claims and distortions of facts that gets a sense of legitimacy just because it is on the Internet. Unlike the more traditional news media, like television and radio, which are subject to laws governing free expression, social media platforms, at least in the United States, are protected under Section 230 of the Communications Decency Act, which absolves them of responsibility for almost all content produced by a third party, such as a person or organization advocating against vaccines.

        Several people, including members of Congress, are suggesting that protections offered to social media under Section 230 should be revisited or even rescinded. Social media companies say they are monitoring the misleading Covid-19 content and doing their best to flag it or even remove it. But is this sufficient? If their efforts were truly effective, vaccine misnformation wouldn’t continue to show up on their platforms. While the efforts of social media companies so far are in the right direction, more can be done to tweak their algorithms to more strongly monitor, identify and downrank misinformation. The question is whether they’re willing to take a closer look at how their algorithms are driving misinformation and disinformation and use their technical prowess to stop it. If not, calls for regulation of social media will continue to increase. For those who hold free expression as a core value, regulation of platforms may or may not be desirable — after all, they are not making widgets, they are dealing with the spread of information and ideas. Any government regulation of ideas is a slippery slope, but if this happens as a result of the platforms’ hesitancy to crack down on vaccine misinformation, then social media companies will have only themselves to blame.

        Source: edition.cnn.com

        Comments (0)
        Add Comment