All finance news

Opinion: Fighting vaccine disinformation is crucial to ending the pandemic

0 1

Ann M. Ravel is the Digital Deception Project director at MapLight and previously served as chair of the Federal Election Commission. Kristin Urquiza is co-founder of Marked By Covid. The opinions expressed in this commentary are their own.

In May 2020, months before a Covid-19 vaccine was available and just before Arizona’s infection rate soared, Governor Doug Ducey encouraged residents not to stay home. “I want to encourage people to get out and about,” Ducey told listeners of a popular radio show. “If you don’t have an underlying health condition, it’s safe out there.” The interview was then shared on Facebook and Twitter. Many people listened to that dangerous advice — and many, including one of our fathers, Mark Urquiza, paid with their lives after contracting the disease.

Disinformation and misinformation about masks, vaccines and other Covid-19-related topics has been spreading like wildfire across social media. Last month, Rep. Marjorie Taylor Greene was suspended from Twitter for a week after making “misleading” claims about vaccines. Meanwhile, Sen. Rand Paul was suspended from YouTube for a week after posting a video falsely claiming masks are ineffective. According to its policy, YouTube prohibits “content that spreads medical misinformation that contradicts local health authorities’ or the World Health Organization’s (WHO) medical information about COVID-19.”

    While these short suspensions are grabbing headlines, they’re also obscuring the larger problem that major social media companies are utterly failing to prevent disinformation from spreading rampantly and wreaking havoc on public health. Preliminary research from the nonprofit organization Avaaz suggests that Facebook’s “related pages” algorithm actively recommends pages that promote anti-vaccine content to Facebook users. Meanwhile, Media Matters, a left-leaning media industry watchdog, reported that a new viral video filled with misleading claims about vaccines and masks amassed more than 90 million Facebook engagements (likes, shares, etc.) in a matter of days. And earlier this year, the Center for Countering Digital Hate identified 12 prominent anti-vaccine activists as being responsible for more than half of the anti-vaccine content shared on Facebook and Twitter during a six-week period (Facebook has since pushed back on that study and the idea that its platform is hindering vaccine uptake).

      Here's what the Delta variant means for the economic recoveryThe truth is that too many people are still unvaccinated and refusing to mask up despite scientific validation that both actions are safe and reduce transmission. With more than 600,000 people lost to Covid in the US and case numbers surging yet again, fighting disinformation is critical to ending the pandemic and decreasing health disparities for generations to come. We must demand social media companies take serious steps to curb vaccine misinformation and disinformation. Here’s what these companies should do:Read More

      Enhance monitoring of high-reach accounts

      Anyone with more than 50,000 followers — that is, a person that we’ve determined has “high reach” on social media — and a history of sharing disinformation about Covid-19 should have their posts subjected to a pre-clearance policy in which the content can be fact-checked before it’s posted online and damage has already been done. While Facebook and Twitter have a strike system in place to punish repeat violators, a pre-clearance policy would more effectively limit the spread of disinformation before it’s too late.

      Be more transparent

      Instead of providing updates on how they are handling Covid-19 disinformation in the current piecemeal, unstructured manner, social media and technology companies should standardize their reports on how they’re limiting Covid-19 disinformation, as is already in place in the European Union. Every month, social media companies are asked to report to the European Commission how they’re working to address Covid-19 disinformation. The Commission then releases that information to the public in a monthly report that’s easily found online. This simple transparency measure ensures the public has an easy way to stay informed.

      Tweak the algorithms

      Social media companies should use recommendation algorithms to prioritize authoritative Covid-19-sources and reduce the visibility of posts with Covid-19 misinformation. While Facebook and Twitter are already working to remove misleading posts and redirect users to reliable information from credible sources, they should make it harder for users to see misleading posts to begin with. There’s no reason that lies and conspiracy theories should be appearing alongside factual, trustworthy information.

      Get the US government more involved

      It’s also critical for the federal government to play a more significant role. To begin, the Biden-Harris administration should create a coordinated national response and appoint a disinformation expert to the Covid-19 task force, as a diverse coalition of public interest groups urged in a letter last year.

        Congress should also advance legislation that improves public access to critical social media data. In particular, the Social Media DATA Act and the Algorithmic Justice and Online Platform Transparency Act would help hold powerful social media companies accountable by increasing access to information about content moderation and online ad targeting.There isn’t a silver bullet to eliminate the Covid-19 disinformation that has become so prominent online, so we need social media companies and our government to do everything they can to chip away at the problem. Until then, online conspiracy theories and lies will continue to impact all of us while serving as powerful fuel for one of the most deadly viruses in modern history.

        Source: edition.cnn.com

        Leave A Reply

        Your email address will not be published.

        10 − four =