All finance news

Opinion: Trump is banned for now, but Facebook still needs to change its policies

0 1

Ann M. Ravel is the Digital Deception Project director at MapLight and previously served as chair of the Federal Election Commission. The opinions expressed in this commentary are her own.

The decision from the Facebook Oversight Board to uphold the suspension on former President Donald Trump’s account marks a short-term step to limit the spread of dangerous online disinformation. But it’s far from a reason to celebrate the social media giant as a beacon of responsibility. After all, the Oversight Board kicked the more consequential decision about whether the ban should be permanent right back to Facebook’s leadership.

If the company is truly interested in accountability, it should permanently ban Trump based on the totality of his behavior, including harassment, incitements to violence and repeatedly spreading disinformation about the results of the 2020 presidential election.

    Facebook is neglecting to take responsibility for the disinformation crisis that extends well beyond the former president and that has been fueled by its own and other social media platforms. By creating an oversight board with the particularly narrow mandate of providing advice on sensitive content moderation decisions, the company is ducking accountability on the many ways it amplifies lies, conspiracy theories and disinformation that incites violence.

      In a post on its media blog, Facebook says it is combatting disinformation and fraud through machine learning and by building new products that will help its community and third-party fact checkers to better identify and report false news to stop it from spreading. The company said it is also making it more difficult for those who post false news to buy ads and make money from its platform.

      Facebook told to investigate its role in insurrectionBut until Facebook does more to proactively limit the reach of harmful content before crises emerge, we should continue to be skeptical of its commitment to addressing disinformation. For a company with more than 3.5 billion worldwide users on its various platforms and that just posted $26 billion in revenue in a single quarter, that’s not too much to ask. As Facebook prepares its next steps following the Board’s ruling, it’s worth examining some of the flawed and irresponsible policies that preceded Trump’s ban and consider how those policies should be changed. Read More

      Eliminate the ‘newsworthiness standard’

      Before the violent insurrection at the US Capitol Building that led to Facebook banning Trump, Facebook justified serving up false, hateful and dangerous posts from the former President as part of its “newsworthiness exemption.” According to Facebook’s reasoning, content posted by influential users is not subject to Facebook’s community standards “if [the company] believes the public interest in seeing it outweighs the risk of harm.” That logic is not only self-serving — it allows Facebook to leave up inflammatory content that keeps people outraged and engaged — it’s also backwards. It’s the posts from powerful and influential people with a history of spreading disinformation that have the capacity for the highest reach and ability to do the most damage. If anything, such accounts should be subject to more scrutiny, not less.

      End algorithmic amplification of disinformation

      Facebook’s business model relies on gathering detailed information about users that can be processed and sold to advertisers who bid for the opportunity to place ads on users’ screens. And the longer a user stays on the platform, the more Facebook can profit. That’s why its algorithms continued to reward blatantly false and dangerous posts by the former President.During the Oversight Board’s review, Facebook refused to answer the board’s questions about whether its algorithms amplified the visibility of then-President Trump’s posts surrounding the deadly siege on the Capitol on January 6. “In this case, the Board asked Facebook 46 questions, and Facebook declined to answer seven entirely, and two partially. The questions that Facebook did not answer included questions about how Facebook’s news feed and other features impacted the visibility of Mr. Trump’s content; whether Facebook has researched, or plans to research, those design decisions in relation to the events of January 6, 2021; and information about violating content from followers of Mr. Trump’s accounts,” the Oversight Board wrote in its decision. For Facebook to truly grapple with the platform’s role in spreading harmful content, it must fundamentally change its model to treat political disinformation as a fundamental threat to democracy. To start, the company should do more to prevent people from engaging with disinformation in the first place and boost authoritative news sources on users’ news feeds.

      Pass laws that hold social media accountable

      The Facebook Oversight Board was created, financed and given a specifically narrow mandate by Facebook itself. It’s not a public institution accountable to the people. That’s why, in addition to demanding better from Facebook itself, we must work to pass legislation that increases transparency about the content moderation policies at large social media companies and holds them accountable.

        While Congress has yet to enact any meaningful legislation to help combat online disinformation, there’s momentum from lawmakers who recognize we can no longer allow a handful of profit-driven companies to call all the shots when it comes to the information powering our democracy. The For the People Act, which passed the House and now awaits action in the Senate, is a good place to start. That bill would improve disclosure rules for online political ads and require platforms to maintain a public database of political ads shown to their users. Yet another proposed bill, the Protecting Americans from Dangerous Algorithms Act, would hold platforms accountable for the harm caused by their algorithms. It also deserves consideration from Congress. Ultimately, no matter what Facebook says, it’s up to our elected leaders to provide the real oversight.

        Source: edition.cnn.com

        Leave A Reply

        Your email address will not be published.

        2 − one =