Analysis: Zuckerberg pushes back hard against the Facebook whistleblower. But it’s not the full story

(CNN)Hours after Facebook whistleblower Frances Haugen testified before Congress about how the social network poses a danger to children and democracy, Mark Zuckerberg took to the platform he built and posted a 1,300-word screed trying to undermine her.

His main argument was that Haugen was taking Facebook’s research on its impact on children — among the tens of thousands of pages of internal documents and research she took before she left the company — out of context. In essence, he argued she cannot be trusted to properly portray the company’s findings, claiming she painted a “false picture of the company.” But despite employing many talented and diligent researchers, it’s Facebook’s top executive who cannot be trusted when it comes to sharing the work of those researchers with the public.

    In August, Facebook (FB) released a report about the most-viewed posts on its platform in the United States. Guy Rosen, Facebook’s vice president of integrity (yes, that’s a real job title at Facebook) said at the time the company had become “by far the most transparent platform on the internet.”

      The report covered Facebook data for the second quarter of this year, and Facebook suggested it painted a rather rosy picture. “Many of the most-viewed pages focused on sharing content about pets, cooking, family,” Facebook said. Read MoreThere was a catch. The research report focused on the second quarter of 2021 — but what about the first quarter? Had Facebook not gathered data and compiled a report for the first three months of 2021?

      Facebook whistleblower testifies company 'is operating in the shadows, hiding its research from public scrutiny'It had, but Facebook executives chose not to share it with the public “because of concerns that it would look bad for the company,” The New York Times reported. The shelved report showed that the most-viewed link on Facebook in the first quarter of this year was a news article that said a doctor died after receiving the coronavirus vaccine, the Times reported. That a news article with clear potential to be reshared in a way that undermines the safety of vaccination would be one of the most popular pieces of content on Facebook amid a pandemic didn’t fit with the image the company’s executives are trying to project: that anti-vaccine sentiment isn’t running rampant on the platform and the company isn’t contributing to America’s vaccine hesitancy problem. When the research eventually leaked to the Times, Facebook came clean, “We’re guilty of cleaning up our house a bit before we invited company,” said Andy Stone, a Facebook spokesperson. The next month, the company was criticized after New York University researchers who were studying misinformation on Facebook said they were booted from the platform. (The company said their decision to deplatform the researchers was related to a separate study on political ads that involved using a browser extension that allowed users to anonymously share the ads they saw on Facebook with the researchers.)The very blatant cherry-picking of what research to make public and what to hide begs the question: what else does Facebook know that it’s not telling us? And who is really creating a “false picture” of the company and its impact on society?

      Facebook whistleblower, Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee at the Russell Senate Office Building on October 05, 2021 in Washington, DC. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profit over safety.

      A low-level employee

      Facebook’s other attempt to undermine the whistleblower was to portray Haugen as a low-level employee who doesn’t know what she is talking about. But that strategy appears to be backfiring, too. Samidh Chakrabarti was head of “civic engagement” at Facebook. Chakrabarti had regularly been put forward by the company to speak publicly about the good work Facebook was doing, even being part of the press tour of the Facebook “war room” for the 2018 US Midterm elections. (The war room was later widely mocked as a publicity stunt.)After her testimony Tuesday, Facebook described Haugen as “a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives.”

        That prompted Chakrabati to respond on Twitter, “Well I was there for over 6 years, had numerous direct reports, and led many decision meetings with C-level execs, and I find the perspectives shared on the need for algorithmic regulation, research transparency, and independent oversight to be entirely valid for debate.” Unfortunately for Facebook, Haugen is on to something.

        Source: edition.cnn.com

        Comments (0)
        Add Comment