All finance news

Likes, anger emojis and RSVPs: the math behind Facebook’s News Feed — and how it backfired

0 34

(CNN Business)In late 2017, Facebook had a big problem: users were commenting on, “liking” and resharing posts less than they had in the past.

The declines in engagement, visualized in charts on an internal document viewed by CNN Business, looked like wiggly black-and-blue frowns. Facebook decided to make a change — and fast.That December, other internal documents show, the company quickly came up with a plan: it would refocus its News Feed algorithm on a new metric it referred to as “meaningful social interactions”, or MSI, for ranking people’s interactions on Facebook. MSI would assign different point values to things such as “likes” and comments on posts, and even RSVPs to events, and take into consideration the relationships between people, such as that of a person writing a post and a person commenting on it. It launched in early 2018, marking a new era in the ways the social network monitors and manipulates its users, and experiments on their data.

    Close-up of Facebook reaction icons shown on a smartphone.The introduction of the MSI metric quickly helped with Facebook’s engagement problem, according to an internal research note from November 6, 2019. Yet, as first reported by The Wall Street Journal last month, the company has also long been aware of the extent to which it fostered negativity online, according to disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of 17 news organizations, including CNN.

      This ranking system was built and rolled out rapidly: According to an internal note from December 21, 2017, Facebook put the first iteration together within just over a month so it would be done by the end of 2017 for deployment in early 2018. Read More

      The big takeaways from the Facebook PapersA Facebook spokeswoman told CNN Business that the introduction of MSI wasn’t a “sea change” in how the company ranked users’ activity on the social network, as it previously considered likes, comments, and shares as part of its ranking. The spokeswoman also said that a “significant amount of internal and external research” led to MSI, and that Facebook tested “various versions” of it before launch, as it would for “any changes” to its ranking systems.”When News Feed decided to goal on meaningful interactions in November, the question came up: how can we use rigorous research and science to decide what weight should go on a like vs a comment etc in a little over one month?” the note asked (emphasis the authors’).The answer, according to the note itself, came via surveying tens of thousands of Facebook users and digging through the company’s massive trove of user data. It was a process with the sheen of statistical rigor, but was also one of trial and error, and human judgment, the documents reveal.

      Move fast and automate things

      It’s hard to say how Facebook’s use of algorithms that take into consideration actions and interactions on its platforms compares to other social networks, but one thing is clear: For years now, all major social networks — and many other online content services, such as Netflix and YouTube — have been incredibly reliant on algorithms to govern what you see. And as the largest social network of them all, Facebook’s algorithms impact more than a third of the people on the planet. So when Facebook decided to use MSI to inform its algorithms that recommend News Feed content, it was making a change that would affect billions of people. As the note in December 2017 explained, getting to the values MSI used at launch was not a simple task. Facebook ran surveys on over 69,000 people in five countries that are among its largest in terms of monthly active users, asking them about “what feedback they find meaningful to give and to receive.” This let the company determine how people valued interactions with different types of people — such as close friends versus acquaintances — and different types of interaction — such as comments on posts versus shares.Those findings, which included that people found more meaning in whom they interacted with than the type of interaction, “helped validate” and “fine tune” the company’s data science findings, the document said, and helped Facebook adjust how it weights the relationships between people interacting on the social network.The company also used surveys, along with existing knowledge about its users and internal data science experiments, to help understand how to build a scale for ranking interactions. For instance, polling detailed in the note found that many users put a low value on having their posts reshared. That’s because they viewed it as an interaction between the sharer and that person’s friends.The note pointed out that the company did things such as analyzing “a bunch of experiments that give people extra feedback.” In those experiments, some posts were given “a little more likes” and others were given “a little more comments”. The results were used to predict how many additional original posts people would generate considering the number of likes, comments, and reactions they had received in previous posts. This helped Facebook come up with a scale, labeled in the note as the final weight for the metric in the first half of 2018: each “like”, for instance, would be worth 1 point; a reaction emoji or a reshare of a post without adding any text would be worth 5 points; an RSVP for an event would be worth 15 points; and comments, messages or reshares deemed “significant” — defined as having “at least 5+ unique tokens, or a photo or video (in case of shares and messages)” — would be worth 30 points according to the note.

      The Facebook Papers may be the biggest crisis in the company's historyThe company could then multiply the total by a figure that represented the closeness of the relationship between the people interacting: were they members of the same group, perhaps? If so, multiply by 0.5. Total strangers? Multiply by 0.3. Jenna Burrell, director of research at nonprofit Data & Society, which studies social implications of technologies, told CNN Business that the research Facebook conducted on users in this case appeared to be quite limited, as the document doesn’t mention surveying users on the actual content they post — words, pictures, or videos — or comments they might leave on others’ posts. “What they’re trying to get at is something that’s really hard to reduce to a metric,” she said.Beyond that, the decision to focus on meaningful social interactions was the kind of switch that would require “thousands of different ways of testing it,” according to Ethan Zuckerman, an associate professor at the University of Massachusetts at Amherst who studies how media can be used to enact social changes. “Because you’re really just rewiring the whole network. So you have to be phenomenally careful and phenomenally thoughtful about how you do it,” he said.Another complicating factor is that while Facebook is an international social network, people don’t use Facebook in the same ways in every country. In Myanmar, for instance, which in 2018 become well-known for the deadly impact of hate speech spread via Facebook, Zuckerman said Facebook is seen as more of a news service than a personal network.”The notion that Facebook is your friends and your personal relationships — that’s just not true for some of the world,” he said.Before the launch of the MSI scale, the Facebook spokeswoman said, Facebook conducted three separate tests of it, using various strengths of the scale, on a subset of users. This sort of test, followed by any tweaks that are needed, is standard practice, the spokesperson said, ahead of the introduction of any ranking change.

      “Beneficial for most metrics”

      Facebook introduced MSI publicly on January 11, 2018, as a way to prioritize posts from friends, family members, and groups. In an interview with CNN at the time, Adam Mosseri, who was then a VP at Facebook and today heads Facebook-owned Instagram, described the move to push meaningful social interactions as a “rebalancing” of how Facebook’s algorithms rank items in the main feed. “We think that we’re currently slightly overvaluing how much time people spend on our platform and undervaluing how many meaningful interactions they have with other people,” said Mosseri, who at the time oversaw News Feed. Indeed, throughout 2018, as the November 6, 2019 research note recounted, the use of MSI was “beneficial for most metrics”, around the world and in the US/Canada region. It led to increases in social interactions, such as likes and comments between users, as well as other “critical ecosystem metrics,” such as the number of people using Facebook daily, revenue, and time users spent looking at their News Feeds.

      “long-term effects on democracy”?

      Yet Facebook quickly discovered that its emphasis on MSI wouldn’t have the same impact in every country and across every type of device. For instance, as that same November 2019 note stated, the company found in April 2018 that reliance on the metric was “hurting” Facebook’s daily active Android users in India; the authors of the note wrote that Facebook could make up for this loss by reducing its reliance on MSI and increasingly emphasizing videos it recommended in users’ feeds. By the end of September that year, the document said, Facebook had identified 11 countries where it used a “more balanced strategy” of MSI plus “appropriate amounts of video.” The note also pointed out that “the dynamics of Feed are changing constantly”, and in early 2019 the company’s ranking team concluded that optimizing for MSI “was no longer an effective tactic for growing sessions”; public-content ranking, it said was a “better strategy”. And less than a year after its launch, documents indicate Facebook knew there were deeper issues with relying on the metric. A November 2018 research memo titled, “Does Facebook reward outrage? Posts that generate negative comments get more clicks”, pointed out that an analysis that month showed more negative comments on posts linking to BuzzFeed led to more clicks on that link. Looking at 13 additional popular publishers and domains, the author of the research found the problem stretched far beyond BuzzFeed.

      An Indian man is shown using Facebook on his cellphone in Siliguri on March 27, 2018.The memo also pointed out that, because of this trend, some publishers may choose to capitalize on negativity. “With the incentives we create, some publishers will choose to do the right thing, while other will take the path that maximizes profits at the expense of their audience’s wellbeing,” the memo stated.”Ethical issues aside, empirically, the current set of financial incentives our algorithms create does not appear to be aligned with our mission,” the memo read, emphasis the author’s. “We can choose to be idle and keep feeding users fast-food, but that only works for so long; many have already caught on to the fact that fast-food is linked to obesity and, therefore, its short term value is not worth the long-term cost.” The move to MSI wasn’t just an issue for publishers: political parties were concerned, too. Another internal research note from April 1, 2019, pointed out that multiple European political parties claimed that the arrival of MSI in 2018 “changed the nature of politics. For the worse.”

      Facebook has known it has a human trafficking problem for years. It still hasn't fully fixed itThe parties argued, the note said, that by emphasizing resharing content, Facebook was “systematically” rewarding “provocative, low-quality content” and parties felt they needed to adjust by pumping out “far more negative content than before” because engagement on positive and policy posts had fallen dramatically. In Poland, for instance, the note said that one party’s social media management team estimated that its posts changed from half positive and half negative to 80% negative because of the algorithmic change. In Spain, the document said, parties reported feeling “trapped in an inescapable cycle of negative campaigning by the incentive structures of the platform.” “Many parties, including those that have shifted strongly to the negative, worry about the long-term effects on democracy,” the document read. Facebook employees were concerned about the impact of MSI, too. In a comment on that same document, a Facebook employee responded to the Spain data by saying it made their “heart cringe.” “I have seen the effect this has had on my mother and how she has become polarized. It’s hard to rally people to cries of ‘be reasonable’ … to find common ground,” they wrote.

      “Haha”

      Over time, internal documents show, Facebook employees proposed changes to MSI, some of which may sound small but show the difficulty with assigning numbers to interactions. For example, the company’s reaction emojis, and the “haha” one in particular, don’t always land the same in every country. In a December 11, 2019 note on the company’s internal platform titled “We are Responsible for Viral Content”, the author wrote that “haha” reactions “are seen as insulting on non-humorous posts in Myanmar.” They included a cartoon translated from Burmese that, according to the note, read “You have been deferred from this year education because you reacted with ‘Haha’ to all my posts.”

      Facebook has language blind spots around the world that allow hate speech to flourishAt the time, all reactions were still weighted the same, but the author noted at the time that “promising proposals are in the works to change this,” the document said. Changes were made to MSI numerous times after its launch, such as in early and late 2020; the Facebook spokeswoman said the formula behind it is “continually updated and refined based on new research and direct feedback from users.” A post in an internal employee group on September 15 of that year forecast changes planned for around October 1 intended “to make MSI capture more useful interactions.” These included filtering out some so-called “bad interactions,” such as deleted comments and single-character comments, and rejiggering the weights associated with reaction icons. Most notably, Facebook said it would make the “angry” reaction, which had previously been demoted to 1.5 points, worth zero points.

        A document from July 2020 laying out the proposal for weight revisions coming in the second half of the year gave a hint as to how the company landed on that decision. “After discussing with Comms to decide between Angry 0 vs 0.5 to see if one is more externally defendable, this is our final proposal on 2020/07/31,” it read, listing a zero value for “angry” in a table below.

        Source: edition.cnn.com

        Leave A Reply

        Your email address will not be published.

        fifteen + nine =