Facebook was frozen after anti-vaccine comments inundated users

Some Facebook employees believed they had found a way for people to help, after March saw claims about the dangers of coronavirus vaccines spreading across social media.

Facebook was frozen after anti-vaccine comments inundated users

Researchers at the company discovered that they could alter the ranking of vaccine posts in people's newsfeeds to reduce misleading information about COVID-19 vaccines. They also offered users legitimate sources such as the World Health Organization.

One Facebook employee responded to an internal memo about the study by writing, "Given these findings, I'm guessing we're hoping for launch ASAP,"

Facebook instead retracted some of the suggestions from the study. Other changes were not made until April.

Another Facebook researcher suggested that comments on vaccine posts be disabled in March so that the platform could better tackle anti-vaccine messages. This suggestion was rejected.

Critics claim that Facebook's slow response to the ideas was due to the tech giant's fear that it could have a negative impact on its profits.

"Why would comment removal be a problem?" Engagement is what matters most," Imran Ahmed, CEO of the Center for Countering Digital Hate (an internet watchdog organization), said. It drives attention, which equals eyeballs, and eyeballs equals ad revenue.

Facebook stated in an email that it had made "considerable progress this year" with the downgrading of vaccine misinformation in its users' feeds.

Facebook's internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel. The Congress received the redacted versions from a group of news organizations including The Associated Press.

This trove of documents proves that Facebook was diligently investigating how it spread misinformation about life-saving vaccinations during the COVID-19 pandemic. The documents also show that rank-and-file employees often suggested solutions to counter anti-vaccine content. The Wall Street Journal published a report last month about Facebook's attempts to address anti-vaccine comments.

Facebook's response raises concerns about whether it prioritized controversy over the health and well-being of its users.

Roger McNamee, a Silicon Valley venture capitalist who was an early investor in Facebook, said that "these people are selling fear and anger." It is not an accident. It's a business model.

Facebook ranks posts based on engagement. This means that the total number and quality of likes, comments, dislikes, and shares is ranked. This ranking system may be useful for simple topics like recipes or dog photos. Facebook documents reveal that engagement-based ranking does not emphasize polarization, disagreement and doubt when it comes to divisive issues such as vaccines.

Facebook researchers modified the ranking system for over 6,000 users from the U.S. and Mexico to decrease vaccine misinformation. These users were not able to see posts about vaccines chosen based upon their popularity. Instead, they saw posts that were trusted.

These results were quite striking: a nearly 12% drop in content that claimed to be debunked or verified by fact-checkers, and an 8% rise in content from authoritative public healthcare organizations like the WHO and U.S. Centers for Disease Control. These users also saw a 7% drop in negative interactions.

According to internal exchanges in the whistleblower's records, employees at the company responded to the study with enthusiasm.

One Facebook employee responded to an internal memo describing how the platform could control anti-vaccine content.

Facebook claimed it had implemented many of the findings of the study, but not for another month. This delay occurred at an important stage in the global vaccine rollout.

Dani Lever, a company spokesperson, stated that the internal documents don't reflect the significant progress made since then in promoting reliable information on COVID-19 and expanding policies to eliminate more harmful COVID/vaccine misinformation."

According to the company, it also took some time to review and implement the changes.

But the urgency to act was clear: States across the U.S. were at that time offering vaccines to the most vulnerable, the elderly and the sick. Public health officials were also concerned. Only 10% of the population had ever received a COVID-19 vaccination. According to a poll by The Associated Press-NORC Center for Public Affairs Research, a third of Americans thought they might skip the shot altogether.

Facebook employees admitted that they didn't know how anti-vaccine sentiment was among the comments section of posts. However, company research from February revealed that up to 60% of comments on vaccine posts were antivaccine or vaccine-reluctant.

The presentation of March 9 stated that "that's a big problem and we have to fix it."

Worse, employees of the company admitted that they did not have a system in place to catch these comments. Even worse, Facebook did not have a policy to remove the comments. This free-for-all allowed users to comment on vaccine posts posted by news outlets and humanitarian organizations.

Another internal memo, posted March 2, stated that "Our ability detect (vaccine hesitation) in comments was bad in English -- but basically non-existent elsewhere."

Derek Beres is a Los Angeles resident and author. He promotes immunizations via his Instagram accounts, which are owned by Facebook. Beres and his friends started a podcast last year after seeing conspiracy theories about COVID-19, vaccines, and other topics swirling around the feeds of well-known health and wellness bloggers.

Beres posted an earlier photo of himself getting the COVID-19 shot. Some on social media said that he would likely die in six months.

Beres stated, "The comments section has been a disaster for so many people."

Facebook became so hostile to vaccination that prominent public health agencies such as UNICEF and World Health Organization started urging people to get the vaccine. However, these organizations refused to use the free advertising Facebook gave them to encourage inoculation according to the documents.

A few Facebook employees had an idea. The company was working to develop a plan to reduce anti-vaccine sentiments in comments. Why not just stop commenting on posts?

One Facebook employee wrote that he was interested in the proposal to remove ALL comments in-line for vaccine posts, as a temporary solution until we can detect vaccine hesitancy sufficiently in comments to refine our removal.

This suggestion was not accepted.

Mark Zuckerberg, Facebook's CEO, announced that Facebook would begin labeling posts that describe vaccines as safe on March 15.

Ahmed, of the Center for Countering Digital Hate, stated that Facebook was able to profit from anti-vaccine comments and continue to engage with them.

He said, "They were trying not to reduce engagement but make it seem like they were trying some steps toward fixing the problems they caused."

It is unrealistic to expect a multibillion-dollar company such as Facebook to change a system it has made so profitable, according to Dan Brahmy CEO of Cyabra, an Israeli tech company that analyses social media networks and disinformation. Brahmy suggested that government regulations might be the only thing that can force Facebook to take action.

Brahmy stated that they didn’t do it because they didn’t have to. It's not possible to undo something that hurts the bottom line.

The U.S. Senate passed bipartisan legislation that would allow social media platforms to offer users the ability to turn off the algorithms tech companies use for organizing their news feeds.

Senator John Thune (R-South Dakota), a sponsor of this bill, asked Facebook whistleblower Haugen for details about the dangers of engagement based ranking in her testimony before Congress earlier in the month.

She stated that there are better ways to rank content, such as chronologically or by source quality, which would be more helpful for users. She said that Facebook will not consider them because they could reduce engagement.

Haugen stated that Facebook knows that when they choose the content, they spend more time on their platform and make more money.

The leaked documents from Haugen also show that only a small percentage of anti-vaccine Facebook users are given big pageviews as part of the current ranking system.

On March 24, internal Facebook research showed that the majority of "problematic vaccine content," was coming from just a few areas of the platform. The report found that 50% of pageviews against vaccines were coming from Facebook communities with high levels of distrust. This is a figure of.016%.

The research revealed that top producers were mainly users who post (vaccine hesitancy), content to feed.

The Center for Countering Digital Hate published a social media analysis that found that only a dozen Facebook users had contributed 73% to anti-vaccine posts between February and March. The August Facebook leaders told the public that the study was "faulty" despite months of internal research that showed that only a few accounts were driving anti-vaccine sentiment.

An AP-NORC survey earlier this month found that the majority of Americans blame social media companies like Facebook and their users for spreading misinformation.

Ahmed stated that Facebook should not be blamed for this problem.

Ahmed stated that Facebook had made decisions that have resulted in people being misinformed and causing them to die. "A murder investigation should be initiated at this point."