Facebook researchers had deep knowledge of how coronavirus and vaccine misinformation moved through the company’s apps, running multiple studies and producing large internal reports on what kinds of users were most likely to share falsehoods about the deadly virus, according to documents disclosed by Facebook whistleblower Frances Haugen.

But even as academics, lawmakers and the White House urged Facebook for months to be more transparent about the misinformation and its effects on the behavior of its users, the company refused to share much of this information publicly, resulting in a public showdown with the Biden administration.

Internally, Facebook employees showed that coronavirus misinformation was dominating small sections of its platform, creating “echo-chamber-like effects” and reinforcing vaccine hesitancy. Other researchers documented how posts by medical authorities, like the World Health Organization, were often swarmed by anti-vaccine commenters, hijacking a pro-vaccine message and wrapping it in a stream of falsehoods.

Taken together, the documents underline just how extensively Facebook was studying coronavirus and vaccine misinformation on its platform as the virus tore across the world, unearthing findings that concerned its own employees.

Yet in public blog posts and in congressional testimony, executives focused on more positive aspects of the social network’s pandemic response, such as how many pieces of misinformation it had taken down and its work to help its users find vaccine clinics near them.

The disconnect between what was known and shared bolsters demands from lawmakers, who are increasingly amenable to proposals to force greater transparency from tech giants, with some directly supporting Haugen’s assertion that a separate regulatory body is needed to study algorithms and internal research.

Advertisement

“For months, I’ve repeatedly requested information from Facebook about COVID misinformation, including questions about which users post it, how the platform amplifies it, how Facebook decides what to remove, and much more,” Rep. Anna Eshoo, D-Calif., said in a Tuesday email to The Post. Representing a district that includes Silicon Valley, she sits on the House Energy and Commerce Committee, which has oversight over the tech giants.

“I asked these questions because policymakers need to understand how COVID misinformation spreads and how we can mitigate its harmful effects on vaccine hesitancy and public health. It was the whistleblower documents that shed light on these issues, instead of Facebook releasing them a long time ago,” Eshoo said.

In an emailed statement, Facebook spokesman Aaron Simpson said the company has worked to promote reliable information about the coronavirus throughout the pandemic and that vaccine hesitancy among U.S. Facebook users has gone down by 50 percent since January.

“There’s no silver bullet to fighting misinformation, which is why we take a comprehensive approach which includes removing more than 20 million pieces of content that break our COVID misinformation policies, permanently banning thousands of repeat offenders from our services, connecting more than 2 billion people to reliable information about COVID-19 and vaccines, and partnering with independent fact-checkers,” Simpson said.

The documents were disclosed to the U.S. Securities and Exchange Commission and Congress by whistleblower Haugen’s lawyers and were reviewed by a consortium of media organizations, including The Washington Post. The documents include research reports, slide decks and discussions between employees on the company’s internal message board. The Wall Street Journal previously reported on some of the pandemic-related revelations in the papers, including how the company struggled to police anti-vaccine comments.

How Facebook, which is particularly popular with older Americans, has affected perceptions around vaccines has been a key part of discussion about the pandemic over the past year. In July, as the disease’s delta variant was causing a massive new surge in infections as the rate of new vaccinations leveled off, the White House began placing some of the blame on social media.

Advertisement

The surgeon general said tech platforms had allowed misinformation about vaccines and the virus to “poison our information environment,” a callback to the days of Big Tobacco. A day later, President Joe Biden told reporters that Facebook was “killing people.” (He later backed off the comment.)

Facebook shot back, saying it had helped fight the pandemic and had strict rules to prevent the spread of harmful information.

“While social media plays an important role in society, it is clear that we need a whole of society approach to end this pandemic,” Facebook Vice President Guy Rosen wrote in a blog. “Facts – not allegations – should help inform that effort.”

Haugen left Facebook in May, before the public fight between the company and the White House, so it’s unclear what kind of information the company had during that time. But employees were looking into the issue well before the public spat.

One study noted that much of the misinformation being spread about vaccines came from “COVID super-spreaders,” who used tactics similar to those of purveyors of falsehoods about the 2020 election and the extremist QAnon ideology. Taking action against these people was “low hanging fruit,” the report said. “We found, like many problems at FB, that this is a head-heavy problem with a relatively few number of actors creating a large percentage of the content growth,” the study read.

The writers suggested that Facebook could put limits on the amount of content people could produce, which would not affect the vast majority of users but might stymie repeat spreaders of coronavirus-related misinformation.

Advertisement

Facebook’s Simpson said the company has permanently banned thousands of “repeat offenders.”

In another study, Facebook researchers again flagged that “problematic vaccine content” was concentrated among a small segment of users. Facebook workers found that half of the views of “problematic vaccine content” came from people in just 10 out of 638 sets of U.S. users, classified together by the company because of their similar social ties.

Within the group with the highest rate of vaccine-hesitant posts, numbering about 3 million people, just five users with more than 50 posts each accounted for 10% of its vaccine hesitancy.

Employees noted that those dynamics had created “self-reinforcing” feedback loops, where vaccine-skeptical posts were overwhelmingly supported in the comments and reactions. Anti-vaccine comments got boosted with many “like” and “love” reactions, while people who posted pro-vaccine content were met with derision in the form of “angry,” “sad” or “haha” reactions, the researchers wrote.

Behind the efforts to get more information from Facebook was a central question: How has the social network shaped real-life behavior? One research report from March 2021 hinted that product decisions did indeed affect people’s offline behaviors during the pandemic.

In one three-week experiment, the company ran a series of interventions that resulted in people seeing more authoritative health information, and less misinformation that had been debunked by the company’s fact-checking partners. People whose feeds had these interventions reported that they were more likely to ensure they were physically distancing in public. However, the changes resulted in a very slight decline in time spent on Facebook’s apps – by 0.06%.

Advertisement

As part of the study, Facebook introduced a news feed model for a small group that didn’t promote content based on its number of comments and re-shares, unless it came from an official health authority, a mainstream news publisher or one of the user’s close friends.

The experiment resulted in a nearly 12% reduction of certain types of misinformation, and a 7% reduction in unwanted social interactions on coronavirus-related content. And users liked it better, too.

“Treated users find COVID-19 content on FB more informative and are more likely to cite FB as source of ‘useful’ info,” the report found.

In this case, Facebook implemented the changes across its platform, Simpson said.

Whether a small group of actors is responsible for an outsize amount of coronavirus misinformation has been a central question in the debate over social media and its role in the pandemic for months. A March report from the nonprofit Center for Countering Digital Hate concluded that a “disinformation dozen” of just 12 people were responsible for up to 73% of the anti-vaccine content on Facebook. The analysis was based on a sample of 483 pieces of anti-vaccine content known to be circulating in Facebook Groups. The report was popular among lawmakers, and the White House referenced its findings as well when it went on the offensive against social media.

There are fair questions about whether a study that focused on a small sample of posts represents what’s happening on the sprawling platform. But even so, Facebook’s own internal data showed that the central thesis was largely true – most of the anti-vaccine content was indeed coming from a relatively minuscule number of accounts.

Advertisement

Even before the “disinformation dozen” report, an internal Facebook document reported on by the Post in March showed that the company knew most of the posts its algorithms identified as expressing “vaccine hesitant” sentiments were coming from a tiny percentage of its users.

Still, Facebook aggressively pushed back against the Center for Countering Digital Hate report, releasing a blog post disputing it publicly after the tensions with the White House put a spotlight on the research. “Focusing on such a small group of people distracts from the complex challenges we all face in addressing misinformation about COVID-19 vaccines,” Monika Bickert, Facebook’s vice president of content policy, said in a blog post.

When Facebook chief executive Mark Zuckerberg testified at a House hearing on disinformation in March, Rep. Mike Doyle, D-Pa., pressed him to commit to removing the so-called disinformation dozen. Zuckerberg replied and he and his team would need to “look at the exact examples” to see if those users were breaking Facebook’s rules. Doyle said Wednesday that his office never heard back. “That’s no surprise given Facebook’s lengthy record of dismissing transparency, avoiding accountability, and not owning up to its own mistakes,” he said in a statement to The Post. “I wish I could expect better of them.”

Imran Ahmed, the chief executive of the group that authored the report, said Facebook’s internal research backs up his warnings. “It turns out they were seeing exactly what we were seeing,” he said. “I’ve sat in government forums where Facebook has misled the lawmakers whose job it was to protect us.” If Facebook had shared more information, “lives would have been saved,” he added.

The internal documents also demonstrate that Facebook employees were able to calculate how many views a widely shared piece of misinformation garnered. In March, Facebook employees noted that one post undermining vaccines was able to garner 53,000 re-shares and more than 3 million impressions because of an apparent error with Facebook’s algorithms.

The user posted a meme with the caption “Baaaaaa …” – a common phrase that anti-vaccine activists have used to imply that people getting the vaccine are mindless sheep. But Facebook’s systems thought the post was written in Romanian, a language in which “ba” means “no,” so it was not demoted. One employee called it “a bad miss for misinfo.”

Advertisement

In the months after that report, Facebook continued to rebuff policymakers eager for answers about how much coronavirus-related misinformation was being viewed on the site, or whether the company alerted advertisers when their content appeared alongside vaccine misinformation.

In one July letter, Reps. Jan Schakowsky, D-Ill., and Eshoo urged the company to answer six detailed questions about how many times coronavirus misinformation was being viewed on the site, including whether the company alerted affected advertisers.

The company flatly refused to respond.

“At this time, we have nothing to share in response to the questions you have raised, outside of what Mark [Zuckerberg] has said publicly,” the company wrote in a one-paragraph response, a month later.

On the inside, Facebook employees appeared to be earnestly trying to fight coronavirus misinformation. One document listed about 20 different “signals” the company used, on top of its regular algorithms, to detect whether a piece of such misinformation had been posted. For example, if a user who had posted repeatedly to a COVID-focused group had one of their posts go viral, it was a good indicator the content might be misinformation. The document shows Facebook’s employees went to great lengths to try to detect content that broke its rules, while working within a system where any post could go viral at any time.

Comments were an especially difficult area for the company to police. “Vaccine hesitancy in comments is rampant,” one researcher wrote, noting that the proportion of vaccine-hesitant comments on vaccine-related posts may be “north of 40%.”

Advertisement

“Our ability to detect vaccine-hesitant comments is bad in English, and basically non-existent elsewhere,” the researcher wrote. The Wall Street Journal previously reported the high rate of misinformation in comments.

Sometimes blatant coronavirus-related misinformation was out in the open and required employees to manually report it before action was taken.

In May, Indian actress Kangana Ranaut wrote to her 7.7 million followers on Facebook-owned Instagram that she had tested positive for COVID-19, the disease caused by the coronavirus. Her post downplayed the severity of the disease, which at the time was ripping through the country, killing tens of thousands of people.

“If you are scared it will scare you more,” Ranaut wrote. “Come let’s destroy this COVID-19 it is nothing but a small time flu which got too much press.” The post was flagged by a Facebook employee on an internal thread related to coronavirus misinformation asking why it was still up. Company policies forbid users from saying COVID-19 is no more dangerous than the flu.

The post was soon taken down, but the employee who originally flagged Ranaut’s account wanted to know why the actress was allowed on Instagram, after being kicked off Twitter in May for repeatedly posting abusive and anti-Muslim comments. Ranaut’s Instagram account was still active as of Wednesday, and she now has 7.8 million followers.

Vaccine hesitancy on Facebook predates the coronavirus. Until 2019, the company took a largely hands-off approach to anti-vaccine content. The company announced in March of that year that it would stop recommending pages and groups with vaccine misinformation, and that it would block advertisements that include falsehoods about vaccines. The company cracked down significantly during the pandemic, announcing in December that it would remove misinformation about the safety, efficacy, ingredients or side effects of vaccines.

Those earlier anti-vaccine groups – many of which were spreading misinformation – have contributed to the phenomenon we see today, said Hany Farid, a computer science professor and misinformation researcher at the University of California at Berkeley.

Facebook “really set the stage both for the Jan. 6 insurrection and for COVID by … allowing vast numbers of conspiracies to be promoted on their platform so people don’t trust the experts,” Farid said.

White House press secretary Jen Psaki said at a Wednesday news conference that it is “unfortunately not surprising” for the administration to learn that Facebook knew of its problems with vaccine misinformation. “We’ve continued to see platforms regularly amplify anti-vaccine content over accurate information,” she said. “That’s the basic problem. And that’s what we continue to see happen.”


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: