Facebook research reportedly finds small number of users responsible for spreading vaccine doubt

Vaccine hesitancy predates COVID-19 and social media, and may not violate platforms’ rules

TV24 DESK: Facebook research into “vaccine-hesitant” beliefs has found that a small group of users is driving many of the discussions that may sow doubt or discouragement about taking a vaccine, The Washington Post reported.

Vaccine hesitancy predates both social media and COVID-19, as the World Health Organization reports, and can derail progress in eradicating vaccine-preventable diseases. The WHO points out that vaccine hesitancy may not be wholly responsible for an the 30 percent increase in measles cases around the world over the past several years, but it played a role in measles’ resurgence.

Facebook banned false and misleading ads about vaccines back in October, weeks before the first coronavirus vaccines were even available. In December, Facebook announced it would remove false claims about COVID-19 vaccines, and began notifying users if they had interacted with a post that had false information. It’s also taken steps to promote authoritative information about COVID-19 vaccines.

The research described by the Post appears to have information about more of a gray area, for example, if a user mentions on Facebook that their symptoms after receiving a vaccine dose are worse than they planned. Comments like that could be used to better understand the vaccine’s impact, but could also make other users wary, especially if they’re already nervous about the vaccine.

The study appears to confirm what many Facebook users (and critics) have long known: that there’s an echo chamber effect that helps spread misinformation on the platform. Content that helps create this effect may not actually run afoul of any of Facebook’s rules, but can quickly metastasize among groups of susceptible users.

Perhaps unsurprisingly, the Facebook researchers found there was significant overlap between users connected to QAnon conspiracy theories — which Facebook has banned from the platform —and the user communities who expressed skepticism about vaccines.

Facebook spokesperson Dani Lever said in an email to The Verge that the company has partnered with more than 60 global health experts, and has studied content related to COVID-19 vaccine and other information to inform its policies— adding that Facebook routinely studies trends that may be part of conversations on its platform, such as voting, bias, hate speech, and nudity so it can continued to refine its products.

“Public health experts have made it clear that tackling vaccine hesitancy is a top priority in the COVID response, which is why we’ve launched a global campaign that has already connected 2 billion people to reliable information from health experts and remove false claims about COVID and vaccines,” she said. “This ongoing work will help to inform our efforts.” Source: THE VERGE

LEAVE A REPLY

Please enter your comment!
Please enter your name here