Skip to main content

Facebook Inc. is moving to stop the spread of anti-vaccine information on its social-media sites by reducing the rankings of groups that spread misinformation, removing an option for advertisers to target people interested in “vaccine controversies” and rejecting ads that promote falsehoods about vaccines.

The company, which made the announcement Thursday, has come under increasing pressure in recent weeks to take action on the barrage of anti-vaccine information found on Facebook and Instagram as cases of vaccine-preventable illnesses rise worldwide. A measles outbreak in B.C. has infected 17 people so far this year. Health officials in Moncton say a whooping-cough outbreak at a high school has infected five people. The World Health Organization has declared vaccine hesitancy as a top global health threat this year.

Many experts say social-media sites are part of the problem, as they allow people to spread false information about vaccines to large audiences. Earlier this week, Ethan Lindenberger, an Ohio teenager who had himself vaccinated despite his mother’s opposition, testified at a U.S. Senate hearing that his mother relied on social-media sites, notably Facebook, as her source for anti-vaccine information.

Related: Billboards posted by anti-vaccine group in GTA being removed, advertising company says

Tim Caulfield, Canada Research Chair in health law and policy at the University of Alberta, wrote in an e-mail that exposure to anti-vaccine information can contribute to vaccine hesitancy, which is why this decision by Facebook is important.

“Social media, including Facebook, also helps to polarize the discourse. In many ways, social-media platforms are polarization machines. The loud voices win,” he wrote.

Facebook said it will make a series of changes: reduce the ranking of groups and pages that spread anti-vaccine information in the news feed and search functions; remove targeting options that let advertisers promote messages to people interested in “vaccine controversies;” and reject ads that include false information about vaccines. It said it might also disable accounts that continually violate the company’s policies.

Monika Bickert, Facebook’s vice-president of global policy management, wrote in a blog post Thursday that the company is “fully committed to the safety of our community and will continue to expand on this work.”

Facebook, which also owns Instagram, won’t display or recommend content that contains anti-vaccine information on Instagram’s explore function or hashtag pages. Instagram will remove ads that contain vaccine misinformation and will also remove targeting options for ads, such as “vaccine controversies."

The company said it is also looking at introducing other measures, such as ensuring factual information from credible organizations about vaccines appears at the top of search rankings and adding warning language to invitations to join anti-vaccine groups.

Facebook is stopping short of removing anti-vaccine groups and pages because it says it believes that isn’t helpful and that it makes more sense to ensure that people have access to fact-based information to counter anti-vaccine messages.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe