A Misinformation Outbreak

In the year 2000, the U.S. government declared measles to be eliminated, meaning for more than twelve months, there had been no endemic cases of the disease within the country (cdc.gov).  Widespread use of the measles vaccine since 1968 resulted in steadily declining incidence of the disease. Then, it all began to change.  A 1998 study, that has since been debunked, misguidedly linked vaccines to autism, and accelerated an anti-vaccination social movement. Over the past 10 years, a steadily increasing number of outbreaks have occurred within communities that do not have a high enough percentage of vaccinated individuals for herd immunity to be effective.  In 2018, the CDC reported 372 confirmed cases of measles in the U.S. This is more than three times the count in 2017.  2019 is already off to a raucous start, with 159 cases as of February 21st (cdc.gov).  Despite the fact that “Anti-vaxers” are a relatively fringe group, misinformation has spread widely and rampantly, and has manifested as physical health consequences.  What role do Facebook’s algorithms have in propagating the spread of misinformation?

Websites, most notably in this case Facebook, give small fringe groups a platform to reach a huge audience.  The ease of sharing articles on this platform makes for quick sharing of information and all the associated benefits and consequences.  Facebook’s algorithms are designed to find content that users are most likely to be interested in.  As a result, people see pages similar to those that they already interact with, and content posted by friends, who likely hold similar world views. The result is a bubble effect. People end up seeing only one side of a story.  Daniel Kahneman would refer to it as the “Availability Heuristic”.  There is no shortage of peer reviewed journal articles to combat anti-vaccination messages, but because the links to anti-vaccination content are far more available in certain networks, this is what they see as true and base their beliefs on.

Vax

Figure 1. The network map of the top 500 publishers about vaccines by hyperlink degree centrality, demonstrating social clustering, who is linking to one another, and how frequently: a vaccine-hesitant community (green), a health and science community (pink), a provaccine community (blue), and a mainstream media community (yellow) that used language common to both provaccine lay audiences and science.

A 2018 study analyzed the influence of vaccination related information sources based on Hyperlink Indegree Centrality, meaning the number of times the source is linked to on other web pages. The labelling is based on the most common vaccine sentiment found on each of these sources; individual stories from these sources may lean a different direction.  As shown in the figure, there is very little overlap between the different types of sources. When an individual posts on Facebook about a vaccine-related topic, they will see content with similarly hyperlinked sources, and as a result have only have background data from limited sources.

It is a problem that Facebook has recognized, and announced a need to address. A Facebook VP stated, False news is harmful to our community, it makes the world less informed, and it erodes trust. It’s not a new phenomenon, and all of us — tech companies, media companies, newsrooms, teachers — have a responsibility to do our part in addressing it” (Facebook.com). Potential options for changing the algorithm could include not featuring known anti-vaccination posts as suggested content for users, or by putting anti-vaccination material farther down the list of search results (Cohen & Bonifield, 2019).  It is a complicated balance between censorship and limiting the spread of harmful misinformation, but adjusting the algorithms to promote a wider dispersion of verified information would help to limit instances of availability bias when it comes to information about vaccines.

References

Cohen, Elizabeth, and John Bonifield. “Facebook to Get Tougher on Anti-Vaxers.” CNN, Cable News Network, 26 Feb. 2019, http://www.cnn.com/2019/02/25/health/facebook-anti-vaccine-content/index.html.

Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2015.

“Measles Cases and Outbreaks .” Cdc.gov, Centers for Disease Control and Prevention, 25 Feb. 2019, http://www.cdc.gov/measles/cases-outbreaks.html.

“Working to Stop Misinformation and False News.” Facebook.com, 7 Apr. 2017, http://www.facebook.com/facebookmedia/blog/working-to-stop-misinformation-and-false-news.

Leave a comment