The online spread of health-related misinformation has demonstrable negative effects on public health. Antivaccine discourse on social media, for example, has been cited as a contributing factor to the rising number of parents who refuse to vaccinate their children.1 Similarly, rumors that circulated on social media during the 2014 Ebola outbreak generated hostility toward healthcare workers, hindering efforts to contain the epidemic.2
In a viewpoint article published in JAMA, Wen-Ying Silvia Chou, PhD, MPH; April Oh, PhD; and William M.P. Klein, PhD, research scientists at the National Cancer Institute, described ways in which medical professionals can curb the spread of health-related misinformation.3
Misinformation is amplified within “information silos” and “echo [chambers],” the authors wrote. Social media feeds are “personally curated” by each individual, thus decreasing the likelihood that users will encounter viewpoints that differ from their own. Misinformation is easily amplified in these social media environments. Research also suggests that “falsehoods spread more easily than truths” on social media and other online forums.4 Mistrust in medical institutions further legitimizes health misinformation in online circles; according to a 2016 Gallup poll, just 36% of individuals expressed “adequate confidence” in the medical system.5 Additionally, a 2017 study suggested that 1 in 5 individuals express “skepticism about scientists” of any field.6 To mitigate the spread of misinformation, authors outlined guidelines for clinical practice, research, and public health.
To properly curb health misinformation, scientists must first understand the way such information is shared. The authors encouraged the “deployment of innovative methods” such as social network analysis to address misinformation on social media. Surveillance must be conducted to investigate the characteristics of certain information silos and identify which intervention methods may be effective. Scientists must also study the “context of misinformation exchange,” including the social media platform on which the information was shared. The dynamics among users sharing misinformation should also be properly understood before strategies are developed. Additionally, the “reach” and consequences of certain health messages must be understood; real-time behavioral data, linkage to medical records, and marketing research can help elucidate the way in which social media users internalize certain information.
Beyond research to identify proper means of intervention, the authors wrote, the medical community must also provide support and training for interaction with misinformed patients. Clinicians must be equipped to understand and respond to their patients’ concerns, rather than “dismissing [them]…as skeptics.” It is also important for social media platforms to properly assess the credibility of certain content before it is disseminated, the authors wrote. Health-related misinformation can undermine efforts to provide proper health care. Social media platforms that encourage the spread of misinformation must receive due attention from medical and public health professionals to curb these effects.
- Broniatowski DA, Jamison AM, Qi S, et al. Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. Am J Public Health. 2018;108(10):1378-1384.
- Jones B, Elbagir N. Are myths making the Ebola outbreak worse? CNN. https://www.cnn.com/2014/08/20/world/africa/ebola-myths/. Updated August 25, 2014. Accessed January 8, 2019.
- Chou W-YS, Oh A, Klein WMP. Addressing health-related misinformation on social media. JAMA. 2018;320(23):2417-2418.
- Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science. 2018;359(6380):1146-1151.
- Saad L. Military, small business, police still stir most confidence. Gallup. https://news.gallup.com/poll/236243/military-small-business-police-stir-confidence.aspx. June 28, 2018. Accessed January 8, 2019.
- Funk C. Mixed messages about public trust in science. Issues Sci Technol. 2017;34(1).