Suicide prevention efforts are illustrative of the dilemma at the intersection of evidence-based care and innovative research, according to an opinion piece published in the Annals of Internal Medicine.
John Torous, MD, of the Department of Psychiatry at Beth Israel Deaconess Medical Center at Harvard Medical School in Boston, Massachusetts, and Ian J. Barnett, PhD, assistant professor of biostatistics in the Department of Biostatistics, Epidemiology, and Informatics at the University of Pennsylvania Perelman School of Medicine in Philadelphia, examined the suicide prevention efforts implemented by Facebook. Using a computer algorithm, Facebook contacts emergency services when it believes that a user may be at risk based on an assessment of the user’s interactions on the platform.
While this can objectively be viewed as a step in the right direction, Drs Torous and Barnett pointed out that the credentials of the Facebook Community Operations team who review the data are unclear, which, they added, “raises the broader issue of what aspects of Facebook’s approach need to be shared in terms of research ethics, clinical research, and public health.”
Continue Reading
Drs Torous and Barnett argue that in the interest of transparency in clinical research, more details of Facebook’s suicide prevention approach must be shared; since the platform’s approach is considered “innovative,” it “likely falls under what a reasonable person would consider research.” In particular, they indicated that Facebook’s protocols are absent the key ethical research principle of respect for persons in the form of participants’ informed consent.
In addition to the ethical issues, Drs Torous and Barnett noted that “the predictive performance of algorithms for detecting suicide risk are best optimized through transparency that enables cooperation with researchers across the field of statistical and machine learning.” They cited the application of a population-level vs an individual-level model to users’ posts, and noted that this approach may have difficulty predicting more nuanced language indicative of suicide — such as sarcasm — due to the limits of the application of population-level modeling.
Finally, the article noted that the scope of these efforts “[seem] more fitting for public health departments than a publicly traded company whose mandate is to return value to shareholders.”
“The approach that Facebook is trialing to reduce death by suicide is innovative and deserves to commendation for its ambitious goal of using data science to advance public health, but there remains room for refinement and improvements,” Drs Torous and Barnett concluded.
Reference
Barnet I, Torous J. Ethics, transparency, and public health at the intersection of innovation and Facebook’s suicide prevention efforts [published online February 11, 2019]. Ann Intern Med. doi:10.7326/M19-0366