Browse by Medical Category
Thursday, October 4, 2018
In his work in mental health and psychiatry, Jordan Smoller, MD, ScD, with the help of colleagues Ben Reis, PhD, and Matthew Nock, PhD, is tapping into the immense potential at the junction of big data and machine learning. His particular data sets? Unfathomable numbers of electronic health records (EHR). The ultimate goal? A clinical decision support tool that could be used on demand and in real time, when a patient visits a physician, to determine if he or she is at an increased risk of suicide—and to inform or motivate more comprehensive screening and potential intervention by the clinician.
Dr. Smoller is, in fact, well on his way to achieving that aim. He and his team first began working with large longitudinal EHR databases in 2009 and, thanks to support from the Tommy Fuss Fund, were able to spend time using the data to develop a risk-prediction algorithm, or what Dr. Smoller calls a suicide “early warning alert system.” It incorporates more than 30,000 different potential predictors or variables from the EHR, each contributing to an overall patient risk score. On the procedure’s initial test within the Partners Healthcare system, it was able to detect around 45 percent of suicide attempts, with 90 percent specificity, an average of roughly two to three years in advance.
“The sad fact is that suicide is one of the leading causes of death in this country, and is the second-leading cause of death among young people,” shares Dr. Smoller. What’s more, he notes, instances of suicide are on the rise, yet healthcare providers in his shoes don’t currently have an effective way to forecast risk. “Clinicians essentially do no better than chance at making accurate predictions,” he says. “This is a tremendous opportunity, we think, to use big data for real-world benefit.”
Of the 30,000 variables involved in the algorithm, many of those that turned up on its initial test as significant determiners of risk are factors clinicians could have feasibly identified on their own: mental health conditions, substance abuse issues, or the use of psychiatric medications, as examples. But surprise risk barometers surfaced as well, including certain kinds of infections and a history of specific orthopedic fractures.
“Those are the things that are interesting,” Dr. Smoller points out, “because no human being could process all of that information simultaneously in a clinical encounter.” He further explains, “With big data and machine learning, you have the ability to potentially find patterns or predictive profiles that may incorporate indicators of risk you would never have thought of.” He cautions, though, that while factors may emerge as important, that doesn’t mean they’re causally related to suicide.
Thus, next steps for Dr. Smoller’s team include grappling with how, as the algorithm and forthcoming tool’s developers, to effectively communicate risk to a clinician while also making clear that it doesn’t offer a perfect prediction, and false-positives are likely. “It’s really meant to inform rather than replace clinical judgment,” Dr. Smoller clarifies.
He and his team recently began the process of prototyping the app that will actually turn their algorithm into a clinical tool. But Dr. Smoller acknowledges it’s not ready for primetime. They’re continuing to refine and validate the predictive algorithm through performance tests in other healthcare systems across the country (preliminary data indicates it performs as well in these systems as it did at Partners). Eventually, the tool will go through a formal clinical trial so that Dr. Smoller can confidently say whether or not the information it provides will make a positive difference for medical practitioners.
These verification steps may take several years, but he’s willing to wait. “The need for and the clinical importance of this is something that I face regularly in trying to care for patients,” Dr. Smoller says. “Until this opportunity of big data came along, the idea of a real clinical decision support tool wasn’t something I even anticipated,” he reflects, “because it wasn’t feasible.”
Contact Dr. Jordan Smoller at firstname.lastname@example.org.
Back to Top