Generative AI’s earliest applications in medicine have largely focused on curing not patients, but the plague of productivity physicians lose to digital documentation. Now, research suggests a way that large language models like ChatGPT could benefit both patients and providers: by automatically extracting a patient’s social needs from reams of text in their clinical records.
Factors like housing, transportation, financial stability, and community support play a critical role in patients’ health once they leave the doctor’s office. But it takes concerted effort to screen patients for gaps in these so-called social determinants of health — and even when screening occurs, this critical information is usually scattered in the rambling clinical notes that providers write each time a patient has a visit.
As a physician trying to understand a patient’s needs, “you’re trying to do a needle in a haystack type search for clinical information,” said Danielle Bitterman, a radiation oncologist and artificial intelligence researcher at Mass General Brigham. “Patients oftentimes have thousands of notes.”
This article is exclusive to STAT+ subscribers
Unlock this article — and get additional analysis of the technologies disrupting health care — by subscribing to STAT+.
Already have an account? Log in
Already have an account? Log in
To submit a correction request, please visit our Contact Us page.
STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect