AI Tool Interprets Echocardiograms in Minutes
Originally published by Yale School of Medicine, written by Rachel Martin
Cardiologists use echocardiography to diagnose a range of functional or structural abnormalities of the heart. Using often over a hundred videos and images that capture different parts of the heart, echocardiographers make dozens of measurements, such as the heart's size and shape, ventricle thickness, and the movement and function of each heart chamber, to assess patient heart health.
A new study in JAMA led by Yale School of Medicine researchers in collaboration with The University of Texas at Austin finds that an artificial intelligence (AI)-enabled tool can interpret echocardiograms with a high degree of accuracy in just a few minutes.
“Echocardiography is a cornerstone of cardiovascular care, but it requires a tremendous amount of clinical time from highly skilled readers to review these studies,” said Rohan Khera, assistant professor of medicine and biostatistics and director of the Cardiovascular Data Science Lab (CarDS) at Yale. “We wanted to develop a technology that can assist these very busy echocardiographers to help improve accuracy and accelerate their workflow.”
The researchers found the AI tool, PanEcho, could perform 39 diagnostic tasks based on multi-view echocardiography and accurately detect conditions, like severe aortic stenosis, systolic dysfunction, left ventricle ejection fraction, among others. This study builds on previous publications, including a 2023 publication in the European Heart Journal, that demonstrated the technology’s accuracy.
“We developed a tool that integrates information from many views of the heart to automatically identify the key measurements and abnormalities that a cardiologist would include in a complete report,” said Greg Holste, a Ph.D. student at UT’s Chandra Family Department of Electrical and Computer Engineering, who is co-advised by Khera and Texas Engineering professor Zhangyang (Atlas) Wang.
PanEcho was developed using 999,727 echocardiographic videos from January 2016 to June 2022 from Yale New Haven Health System (YNHHS) patients. Researchers then validated the tool using studies from 5,130 YNHHS patients as well as three external data cohorts from the Heart and Vascular Center of Semmelweis University in Budapest, Hungary; Stanford University Hospital; and Stanford Health Care.
“The tool can now measure and assess a wide range of heart conditions, making it much more attractive for future clinical use,” says Evangelos K. Oikonomou, a clinical fellow in cardiovascular medicine at Yale and co-first author of the study. “While it is highly accurate, it can be less interpretable than the read from a clinician. It’s still an algorithm and it requires human oversight.”
Holste was responsible for processing the echocardiogram data, designing and developing the AI model and comparing its performance on unseen cases to human experts.
“This is a perfect example of what can happen when you put engineers and clinicians in the same room, giving them time to speak each other’s languages,” Holste said. “I have worked with (Khera) and (Oikonomou) since the beginning of my Ph.D., so we have an ease that I know can be difficult to come by. This is exactly the kind of interdisciplinary science I want to continue pushing forward with the rest of my career.”
The full model and weights are available via open source, and the research team is encouraging other investigators to test the model using their echocardiographic studies and make improvements.