Artificial intelligence gives stethoscopes a much-needed upgrade
Over its 200 years of existence, the stethoscope has emerged as the ultimate symbol for the medical community. Dangling from the neck of a doctor or nurse, the acoustic device seems to bestow upon the wearer an instant veneer of competence and credibility.
But here’s something people might find surprising, perhaps even shocking: the health care pros who regularly rely on these acoustic devices during physical examinations are not particularly good at using them. Study after study shows that doctors and nurses either frequently miss telltale acoustic signs of heart or lung problems or mistakenly refer patients to cardiologists for further (and expensive) testing.
In other words, when it comes to identifying evidence of a potential heart attack, doctors are literally just playing it by ear.
But thanks to the efforts of Berkeley Engineering alums Tyler Crouch (B.S.’14 ME) and Connor Landgraf (MEng ’14 BioE), the stethoscope is due for a serious upgrade. They are developing algorithms that, when combined with a digital stethoscope and artificial intelligence software, can help physicians predict whether a patient is at risk for heart diseases with a much greater degree of accuracy.
Last month, the federal Food and Drug Administration (FDA) approved nearly half a dozen of their algorithms designed to detect heart murmurs and atrial fibrillation, irregular heartbeats that could lead to stroke or blood clots. And in December, the FDA granted a “breakthrough” device designation to an algorithm that analyzes data from the heart’s electrical impulses for evidence of heart failure. Such a designation allows the agency to fast track significant innovations for approval.
“We’re very excited about the FDA clearances,” Crouch said, as they’re finally realizing the potential of technology that he, Landgraf and Jason Bellet started to develop seven years ago while all were students at Berkeley and eventually turned into a company called Eko Devices.
Since its debut in 1819, the stethoscope has not changed that much. Doctors place a tube-like device to a patient’s chest and listen for sounds like wheezing, stridor and the swirling sound of blood flowing from one heart valve to another.
At that time, the idea of using sounds generated from within the body to identify heart and lung conditions was revolutionary, even controversial, as some physicians could not see the connection between noise and disease. But over time, the stethoscope became a standard device in hospitals and doctors’ offices.
The stethoscope, however, started to fall out of favor with the invention of diagnostic technologies like radiography, electrocardiography and echocardiography, which produce images of the heart. Eventually, payers decided to stop reimbursing doctors for phonocardiographs, machines that plot the sounds of hearts. And the American Board of Internal Medicine no longer tests doctors’ ability to listen to sounds inside the body.
Diminishing skills
However, the stethoscope holds a significant advantage over the newer technologies: the device is much cheaper and simpler to use. Besides, insurers often require patients to see their primary care doctor before they can get a pricey echocardiogram.
But several studies over the years that tested doctors and nurses’ stethoscope skills have yielded poor to mediocre results.
For example, in 2014, the journal Medical Devices: Evidence and Research published a study by researchers at Texas Tech University, who sought to measure the accuracy of doctors and nurses in detecting basic sounds on a simulator using different brands of stethoscopes.
The study found that participants correctly identified all of the sounds in only 69% of cases. While researchers found that the volunteers using the high-end stethoscope performed better than the cheaper models, other factors, like a noticeable lack of skill, played a big role.
“Correct detection rates in our volunteers were modest, with a big room for improvement,” the study said. “We continue to believe that mastering a set of basic physical exam and auscultation skills is mandatory in making rational and cost-effective decisions regarding further testing.”
In another study, published last year by the journal PLOS ONE, researchers tested nearly 200 pulmonologists — doctors who can specialize in diseases in the respiratory tract — and medical students on 24 breathing-related sounds in children. According to the study, only 24.1% of students and 36.5% of the physicians correctly identified the sounds.
Berkeley students tackle the stethoscope
The idea for what became the Eko came to Landgraf in a course he was taking from Amy Herr, a professor of bioengineering who leads the Herr Lab in Bioinstrumentation for Quantitative Biology and Medicine.
Crouch said they eventually decided to focus on improving the stethoscope, noting the device has “looked pretty much the same” over the years. Digital stethoscopes were available, but doctors had largely shunned them because they were pricey and quickly lost power.
Crouch said they knew they needed to add features to the digital stethoscope to make it less expensive and more convenient. The company, developed under the guidance of SkyDeck, the university’s startup accelerator, eventually came up with a device equipped with noise cancellation technology. It also featured software that allowed physicians to record, review and share data like electrocardiogram rhythms on mobile devices.
But the emergence of artificial intelligence (AI) provided the real spark of innovation.
AI, or machine learning, allows computers to recognize patterns in vast troves of data. Show a computer enough images of cats and chairs, and it can eventually tell the difference between a cat and a chair.
The same applies to sounds. The heart of a person suffering from a blood clot makes a different sound than a healthy person because the heart is pumping harder to push blood through the obstructed vessel. So Eko developed algorithms to train computers to recognize those unique acoustic signatures associated with blood clots from other noises in the body.
“We want to ID patients who really need treatment and prevent needless referrals,” Crouch said. “Cardiologist offices get clogged when people who don’t need to be there show up.”
So far, the clinical data amassed by Eko suggests the AI-enhanced stethoscopes outperform doctors and nurses who use the basic stethoscope. In a study, the company said Eko’s AI was able to identify heart murmurs with 87% sensitivity and 87% specificity, a much better performance than the average physician.
In a major study published in Nature Medicine in January 2019, researchers from Eko and the Mayo Clinic tested an algorithm designed to spot asymptomatic left ventricular dysfunction (ALVD) from over 50,000 patients.
Specifically, the algorithm focused on the QSR Complex, which corresponds to muscle contraction and electrical activity inside the heart. ALVD indicates the left chamber of the heart is not contracting properly, which means the patient’s heart is not pumping blood effectively and therefore could fail.
According to the study, the algorithm correctly identified ALVD with an accuracy rate of 85.7% and sensitivity rate of 86.3%. Several months later, the FDA granted the algorithm breakthrough status.
Crouch said there is still much work to be done. The challenge is to make sure the algorithms generate high enough specificity and sensitivity as to avoid false positives.
To that end, the company is trying to create the world’s largest collection of heart sounds for its AI software to pick through. The more data it has, the better the accuracy.
But right now, Crouch said, the data Eko does possess is “only a drop in the bucket.”