Creating A Novice Echo Screening Pathway

We hear a lot about AI’s potential to expand ultrasound to far more users and clinical settings, and a new study out of Singapore suggests that ultrasound’s AI-driven expansion might go far beyond what many of us had in mind.

The PANES-HF trial set up a home-based echo heart failure screening program that equipped a team of complete novices (no experience with echo, or in healthcare) with EchoNous’s AI-guided handheld ultrasound system and Us2.ai’s AI-automated echo analysis and reporting solution.

After just two weeks of training, the novices performed at-home echocardiography exams on 100 patients with suspected heart failure, completing the studies in an average of 11.5 minutes per patient.

When compared to the same 100 patients’ NT-proBNP blood test results and reference standard echo exams (expert sonographers, cart-based echo systems, and cardiologist interpretations), the novice echo AI pathway…

  • Yielded interpretable results in 96 patients 
  • Improved risk prediction accuracy versus NT-proBNP by 30% 
  • Detected abnormal LVEF <50% scans with an 0.880 AUC (vs. NT-proBNP’s 0.651-0.690 AUCs)
  • Achieved good agreement with expert clinicians for LVEF<50% detection (k=0.742)

These findings were strong enough for the authors to suggest that emerging ultrasound and AI technologies will enable healthcare organizations to create completely new heart failure pathways. That might start with task-shifting from cardiologists to primary care, but could extend to novice-performed exams and home-based care.

The Takeaway

Considering the rising prevalence of heart failure, the recent advances in HF treatments, and the continued sonographer shortage, there’s clearly a need for more accessible and efficient echo pathways — and this study is arguably the strongest evidence that AI might be at the center of those new pathways.

A Problem with Private Equity

The funniest physician on the internet, ophthalmologist and comedian Dr. Glaucomflecken, sparked quite a debate over private equity’s healthcare impact last week with this banger of a Valentine’s Day tweet:

“Every physician who sells their practice to private equity is choosing to make health care worse for everybody. I hope the money helps you sleep at night, because you have made life worse for every single patient and employee walking into your PE Daddy’s practice.”

Within three days, Dr. Glaucomflecken’s attack on healthcare PE garnered 1.2M views, 1,150 retweets, and 12k likes, while inspiring some telling conversations about private equity’s impact on radiology.

RadTwitter’s many private equity critics

  • Celebrated one of their biggest concerns gaining viral attention
  • Warned that this trend is putting MBAs in control of patient care
  • Theorized that PE is “driving physician satisfaction into the ground”
  • Highlighted PE-backed rad practices’ staffing/retention challenges
  • Joked that Dr. Glaucomflecken is now uninvited from the ACR meeting

Meanwhile, a few brave radiology PE leaders and defenders….

  • Countered that Dr. Glaucomflecken’s post was unfairly broad
  • Emphasized the challenges that private practices face on their own
  • Reasoned that health systems just are as money-driven, and worse at leading practices
  • Contended that PE improves radiology access in rural areas (others disputed this)
  • Inferred that PE is “in the arena” working to improve care, while critics sit on the sidelines

The hundreds of other comments from non-radiologists in the Dr. Glaucomflecken thread made many of the same arguments about their specialties, while revealing an overall consensus that the healthcare incentive system is flawed, insurer influence is playing a big role in practice consolidation, and many physician practices aren’t in a position to exclusively sell to physician owned/led organizations. 

The Takeaway

Regardless where you stand in the healthcare private equity debate, Dr. Glaucomflecken’s Twitter responses make it very clear that providers are concerned about the state of U.S. healthcare economics. That same discussion thread also might contain more ideas about areas that the U.S. healthcare system should improve than any published report we’ll cover this year.

Lumitron Raises $20M, Plans to Transform X-Ray

Lumitron Technologies secured another $20M in funding to expand its manufacturing and commercialization capabilities as it works its way to a $1B-plus IPO and the launch of what it calls the biggest breakthrough in the history of X-ray technology

Lumitron’s HyperVIEW EBCS imaging system boasts 100x greater image resolution and 100x lower radiation exposure than CT, while matching the size and price tag of a current higher-end CT scanner.

  • The HyperVIEW EBCS’ ability to image at the cellular level could also support next-gen “flash radiotherapies” that directly target cancerous cells. 

Lumitron is clearly bullish about its HyperVIEW EBCS scanner, forecasting that it will be used in “every aspect of medicine” and an array of industrial applications.

  • The HyperVIEW’s rollout schedule is equally ambitious, targeting use at research universities and hospitals within the next year and clinical readiness within just two years.

Skeptics might find plenty of reasons to question whether Lumitron can actually achieve these lofty goals. For starters, Lumitron lists just four employees on LinkedIn, the general public has only seen artistic renderings of the HyperVIEW scanner, and launching a completely new modality might be one of the most challenging acts in the business of medical imaging.

  • However, Lumitron also comes with plenty of credibility. The company was founded by well-established medtech and research leaders, its technology was developed at the famous Lawrence Livermore National Laboratory, and it now has $20M to fund its next steps. 

The Takeaway

We cover groundbreaking new imaging technologies all the time, but it’s exceptionally rare for those technologies to actually approach commercialization, especially from a relatively unknown company. 

Because of that lack of precedence, hospitals will need to see a ton of evidence before they start making room for their new HyperVIEW scanners. However, if they truly outperform modern CTs by 100x (with the same price and footprint), the Lumitron HyperVIEW might actually prove to be the biggest breakthrough in the history of X-ray.

GE HealthCare Adds Ultrasound Guidance with Caption Health Acquisition

GE HealthCare took a major step towards expanding its ultrasound systems to new users and settings, acquiring AI guidance startup Caption Health.

GE plans to integrate Caption’s AI guidance technology into its ultrasound platform, starting with POCUS devices and echocardiography exams. GE specifically emphasized how its Caption integration will help streamline echo adoption among novice operators and bring heart failure exams into “doctors’ offices, the home, and alternate sites of care.”

  • That’s particularly notable given healthcare’s major shift outside of hospital walls, especially considering that Caption has already developed a unique home echo exam and virtual diagnosis service. 
  • It’s also another sign that GE sees big potential for at-home ultrasound, coming less than a year after investing in home maternity ultrasound startup Pulsenmore.

GE didn’t disclose the tuck-in acquisition’s value. However, Caption is relatively large for an AI startup (79 employees on LinkedIn, >$62M raised) and is arguably the most established company in the ultrasound guidance segment (FDA & CE approved, CMS-reimbursed, notable alliances).

  • The fact that GE HealthCare has already made two acquisitions since spinning off in early January (after a 16 month pause) also suggests that the newly-independent medtech giant has returned to M&A mode.

Of course, the acquisition is another sign that the imaging AI consolidation trend remains in full swing, marking at least the ninth AI startup acquisition since January 2022 and the third so far in 2023.

  • One contributor to that AI consolidation surge appears to be ultrasound hardware vendors acquiring AI guidance companies, noting that GE’s Caption acquisition comes about six months after Exo’s acquisition of Medo AI.

The Takeaway

Ultrasound’s potential expansion to new users and clinical settings could create the kind of growth that most modalities only experience once in their lifetime (or never experience), and ease of use might dictate how far ultrasound is able to expand. That could make this acquisition particularly significant for GE HealthCare and for ultrasound’s path towards far broader adoption.

Radiology NLP’s Efficiency and Accuracy Potential

The last week brought two high profile studies underscoring radiology NLP’s potential to improve efficiency and accuracy, showing how the language-based technology can give radiologists a reporting head-start and allow them to enjoy the benefits of AI detection without the disruptions.

AI + NLP for Nodule QA – A new JACR study detailed how Yale New Haven Hospital combined AI and NLP to catch and report more incidental lung nodules in emergency CT scans, without impacting in-shift radiologists. The quality assurance program used a CT AI algorithm to detect suspicious nodules and an NLP tool to analyze radiology reports, flagging only the cases that AI marked as suspicious but the NLP tool marked as negative.

  • The AI/NLP program processed 19.2k CT exams over an 8-month period, flagging just 50 cases (0.26%) for a second review.
  • Those flagged cases led to 34 reporting changes and 20 patients receiving follow-up imaging recommendations. 
  • Just as notably, this semi-autonomous process helped rads avoid “thousands of unnecessary notifications” for non-emergent nodules.

NLP Auto-Captions – JAMA highlighted an NLP model that automatically generates free-text captions describing CXR images, streamlining the radiology report writing process. A Shanghai-based team trained the model using 74k unstructured CXR reports labeled for 23 different abnormalities, and tested with 5,091 external CXRs alongside two other caption-generating models.

  • The NLP captions reduced radiology residents’ reporting times compared to when they used a normal captioning template or a rule-based captioning model (283 vs. 347 & 296 seconds), especially with abnormal exams (456 vs. 631 & 531 seconds). 
  • The NLP-generated captions also proved to be most similar to radiologists’ final reports (mean BLEU scores: 0.69 vs. 0.37 & 0.57; on 0-1 scale).

The Takeaway

These are far from the first radiology NLP studies, but the fact that these implementations improved efficiency (without sacrificing accuracy) or improved accuracy (without sacrificing efficiency) deserves extra attention at a time when trade-offs are often expected. Also, considering that everyone just spent the last month marveling at what ChatGPT can do, it might be a safe bet that even more impressive language and text-based radiology solutions are on the way.

Whole-Body Scanning’s Star Power

The proactive whole-body scanning segment gained even more celebrity-driven momentum last week with the launch of Neko Health, a Sweden-based startup cofounded by Spotify CEO Daniel Ek.

Neko Health launches with the goal of improving early disease detection, thus allowing physicians to focus on preventive care, and reducing late detection’s social and economic impact.

  • The $190 exams combine a 360-degree body scan, cardiovascular scans, sensors, and blood tests to collect 50M data points (“skin, heart, vessels, respiration, microcirculation and more”) that are analyzed with AI to assess patients’ unique risks. 

Neko Health’s cardiovascular exam includes cardiac ultrasound (among other technologies), but its other scanners are based on “cameras, lasers, and radars,” and aren’t the type of modalities that most of you associate with whole-body scanning (no MRI or CT).

  • That said, Neko’s launch prompted the same type of radiologist backlash that we typically see when new whole-body imaging companies emerge, and Neko’s exams could still lead to the cascade of follow-ups that radiologists are concerned about.

Unfortunately for those concerned radiologists, the general public pays much more attention to the rich and famous than what folks are upset about on RadTwitter, and it seems that elites love proactive whole-body exams… 

  • Spotify’s Daniel Ek co-founded Neko (in case you missed that part)
  • Whole-body MRI startup Prenuvo is backed by some A-list investors (Apple’s Tony Fadell, Google’s Eric Schmidt, supermodel Cindy Crawford)
  • AI-driven proactive MRI company Ezra’s investor list is full of execs and entrepreneurs, rather than the VCs that imaging startups typically rely on
  • Whole-body scans have also been endorsed by some very influential celebrities (Oprah, Kim Kardashian, Chamath Palihapitiya, Paris Hilton, Kevin Rose)

Outside of the excellent celebrity endorsement work that Hologic has done for breast cancer screening, we don’t see that type of star power in traditional areas of medical imaging. 

The Takeaway

Neko Health largely steers clear of radiology’s turf from a modality perspective, but whole-body scanning’s recent influx of funding, innovations, and celebrity-driven awareness seem very relevant to radiology.

Understanding AI’s Physician Influence

We spend a lot of time exploring the technical aspects of imaging AI performance, but little is known about how physicians are actually influenced by the AI findings they receive. A new Scientific Reports study addresses that knowledge gap, perhaps more directly than any other research to date. 

The researchers provided 233 radiologists (experts) and internal and emergency medicine physicians (non-experts) with eight chest X-ray cases each. The CXR cases featured correct diagnostic advice, but were manipulated to show different advice sources (generated by AI vs. by expert rads) and different levels of advice explanations (only advice vs. advice w/ visual annotated explanations). Here’s what they found…

  • Explanations Improve Accuracy – When the diagnostic advice included annotated explanations, both the IM/EM physicians and radiologists’ accuracy improved (+5.66% & +3.41%).
  • Non-Rads with Explainable Advice Rival Rads – Although the IM/EM physicians performed far worse than rads when given advice without explanations, they were “on par with” radiologists when their advice included explainable annotations (see Fig 3).
  • Explanations Help Radiologists with Tough Cases – Radiologists gained “limited benefit” from advice explanations with most of the X-ray cases, but the explanations significantly improved their performance with the single most difficult case.
  • Presumed AI Use Improves Accuracy – When advice was labeled as AI-generated (vs. rad-generated), accuracy improved for both the IM/EM physicians and radiologists (+4.22% & +3.15%).
  • Presumed AI Use Improves Expert Confidence – When advice was labeled as AI-generated (vs. rad-generated), radiologists were more confident in their diagnosis.

The Takeaway
This study provides solid evidence supporting the use of visual explanations, and bolsters the increasingly popular theory that AI can have the greatest impact on non-experts. It also revealed that physicians trust AI more than some might have expected, to the point where physicians who believed they were using AI made more accurate diagnoses than they would have if they were told the same advice came from a human expert.

However, more than anything else, this study seems to highlight the underappreciated impact of product design on AI’s clinical performance.

Get every issue of The Imaging Wire, delivered right to your inbox.

You might also like..

Select All

You're signed up!

It's great to have you as a reader. Check your inbox for a welcome email.

-- The Imaging Wire team

You're all set!