Once an AI algorithm has been approved and moves into clinical use, how should its performance be monitored? This question was top of mind at last week’s meeting of the FDA’s new Digital Health Advisory Committee.
AI has the potential to radically reshape healthcare and help clinicians manage more patients with fewer staff and other resources.
- But AI also represents a regulatory challenge because it’s constantly learning, such that after a few years an AI algorithm might be operating much differently from the version first approved by the FDA – especially with generative AI.
This conundrum was a point of discussion at last week’s DHAC meeting, which was called specifically to focus on regulation of generative AI, and could result in new rules covering all AI algorithms. (An executive summary that outlines the FDA’s thinking is available for download.)
- The stakes are high – FDA Commissioner Robert Califf, MD, opened the meeting expressing concern that AI’s financial impact is taking precedence over its value to patients and clinicians.
Radiology was well-represented at DHAC, understandable given it has the lion’s share of authorized algorithms (73% of 950 devices at last count).
- A half-dozen radiology AI experts gave presentations over two days, including Parminder Bhatia of GE HealthCare; Nina Kottler, MD, of Radiology Partners; Pranav Rajpurkar, PhD, of Harvard; and Keith Dreyer, DO, PhD, and Bernardo Bizzo, MD, PhD, both of Mass General Brigham and the ACR’s Data Science Institute.
Dreyer and Bizzo directly addressed the question of post-market AI surveillance, discussing ongoing efforts to track AI performance, including …
- The ACR’s launch earlier this year of the ACR Recognized Center for Healthcare-AI (ARCH-AI), a new quality assurance and certification program.
- Assess-AI, an initiative within ARCH-AI launched last week to provide a central data registry to monitor AI algorithm performance in real-world clinical settings.
- The Healthcare AI Challenge, a community for healthcare AI validation and monitoring that’s a collaboration between ACR, MGB, and several other academic institutions.
The Takeaway
Last week’s DHAC meeting offers a fascinating glimpse at the issues the FDA is wrestling with as it contemplates stronger regulation of generative AI. Fortunately, radiology has blazed a trail in setting up structures like ARCH-AI and Assess-AI to monitor AI performance, and the FDA is likely to follow the specialty’s lead as it develops a regulatory framework.