Unpacking the Biden Administration’s New AI Order

It seems like watershed moments in AI are happening on a weekly basis now. This time, the big news is the Biden Administration’s sweeping executive order that directs federal regulation of AI across multiple industries – including healthcare. 

The order comes as AI is becoming a clinical reality for many applications. 

  • The number of AI algorithms cleared by the FDA has been surging, and clinicians – particularly radiologists – are getting access to new tools on an almost daily basis.

But AI’s rapid growth – and in particular the rise of generative AI technologies like ChatGPT – have raised questions about its future impact on patient care and whether the FDA’s existing regulatory structure is suitable for such a new technology. 

The executive order appears to be an effort to get ahead of these trends. When it comes to healthcare, its major elements are summarized in a succinct analysis of the plan by Health Law Advisor. In short, the order: 

  • Calls on HHS to work with the VA and Department of Defense to create an HHS task force on AI within 90 days
  • Requires the task force to develop a strategic plan within a year that could include regulatory action regarding the deployment and use of AI for applications such as healthcare delivery, research, and drug and device safety
  • Orders HHS to develop a strategy within 180 days to determine if AI-enabled technologies in healthcare “maintain appropriate levels of quality” – basically, a review of the FDA’s authorization process
  • Requires HHS to set up an AI safety program within a year, in conjunction with patient safety organizations
  • Tells HHS to develop a strategy for regulating AI in drug development

Most analysts are viewing the executive order as the Biden Administration’s attempt to manage both risk and opportunity. 

  • The risk is that AI developers lose control of the technology, with consequences such as patients potentially harmed by inaccurate AI. The opportunity is for the US to become a leader in AI development by developing a long-term AI strategy. 

The Takeaway

The question is whether an industry that’s as fast-moving as AI – with headlines changing by the week – will lend itself to the sort of centralized long-term planning envisioned in the Biden Administration’s executive order. Time will tell.

Trained to Underdiagnose

A new Nature study suggests that imaging AI models might underdiagnose patient populations who are also underdiagnosed in the real world, revealing new ethical and clinical challenges for AI development, regulation, and adoption.

The Study – The researchers trained four AI models to predict whether images would have positive diagnostic findings using three large/diverse public CXR datasets (one model w/ each dataset, one w/ combined dataset, 707k total images). They then analyzed model performance across various patient populations.

The Underdiagnosed – The AI models were mostly likely to underdiagnose patients who are female, young (0-20yrs), Hispanic and Black, and covered by Medicaid (low-income). AI underdiagnosis rates were even more extreme among patients who belonged to multiple underserved groups, such as Hispanic females or younger Black patients.

The Overdiagnosed – As you might expect, healthy patients who were incorrectly flagged by the AI models as unhealthy were usually male, older, White, and higher income.

The Clinical Impact – In clinical use, a model like this would result in traditionally underserved patients experiencing more missed diagnoses and delayed treatments, while traditionally advantaged patients might undergo more unnecessary tests and treatments. And we know from previous research that AI can independently detect patient race in scans (even if we don’t know why).

The Takeaway – AI developers have been working to reduce racial/social bias in their models by using diverse datasets, but it appears that they could be introducing more systemic biases in the process (or even amplifying them). These biases certainly aren’t AI developers’ fault, but they still add to the list of data source problems that developers will have to solve.

Get every issue of The Imaging Wire, delivered right to your inbox.

You might also like..

Select All

You're signed up!

It's great to have you as a reader. Check your inbox for a welcome email.

-- The Imaging Wire team

You're all set!