Could AI increase clinician burnout? The answer is yes… and the solution is human beings

By Brian Murphy
Could artificial intelligence (AI) increase physician burdens—and burnout?
This headline from Healthcare Dive (see link to full article in the comments) is provocative … but might not be clickbait.
This is admittedly a complex issue with a lot of “it depends.”
It depends on what product you’re using.
It depends on what your organizational goals are.
It even depends on how you define AI.
But the core issue is if AI is being used solely as a revenue-enhancing tool for providers to see more patients, it will inevitably lead to burnout—full stop.
Everyone (including mid-revenue leadership) is in a rush to implement AI powered tools, which are being pitched as a panacea for struggling healthcare organizations.
However, the technology isn’t mature enough to be trusted. For example, generative AI occasionally hallucinates results, i.e., generates wrong answers to prompts, or manufactures references.
Which means, the output must always be checked. When the output is a suspected/likely diagnosis, an MD must confirm it.
You can see the potential problem here. A bottleneck, AI tools generating notes faster and putting more, and more complex, decisions in front of physicians. Leading to burnout.
If accuracy is the goal, fantastic. If true reflection of patient complexity is the goal, great.
If giving physicians a sane space to practice medicine is your goal, wonderful. I’m particularly encouraged by ambient listening and virtual scribes, which can greatly ease end of day chart completion drudgery.
The article lists some fine examples. “Abridge, a startup that uses ambient listening and generative AI to automate clinical notetaking, and Nabla, another ambient AI assistant, both claim to save providers about two hours each day. Physicians using [Oracle] see a 30% decrease in documentation time, according to the company. Meanwhile, Microsoft says its AI documentation product saves doctors five minutes on average per patient visit.”
We should celebrate anything that eases documentation workload and stems provider burnout.
BUT… if the goal is more, you’ll get more. More patients seen, more revenue. And also, more stress, fractured MDs, and probably more denials. Payers will have an equal and opposite reaction to more.
Safe, effective AI use is a moving target. HHS has developed a comprehensive AI strategy that advocates for AI to augment clinical decision-making rather than replace it, ensuring that healthcare professionals maintain ultimate responsibility for patient diagnoses and treatment decisions.
Perhaps someday AI output can be fully trusted… but not yet.
AI is here to stay. But this article serves as a reminder to use these tools with caution. This may mean turning off some functionality, or putting mindful coders or CDI professionals in between to eliminate the noise.
TL;DR, AI should allow physicians to take a breath, not do more (also a reminder of why we need to keep moving away from the FFS rat race).
Reference
- Healthcare Dive, “Could Artificial Intelligence Increase Clinician Burden”: https://www.healthcaredive.com/news/could-artificial-intelligence-increase-clinician-burden/741660/
Related News & Insights
New sepsis aftercare code Z51.A: How would you assign to this clinical scenario?
By Joanne Wilson, RN, ADN, CCDS, Senior Director, Solutions We have a new ICD-10-CM code to bill…
Mission-Driven Medicine: Dr. Pablo Buitron de la Vega’s SDOH crusade, from capture to care
Listen to the episode here. Social Determinants of Health, or SDOH, are a buzzword these days—but…