Years ago, physicians in the intensive care unit kept an average of seven pieces of data in their heads. That number has since multiplied to 1,300 pieces of data.
“It is impossible for the human brain to be able to manage all of that information that's coming to us,” noted Margaret Lozovatsky, MD, a pediatric hospitalist and the AMA’s vice president of digital health innovations.
Augmented intelligence (AI)—often called artificial intelligence—has the potential to present that data to physicians in a way that enables them to make the clinical decisions they’re trained to make.
“I believe that AI can get us to a place where the information will be presented to the right person at the right time. They can use their clinical expertise and focus on patient care, which is what we all hope to do,” said Dr. Lozovatsky, who joined AMA member Tina Shah, MD, MPH, for a session on health AI use at the most recent International Conference of Physician Health in Nova Scotia, Canada.
The session was moderated by Michael Tutty, PhD, MHA, the AMA’s group vice president for professional satisfaction and practice sustainability.
No one is suggesting that AI technology will replace doctors in making clinical decisions, Dr. Lozovatsky clarified. “We're suggesting that it can be an additive co-pilot to the clinicians making those decisions.”
From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors—not a burden.
What to consider when investing in AI
Evaluating health AI technologies involves many steps, said Dr. Lozovatsky. There’s the value equation: how AI will address all the factors within a health care system and a practice. How will it affect quality of care conversations, the patient and clinician experience, health equity? Cost is another factor.
What problem is AI solving? Is it reducing physician workload?
“Tied into that is: Is it integrated into my clinician's workflow?” added Dr. Shah, chief clinical officer at Abridge, a health care technology company that specializes in generative AI platforms.
Looking at AI technologies within this framework can determine whether AI makes sense in a specific health care environment.
AI “is a team sport”
Two things make AI work: the people and the process, said Dr. Lozovatsky. “And while it's fun to talk about the vision of what is to happen, it's the hard work of setting up governance in your organization that truly makes these implementations successful.”
Governance addresses what skill sets are needed in the practice or organization to evaluate these tools.
There’s the middle layer of governance where a lot of the design work happens, and an executive layer, which outlines a clear direction and vision for implementing these tools. Having clinical staff that understands technology and partners with other leaders is also important.
“It is a team sport,” she said.
How AI can improve clinical workflows
In considering AI implementations, one approach is to start with low risk, high gain areas such as administrative tasks, said Dr. Lozovatsky.
AI has been effective in reducing documentation, such as clinical notes. This isn’t just in pilot form, but at scale in many institutions, said Dr. Shah, whose own company has partnered in scaling AI across the entire Kaiser Permanente system, one of the largest health systems in the U.S.
“Allowing us to take away this extra work so that the clinician can just talk to the patient is really huge,” she said.
Many physicians often worry about what AI is going to get wrong, said Dr. Lozovatsky.
“What is it going to miss when I have that encounter with a patient?” The reality is humans miss a lot, are biased and often make mistakes. “The real question to me is not necessarily is this tool perfect, but is this tool going to be better than if I was practicing without the tool?”
A good example of this is radiology. AI tools are now identifying incidental findings that could have been missed, she added.
The AMA has developed advocacy principles (PDF) that address the development, deployment and use of health care AI, with particular emphasis on:
- Health care AI oversight.
- When and what to disclose to advance AI transparency.
- Generative AI policies and governance.
- Physician liability for use of AI-enabled technologies.
- AI data privacy and cybersecurity.
- Payer use of AI and automated decision-making systems.
What the future looks like for AI
As a hospitalist, Dr. Lozovatsky often manages 20-plus patients at the same time. “You’re trying to make sure that there's no risk of deterioration and you're balancing the orders and the rounds and the documentation and all of those things.”
She envisions a time when integrated AI tools will be able to alert her in real time if a patient is deteriorating. “You will receive the information that you need that's tailored to your specialty, to the care of the patient in that moment.”
Dr. Shah’s company, Abridge, has been using an “ambient AI” technology that listens to a conversation between a clinician and a patient. “In real time, it figures out what the medical pieces are; it generates the medical documentation and generates documentation for the patient as well. That's an example of something I'm so excited about.”
She also touted the clinical decision support capabilities of the Abridge platform, which is able to curate data and prior research for physicians and other health professionals so that a wealth of information is at their fingertips.