As augmented intelligence (AI) becomes more commonplace in physician practices, it’s important that the technology is integrated in a way that is helpful for patients and doctors instead of adding to office burdens.
With that in mind—along with the fact that nearly two out of every three physicians in 2024 reported that they use health care AI, a 78% jump from 2023—the AMA has developed a new toolkit that outlines steps physician practices can follow to help ensure AI technologies are implemented into health care settings in a safe, ethical and responsible way.
The AMA STEPS Forward® “Governance for Augmented Intelligence” toolkit, developed in collaboration with Manatt Health, is a comprehensive eight-step guide for health care systems to establish a governance framework to implement, manage and scale AI solutions.
“AI is becoming integrated into the way that we deliver care. The technology is moving very, very quickly. It’s moving much faster than we are able to actually implement these tools, so setting up an appropriate governance structure now is more important than it’s ever been because we have never seen such quick rates of adoption,” said Margaret Lozovatsky, MD, vice president of digital health innovations at the AMA.
“We doubled the number of physicians who are using AI from 2023 to 2024, which we found in our AI survey, and that rate of change is the fastest that any of us has seen in our career,” Dr. Lozovatsky added. “That’s why taking the time to slow down to set up an appropriate process to be able to vet these tools and to make them meaningful from a clinical perspective right now is so critical, such that you can set up your organization or practice for success in the future.”
From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors.
Empowering practices
AI—commonly referred to as artificial intelligence—can be helpful with clinical and administrative tasks, including summarizing medical notes, assisting in diagnosis and detecting and classifying the likelihood of future adverse events. And there is excitement in the medical world about the transformative potential AI has to enhance diagnostic accuracy, personalize treatments, reduce administrative and documentation burden, and speed up advances in biomedical science.
But “at the same time, there is concern about AI’s potential to worsen bias, increase privacy risks, introduce new liability and offer seemingly convincing yet ultimately incorrect conclusions that could affect patient care,” the toolkit explains.
Proper governance can address both the excitement and concerns by empowering health systems to:
- Manage tool identification and deployment.
- Standardize risk assessment and mitigation strategies.
- Maintain comprehensive documentation.
- Ensure safe applications with robust oversight.
- Decrease physician burnout.
- Promote collaboration and alignment across an institution.
8 steps to establish governance
In addition to laying out some key AI concepts that physicians should understand and explaining the challenges that a strong governance model can help address, the AMA toolkit breaks down the eight steps that health systems need to take to establish AI governance.
Establish executive accountability and a governance structure. This step looks at the importance of CEO and board level commitment and explores the factors organizations may need to consider when establishing an AI governance structure.
Form a working group to detail priorities, processes and policies. Learn about who should be included, what the group may be responsible for and what topics the group may want to discuss.
Assess the current state and establish priorities. Discover how to inventory AI activity and articulate an AI priority framework.
Develop AI policies. This step lays out what an AI policy should, at minimum, include and provides a link to a model AI policy. It also offers guidance on how to review existing policies and procedures to determine if any revisions or cross-references to the AI policy are necessary.
Define project intake, vendor evaluation and assessment processes. Discover how to standardize the intake and evaluation process for AI tools, which can ensure safety and efficient resource use and prevent duplicative efforts and maintain consistency.
Update standard planning and implementation processes. This step provides a checklist the workgroup can use to coordinate with other technology committees and the project manager office to review and update standard procedures to incorporate AI-specific considerations.
Establish an oversight and monitoring process. Learn what needs to happen to regularly monitor AI tools to validate their performance and identify and resolve any risks.
Support AI organizational readiness. Understand how to involve stakeholders across the health system to make practical adjustments to current operation in anticipation of the launch of new AI tools. A chart helps health systems understand activities that various parts of the organization should prepare for.
Beyond the toolkit
Discover how members of the AMA Health System Program are using AI to make meaningful change.
In addition to fighting on the legislative front to help ensure that technology is an asset to physicians and not a burden, the AMA has developed advocacy principles (PDF) that address the development, deployment and use of health care AI, with particular emphasis on:
- Health care AI oversight.
- When and what to disclose to advance AI transparency.
- Generative AI policies and governance.
- Physician liability for use of AI-enabled technologies.
- AI data privacy and cybersecurity.
- Payer use of AI and automated decision-making systems.
Learn more with the AMA about the emerging landscape of health care AI. Also, explore how to apply AI to transform health care with the “AMA ChangeMedEd® Artificial Intelligence in Health Care Series.”