Beyond the buzz, state lawmakers weigh in on health care AI
The “era of AI” is here. States seek to mandate AI transparency and regulate clinical decision-making and payer use.
Tim Storey, the CEO of the National Conference of State Legislatures, has joked that the term “AI” should be included in the title of every session of his organization’s annual legislative summit to ensure that each program draws a full room.
“This is the beginning of the era of AI,” Storey said at the 2025 AMA State Advocacy Summit, held in Carlsbad, California.
“You can’t spend four or five minutes in a legislature without somebody talking about AI something or other because it touches everything,” he said. “The buzz about it is so strong.”
Sandwiched in between budgets and health care, Storey said the second-biggest policy issue in state legislatures is “technology,” a catch-all phrase that includes cybersecurity, young people and social media, and augmented intelligence—sometimes called “artificial intelligence.”
“It’s almost as if AI encompasses every other issue,” Storey said. That includes education, transportation, corrections—and health care.
Later at the State Advocacy Summit, AMA Board of Trustees Chair Michael Suk, MD, JD, MPH, MBA, said the AMA’s vision for health care includes “a system where technology like AI actually improves the patient experience and leads to better outcomes.”
The AMA’s recently updated policy on AI development and use in health care (PDF) guides the Association’s “engagement on this issue with lawmakers, regulators and other stakeholders, “who each have a role in bringing the future of health AI into focus,” said Dr. Suk, an orthopaedic surgeon at the Geisinger health system.
“We want to make sure physicians have a seat at the table in those conversations,” he added.
Geisinger is a member of the AMA Health System Program, which provides enterprise solutions to equip leadership, physicians and care teams with resources to help drive the future of medicine.
From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors—not a burden.
Flurry of AI activity anticipated
The AMA State Advocacy Summit included a well-attended expert panel discussion on how states were balancing innovation and the promised benefits of AI while also protecting patients and consumers.
“It's going to be a busy year for AI at the state level,” said panel moderator Jared Augenstein, senior managing director of the consulting firm Manatt Health.
He noted that, in early January, there had already been more than a dozen bills introduced in New York, Texas, Virginia, Illinois and other states.
There were more than 100 AI-related bills connected to health care introduced in 2024.
These bills mostly fell into five buckets and involved legislation calling for an AI study, mandating transparency, stopping AI-generated discrimination, regulating payer use and regulating clinical decision-making, Augenstein said.
Twenty of those bills passed, including measures in:
- California, where three bills passed related to AI in health care, one required transparency from physicians and health care organizations on disclosure of generative AI use; another required human involvement in health plan medical necessity determination; and another was described as a “general transparency bill for large model developers.”
- Colorado, which included consumer protections and imposed “significant requirements on developers and deployers of AI tools that are used in high-risk situations.”
- Utah, which provided narrow consumer protections and required disclosure when generative AI is used for “regulated occupations,” many of which are health care related.
“All this can be a bit overwhelming,” said Augenstein. To help keep up, he recommended that physicians and others read the new AMA issue brief on AI state advocacy and policy priorities (PDF), which focuses on health plans’ use of AI, transparency and physician liability.
“While the use of AI tools can create efficiencies by automating processes and streamlining operations, the AMA is concerned that these tools are making automated decisions without considering the nuances of each individual patient’s medical conditions and needs, increasing denials for medically necessary care, and creating access barriers (e.g., delays in care) for patients,” the issue brief says.
Regarding the role of physicians and physician oversight of payer AI use, the brief says:
- Any automated decision-making tool that recommends limitations or denials of care should be automatically referred for review to a physician possessing a current and valid, nonrestricted license to practice medicine in the state in which the proposed services would be provided if authorized, and of the same specialty as the physician who typically manages the medical condition or disease or provides the health care service involved in the request prior to issuance of any final determination.
- Prior to issuing an adverse determination, the treating physician must have the opportunity to discuss the medical necessity of the care directly with the physician who will be responsible for determining if the care is authorized.
- Use of automated decision-making should not replace the individualized assessment of a patient’s specific medical and social circumstances.
Rapid change, rapid improvement
Panelist Justin Norden, MD, MPhil, said the health care industry was not really prepared for the rapid evolution of health AI.
“This technology keeps changing every few weeks and we're not used to that—especially in health care,” said Dr. Norden, an AMA member who is the founder and CEO of Qualified Health, which works with health care organizations to build their AI infrastructure and establish governance and security policies.
“When we think of molecules or devices that get approved once and don't change, we think about using them for a couple decades,” he added.
Dr. Norden noted that the rapid pace also brings with it rapid improvements and “this is the worst the technology will ever be.”
Like other digital tools, however, AI applications are often created for other industries and then adapted for health care use.
“We are taking technology that was developed for someone else and people are figuring out how we can use this in our system,” Dr. Norden said.
While people get excited about AI autonomously developing a diagnosis, Dr. Norden said health care organizations should first look at low-risk tasks like submitting claims and quality reports.
“To me, that is the right place to start,” Dr. Norden said, adding that an old maxim in medicine should hold sway.
“Let's let other sectors figure this out first,” he added. “Health care usually comes later and that's not a bad thing. I think, actually, that's a good thing. In general, we want to do no harm, and we can wait to see how it shakes out.”