Digital

Do these 5 things to ensure AI is used ethically, safely in care

. 5 MIN READ
By
Timothy M. Smith , Contributing News Writer

AMA News Wire

Do these 5 things to ensure AI is used ethically, safely in care

Oct 9, 2024

There are numerous potential uses for augmented intelligence (AI), often called artificial intelligence, in clinical settings. They include informing clinical management; assisting with treatment, diagnosis or screening decisions; and autonomously treating, diagnosing or screening for disease. Physicians are already on board with it too—an AMA survey of more than 1,000 doctors found that nearly two-thirds can see AI’s potential benefits.

But each use comes with risks.

The AMA is advocating for you

The AMA has achieved recent wins in 5 critical areas for physicians.

An AMA Ed Hub™ CME series introduces learners to foundational principles in AI and machine learning, a subdomain of AI that enables computers to learn patterns and relationships from data without being explicitly programmed by people. Developed by the AMA ChangeMedEd initiative and the University of Michigan DATA-MD team and geared toward medical students, it is also suitable for residents, fellows, practicing physicians and other health professionals.

The fifth module in the series, “Navigating Ethical and Legal Considerations of AI in Health Care,” explores physicians’ roles in contributing to AI’s ethical and safe use in clinical settings.

From AI implementation to EHR adoption and usability, the AMA is making technology work for physicians, ensuring that it is an asset to doctors—not a burden.

“When integrating Al responsibly in health care, we must rely on the medical ethics of patient autonomy, beneficence, nonmaleficence and justice as our guideposts,” the module says. “There are many reasons why Al may not perform well and possibly cause harm, including bias.”

Bias can be introduced at any stage of Al development—problem identification, data gathering, algorithm development and model implementation—so it is essential that physicians participate in the development and implementation of health care Al to protect patients and safeguard these ethical principles.

The AMA has developed new advocacy principles that build on current AI policy. These new principles (PDF) address the development, deployment and use of health care AI. Meanwhile, AMA Ed Hub also features a separate, 16-credit CME course on artificial and augmented intelligence in health care.

Members save on travel & entertainment

AMA members save up to 25% on car rental base rates at participating Hertz locations; get discounts on travel and cruises.

“There are specific actions physicians can take within their professional capacity, contributing to Al's ethical and safe use to benefit patients and the health care system,” the module says, and a few are listed.

“While FDA review offers a degree of quality assurance, medical societies can provide additional guidelines for assessing Al products during implementation and evaluating Al recommendations for individual patients,” the module notes. “These organizations can offer essential guidance, similar to how they define the standard of care for specific medical interventions, to ensure the reliable, safe and effective adoption of health care Al.”

Physicians working in hospitals or practice groups should work to ensure that administrative work on Al algorithm development and deployment aligns with their clinical needs. In addition, as with the evaluation of other medical technology, physicians should advocate for rigorous vetting of Al used in health care. They should also consult their malpractice insurers to understand their coverage when using Al in practice. 

Fighting for Physicians

Get updates on how the AMA is fighting for physicians on critical issues—delivered to your inbox.

Fighting for Physicians subscribe

“Health care professionals should proactively seek the knowledge and skills necessary to assess and interpret Al algorithms. This includes understanding when to apply specific health care Al and the level of confidence to place in algorithmic recommendations,” the module says. “Clinicians should also understand how to evaluate a model's performance and engage with its outputs to enhance patient care.”

Learn more with the AMA about the emerging landscape of augmented intelligence in health care (PDF) and how the AMA is advancing health care AI through ethics, evidence and equity.

Physicians can be held liable for clinical decisions informed by Al, so they must exercise caution when implementing Al tools, especially those that haven’t been reviewed by the FDA or assessed by their institutions. Also, with the legal landscape ever changing, it may be prudent to adhere to the established standard of care. In most cases, physicians should use Al as a confirmatory, assistive or exploratory tool and not look to AI to make decisions for them. And since the AI domain is in substantial flux, it is incumbent upon physicians to stay up to date on its changes. 

“Laws and regulations related to Al are still developing and will undoubtedly shift,” the module says. “Health care professionals should stay abreast of these changes to ensure that their practices align with the most current requirements and guidelines.”

The module also outlines laws and liability related to health care AI and describes the current governance and regulation landscape. A related AMA Ed Hub module provides a systems perspective on ethics and law in medicine.

Periodic knowledge checks test the user’s understanding of how concepts are applied.

The CME module “Navigating Ethical and Legal Considerations of AI in Health Care” is enduring material and designated by the AMA for a maximum of 0.5 AMA PRA Category 1 Credit™.

It is part of the AMA Ed Hub, an online platform with high-quality CME and education that supports the professional development needs of physicians and other health professionals. With topics relevant to you, it also offers an easy, streamlined way to find, take, track and report educational activities.

Learn more about AMA CME accreditation.

Making technology work for physicians

FEATURED STORIES