
To implement health AI, first decide who’s accountable
How did your country report this? Share your view in the comments.
Diverging Reports Breakdown
To implement health AI, first decide who’s accountable
The AMA defines AI as augmented intelligence to emphasize that AI’s role is to help health care professionals, not replace them. In 2024, the number of physicians using AI tools nearly doubled, from 38% to 68%. As their use grows so does the imperative to use AI responsibly, safely and effectively. The AMA STEPS Forward® “Governance for Augmented Intelligence” toolkit, developed in collaboration with Manatt Health, is a comprehensive eight-step guide for health care systems to establish a governance framework to implement, manage and scale AI solutions.. Find out how participants in the AMA Health System Member Program are using AI to make meaningful change. Join for Half Dues. See our real-world impact on issues critical to patients and physicians. Read the AMA Impact Report (PDF) for more information on how the AMA is advocating for you and how you can join for half the dues, all the AMA benefits and more from the largest physician organization in the U.S. For confidential support call the Samaritans on 08457 90 90 90, visit a local Samaritans branch or click here for details.
Augmented intelligence (AI)—commonly called artificial intelligence—is rapidly affecting health care. In 2024, the number of physicians using AI tools nearly doubled, from 38% to 68%, and as their use grows so does the imperative to use AI responsibly, safely and effectively.
The AMA defines AI as augmented intelligence to emphasize that AI’s role is to help health care professionals, not replace them.
The AMA is advocating for you See our real-world impact on issues critical to patients and physicians. Read Impact Report (PDF)
“Clinical decision-making must still lie with clinicians,” according to Margaret Lozovatsky, MD, who is chief medical information officer and vice president of digital health innovations at the AMA. “AI simply enhances their ability to make those decisions.”
That distinction is vital. AI directly affects patient care and outcomes, and as it becomes more embedded in daily operations and workflows it poses new challenges and responsibilities.
“There are genuine risks in implementing these technologies,” Dr. Lozovatsky said during a recent AMA webinar (now available on demand). “That makes it important to understand the critical need for governance.”
The foundational pillars of responsible AI adoption are:
Establishing executive accountability and structure.
Forming a working group to detail priorities, processes and policies.
Assessing current policies.
Developing AI policies.
Defining project intake, vendor evaluation and assessment processes.
Updating standard planning and implementation processes.
Establishing an oversight and monitoring process.
Supporting AI organizational readiness.
The AMA STEPS Forward® “Governance for Augmented Intelligence” toolkit, developed in collaboration with Manatt Health, is a comprehensive eight-step guide for health care systems to establish a governance framework to implement, manage and scale AI solutions.
From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors—not a burden.
Start with strategic leadership Start with strategic leadership
Establishing accountability is the first and most essential step in safe, scalable and meaningful AI integration. Executive leadership determines the vision, provides oversight and ensures that AI and its implementation align with system-wide priorities.
“Engaging the C-suite is critical,” said Dr. Lozovatsky, a pediatric hospitalist and a nationally recognized leader in digital health and health care informatics. “All of their areas will be impacted, so buy-in from those leaders is imperative.”
The executive leadership often includes the:
Chief medical officer.
Chief nursing officer.
Chief quality officer.
Chief operating officer.
Chief information or chief technology officer.
Chief digital officer.
General counsel.
Chief medical information officer.
Such a multidisciplinary governance structure drives implementation based on feedback from those delivering care.
“It is critical to have people representing all of those spaces because they understand the pertinent considerations best,” Dr. Lozovatsky explained. From there, each leader should delegate responsibilities to trusted stakeholders within their domain to shape workflows, policies, and project evaluations.
Find out how participants in the AMA Health System Member Program are using AI to make meaningful change.
Half the dues, all the AMA benefits! Thousands of free CME opportunities to fulfill state requirements
A powerful voice fighting for you during uncertain times
Research, resources, events and more from the largest physician organization Join for Half Dues
Establish clinical governance Establish clinical governance
The AMA recommends a three-tiered model to organize AI governance.
Clinical executive leadership: Top-level support is essential to ensure that AI aligns with organizational priorities and patient care. Leadership must also delegate responsibilities for implementation.
Top-level support is essential to ensure that AI aligns with organizational priorities and patient care. Leadership must also delegate responsibilities for implementation. Advisory councils: Committees should review technologies and address clinical concerns while ensuring that the new tools are interoperable with existing systems.
Committees should review technologies and address clinical concerns while ensuring that the new tools are interoperable with existing systems. Specialty areas: Real engagement with front-line staff in clinical specialties will help develop and deploy AI tools as per those specialties’ unique needs.
This approach ensures that AI supports broader institutional goals and that resources are allocated properly.
“Organizations likely already have processes for evaluating technology,” said Dr. Lozovatsky. “They will need to consider unique aspects of AI and how those will be addressed with existing models and which additional governance bodies will be necessary to create” the goal being to ensure consistent oversight and alignment with institutional priorities.
Embed clinical informatics leadership Embed clinical informatics leadership
Clinical informatics encompasses clinical, technical and operational considerations, making this expertise integral to decision-making.
“Governance of any clinical technology relies on a deep understanding of what both technology and people can and should do,” said Dr. Lozovatsky. Including these experts early in the governance conversation is essential.
Organizations must answer several strategic questions before designing a structure:
How does AI support our strategic goals?
Should we take a more cautious approach?
What internal capabilities and external partnerships do we need?
Who will be accountable for oversight and compliance?
Will AI fit into our existing governance model, or do we need new committees or roles?
Establishing a governance framework starts with treating AI as a tool to advance institutional goals. Evaluate existing internal capabilities to determine their readiness for assessing and implementing AI. After that, health care organizations can make fully informed decisions about incorporating AI into existing structures or create new ones.
Stay up to date on AI Follow the latest news on AI and its applications and effects for health care—delivered to your inbox. Subscribe Now
Governance supports trust and outcomes Governance supports trust and outcomes
Perhaps the most important function of AI governance is building trust among physicians and other health professionals that these tools are safe, among patients that their data is secure, and among leaders that AI adoption will advance their mission of care.
“Health care organizations must ensure that AI is implemented in a safe, thoughtful manner,” said Dr. Lozovatsky. “We must prove that we’re supporting care for our patients and our clinicians in their ability to deliver that care.”
Clarity of purpose and strong structures for implementation empower health systems to move forward with confidence. Establishing executive accountability within a viable governance framework supports collaboration that drives meaningful AI integration.
AI governance isn’t just oversight. It’s about creating a culture of innovation in a structured framework that positions AI as a tool to improve care.
“Doing this in a safe, thoughtful manner,” as Dr. Lozovatsky put it, “supports care for our patients and our clinicians in delivering it.”
In addition to fighting on the legislative front to help ensure that technology is an asset to physicians and not a burden, the AMA has developed advocacy principles (PDF) that address the development, deployment and use of health care AI, with particular emphasis on:
Health care AI oversight.
When and what to disclose to advance AI transparency.
Generative AI policies and governance.
Physician liability for use of AI-enabled technologies.
AI data privacy and cybersecurity.
Payer use of AI and automated decision-making systems.
Learn more with the AMA about the emerging landscape of health care AI. Also, explore how to apply AI to transform health care with the “AMA ChangeMedEd® Artificial Intelligence in Health Care Series.”