Page 42 - Read Online
P. 42
Treger et al. Art Int Surg. 2025;5:126-32 https://dx.doi.org/10.20517/ais.2024.66 Page 130
technical proficiency, improving patient safety, reducing operative times, and leading to improved surgical
outcomes.
MEDICO-LEGAL IMPLICATIONS
Questions persist regarding the assignment of medical liability when the use of AI in plastic surgery results
in negative patient outcomes. While a patient’s health status may deteriorate despite receiving the accepted
standard of care, there may be cases where AI algorithmically steers plastic surgeons toward incorrect or
potentially negligent practices that cause untoward sequelae. In the nascent state of AI in healthcare, our
medico-legal system must develop a liability model that balances patient safety with beneficial aspirations
for constructive innovation. Certain policy options have been proposed. Suggestions include revision of the
[22]
standard of care with the incorporation of AI or a no-fault adjudication system . In all likelihood, liability
in the case of malpractice will be shared among multiple parties. These parties would most probably consist
of plastic surgeons leveraging AI, model developers, medical institutions, and regulatory agencies.
Determining the extent of each party’s culpability may be a complex task that creates obstacles to AI’s
ultimate relevance in plastic surgery.
Plastic surgeons, like all healthcare providers, are held to a standard of care. This necessitates the provision
of consistent, accepted treatment practices as established by academic and specialty societies. Questions are
raised regarding what level of proficiency is expected from plastic surgeons using AI and whether the
standard of care would be adjusted to a higher level that would essentially require its use or be at risk of
falling into obsolescence. Malpractice claims could arise if AI provides incorrect plastic surgery
recommendations that are followed or if correct recommendations are not followed, truly an unintended
catch-22 situation. Ultimately, plastic surgeons must demonstrate that they are acting reasonably and
responsibly when using AI in patient care. This places immense importance on the documentation of AI
assistance. Specifically, plastic surgeons should maintain detailed records of AI recommendations, their
rationale for following or deviating from those recommendations, and any subsequent patient outcomes.
Regarding AI developers, they must ensure that their tools meet currently undefined safety and efficacy
standards and provide adequate warnings about potential risks. A common complaint with AI models in
their current state is that they provide recommendations without assigning a degree of certainty, potentially
[23]
misleading the unwary user . Substantial testing must be done before developers can offer their technology
for commercial use in plastic surgery, to avoid hypothetical punitive action.
EARLY LEGISLATION FOR AI IN HEALTHCARE
A few attempts have been made in the United States to impose regulations on the use of AI in healthcare.
These actions have been largely pursued by the federal government, although some of the legislation has
relinquished final decision-making power to state governments. An introduced but never passed piece of
[24]
legislation was the Healthy Technology Act of 2023 . Proposed by Republican House Representative David
Schweikert, this bill aimed to clarify that AI and machine learning technologies could qualify as
practitioners with the authority to prescribe to patients if certain criteria were met. Specifically, the
“provider” would have to be authorized by state law, and the medical devices or drugs being prescribed
would need to have federal approval (i.e., FDA approval). This bill was referred to the Subcommittee on
Health on January 20, 2023, but never passed a House vote.
Later in October 2023, President Biden issued an executive order (#14110) about AI development. The goal
was to ensure that the United States becomes a global leader in reaping the benefits and managing risks of
the blossoming technology . Although not specific to healthcare, Biden’s administration called for the
[25]

