Page 65 - Read Online
P. 65

Ganesan et al. Art Int Surg 2024;4:364-75  https://dx.doi.org/10.20517/ais.2024.68                                                      Page 368

                                        [22]
               tissue in diabetic foot ulcers . In the same type of wound, Liu et al. and Viswanathan et al. used AI to
               create color-coded regions to identify ischemia and infection based on real patient images [23,24] . These
               advancements in wound identification and assessment naturally lead to their application in wound
               management.


               AI in wound management
               Many smartphone applications (apps) have been designed to facilitate wound monitoring at home, where
               most wound management happens. These apps assess pictures of wound tissue with automatic color and
               measurement calibration, remove background “noise”, and use a factorization-based segmentation to
               classify and assess chronic wounds accurately [25,26] . One app can also detect subsurface tissue oxygenation of
                      [27]
               wounds . Poor wound oxygenation can delay healing, and catching these complications early on could
               help prevent further deterioration.

               Researchers are also aiming to sync data gathered from these apps with patients’ EMRs to offer providers
               up-to-date information for wound healing. Previously, patients had to manually measure their wounds at
               home, take pictures of their wounds without necessarily knowing if they were infected or had changed, and
               send the images to the practice, then wait for a response. Now, AI-assisted apps connected to medical
               records allow patients to input a single photo and receive several outputs, including the wound’s
               dimensions, classification, possible presence of infection or ischemia, and tissue types. This information can
               be synced with the EMR for immediate access by providers to further guide the patient.


                                                                                             [28]
               One example of this kind of technology is “The Wound Viewer”, developed by Zoppo et al. . The Wound
               Viewer is an AI-powered, portable medical device that leverages sensors and algorithms to remotely collect
               and analyze clinical data, including three-dimensional wound measurements and tissue composition, and
                                               [28]
               upload interpretations to the EMR . Guadagnin et al. created an image mining-based system that
               automatically interprets tissue types and colors from pressure ulcers, while making selected relevant visual
                                                               [29]
               information available to providers in the medical record . Given the rapid deterioration of wounds, daily
               monitoring is crucial to ensure proper healing and timely treatment adjustments.

               Daily monitoring can inform adjustments to wound treatment, since chronic wounds are, by definition,
               difficult to treat due to a number of underlying health conditions. Pressure injuries occur due to localized
               damage to the skin and underlying soft tissue, usually over a bony prominence . This damage is often a
                                                                                   [30]
               result of prolonged pressure, shear, and/or frictional forces . Patients who have sensory deficits have an
                                                                  [30]
               absent pressure feedback response that results in prolonged pressure over a period of time . The way to
                                                                                             [30]
               prevent and heal these types of injuries is to avoid that prolonged pressure. This is especially difficult for
               those who are unable to sense pressure or those with mobility and activity challenges, like patients in
               wheelchairs. These patients also experience more friction/shear when transferring from chairs to other
               surfaces, are more likely to experience nutritional deficiencies, and have more moisture around their
               wounds. Sensory perception, mobility, activity, friction/shear, nutrition, and moisture are factors of the
               Braden Scale, a widely used scale that assesses six physical categories that affect wound healing .
                                                                                              [31]

               Researchers have developed AI that tackles several of the factors included on the Braden Scale, aiming to
               facilitate the wound healing process. To address challenges in mobility, sensory perception, and activity,
                                                                                     [32]
               Gabison et al. used data from a noncontact system of load cells placed under a bed . The data were used to
               determine whether a patient was left-side lying, supine, or right-side lying with 94% accuracy . Danilovish
                                                                                              [32]
               et al. used an inexpensive “off-the-shelf” camera to classify a patient’s positions into four different postures
                               [33]
               with 95% accuracy . Artificially intelligent load cells and cameras could eventually alert caregivers when a
   60   61   62   63   64   65   66   67   68   69   70