Page 22 - Read Online
P. 22

Liu et al. Art Int Surg 2024;4:92-108  https://dx.doi.org/10.20517/ais.2024.19                                                                Page 106

                               [4]
               nonoperative time . These studies collectively highlight the notion that investigations on improving OR
               efficiency heavily rely on manual observation as the substrate for their analysis. Hence, the most significant
               barrier to improving surgery along these dimensions is the ability to analyze video data in a timely manner.
               Our HMR-based approach for surgical activity recognition serves as a foundation for exponentially scaling
               our ability to understand and improve the performance of both individuals and teams in the operating
               room.

               CONCLUSION
               In this paper, we presented a unified approach to systematically analyze the behavior and actions of
               individuals from OR videos using a human mesh-centered approach. Leveraging a novel, ensemble method
               for human detection, tracking, and mesh recovery, we demonstrated that substantial quantitative differences
               between surgical actions can emerge in the form of visual attention, movement patterns, and positional
               occupancy. We further showed that sequences of mesh embeddings formed from 3D joint positions can be
               used to train downstream machine learning models for surgical action recognition, paving the way for
               important downstream surgical tasks that rely on a rich understanding of human behavior. Overall, our
               work presents opportunities for video review programs to study human behavior in the OR in a systematic
               and scalable way. To our knowledge, we are the first study to have investigated the development of HMR-
               based approaches to analyze OR videos.


               DECLARATIONS
               Author contributions
               Conceptualization, investigation, methodology, software, validation, visualization, writing - original draft,
               writing - review and editing: Liu B
               Conceptualization, data curation, writing - review and editing: Soenens G
               Conceptualization, writing - review: Villarreal J
               Conceptualization, writing - review, supervision: Jopling J, Yeung-Levy S, Rau A
               Conceptualization, data curation, writing - review, supervision: Van Herzeele I


               Availability of data and materials
               Code will be made available upon request.


               Financial support and sponsorship
               This  work  was  supported  by  Wellcome  Leap  SAVE  (No.  63447087-287892)  and  the  National  Science
               Foundation (No. 2026498). Soenens G was supported by a PhD Fellowship (No. 11A5721-3N), and Van
               Herzeele I was supported by a Senior Clinical Fellowship (No. 802314-24N), both provided by the Fund for
               Scientific Research - Flanders, Belgium.


               Conflicts of interest
               All authors declare that there are no conflicts of interest.


               Ethical approval and consent to participate
               All visible subjects used in the simulated videos provided written informed consent to being filmed and
               agreed to the data’s use in scientific research. As our research does not deal with any patient data or broadly,
               PHI, we did not obtain an institutional review board (IRB) for this study.


               Consent for publication
               All visible subjects provided written formal consent for publication.
   17   18   19   20   21   22   23   24   25   26   27