Page 41 - Read Online
P. 41

George et al. Mini-invasive Surg 2024;8:4  https://dx.doi.org/10.20517/2574-1225.2023.102  Page 9 of 23



 ulcers  methods in the classification of             neural    specificity of 52.00%
 WCE images as inflammatory,                          network
 polypoid and ulcer

 AI: Artificial intelligence; MLP: multilayer perceptron; BEEMD: bidimensional ensemble empirical mode decomposition; SVM: support vector machine; CNN: convolutional neural network; SBCE: small bowel capsule
 endoscopy; LDA: linear discriminant analysis; ROI: region of interest; RCNN: region-based convolutional neural network; MCE: magnetically controlled capsule endoscopy; CCE: colon capsule endoscopy; WCE:
 wireless capsule endoscopy; KID: koulaouzidis-iakovidis database.




 Inflammatory bowel disease
 Potential AI tools to improve the detection and assessment of ulcers and mucosal inflammation caused by Crohn’s disease have been researched for over a

                                                                         [78]
 decade. In 2012, Kumar et al. published their work using a cascade for classifying CD lesions and quantitatively assessing their severity . The severity
 assessment given (normal, mild, and severe) by the model was shown to correlate well with those manually assigned by experts. While multiple machine
 learning models have achieved reasonable sensitivities and specificities in this field [79-81] , since 2018, deep learning systems have predominated research [81-92] . In
 2022, Ferreira et al. developed a CNN using a total of 8,085 images to detect ulcers and erosions in images from the PillCam™ Crohn’s Capsule, with an overall

 sensitivity of 90% and specificity of 96% . Higuchi et al. published their work using CNN-based models to automatically classify ulcerative colitis lesion
 [89]
   [90]
 severity based on the Mayo Endoscopic Subscore, achieving an accuracy of 98.3% . While reasonable results have been achieved, ulcers and erosions typically
 have fewer colour features compared to active bleeding lesions, making their detection and classification generally more difficult [Table 5].



 Coeliac disease
 Currently, there is a comparatively smaller body of research on AI detection and analysis of capsule endoscopy video for coeliac disease. Given the recency of
 the field, all retrieved articles utilised deep learning in their systems [93-96] . In 2017, Zhou et al. developed a deep learning method using the GoogLeNet model .
                                                                                        [93]
 Impressively, a 100% sensitivity and specificity were found on testing, although only a small number of video clips were used for the study. More recently, in

                                                           [95]
 2021, Li et al. employed principal component analysis (PCA) for feature extraction, including the novel strip PCA (SPCA) method . Using a small database of
 460 images, their process was found to have an average accuracy of 93.9% on testing. The small number of studies performed has resulted in a paucity of
 evidence on the utility of AI tools for this condition [Table 6].



 Hookworm detection
 Among the various pathological conditions that AI diagnostic techniques can identify, research into detecting parasitic infestations such as Hookworms has
 very little published data available. In 2016, Wu et al. proposed a new method that includes a multi-scale dual matched filter to locate the tubular structure of
                                                                         [97]
 hookworms and a piecewise parallel region detection method to identify regions potentially containing hookworm bodies on WCE imaging . Testing on a
 large dataset of 440,000 WCE images demonstrated accuracy, sensitivity, and specificity rates of around 78%. In 2018, He et al. furthered this work by
 integrating two CNN systems to model the visual appearances and tubular patterns of hookworms concurrently . Testing and validating showcased an
                                            [98]
                                                                                   [99]
 impressive accuracy of 88.5%. More recently, in 2021, Gan et al. utilised a deep CNN trained using 11,236 capsule endoscopy images of hookworms . The
 trained CNN system took 403 s to evaluate 10,529 test images, with sensitivity, specificity, and accuracy of 92.2%, 91.1%, and 91.2%, respectively [Table 7].
   36   37   38   39   40   41   42   43   44   45   46