Page 71 - Read Online
P. 71

Page 162                                                                                                                                                      Kimbowa et al. Art Int Surg 2024;4:149-69  https://dx.doi.org/10.20517/ais.2024.20



 - Merge segmented patches to show the needle localization in
 the 3D volume
 - Centers of detected circular segmentations correspond to the
 needle shaft
 - Most distal bright intensity corresponds to the needle tip
 [74]
 2020-  Lee et al.  2D  - Pass image through model to get segmentation   - Method evaluated on human data (8   - Evaluation metrics reported in # of pixels, rather than mm
 01-20  - Apply a max contour algorithm to find most contiguous   patients)  - Only compared general segmentation architectures, but no
 segment                                earlier needle detection methods
 - Visualize by drawing bounding from top right most pixel to
 bottom left pixel, and straightening the segmentation as
 diagonal of the bounding box
 [65]
 2019-  Mwikirize et al.  2D+t  - Enhance needle tip in consecutive us images   - Real time (67 fps)   - Not robust to motion artifacts such as breathing, or
 10-10  - Classifiy enhanced images and localize tip in enhanced images  - Both in plane and out of plane detection   pulsating
 that have needle  - Robust to intensity variations   - Cannot detect stationary needle tip as it depends on motion
 - Resilient to high intensity artifacts in the
 image
 - Incorporates temporal information
 [90]
 2019-  Pourtaherian et al.  3D  - Slice 3D ultrasound volume into 2D slices   - Conceptually simple architecture  - Computationally expensive
 02-24  - Select 3 consecutive slices (with the middle one being the
 reference slice) to incorporate some 3D information
 - Pass slices as 3 channel input to fully connected network
 (autoencoder style)
 - Obtain pixelwise classification of the slices
 [73]
 2019-  Arif et al.  3D+t  - Segment 3D ultrasound volume using a CNN   - Incorporates temporal information   - Not robust to motion artifacts as it assumes only needle
 02-11  - Extract needle candidates from segmentation using connected  - Ablation studies performed on architecture  moves between two consecutive frames
 component labelling and PCA   - Evaluated on multiple datasets (3 datasets)  - Assumes linear needle motion
 - Combine needle candidates with those from previous time   - performance doesn’t vary much except for  - Not robust to transducer motion (translation or rotation)
 step to obtain real needle by detecting motion between the   the in vivo data  - Large standard deviation on in vivo data as compared to
 time steps                             phantom data (not easily generalizable)
 - Visualize needle in two planes; 1 perpendicular and the other   - Computational speed measured on GPU
 parallel to the transducer             - Doesn’t localize needle tip - just the plane and segmentation
                                        of the needle (perhaps visible shaft)
 [72]
 2018-  Pourtaherian  3D  - Extract voxels from 3D ultrasound volume and classify each   - Can detect very short needles (5mm and   - Method does not explicitly detect the needle tip (only the
 05-31  voxel as needle or background   10mm)   plane where the needle and tip are maximally visible)
 - Obtain 2D cross section slices from the 3D ultrasound volume  - Robust to transducer and patient   - Can’t detect needle in the first 2mm
 and segment the needle in each slice (various slices   movements (as it performs repeated   - Computationally expensive- Only in-plane
 perpendicular to the lateral and elevation plane)   detection in 3D volume)
 - Map segmentation output onto its corresponding position in   - Method evaluated on data from 2
 3D   transducers (of varying resolution) and 2
 - Estimate needle axis by fitting a model of the needle to the   tissue types, 2 needle types (of different
 segmented voxels (model is a straight cylinder having a fixed   gauge)
 diameter)
 - Visualize the 2D cross section plane that contains the entire
 needle
 [83]
 2018-  Mwikirize et al.  2D  (1) Detect needle using a bounding box   - Relies on intensity invariant features.   - Inference time evaluated on GPU (most ultrasound devices
 03-06  (2) Use bounding box to automatically determine needle   Robust to low intensity needle features and   run on CPU)
 trajectory and tip  presence of high intensity artifacts
   66   67   68   69   70   71   72   73   74   75   76