Page 177 - Read Online
P. 177

Page 4 of 8                   Arora et al. Neuroimmunol Neuroinflammation 2018;5:26  I  http://dx.doi.org/10.20517/2347-8659.2018.11


               Step 2 - blur the image by convolution with the Gaussian filter derived in Step 1;
               Step 3 - compute the x and y components of the image gradient by centered differences.


               Algorithm 2: compute_edge_points
               Input - the image gradient vector field g;
               Output - a list of sub-pixel edge points;
               Step 1 - if the gradient modulus of a given pixel is larger than gradient modulus of the left and right pixels
               then dub the horizontal edge point provided the gx component of the image gradient is larger or equal to the
               gy component;
               Step 2 - analogously, a pixel whose gradient modulus is larger than the gradient at the pixels above and
               below is dubbed a vertical edge point if the gy component of the image gradient is larger or equal than the gx
               component;
               Step 3 - if gradient modulus of a given pixel is larger than gradient modulus of the above and below pixels
               then dub the vertical edge point provided the gy component of the image gradient is larger or equal to the gx
               component;
               Step 4 - finally the Devernay scheme is used either along the vertical or the horizontal axis to compute the
               sub-pixel position of the edge points.

               Algorithm 3: chain_edge_process
               Input - the image gradient vector field and a list of sub-pixel edge points computed above;
               Output - modified edge list;
               Step 1 - each point in the list is evaluated;
               Step 2 - the neighbors of each pixel is computed;
               Step 3 - edge points are associated with the pixel that is local maximum of the gradient modulus, either
               vertically or horizontally;
               Step 4 - two subsets of the set are formed. The first set for forward chaining and the other for backward
               chaining;
               Step 5 - the elements of each of these subsets with shortest distance of the edge point are selected as the
               candidates for forward and backward chaining;
               Step 6 - then towards the end of the algorithm the previous chains are verified, both unlinking of previous
               chains and creating of new links are done if the new ones are better.

               Stage 3: classification using the k-nearest neighbors
               K-nearest neighbor (k-NN) methods is a non-parametric regression methodology. An analytical function
               of the input-output relationship is used to K closest training vector for a given input feature vector. This
               algorithm function uses a distance (Euclidean) and a voting function. Other algorithms of the type
               k-NN also work in two phases of training and testing. In the training phase the data points are given to
               a n-dimensional space, n being a positive integer. In the testing phase the test data points are fed to the
               algorithm which in turn generates the nearest data points to the plotted data points on the n-dimensional
               plane [3,9-11] .


               Algorithm 4: k-NN classifier
               Step 1- determine a suitable distance matrix;
               Step 2 - in training phase- plot all the data set pairs as points on a n-dimensional plane;
               Step 3 - the data to be tested is plotted in the same plane and the nearest distance matrix is evaluated;
               Step 4 - the k-NN are chosen and told to vote for the correctness of the algorithm to determine the best
               plausible classification result.

               Experimentation
               A dataset of 210 images of brain tumors of various types was collected initially with the type of the tumor
   172   173   174   175   176   177   178   179   180   181   182