fbpx

Implementation of Texture Based Segmentation in Wireless Capsule Endoscopy Images


Call for Papers Engineering Journal, May 2019

Download Full-Text PDF Cite this Publication

Text Only Version

Implementation of Texture Based Segmentation in Wireless Capsule Endoscopy Images

  1. Shanmuga Sundaram[1], Sowmiya. K[2], Saranya. M [3], Tamilselvan. M[4], Navinkumar. S[5]

    Assistant Professor, Knowledge Institute of Technology, Tamil Nadu, India1.

    U.G Student, Knowledge Institute of Technology, Tamil Nadu, India2,3,4,5.

    Abstract:- Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health- services which is done using K-means clustering technique. It is an efficient approach for unsupervised segmentation of natural and textural images based on the extraction of image. In an Image, the sort of regions are analyze in segments by image texture. Color intensity in the image provide unusual pattern of information about the image. Texture Interpretation describes the regions on the behalf of the texture of an image. Texture description of an image used sort of properties of an image, for determine the quality such as smoothness, roughness, etc. based on intensity of pixels in that particular image. Texture Segmentation is a powerful concept for analysis of image when it is very rich in texture properties. So based on the property pixels of image, analysis are done by filters. To analyze the spot blotches at human body based on the properties of image. The main aim of the paper is to provide the defeated spot in the captured image by analyzing both the k-means clustering algorithm and Otsus algorithm. The proposed system is implemented and simulated using MATLAB software.

    Keywords:- Wireless capsule endoscopy images (WCE), MATLAB software, texture segmentation, energy saving.

    I INTRODUCTION

    The prototype capsule endoscope was developed at the Royal London Hospital in the UK by Professor Paul swain. Modern endoscopic or capsule endoscopic technique has revolutionized the diagnosis of the diseases and also treatment of the upper GI tract. i.e. Esophagus, stomach, duodenum and colon. The last remaining digestive system part is the small intestine. The small intestine has been a Difficult organ in digestive system which is to make diagnoses and treat without performing any kind of surgery. The technique requires three main components, an ingestible capsule, a portable data a workstation equipped with image processing data software.

    Fig. 1 Block diagram of Pill Cam

    Texture segmentation is one of the most challenging problems in image segmentation. The problem already begins with the definition of textures in fig.1. The human eye can easily recognize different textures, but it is quite difficult to define them in mathematical terms. The basic idea of texture segmentation is the grouping or clustering of pixels based on their properties such as contrast, homogeneity, entropy, energy, and/or maximum probability. Pixels that represent the same texture should be grouped together so that further analysis can be done on the various texture regions. However, this is not always the case because there are many situations where two or more different textures have very similar properties leading to incorrect segmentation results. Another challenge of texture segmentation is to identify the boundaries between regions when the regions have similar textures. Texture segmentation can also be quite difficult when there is noise affecting the differentiation between textures. This research aims to develop a robust texture segmentation algorithm able to cope with these problems.

    Different approaches deal with the extraction of homogeneous features from textures. Among these texture descriptors, some are statistical-based or transformed-based as the structure. Once the feature extraction step is completed, a fast and robust algorithm to perform texture segmentation/extraction is needed. To reach this objective, we develop an algorithm based on the k-means and Otsus algorithm model. It is subjected to drive the boundary of the image to be segmented.

    This paper mainly concentrate on software part suitable for detecting tumor, bleeding and ulcer in WCE images, The wireless capsule endoscopy images which exploits textural features based selection. The features for WCE images which are robust to the illumination change and characterize multi resolution property of WCE images.

    1. PROPOSED METHODOLOGY

      1. IMAGE TEXTURE

        An image texture is a set of metrics calculated in image processing designed to quantify the perceived texture of an image. Image texture gives us information about the spatial arrangement of color or intensities in an image or selected region of an image.

        Image textures can be artificially created or found in natural scenes captured in an image. Image textures are one way that can be used to help in segmentation or classification of images. To analyze an image texture in computer graphics, there are two ways to approach the issue: Structured Approach and Statistical Approach.

        1. CO-CCCURRENCE MATRICS

          Co-occurrence matrix captures numerical features of a texture using spatial relations of similar gray tones. Numerical features computed from the co-occurrence matrix can be used to represent, compare, and classify textures. The following are a subset of standard features derivable from a normalized co-occurrence matrix:

          Where is the th entry in a gray-tone spatial dependence matrix, and Ng is the number of distinct gray- levels in the quantized image.

          One negative aspect of the co-occurrence matrix is that the extracted features do not necessarily correspond to visual perception.

        2. EDGE DETECTION

          The use of edge detection to determine the number of edge pixels in a specified region helps determine a characteristic of texture complexity. After edges have been found the direction of the edges can also be applied as a characteristic of texture and can be useful in determining patterns in the texture. These directions can be represented as an average or in a histogram.

          Consider a region with N pixels. The gradient-based edge detector is applied to this region by producing two outputs for each pixel p: the gradient magnitude Mag (p) and the gradient direction Dir (p). The edgeness per unit area can be defined by Fedgeness = (|{p|Mag(p)>T})|/N for some threshold (T).

          To include orientation with edgeness we can use histograms for both gradient magnitude and gradient direction. Let Hmag(R) denotes the normalized histogram of gradient magnitudes of region R, and let Hdir denote the normalized histogram of gradient orientations of region R. Both are normalized according to the size NR Then

          Fmagdir = (Hmag(R), Hdir(R)) is quantitative texture description of region R.

        3. REGION DETECTION

          Attempts to group or cluster pixels based on texture properties together.

        4. BOUNDARY DETECTION

        Attempts to group or cluster pixels based on edges between pixels that come from different texture properties.

      2. K-MEANS CLUSTERING

        K-means clustering is a method of vector quantization, originally from signal processing, that is popular for cluster analysis in data mining. K-means clustering aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean, serving as a prototype of the cluster. This results in a partitioning of the data space into Voronoi cells.

        The problem is computationally difficult (NP-hard); however, there are efficient heuristic algorithms that are commonly employed and converge quickly to a local optimum. These are usually similar to the expectation-maximization algorithm for mixtures of Gaussian distributions via an iterative refinement approach employed by both algorithms. Additionally, they both use cluster centers to model the data; however, k-means clustering tends to find clusters of comparable spatial extent, while the expectation-maximization mechanism allows clusters to have different shapes.

      3. OTSU'S METHOD

        In image processing, Otsu's method is used to automatically perform clustering-based image threshold the reduction of a gray level image to a binary image. The algorithm assumes that the image contains two classes of pixels following bi-modal histogram (foreground pixels and background pixels), it then calculates the optimum threshold separating the two classes so that their combined spread (intra-class variance) is minimal. The extension of the original method to multi-level threshold is referred to as the Multi Otsu method.

      4. APHELION SOFTWARE

        The Aphelion Imaging Software Suite software products provide the broadest range of computer vision tools for 2D and 3D images, including hundreds of image processing functions, a state-of-the-art morphology library, and recognition classifiers, all available from a user- friendly interface or as native libraries and .Net® components. Powerful macro interpreters aid in rapid development and deployment of imaging applications dealing with a single image or a large series of still or live images. The Aphelion Imaging Software Suite software products are fully compatible with the latest versions of Windows® operating systems (including Windows® 7).

    2. SIMULATION RESULTS AND PERFORMANCE COMPARISON

      The simulation of wireless capsule images is based on comparing the both K-means clustering and OTSUS algorithm.

        1. K-means clustering algorithm

      he = imread('image.png'); imshow(he), title('H&E image'); text(size(he,2),size(he,1)+15,.

      'FontSize',7,'HorizontalAlignment','right'); cform = makecform('srgb2lab');

      lab_he = applycform(he,cform); ab = double(lab_he(:,:,2:3)); nrows = size(ab,1);

      ncols = size(ab,2);

      ab = reshape(ab,nrows*ncols,2); nColors = 3;

      % repeat the clustering 3 times to avoid local minima [cluster_idx, cluster_center]= kmeans(ab,nColors,'distance','sqEuclidean', … 'Replicates',3);

      pixel_labels = reshape(cluster_idx,nrows,ncols); imshow(pixel_labels,[]), title('image labeled by cluster index');

      segmented_images = cell(1,3); rgb_label = repmat(pixel_labels,[1 1 3]);

      for k = 1:nColors color = he;

      color(rgb_label ~= k) = 0; segmented_images{k} = color; end

      imshow(segmented_images{1}), title('objects in cluster 1');

      mean_cluster_value = mean(cluster_center,2); [tmp, idx] = sort(mean_cluster_value); blue_cluster_num = idx(1);

      L = lab_he(:,:,1); blue_idx=find(pixel_labels==blue_cluster_num); L_blue = L(blue_idx); is_light_blue=im2bw(L_blue,graythresh(L_blue)); nuclei_labels = repmat(uint8(0),[nrows ncols]); nuclei_labels(blue_idx(is_light_blue==false)) = 1;

      nuclei_labels = repmat(nuclei_labels,[1 1 3]); blue_nuclei = he;

      blue_nuclei(nuclei_labels ~= 1) = 0; imshow(blue_nuclei), title('tumor detection');

      B.OTSUS METHOD

      clear; close all

      warning off;

      I=imread('11.jpg'); if isrgb(I)==1

      I_gray=rgb2gray(I); else

      I_gray=I; end

      figure,imshow(I_gray); I_double=double(I_gray); [wid,len]=size(I_gray); colorlevel=256; hist=zeros(colorlevel,1);

      %threshold=128; for i=1:wid

      for j=1:len m=I_gray(i,j)+1; hist(m)=hist(m)+1;

      end end

      hist=hist/(wid*len); miuT=0;

      for m=1:colorlevel miuT=miuT+(m-1)*hist(m);

      end xigmaB2=0;

      for mindex=1:colorlevel threshold=mindex-1;

      omega1=0; omega2=0;

      for m=1:threshold-1 omega1=omega1+hist(m);

      end

      omega2=1-omega1; miu1=0;

      miu2=0;

      for m=1:colorlevel if m<threshold

      miu1=miu1+(m-1)*hist(m); else

      miu2=miu2+(m-1)*hist(m); end

      end miu1=miu1/omega1; miu2=miu2/omega2;

      xigmaB21=omega1*(miu1- miuT)^2+omega2*(miu2-miuT)^2;

      xigma(mindex)=xigmaB21; if xigmaB21>xigmaB2

      finalT=threshold; xigmaB2=xigmaB21;

      end end

      % fT=finalT/255

      % T=graythresh(I_gray)%matlab for i=1:wid

      for j=1:len

      if I_double(i,j)>finalT bin(i,j)=1;

      else

      bin(i,j)=0; end

      end end

      figure,imshow(bin); figure,plot(1:colorlevel,xigma);

      The result of the segmentation of the given input captured wireless capsule endoscopy images in fig 2 using K-means clustering algorithm technique in fig 3 and OTSUS method in fig 4. By comparing both algorithms we get better segmentation result.

      Fig.2Input image.

      Fig. 3 Output for K-means clustering algorithm

      Fig. 4 Output for OTSUS algorithm

    3. CONCLUSION

      This paper proposed an method for segmenting the defeated region from CE images. The segmenting algorithms had impacts from the statistical image techniques. The segmentation results suffered from the experimental evaluations in both qualitative and quantitative measurements. The current segmentation results fit to applications such as image enhancement, in which the over segmentation rate is not strictly required. It is expected to reduce the over-segmentation rate. In which

      WCE data can be efficiently managed by solving the multi- objective optimization problem based on the frame importance. By comparing both two types of segmentation algorithm to get the different summarization objectives such as minimum summary length and maximum information coverage can be accomplished according to the requirements of gastroenterologists.

    4. REFERENCES:

  1. A.Alrawi, M. S. Ali, and D. A. Ibrahim. Texture segmentation based on multifractal dimension. International Journal on Soft Computing, 3(1).

  2. S.Padmapriya, P.Shanmugasundaram, N.Santhiyakumari Segmentation of video capsule Gastrointestinal image Using Labview TEQIP-II sponsored National Conference on Electronics, Signal Processing, Communication & Embedded Systems (NCESCES14) in Coimbatore Institute Of Technology, Coimbatore, February – 2014.

  3. Akakin, H. C., Sarma, S. E., 2013. A Generalized Laplacian of Gaussian Filter for Blob Detection and Its Applications, IEEE transaction on Cybernetics, vol. 99, 1-15.

  4. P.Shanmugasundaram, N.Santhiyakumari, Interactive Segmentation Of Capsule Endoscopy Images Using Grow Cut Method, Accepted to publish in Sixth International Conference on (CICN 2014) and in IEEE Computer Society.

  5. Baopu Li and Max Q.-H. Meng, Tumor recognition in capsule endoscopy images using textural features and SVM-Based Feature selection,Image Vis. Comput., vol. 16, no. 3, MAY. 2012.

  6. Chen, Zhao, Salo, Rahtu, Matti, Automatic dynamic texture segmentation using local descriptors and optical flow, IEEE 2013, Transaction on Image processing, Vol. no. 22, No. 1, January 2013, Page no. 326-339.

  7. Kayabol,Unsupervised amplitude and texture classification of SAR images with Multinomial latent model, IEEE, Vol. no 22, feb 2013, Page no. 561- 572.

  8. Baopu Li and Max Q.-H. Meng, Tumor recognition in capsule endoscopy images using textural features and SVM-Based Feature selection, Image Vis. Comput., vol. 16, no. 3, MAY. 2012.

  9. Mehmood, I.; Sajjad, M.; Baik, S.W. Video summarization based tele- endoscopy: A service to efficiently manage visual data generated during wireless capsule endoscopy procedure. J. Med. Syst. 2014, 38, 19.

  10. Basar, M.R.; Malek, F.; Juni, K.M.; Idris, M.S.; Saleh, M.I.M. Ingestible wireless capsule technology: A review of development and future indication. Int. J. Antennas Propag. 2012, 2012, 807165.

  11. D. O. F. and D. R. Cave, Capsule Endoscopy. Saunders Elsivier, 2008. X. Liu, J. Gu, Y. Xie, J. Xiong, and W. Qin, A new approach to detecing ulcer and bleeding in Wireless capsule endoscopy images, in Biomedical and Health Informatics (BHI), 2012 IEEE- EMBS International Conference on, 2012, pp. 737740.

  12. J. Ma, T. Tillo, B. Zhang, Z. Wang, and E. G. Lim, Novel training and comparison method for blood detection in wireless capsule endoscopy images, in Medical Information and Communication Technology (ISMICT), 2013 7th International Symposium on, 2013, pp. 5660.

[13]Given Imaging Receives FDA Clearance for PillCam® COLON in Patients Following Incomplete Colonoscopy. [Online]. Available: http://www.givenimaging.com/en-us/Innovative-Solutions/Capsule- Endoscopy/pillcam-colon/Pages/COLON-Press-release.aspx. [Accessed: 15-April-2014].

Leave a Reply

Your email address will not be published. Required fields are marked *