IJSTR

International Journal of Scientific & Technology Research

Home About Us Scope Editorial Board Blog/Latest News Contact Us
0.2
2019CiteScore
 
10th percentile
Powered by  Scopus
Scopus coverage:
Nov 2018 to May 2020

CALL FOR PAPERS
AUTHORS
DOWNLOADS
CONTACT

IJSTR >> Volume 9 - Issue 4, April 2020 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



Classification Of Medical Image Data Using K Nearest Neighbor And Finding The Optimal K Value

[Full Text]

 

AUTHOR(S)

Preeti Nair, Indu Kashyap

 

KEYWORDS

Data mining, k NN, Image classification, optimal k value and Feature extraction.

 

ABSTRACT

Rapid developments made in data mining technologies has explored the growth of real time applications. In general, clustering and classification are the two important tasks used for understanding the hidden patterns of the data. One of the most widely employed algorithm is, k Nearest Neighbor (k NN) which serves better for any classification purpose. This paper proposes a finding of optimal k value using k NN on medical data. Initially, a set of medical images are collected from a public repository. The collected images composed of irrelevant noises, low contrast with degraded quality. Gaussian filtering is employed to reduce the noise with enhanced image quality. The contrast enhanced image is processed under morphological operations in segmentation process. Feature extraction process play a vital role for selecting the optimal k values. Using Gray Level Co-Occurrence Matrix (GLCM), the relevant features are estimated and selected. These selected features are given as input to k NN algorithm and obtains the optimal k value based on accuracy. For a possible set of data folds, the achieved optimal k value dictates the effectiveness of the research objectives.

 

REFERENCES

[1]. Ghosh, A. K. (2006). On optimum choice of k in nearest-neighbor classification. Computational Statistics & Data Analysis, 50(11), 3113-3123.
[2]. Zhang, S., Li, X., Zong, M., Zhu, X., & Cheng, D. (2017). Learning k for kNN classification. ACM Transactions on Intelligent Systems and Technology (TIST), 8(3), 43.
[3]. Jiahua Chen and Jun Shao. 2001. Jackknife variance estimation for nearest-neighbor imputation. J. Am. Statist. Assoc. 96, 453 (2001), 260–269.
[4]. Xiai Chen, Zhi Han, Yao Wang, Yandong Tang, and Haibin Yu. 2016. Nonconvex plus quadratic penalized low-rank and sparse decomposition for noisy image alignment. Sci. Chin. Infor. Sci. 5 (2016), 1–13.
[5]. Debo Cheng, Shichao Zhang, Xingyi Liu, Ke Sun, and Ming Zong. 2015. Feature selection by combining subspace learning with sparse representation. Multimedia Syst. (2015), 1–7.
[6]. Ingrid Daubechies, Ronald DeVore, Massimo Fornasier, and C. Sinan Gunt ¨ urk. 2010. Iteratively reweighted ¨ least squares minimization for sparse recovery. Commun. Pure Appl. Math. 63, 1 (2010), 1–38.
[7]. Yongsheng Dong, Dacheng Tao, and Xuelong Li. 2015b. Nonnegative multiresolution representation-based texture image classification. ACM Trans. Intell. Syst. Technol. 7, 1 (2015), 4.
[8]. Zhen Dong, Wei Liang, Yuwei Wu, Mingtao Pei, and Yunde Jia. 2015a. Nonnegative correlation coding for image classification. Sci. Chin. Infor. Sci. 59, 1 (2015), 1–14.
[9]. Jianping Fan, Jinye Peng, Ling Gao, and Ning Zhou. 2015. Hierarchical learning of tree classifiers for large-scale plant species identification. IEEE Trans. Image Process. 24, 11 (2015), 4172–84.
[10]. Pedro J. Garcıa-Laencina, Jose-Luis Sancho-Gomez, Anibal R. Figueiras-Vidal, and Michel Verleysen. 2009. K-nearest neighbours with mutual information for simultaneous classification and missing data imputation. Neurocomputing 72, 7 (2009), 1483–1493.
[11]. Mohammad Ghasemi Hamed, Mathieu Serrurier, and Nicolas Durand. 2012. Possibilistic kNN regression using tolerance intervals. In Advances in Computational Intelligence. 410–419.
[12]. Xiaofei He, Chiyuan Zhang, Lijun Zhang, and Xuelong Li. 2016. A-optimal projection for image representation. IEEE Trans. Pattern Anal. Mach. Intell. 38, 5 (2016), 1009–1015.
[13]. Boyu Li, Yun Wen Chen, and Yan Qiu Chen. 2008. The nearest-neighbor algorithm of local probability centers. IEEE Trans. Syst. Man Cybernet. B 38, 1 (2008), 141–154.
[14]. Xuelong Li, Qun Guo, and Xiaoqiang Lu. (2016). Spatiotemporal statistics for video quality assessment. IEEE Trans. Image Process. 25, 7 (2016), 3329–3342.
[15]. Xuelong Li, Lichao Mou, and Xiaoqiang Lu. (2015). Scene parsing from an MAP perspective. IEEE Trans. Cybernet. 45, 9 (2015), 1876–1886.
[16]. Xuelong Li and Yanwei Pang. (2009). Deterministic column-based matrix decomposition. IEEE Trans. Knowl. Data Eng. 22, 1 (2009), 145–149.
[17]. Xuelong Li, Zhigang Wang, and Xiaoqiang Lu. (2016). Surveillance video synopsis via scaling down objects. IEEE Trans. Image Process. 25, 2 , 740–755.
[18]. Jiayin Zhu., Xuehua Zhao., Huaizhong Li., Huiling Chen., & Gang wu (2018). An effective machine learning approach for identifying the glyphosate poisoning status in rats using blood routine test. IEEE access, 6, 15653-15662.
[19]. Xiabi Liu., Ling Ma., Li song., Yanfeng zhao., Xinming Zhao & chunwu Zhou (2015). Recognizing Common CT Imaging Signs of Lung Diseases Through a New Feature Selection Method Based on Fisher Criterion and Genetic Optimization. IEEE journal of biomedical and health informatics. 19(2). 635-647.
[20]. Yousif A. Alhaj., JianWen Xiang., Dongdong Zhao., Mohammed A., & Al. Qanes (2019). A Study of the Effects of Stemming Strategies on Arabic Document Classification. IEEE access. 7. 32664- 32671.
[21]. Liming Deng., Yong Hu., Jason Pui Yin Cheung., & Keith Dip Kei Luk (2017).A Data-Driven Decision Support System for Scoliosis Prognosis. IEEE access. 5. 7874-7884.
[22]. Asma Gul., Aris Per peroglou., Zardad Khan., Osama Mahmoud ., Miftahuddin Miftahuddin ., Werner Adler & Berthold Lausen (2018). Springer Advances in Data analysis and classification. 12(4). 827-840.
[23]. Ritu Chauhan., Harleen Kaur & Victor Chang (2017). Advancement and applicability of classifiers for variant exponential model to optimize the accuracy for deep learning. 12(5). 1-10.
[24]. J.S Sanchez., R.A Mollineda., & J. M. Sotoca (2007). An analysis of how training data complexity affects the nearest-neighbor classifiers. Springer Pattern analysis and Applications. 10(3). 189-201.
[25]. N. Hema Rajini & R. Bhavani (2014). Automatic classification of Computed Tomography brain images using ANN, k-NN and SVM. Springer AI &Society. 29(1). 97-102.
[26]. Gursoy, M.E., Inan, A., Nergiz, M.E. and Saygin, Y., 2017. Differentially private nearest neighbor classification. Data Mining and Knowledge Discovery, 31(5), pp.1544-1575.
[27]. Hu, L.Y., Huang, M.W., Ke, S.W. and Tsai, C.F., 2016. The distance function effect on k-nearest neighbor classification for medical datasets. SpringerPlus, 5(1), p.1304.