IJSTR

International Journal of Scientific & Technology Research

Home About Us Scope Editorial Board Contact Us
CONTACT

IJSTR >> Volume 8 - Issue 5, May 2019 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



Comparative Analysis Of Data Mining Using The Rought Set Method With K-Means Method

[Full Text]

 

AUTHOR(S)

Marnis Nasution, Deci Irmayani, Ronal Watrianthos, Sudi Suryadi, Ibnu Rasyid Munthe

 

KEYWORDS

Data Mining, Rought Set, K-Means

 

ABSTRACT

The purpose of this article is to compare between two data mining methods. Namely rought sets and k-means which both types of data are mining for clustering. Data mining itself is a method used to explore knowledge from a pile of data which so far has only been archived. While the clustering method itself is one method used to classify tendency, either the rought set method or k-means itself is used to find tendency or classify data. Both the method of rought set and k-means have the advantages of each according to needs. It is important to know what the advantages of each method are before deciding to use which method to use.

 

REFERENCES

[1]. M. Hossein and F. Zarandi, “Application of Rough Set Theory in Data Mining for Decision Support Systems ( DSSs ),” J. Ind. Eng. 1, vol. 1, pp. 25–34, 2008.

[2]. L. I. Wanqing, M. A. Lihua, and W. E. I. Dong, “Data Mining Based on Rough Sets in Risk Decision-making : Foundation and Application,” WSEAS Trans. Comput, vol. 9, no. 2, pp. 113–123, 2010.

[3]. M. Sudha, “Comparative Analysis between Rough set theory and Data mining algorithms on their prediction,” vol. 13, no. 7, pp. 3249–3260, 2017.

[4]. M. Jindal and N. Kharb, “K-means Clustering Technique on Search Engine Dataset using Data Mining Tool,” vol. 3, no. 6, pp. 505–510, 2013.

[5]. M. Nasution, “IMPLEMENTASI DATA MINING K-MEANS UNTUK MENGUKUR KEMAMPUAN LOGIKA MAHASISWA ( STUDI KASUS : AMIK LABUHAN BATU ),” Informatika, vol. 5, no. 1, 2017.