طبقه‌بند مبتنی بر K نزدیکترین همسایه‌ها با استفاده از جمع وزن‌دار خطاهای بازسازی

نوع مقاله : مقاله برق

نویسندگان

1 استادیار، آزمایشگاه آموزش ماشین و یادگیری عمیق، دانشکده مهندسی فناوری‌های نوین، دانشگاه تخصصی فناوری‌های نوین آمل، آمل، ایران

2 استادیار، دانشکده مهندسی فناوری‌های نوین، دانشگاه تخصصی فناوری‌های نوین آمل، آمل، ایران

چکیده

در این مقاله، طبقه­بندی مبتنی بر طبقه­بند K نزدیکترین همسایه­ها و خطای بازسازی، جهت دسته­بندی داده­ها معرفی شده است. در روش پیشنهادی، ابتدا K نزدیکترین داده (همسایه) به داده­ی آزمون، از هر دسته موجود در داده­های آموزش، محاسبه می­گردد. سپس به بازسازی داده­ی آزمون، بر حسب تعداد مختلفی از نزدیکترین همسایه­ها (از یک تا  K)، در هر دسته پرداخته شده و میزان خطای بازسازی به ازای هر تعداد همسایه به طور مجزا محاسبه می­گردد. در گام بعد، در هر دسته، میزان خطا به صورت جمع وزندار خطای حاصل از تمامی بازسازی­ها محاسبه می­گردد. وزن خطای بازسازی، متناسب با تعداد همسایه­های دخیل در آن در نظر گرفته شده است بدین ترتیب که خطای بازسازی در تعداد همسایه­های آن ضرب می­شود. در آخر، داده­ی آزمون به دسته­ای تعلق دارد که کمترین میزان خطای کل را دارا است. این عمل موجب می­گردد تا ترکیبی از طبقه­بندهای مبتنی بر K نزدیکترین همسایه به صورت هم­افزایی در طبقه­بندی داده­ها نقش ایفا نمایند. در این مقاله از 10 دسته­مجموعه متعلق به پایگاه داده­ی سری-زمانی UCR و پنج دسته-مجموعه متعلق به پایگاه داده­ی دسته­بندی UCI جهت ارزیابی روش پیشنهادی استفاده شده است. نتایج بدست آمده از این ارزیابی­ها نشان می­دهد که روش پیشنهادی، عملکرد طبقه ندهای KNN مبتنی بر کمترین خطای بازسازی را به میزان زیادی بهبود بخشیده و نرخ بازشناسی در برخی K ها را در حدود 5 درصد بهتر نموده و متوسط نرخ بازشناسی به ازای تمامی Kها (از 2 الی 15) در حدود 1.6 درصد بهبود یافته است.

کلیدواژه‌ها

موضوعات


عنوان مقاله [English]

A Classifier Based on K-Nearest Neighbors Using Weighted Summation of Reconstruction Errors

نویسندگان [English]

  • Rassoul Hajizadeh 1
  • Mohammad Ali Hosseinzadeh 2
1 Machine Learning and Deep Learning Research Laboratory, Faculty of Engineering Modern Technologies, Amol University of Special Modern Technologies, Amol, Irans
2 Faculty of Engineering Modern Technologies, Amol University of Special Modern Technologies, Amol, Iran
چکیده [English]

In this paper, a classifier is introduced based on the nearest neighbor classifier and the reconstruction error for data classification. In the proposed method, first, K nearest data points (neighbors) from each category in the training data are calculated for the test data point. Then, the reconstruction of the test data is performed based on different numbers of nearest neighbors (from one to K) in each category, and the reconstruction error is calculated separately for each number of neighbors. In the next step, for each category, the error is calculated as the weighted sum of the errors obtained from all reconstructions. The weight of the reconstruction error is proportional to the number of neighbors involved in it, so the reconstruction error is multiplied by the number of neighbors. Finally, the test data belongs to the category with the lowest overall error. This process allows a combination of K nearest neighbor classifiers to play a role in data classification. In this paper, 10 datasets from the UCR time series database and five datasets from the UCI classification database are used to evaluate the proposed method. The results of these evaluations show that the proposed method significantly improves the performance of the minimum reconstruction error based KNN classifiers, achieving approximately 5% better recognition rate for some K values and an average recognition rate improvement of about 1.6% for all K values (from 2 to 15).
.

کلیدواژه‌ها [English]

  • Classifier
  • Recognition rate
  • K-nearest neighbors
  • Linear reconstruction
  • Weighted combination
[1] J.Z. Zhang, P.R. Srivastava, D. Sharma, and P. Eachempati. "Big data analytics and machine learning: A retrospective overview and bibliometric analysis." Expert Systems with Applications 184 (2021): 115561.
[2] A. Pucchio, E.A. Eisenhauer, and F.Y. Moraes. "Medical students need artificial intelligence and machine learning training." Nature Biotechnology 39, no. 3 (2021): 388-389.
[3] A.B. Hassanat, H.N. Ali, A.S. Tarawneh, M. Alrashidi, M. Alghamdi, G.A. Altarawneh, and M.A. Abbadi. "Magnetic Force Classifier: A Novel Method for Big Data Classification." IEEE Access 10 (2022): 12592-12606.
[4] F. Nejadshahmohammad. "Development of Multi-similarity index clustering algorithm in Mathematical Modelling of Mines". Journal of Modeling in Engineering 17. no. 56 (2019): 267-279. (inPersian)
[5] T.C. Tchapga, T.A. Mih, A. Tchagna Kouanou, T. Fozin Fonzin, P. Kuetche Fogang, B.A. Mezatio, and D. Tchiotsop. "Biomedical image classification in a big data architecture using machine learning algorithms." Journal of Healthcare Engineering (2021).
[6] S. Alam, and N. Yao. "The impact of preprocessing steps on the accuracy of machine learning algorithms in sentiment analysis." Computational and Mathematical Organization Theory 25, no. 3 (2019): 319-335.
[7] P.C. Soto, N. Ramzy, F. Ocker, and B. Vogel-Heuser. "An ontology-based approach for preprocessing in machine learning." In 2021 IEEE 25th International Conference on Intelligent Engineering Systems (INES), 2021, pp. 000133-000138.
[8] D.P. Yadav, A. Sharma, M. Singh, and A. Goyal. "Feature extraction based machine learning for human burn diagnosis from burn images." IEEE Journal of Translational Engineering in Health and Medicine 7 (2019): 1-7.
[9] S. Dong, P. Wang, and K. Abbas. "A survey on deep learning and its applications." Computer Science Review 40 (2021): 100379.
[10] C. Janiesch, P. Zschech, and K. Heinrich. "Machine learning and deep learning." Electronic Markets 31, no. 3 (2021): 685-695.
[11] A.H. Alsaqqa, M.A. Alkahlout, and S.S. Abu-Naser. "Using Deep Learning to Classify Different Types of Vitamin." International Journal of Academic Engineering Research (IJAER) 6, no. 1 (2022): 1-6.
[12] Sadeghi, Mohsen, Hossein Marvi, and Ali Reza Ahmadyfard. "A New and Efficient Feature Extraction Method for Robust Speech Recognition Based on Fractional Fourier Transform and Differential Evolution Optimizer." Journal of Modeling in Engineering 18, no. 61 (2020): 85-96.
[13] Harimi, Ali, and Khashayar Yaghmaie. "improving speech emotion recognition via gender classification." Journal of Modeling in Engineering 15, no. 48 (2017): 183-200.
[14] Javaid, Arslan, Muhammad Sadiq, and Faraz Akram. "Skin cancer classification using image processing and machine learning." In 2021 international Bhurban conference on applied sciences and technologies (IBCAST), pp. 439-444. IEEE, 2021.
[15] Cover, T., and P. Hart. "Nearest neighbor pattern classification." IEEE Transaction on Information Theory 13, no. 1 (1967): 21-27.
[16] S. You, C. Xu, C. Xu, and D. Tao. "Learning with single-teacher multi-student." In Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1. 2018.
[17] A. Chaudhary, S. Kolhe, and R. Kamal. "An improved random forest classifier for multi-class classification." Information Processing in Agriculture 3, no. 4 (2016): 215-222.
[18] V. Uebele, S. Abe, and M.S. Lan. "A neural-network-based fuzzy classifier." IEEE Transactions on Systems, Man, and Cybernetics 25, no. 2 (1995): 353-361.
[19] J. Gou, W. Qiu, Z. Yi, X. Shen, Y. Zhan, and W. Ou. "Locality constrained representation-based K-nearest neighbor classification." Knowledge-Based Systems 167 (2019): 38-52.
[20] Y. Zeng, Y. Yang, and L. Zhao. "Pseudo nearest neighbor rule for pattern classification." Expert Systems with Applications 36 (2009): 3587-3595.
[21] J.P. Gou, Y.Z. Zhan, Y.B. Rao, X.J. Shen, X.M. Wang, and W. He. "Improved pseudo nearest neighbor classification." Knowledge-Based System 70 (2014): 361-375.
[22] Y. Mitani, and Y. Hamamoto. "A local mean-based nonparametric classifier." Pattern Recognition Letters 27, no. 10 (2006): 1151-1159.
[23] J.P. Gou, W.M. Qiu, Q.R. Mao, Y.Z. Zhan, X.Z. Shen, and Y.B. Rao. "A Multi-Local Means Based Nearest Neighbor Classifier." In 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), 2017, pp. 448-452.
[24] W. Li, Q. Du, F. Zhang, and W. Hu. "Collaborative-Representation Based Nearest Classifier for Hyperspectral Imagery." IEEE Geoscience and Remote Sensing Letters 12, no. 2 (2015): 389-393.
[25] Z.P. Pan, Y.D. Wang, and W.P. Ku. "A new k-harmonic nearest neighbor classifier based on the multi-local means." Expert Systems with Applications 67 (2017): 115-125.
[26] S.A. Dudani, "The distance-weighted k-Nearest Neighbor rule." IEEE Transaction on Systems, Man and Cybernetics 6, no. 4 (1976): 325-327.
[27] R. Hajizadeh, A. Aghagolzadeh, and M. Ezoji. "Mutual neighborhood and modified majority voting based KNN classifier for multi-categories classification." Pattern Analysis and Applications (2022): 1-21.
[28] D. Dua, and C. Graff. "UCI Machine Learning Repository." Irvine, CA: University of California, School of Information and Computer Science, 2019.
[29] Y. Chen, E. Keogh, B. Hu, N. Begum, A. Bagnall, A. Mueen, and G. Batista. "The UCR Time Series Classification Archive." 2015.