AEkNN: An AutoEncoder kNN-Based Classifier With Built-in Dimensionality Reduction

TitleAEkNN: An AutoEncoder kNN-Based Classifier With Built-in Dimensionality Reduction
Publication TypeJournal Article
Year of Publication2018
AuthorsPulgar-Rubio, F., Charte Francisco, Rivera-Rivas A.J., and del Jesus M. J.
JournalInternational Journal of Computational Intelligence Systems
Volume12
Pagination436-452
Date Published11/2018
ISSN1875-6883
Abstract

High dimensionality tends to be a challenge for most machine learning tasks, including classification. There are different classification methodologies, of which instance-based learning is one. One of the best known members of this family is the k-nearest neighbors (kNNs) algorithm. Its strategy relies on searching a set of nearest instances. In high-dimensional spaces, the distances between examples lose significance. Therefore, kNN, in the same way as many other classifiers, tends to worsen its performance as the number of input variables grows. In this study, AEkNN, a new kNN-based algorithm with built-in dimensionality reduction, is presented. Aiming to obtain a new representation of the data, having a lower dimensionality but with more informational features, AEkNN internally uses autoencoders. From this new vector of features the computed distances should be more significant, thus providing a way to choose better neighbors. An experimental evaluation of the new proposal is conducted, analyzing several configurations and comparing them against the original kNN algorithm and classical dimensionality reduction methods. The obtained conclusions demonstrate that AEkNN offers better results in predictive and runtime performance.

Notes

TIN2015-68854-R; FPU16/00324

URLhttps://www.atlantis-press.com/article/125905686
DOI10.2991/ijcis.2019.0025
Fichero: