|Title||AEkNN: An AutoEncoder kNN-Based Classifier With Built-in Dimensionality Reduction|
|Publication Type||Journal Article|
|Year of Publication||2018|
|Authors||Pulgar-Rubio, F., Charte Francisco, Rivera-Rivas A.J., and del Jesus M. J.|
|Journal||International Journal of Computational Intelligence Systems|
High dimensionality tends to be a challenge for most machine learning tasks, including classification. There are different classification methodologies, of which instance-based learning is one. One of the best known members of this family is the k-nearest neighbors (kNNs) algorithm. Its strategy relies on searching a set of nearest instances. In high-dimensional spaces, the distances between examples lose significance. Therefore, kNN, in the same way as many other classifiers, tends to worsen its performance as the number of input variables grows. In this study, AEkNN, a new kNN-based algorithm with built-in dimensionality reduction, is presented. Aiming to obtain a new representation of the data, having a lower dimensionality but with more informational features, AEkNN internally uses autoencoders. From this new vector of features the computed distances should be more significant, thus providing a way to choose better neighbors. An experimental evaluation of the new proposal is conducted, analyzing several configurations and comparing them against the original kNN algorithm and classical dimensionality reduction methods. The obtained conclusions demonstrate that AEkNN offers better results in predictive and runtime performance.
AEkNN: An AutoEncoder kNN-Based Classifier With Built-in Dimensionality Reduction