Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA

Authors

  • Jorge Arroyo-Hernández Escuela de Matemática Universidad Nacional Heredia, Costa Rica, Costa Rica

DOI:

https://doi.org/10.15359/ru.30-1.7

Keywords:

Dimensionality Reduction, Points Clouds, Preimage problem

Abstract

The dimensionality reduction methods are algorithms mapping the set of data in subspaces derived from the original space, of fewer dimensions, that allow a description of the data at a lower cost. Due to their importance, they are widely used in processes associated with learning machine. This article presents a comparative analysis of PCA, PPCA and KPCA dimensionality reduction methods. A reconstruction experiment of worm-shape data was performed through structures of landmarks located in the body contour, with methods having different number of main components. The results showed that all methods can be seen as alternative processes. Nevertheless, thanks to the potential for analysis in the features space and the method for calculation of its preimage presented, KPCA offers a better method for recognition process and pattern extraction

References

Amini, A. A., Chen, Y., Elayyadi, M., & Radeva, P. (2001). Tag surface reconstruction and tracking of myocardial beads from SPAMM-MRI with parametric B-spline surfaces. Medical Imaging, IEEE Transactions on, 20(2), 94-103. Recuperado de doi http://dx.doi.org/10.1109/42.913176

Arroyo, J. y Alvarado, J. (2014). A new variant of Conformal Map Approach method for computing the preimage in Input Space. Recent Advances in Computer Engineering, Communications and Information Technology, 301-304 Recuperado de http://www.wseas.us/e-library/conferences/2014/Tenerife/INFORM/INFORM-00.pdf

Honeine, P. y Richard, C. (Marzo, 2011). Preimage Problem in Kernel-Based Machine Learning. IEEE Signal Processing Magazine, 28 (2), 77-88. Recuperado de http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5714388&isnumber=5714377

Lee, J. y Verleysen, M. (2007). Nonlinear Dimensionality Reduction. Springer. Science & Business. Estados Unidos. doi http://dx.doi.org/10.1007/978-0-387-39351-3

Shlens, J. (2005). A Tutorial on Principal Component Analysis. Systems Neurobiology Laboratory, Salk Institute for Biological Studies. Recuperado de http://arxiv.org/pdf/1404.1100v1.pdf

Scholkopf, B., Smola, A. y Müller, K. (1999). Kernel principal component analysis. Advances in Kernel Methods-Support vector Learning, 327-352. Recuperado de http://pca.narod.ru/scholkopf_kernel.pdf

Tipping, M. y Bishop, M. (1999). Probabilistic principal component analysis. Journal of the Royal Statistical Society. Series B, 61 (3), 611-622. Recuperado de

doi http://dx.doi.org/10.1111/1467-9868.00196

Van der Maaten, L., Postma, E. y Van den Herik, H. (2009). Dimensionality Reduction: A Comparative Review. Technical Report TiCC TR. Recuperado de http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.125.6716&rep=rep1&type=pdf

Published

2016-01-01

How to Cite

Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA. (2016). Uniciencia, 30(1), 115-122. https://doi.org/10.15359/ru.30-1.7

Issue

Section

Original scientific papers (evaluated by academic peers)

How to Cite

Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA. (2016). Uniciencia, 30(1), 115-122. https://doi.org/10.15359/ru.30-1.7

Comentarios (ver términos de uso)

Most read articles by the same author(s)

1 2 3 4 5 6 7 8 9 10 > >>