2012
Garcia-Moreno, Pablo; Artés-Rodríguez, Antonio; Hansen, Lars Kai
A Hold-out Method to Correct PCA Variance Inflation Proceedings Article
En: 2012 3rd International Workshop on Cognitive Information Processing (CIP), pp. 1–6, IEEE, Baiona, 2012, ISBN: 978-1-4673-1878-5.
Resumen | Enlaces | BibTeX | Etiquetas: Approximation methods, classification scenario, computational complexity, computational cost, Computational efficiency, correction method, hold-out method, hold-out procedure, leave-one-out procedure, LOO method, LOO procedure, Mathematical model, PCA algorithm, PCA variance inflation, Principal component analysis, singular value decomposition, Standards, SVD, Training
@inproceedings{Garcia-Moreno2012,
title = {A Hold-out Method to Correct PCA Variance Inflation},
author = {Pablo Garcia-Moreno and Antonio Art\'{e}s-Rodr\'{i}guez and Lars Kai Hansen},
url = {http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6232926},
isbn = {978-1-4673-1878-5},
year = {2012},
date = {2012-01-01},
booktitle = {2012 3rd International Workshop on Cognitive Information Processing (CIP)},
pages = {1--6},
publisher = {IEEE},
address = {Baiona},
abstract = {In this paper we analyze the problem of variance inflation experienced by the PCA algorithm when working in an ill-posed scenario where the dimensionality of the training set is larger than its sample size. In an earlier article a correction method based on a Leave-One-Out (LOO) procedure was introduced. We propose a Hold-out procedure whose computational cost is lower and, unlike the LOO method, the number of SVD's does not scale with the sample size. We analyze its properties from a theoretical and empirical point of view. Finally we apply it to a real classification scenario.},
keywords = {Approximation methods, classification scenario, computational complexity, computational cost, Computational efficiency, correction method, hold-out method, hold-out procedure, leave-one-out procedure, LOO method, LOO procedure, Mathematical model, PCA algorithm, PCA variance inflation, Principal component analysis, singular value decomposition, Standards, SVD, Training},
pubstate = {published},
tppubtype = {inproceedings}
}
In this paper we analyze the problem of variance inflation experienced by the PCA algorithm when working in an ill-posed scenario where the dimensionality of the training set is larger than its sample size. In an earlier article a correction method based on a Leave-One-Out (LOO) procedure was introduced. We propose a Hold-out procedure whose computational cost is lower and, unlike the LOO method, the number of SVD's does not scale with the sample size. We analyze its properties from a theoretical and empirical point of view. Finally we apply it to a real classification scenario.