Full Text:   <1756>

CLC number: TP391.4

On-line Access: 2012-12-09

Received: 2012-06-11

Revision Accepted: 2012-11-12

Crosschecked: 2012-11-12

Cited: 2

Clicked: 2767

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
1. Reference List
Open peer comments

Journal of Zhejiang University SCIENCE C 2012 Vol.13 No.12 P.901-908


Learning robust principal components from L1-norm maximization

Author(s):  Ding-cheng Feng, Feng Chen, Wen-li Xu

Affiliation(s):  Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing 100084, China; more

Corresponding email(s):   fdc08@mails.tsinghua.edu.cn, chenfeng@tsinghua.edu.cn, xuwl@tsinghua.edu.cn

Key Words:  Principal component analysis (PCA), Outliers, L1-norm, Greedy algorithms, Non-greedy algorithms

Ding-cheng Feng, Feng Chen, Wen-li Xu. Learning robust principal components from L1-norm maximization[J]. Journal of Zhejiang University Science C, 2012, 13(12): 901-908.

@article{title="Learning robust principal components from L1-norm maximization",
author="Ding-cheng Feng, Feng Chen, Wen-li Xu",
journal="Journal of Zhejiang University Science C",
publisher="Zhejiang University Press & Springer",

%0 Journal Article
%T Learning robust principal components from L1-norm maximization
%A Ding-cheng Feng
%A Feng Chen
%A Wen-li Xu
%J Journal of Zhejiang University SCIENCE C
%V 13
%N 12
%P 901-908
%@ 1869-1951
%D 2012
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1200180

T1 - Learning robust principal components from L1-norm maximization
A1 - Ding-cheng Feng
A1 - Feng Chen
A1 - Wen-li Xu
J0 - Journal of Zhejiang University Science C
VL - 13
IS - 12
SP - 901
EP - 908
%@ 1869-1951
Y1 - 2012
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1200180

principal component analysis (PCA) is fundamental in many pattern recognition applications. Much research has been performed to minimize the reconstruction error in l1-norm based reconstruction error minimization (L1-PCA-REM) since conventional L2-norm based PCA (L2-PCA) is sensitive to outliers. Recently, the variance maximization formulation of PCA with l1-norm (L1-PCA-VM) has been proposed, where new greedy and non-greedy solutions are developed. Armed with the gradient ascent perspective for optimization, we show that the L1-PCA-VM formulation is problematic in learning principal components and that only a greedy solution can achieve robustness motivation, which are verified by experiments on synthetic and real-world datasets.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article


[1]Bishop, C.M., 2006. Pattern Recognition and Machine Learning. Springer.

[2]de la Torre, F., Black, M.J., 2001. Robust Principal Component Analysis for Computer Vision. Proc. 8th IEEE Int. Conf. on Computer Vision, 1:362-369.

[3]Ding, C., Zhou, D., He, X.F., Zha, H.Y., 2006. R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization. Proc. 23rd Int. Conf. on Machine Learning, p.281-288.

[4]Duda, R.O., Hart, P.E., Stork, D.G., 2001. Pattern Classification. Wiley-Interscience.

[5]Hastie, T., Tibshirani, R., Friedman, J., Franklin, J., 2005. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer.

[6]Hyvärinen, A., Hurri, J., Hoyer, P.O., 2009. Natural Image Statistics. Springer.

[7]Ke, Q.F., Kanade, T., 2003. Robust Subspace Computation Using L1 Norm. Technical Report, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA. Ke, Q.F., Kanade, T., 2005. Robust L1 Norm Factorization in the Presence of Outliers and Missing Data by Alternative Convex Programming. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, p.592-599.

[8]Kwak, N., 2008. Principal component analysis based on L1-norm maximization. IEEE Trans. Pattern Anal. Mach. Intell., 30(9):1672-1680.

[9]Le, Q.V., Karpenko, A., Ngiam, J., Ng, A.Y., 2011. ICA with Reconstruction Cost for Efficient Overcomplete Feature Learning. Advances in Neural Information Processing Systems 24, p.1017-1025.

[10]Liu, J., Ji, S., Ye, J., 2009. Multi-task Feature Learning via Efficient L2,1-Norm Minimization. Proc. 25th Conf. on Uncertainty in Artificial Intelligence, p.339-348.

[11]Nakajima, S., Sugiyama, M., Babacan, D., 2011. On Bayesian PCA: Automatic Dimensionality Selection and Analytic Solution. Proc. 28th Int. Conf. on Machine Learning, p.497-504.

[12]Nie, F., Huang, H., Cai, X., Ding, C., 2010. Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization. Advances in Neural Information Processing Systems 23, p.1813-1821.

[13]Nie, F., Huang, H., Ding, C., Luo, D., Wang, H., 2011. Robust Principal Component Analysis with Non-greedy L1-Norm Maximization. Proc. 22nd Int. Joint Conf. on Artificial Intelligence, p.1433-1438.

[14]Wright, J., Ganesh, A., Rao, S., Peng, Y.G., Ma, Y., 2009. Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization. Advances in Neural Information Processing Systems 22, p.2080-2088.

[15]Zass, R., Shashua, A., 2007. Nonnegative Sparse PCA. Advances in Neural Information Processing Systems 19, p.1561-1568.

[16]Zhang, Y., Teng, Y., 2010. Adaptive multiblock kernel principal component analysis for monitoring complex industrial processes. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 11(12):948-955.

Open peer comments: Debate/Discuss/Question/Opinion


Please provide your name, email address and a comment

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - Journal of Zhejiang University-SCIENCE