Full Text:   <4430>

CLC number: TP391

On-line Access: 2012-12-09

Received: 2012-05-24

Revision Accepted: 2012-09-20

Crosschecked: 2012-11-12

Cited: 3

Clicked: 7433

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE C 2012 Vol.13 No.12 P.881-890

http://doi.org/10.1631/jzus.C1200156


Adaptive online prediction method based on LS-SVR and its application in an electronic system


Author(s):  Yang-ming Guo, Cong-bao Ran, Xiao-lei Li, Jie-zhong Ma

Affiliation(s):  School of Computer Science and Technology, Northwestern Polytechnical University, Xi’an 710072, China

Corresponding email(s):   yangming_g@nwpu.edu.cn

Key Words:  Adaptive online prediction, Least squares support vector regression (LS-SVR), Electronic system


Yang-ming Guo, Cong-bao Ran, Xiao-lei Li, Jie-zhong Ma. Adaptive online prediction method based on LS-SVR and its application in an electronic system[J]. Journal of Zhejiang University Science C, 2012, 13(12): 881-890.

@article{title="Adaptive online prediction method based on LS-SVR and its application in an electronic system",
author="Yang-ming Guo, Cong-bao Ran, Xiao-lei Li, Jie-zhong Ma",
journal="Journal of Zhejiang University Science C",
volume="13",
number="12",
pages="881-890",
year="2012",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1200156"
}

%0 Journal Article
%T Adaptive online prediction method based on LS-SVR and its application in an electronic system
%A Yang-ming Guo
%A Cong-bao Ran
%A Xiao-lei Li
%A Jie-zhong Ma
%J Journal of Zhejiang University SCIENCE C
%V 13
%N 12
%P 881-890
%@ 1869-1951
%D 2012
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1200156

TY - JOUR
T1 - Adaptive online prediction method based on LS-SVR and its application in an electronic system
A1 - Yang-ming Guo
A1 - Cong-bao Ran
A1 - Xiao-lei Li
A1 - Jie-zhong Ma
J0 - Journal of Zhejiang University Science C
VL - 13
IS - 12
SP - 881
EP - 890
%@ 1869-1951
Y1 - 2012
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1200156


Abstract: 
Health trend prediction has become an effective way to ensure the safe operation of highly reliable systems, and online prediction is always necessary in many real applications. To simultaneously obtain better or acceptable online prediction accuracy and shorter computing time, we propose a new adaptive online method based on least squares support vector regression (LS-SVR). This method adopts two approaches. One approach is that we delete certain support vectors by judging the linear correlation among the samples to increase the sparseness of the prediction model. This approach can control the loss of useful information in sample data, improve the generalization capability of the prediction model, and reduce the prediction time. The other approach is that we reduce the number of traditional LS-SVR parameters and establish a modified simple prediction model. This approach can reduce the calculation time in the process of adaptive online training. Simulation and a certain electric system application indicate preliminarily that the proposed method is an effective prediction approach for its good prediction accuracy and low computing time.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Cawley, G.C., Talbot, N.L.C., 2002. Reduced rank kernel ridge regression. Neur. Process. Lett., 16(3):293-302.

[2]Chen, A.J., Song, Z.H., Li, P., 2007. Modeling method of least squares support vector regression based on vector base learning. Control Theory Appl., 24(1):1-5 (in Chinese).

[3]Chen, S.L., Chen, G., Xue, H., 2011. Research on sparsification approach for least squares support vector machine classification. Comput. Eng., 37(22):145-147, 150 (in Chinese).

[4]Cowen, C.C., 1996. Linear Algebra for Engineering and Science. Indiana West Pickle Press, West Lafayette, USA, p.124-265.

[5]de Kruif, B.J., de Vries, T.J.A., 2003. Pruning error minimization in least squares support vector machines. IEEE Trans. Neur. Networks, 14(3):696-702.

[6]Deng, J.L., 2004. The Primary Methods of Grey System Theory. Huazhong University of Science and Technology Press, Wuhan, China, p.34-45 (in Chinese).

[7]Guo, H.B., Guan, X.Q., 2010. Application of Least Squares Support Vector Regression in Network Flow Forecasting. 2nd Int. Conf. on Computer Engineering and Technology, p.V7-342-V7-345.

[8]Guo, Y.M., Zhai, Z.J., Jiang, H.M., 2009. Weighted prediction of multi-parameter chaotic time series using least squares support vector regression (LS-SVR). J. Northwestern Polytechn. Univ., 27(1):83-86 (in Chinese).

[9]Hoegaerts, L., Suykens, J.A.K., Vandewalle, J., de Moor, B., 2004. A comparison of pruning algorithms for sparse least squares support vector machines. LNCS, 3316:1247-1253.

[10]Jiao, L.C., Bo, L.F., Wang, L., 2007. Fast sparse approximation for sparse least squares support vector machine. IEEE Trans. Neur. Networks, 18(3):685-697.

[11]Keerthi, S.S., Shevade, S.K., 2003. SMO algorithm for least squares SVM formulations. Neur. Comput., 15(2):487-507.

[12]Müller, K.R., Smola, A.J., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.N., 1997. Predicting time series with support vector machines. LNCS, 1327:999-1004.

[13]Pang, H.X., Dong, W.D., Xu, Z.H., Feng, H.J., Li, Q., Chen, Y.T., 2011. Novel linear search for support vector machine parameter selection. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 12(11):885-896.

[14]Qu, J., Zuo, M.J., 2012. An LSSVR-based algorithm for online system condition prognostics. Expert Syst. Appl., 39(5):6089-6102.

[15]Schölkoph, B., Burges, J., Smola, A., 1999. Advance in Kernel Methods: Support Vector Machines. MIT Press, Cambridge, Massachusetts, USA, p.55-90.

[16]Suykens, J.A.K., Vandewale, J., 1999. Least squares support vector machine classifiers. Neur. Process. Lett., 9(3):293-300.

[17]Suykens, J.A.K., Lukas, L., Vandewalle, J., 2000. Sparse Approximation Using Least Squares Support Vector Machines. Proc. IEEE Int. Symp. on Circuits and Systems, p.757-760.

[18]Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., Moor, B.D., Vandewaller, J., 2002. Least Squares Support Vector Machines. World Scientific Publishing, New Jersey, USA, p.71-111.

[19]Vapnik, V.N., 1995. The Nature of Statistical Learning Theory. Springer-Verlag, New York, USA, p.267-287.

[20]Vapnik, V.N., 1998. Statistical Learning Theory. John Wiley and Sons, Hoboken, USA, p.93-110.

[21]Yaakov, E., Shie, M., Ron, M., 2002. Sparse online greedy support vector regression. LNCS, 2430:1-3.

[22]Zeng, X.Y., Chen, X.W., 2005. SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans. Neur. Networks, 16(6):1541-1546.

[23]Zhang, H.R., Wang, X.D., 2006. Incremental and online learning algorithm for regression least squares support vector machine. Chin. J. Comput., 29(3):400-406 (in Chinese).

[24]Zhao, Y.H., Zhong, P., Wang, K.N., 2011. Application of least squares support vector regression based on time series in prediction of gas. J. Conv. Inf. Technol., 6(1):243-250.

[25]Zhou, X.R., Teng, Z.S., Yi, Z., 2009. Fast pruning algorithm for designing sparse least squares support vector machine. Electr. Mach. Control, 13(4):626-630 (in Chinese).

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE