Full Text:   <2864>

CLC number: TP387

On-line Access: 

Received: 2004-03-27

Revision Accepted: 2004-12-01

Crosschecked: 0000-00-00

Cited: 0

Clicked: 6339

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
1. Reference List
Open peer comments

Journal of Zhejiang University SCIENCE A 2005 Vol.6 No.5 P.387-392


Clustering-based selective neural network ensemble

Author(s):  FU Qiang, HU Shang-xu, ZHAO Sheng-ying

Affiliation(s):  Laboratory of Intelligence Information Engineering, Zhejiang University, Hangzhou 310027, China; more

Corresponding email(s):   fuqiang@zju.edu.cn

Key Words:  Neural network, Ensemble, Clustering

FU Qiang, HU Shang-xu, ZHAO Sheng-ying. Clustering-based selective neural network ensemble[J]. Journal of Zhejiang University Science A, 2005, 6(5): 387-392.

@article{title="Clustering-based selective neural network ensemble",
author="FU Qiang, HU Shang-xu, ZHAO Sheng-ying",
journal="Journal of Zhejiang University Science A",
publisher="Zhejiang University Press & Springer",

%0 Journal Article
%T Clustering-based selective neural network ensemble
%A FU Qiang
%A HU Shang-xu
%A ZHAO Sheng-ying
%J Journal of Zhejiang University SCIENCE A
%V 6
%N 5
%P 387-392
%@ 1673-565X
%D 2005
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.2005.A0387

T1 - Clustering-based selective neural network ensemble
A1 - FU Qiang
A1 - HU Shang-xu
A1 - ZHAO Sheng-ying
J0 - Journal of Zhejiang University Science A
VL - 6
IS - 5
SP - 387
EP - 392
%@ 1673-565X
Y1 - 2005
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.2005.A0387

An effective ensemble should consist of a set of networks that are both accurate and diverse. We propose a novel clustering-based selective algorithm for constructing neural network ensemble, where clustering technology is used to classify trained networks according to similarity and optimally select the most accurate individual network from each cluster to make up the ensemble. Empirical studies on regression of four typical datasets showed that this approach yields significantly smaller ensemble achieving better performance than other traditional ones such as Bagging and Boosting. The bias variance decomposition of the predictive error shows that the success of the proposed approach may lie in its properly tuning the bias/variance trade-off to reduce the prediction error (the sum of bias2 and variance).

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article


[1] Bauer, E., Kohavi, R., 1999. An empirical comparison of voting classification algorithms: Bagging, Boosting, and variants. Machine Learning, 36(1-2):105-139.

[2] Blake, C., Keogh, E., Merz, C.J., 1998. UCI Repository of Machine Learning Databases. http://www.ics.uci.edu/~mlearn/MLRepository.html. Department of Information and Computer Science, University of California, Irvine, CA.

[3] Breiman, L., 1996. Bagging predictors. Machine Learning, 24(2):123-140.

[4] Breiman, L., Friedman, J.H., 1985. Estimating optimal transformations in multiple regression and correlation (with discussion). Journal of the American Statistical Association, 80:580-619.

[5] Drucker, H., 1999. Boosting Using Neural Networks. In: Sharkey, A. (Ed.), Combining Artificial Neural Nets: Ensemble and Module Multi-net Systems. Springer-Verlag, London, p.42-49.

[6] Friedman, J.H., Grosse, E., Stuetzle, W., 1983. Multidimensional additive Spline approximation. SIAM Journal of Scientific and Statistical Computing, 4:292-301.

[7] Fu, Q., Hu, S.X., Zhao, S.Y., 2004. A PSO-based approach for neural network ensemble. Journal of Zhejiang University (Engineering Science), 38(12):1596-1600 (in Chinese).

[8] German, S., Bienenstock, E., Doursat, R., 1992. Neural networks and the bias/variance dilemma. Neural Computation, 4(1):1-58.

[9] Hansen, J.V., 2000. Combining Predictors: Meta Machine Learning Methods and Bias/variance and Ambiguity Decomposition. Ph. D Dissertation, Department of Computer Science, University of Aarhus, Denmark.

[10] Hansen, L.K., Salamon, P., 1990. Neural network ensembles. IEEE Transaction on Pattern Analysis and Machine Intelligence, 12(10):993-1001.

[11] Krogh, A., Vedelsdy, J., 1995. Neural Network Ensembles Cross Validation, and Active Learning. In: Tesauro, G., Touretzky, D., Leen, T. (Eds.), Advances in Neural Information Processing Systems, Volume 7. MIT Press, Cambridge, MA, p.231-238.

[12] Lazarevic, A., Obradovic, Z., 2001. Effective pruning of neural network classifier ensembles. Proc. International Joint Conference on Neural Networks, 2:796-801.

[13] Liu, Y., Yao, X., 2000. Evolutionary ensembles with negative correlation learning. IEEE Trans. Evolutionary Computation, 4(4):380-387.

[14] Melville, P., Mooney, R., 2003. Constructing Diverse Classifier Ensembles Using Artificial Training Examples. Proc. of the IJCAI-2003, Acapulco, Mexico, p.505-510.

[15] Navone, H.D., Verdes P.F., Granitto, P.M., Ceccatto, H.A., 2000. Selecting Diverse Members of Neural Network Ensembles. Proc. 16th Brazilian Symposium on Neural Networks, p.255-260.

[16] Opitz, D., Shavlik, J., 1996. Actively searching for an effective neural network ensemble. Connection Science, 8(3-4):337-353.

[17] Ridgeway, G., Madigan, D., Richardson, T., 1999. Boosting Methodology for Regression Problems. Proc. 7th Int. Workshop on Artificial Intelligence and Statistics. Fort Lauderdale, FL, p.152-161.

[18] Rosen, B.E., 1996. Ensemble learning using decorrelated neural network. Connection Science, 8(3-4):373-384.

[19] Schapire, R.E., 1990. The strength of weak learn ability. Machine Learning, 5(2):1971-227.

[20] Zhou, Z.H., Wu, J.X., Jiang, Y., Chen, S.F., 2001. Genetic algorithm based selective neural network ensemble. Proc. 17th International Joint Conference on Artificial Intelligence, 2:797-802.

Open peer comments: Debate/Discuss/Question/Opinion


Please provide your name, email address and a comment

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE