Full Text:   <1387>

CLC number: TP391

On-line Access: 2011-02-08

Received: 2010-02-01

Revision Accepted: 2010-09-01

Crosschecked: 2010-12-30

Cited: 2

Clicked: 3324

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
1. Reference List
Open peer comments

Journal of Zhejiang University SCIENCE C 2011 Vol.12 No.2 P.83-87


Binary tree of posterior probability support vector machines

Author(s):  Dong-li Wang, Jian-guo Zheng, Yan Zhou

Affiliation(s):  Glorious Sun School of Business and Management, Donghua University, Shanghai 200051, China, College of Information Engineering, Xiangtan University, Xiangtan 411105, China

Corresponding email(s):   sgirld@163.com

Key Words:  Binary tree, Support vector machine, Handwritten recognition, Classification

Share this article to: More |Next Article >>>

Dong-li Wang, Jian-guo Zheng, Yan Zhou. Binary tree of posterior probability support vector machines[J]. Journal of Zhejiang University Science C, 2011, 12(2): 83-87.

@article{title="Binary tree of posterior probability support vector machines",
author="Dong-li Wang, Jian-guo Zheng, Yan Zhou",
journal="Journal of Zhejiang University Science C",
publisher="Zhejiang University Press & Springer",

%0 Journal Article
%T Binary tree of posterior probability support vector machines
%A Dong-li Wang
%A Jian-guo Zheng
%A Yan Zhou
%J Journal of Zhejiang University SCIENCE C
%V 12
%N 2
%P 83-87
%@ 1869-1951
%D 2011
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1000022

T1 - Binary tree of posterior probability support vector machines
A1 - Dong-li Wang
A1 - Jian-guo Zheng
A1 - Yan Zhou
J0 - Journal of Zhejiang University Science C
VL - 12
IS - 2
SP - 83
EP - 87
%@ 1869-1951
Y1 - 2011
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1000022

Posterior probability support vector machines (PPSVMs) prove robust against noises and outliers and need fewer storage support vectors (SVs). Gonen et al. (2008) extended PPSVMs to a multiclass case by both single-machine and multimachine approaches. However, these extensions suffer from low classification efficiency, high computational burden, and more importantly, unclassifiable regions. To achieve higher classification efficiency and accuracy with fewer SVs, a binary tree of PPSVMs for the multiclass classification problem is proposed in this letter. Moreover, a Fisher ratio separability measure is adopted to determine the tree structure. Several experiments on handwritten recognition datasets are included to illustrate the proposed approach. Specifically, the Fisher ratio separability accelerated binary tree of PPSVMs obtains overall test accuracy, if not higher than, at least comparable to those of other multiclass algorithms, while using significantly fewer SVs and much less test time.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article


[1]Cortes, C., Vapnik, V., 1995. Support-vector networks. Mach. Learn., 20(3):273-297.

[2]Dietterich, T.G., Bakiri, G., 1995. Solving multiclass learning problems via error-correcting output codes. J. Artif. Intell. Res., 2(1):263-286.

[3]Duda, R.O., Har, P.E., 1973. Pattern Classification and Scene Analysis. Wiley, New York.

[4]Fei, B., Liu, J., 2006. Binary tree of SVM: a new fast multiclass training and classification algorithm. IEEE Trans. Neur. Netw., 17(3):696-704.

[5]Gonen, M., Tanuğur, A.G., Alpaydin, E., 2008. Multiclass posterior probability support vector machines. IEEE Trans. Neur. Netw., 19(1):130-139.

[6]Guo, G., Li, S.Z., Chan, K.L., 2001. Support vector machines for face recognition. Image Vis. Comput., 19(9-10):631-638.

[7]Hsu, C.W., Lin, C.J., 2002. A comparison of methods for multi-class support vector machines. IEEE Trans. Neur. Netw., 13(2):415-425.

[8]Hu, Z.H., Cai, Y.Z., Li, Y.G., Xu, X.M., 2005. Data fusion for fault diagnosis using multi-class support vector machines. J. Zhejiang Univ.-Sci., 6A(10):1030-1039.

[9]Huang, P., Zhu, J., 2010. Multi-instance learning for software quality estimation in object-oriented systems: a case study. J. Zhejiang Univ.-Sci. C (Comput & Electron.), 11(2):130-138.

[10]KreBel, U.H.G., 1999. Pairwise classification and support vector machine. In: Schölkopf, B., Burges, C.J., Smola, A.J. (Eds.), Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge, MA.

[11]Leng, B., Qin, Z., Li, L.Q., 2007. Support vector machines active learning for 3D model retrieval. J. Zhejiang Univ.-Sci. A, 8(12):1953-1961.

[12]Muller, K.R., Mika, S., Ratisch, G., Tsuda, K., Scholkopf, B., 2001. An introduction to kernel-based learning algorithms. IEEE Trans. Neur. Netw., 12(2):181-201.

[13]Platt, J., Cristianini, N., Shawe-Taylor, J., 2000. Large margin DAGSVM’s for multiclass classification. Adv. Neur. Inform. Process. Syst., 12:547-553.

[14]Takahashi, F., Abe, S., 2002. Decision-Tree-Based Multiclass Support Vector Machine. Proc. 9th Int. Conf. on Neural Information, p.1418-1422.

[15]Tao, Q., Wu, G.W., Wang, F.Y., Wang, J., 2005. Posterior probability support vector machines for unbalanced data. IEEE Trans. Neur. Netw., 16(6):1561-1573.

[16]Vapnik, V., 1995. The Nature of Statistical Learning Theory. Springer Verlag, New York.

[17]Vapnik, V., 1998. Statistical Learning Theory. John Wiley & Sons, New York.

[18]Xin, D., Wu, Z.H., Pan, Y.H., 2002. Probability output of multi-class support vector machines. J. Zhejiang Univ.-Sci., 3(2):131-134.

[19]Zhang, L., Zhou, W.D., Su, T.T., Jiao, L.C., 2007. Decision tree support vector machine. Int. J. Artif. Intell. Tools, 16(1):1-15.

Open peer comments: Debate/Discuss/Question/Opinion


Please provide your name, email address and a comment

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - Journal of Zhejiang University-SCIENCE