Full Text:   <3234>

Summary:  <2103>

CLC number: TP3

On-line Access: 2014-01-29

Received: 2013-06-20

Revision Accepted: 2013-11-09

Crosschecked: 2014-01-15

Cited: 1

Clicked: 8207

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
1. Reference List
Open peer comments

Journal of Zhejiang University SCIENCE C 2014 Vol.15 No.2 P.107-118

http://doi.org/10.1631/jzus.C1300167


Transfer active learning by querying committee


Author(s):  Hao Shao, Feng Tao, Rui Xu

Affiliation(s):  School of WTO Research & Education, Shanghai University of International Business and Economics, Shanghai 200336, China; more

Corresponding email(s):   shaohao@suibe.edu.cn, ftao@ecust.edu.cn, rxu@ustc.edu.cn

Key Words:  Active learning, Transfer learning, Classification


Hao Shao, Feng Tao, Rui Xu. Transfer active learning by querying committee[J]. Journal of Zhejiang University Science C, 2014, 15(2): 107-118.

@article{title="Transfer active learning by querying committee",
author="Hao Shao, Feng Tao, Rui Xu",
journal="Journal of Zhejiang University Science C",
volume="15",
number="2",
pages="107-118",
year="2014",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1300167"
}

%0 Journal Article
%T Transfer active learning by querying committee
%A Hao Shao
%A Feng Tao
%A Rui Xu
%J Journal of Zhejiang University SCIENCE C
%V 15
%N 2
%P 107-118
%@ 1869-1951
%D 2014
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1300167

TY - JOUR
T1 - Transfer active learning by querying committee
A1 - Hao Shao
A1 - Feng Tao
A1 - Rui Xu
J0 - Journal of Zhejiang University Science C
VL - 15
IS - 2
SP - 107
EP - 118
%@ 1869-1951
Y1 - 2014
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1300167


Abstract: 
In real applications of inductive learning for classification, labeled instances are often deficient, and labeling them by an oracle is often expensive and time-consuming. active learning on a single task aims to select only informative unlabeled instances for querying to improve the classification accuracy while decreasing the querying cost. However, an inevitable problem in active learning is that the informative measures for selecting queries are commonly based on the initial hypotheses sampled from only a few labeled instances. In such a circumstance, the initial hypotheses are not reliable and may deviate from the true distribution underlying the target task. Consequently, the informative measures will possibly select irrelevant instances. A promising way to compensate this problem is to borrow useful knowledge from other sources with abundant labeled information, which is called transfer learning. However, a significant challenge in transfer learning is how to measure the similarity between the source and the target tasks. One needs to be aware of different distributions or label assignments from unrelated source tasks; otherwise, they will lead to degenerated performance while transferring. Also, how to design an effective strategy to avoid selecting irrelevant samples to query is still an open question. To tackle these issues, we propose a hybrid algorithm for active learning with the help of transfer learning by adopting a divergence measure to alleviate the negative transfer caused by distribution differences. To avoid querying irrelevant instances, we also present an adaptive strategy which could eliminate unnecessary instances in the input space and models in the model space. Extensive experiments on both the synthetic and the real data sets show that the proposed algorithm is able to query fewer instances with a higher accuracy and that it converges faster than the state-of-the-art methods.

采用专家问询方法的主动迁移学习算法研究

研究目的:在分类学习中,我们往往面临着匮乏的类标信息,而对无类标数据进行分类又会耗费大量人力物力,同时大量老旧信息得不到充分应用,造成资源浪费。一个典型例就是对突发新疾病的诊断,如H7N9禽流感病毒,从发现病症到确诊,需要经过很长时间,其中重要原因是,当新疾病出现时,往往只有极少数确诊病例,而且对病情信息所知甚少,由于没有经验数据,确诊新病症极为困难,导致大量病人被当作普通流感治疗,从而耽误了救治的黄金时间。因此,针对大量疑似病例,需要尽快做出正确诊断以挽救病人生命。如果依靠医生对每一个疑似病例进行详细分析诊断,将会浪费宝贵的医疗资源和时间,耽误亟待确诊患者的救治。我们注意到,医院保存了大量其他疾病的数据库。因此,探讨如何利用已有数据(例如普通流感或肺炎数据库),辅助医生进行未知的类似病症的诊断,具有更加重要的现实意义。本研究主要利用迁移学习理论,对旧数据进行信息提取,同时借助专家系统,进一步提升其精确性,从而在快速得到准确结果的同时节省大量稀缺资源。
创新要点:采用专家系统和混合模型,进一步优化迁移学习方法。在借助专家指导的过程中,主动学习(active learning)理论可以更好提供最有价值的数据集。因此,本研究引入专家系统对迁移算法的辅助方法设计,以及使用主动学习理论来进行未知数据的人工选择,以弥补迁移学习算法在初始数据集匮乏的情况下性能不足的弱点。
研究手段:将大量冗余数据(源数据)作为专家系统,在迭代过程中设置阈值,淘汰不符合条件的专家以及数据集合,可以大大提升算法性能。
重要结论:主动学习和迁移学习的结合,能够补偿迁移学习算法对初始数据集质量的高度依赖,避免负面迁移并大大提升算法性能。

关键词:迁移学习,主动学习,分类,数据挖掘

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Argyriou, A., Maurer, A., Pontil, M., 2008. An algorithm for transfer learning in a heterogeneous environment. Proc. European Conf. on Machine Learning and Knowledge Discovery in Databases, p.71-85.

[2]Balcan, M.F., Beygelzimer, A., Langford, J., 2006. Agnostic active learning. Proc. 23rd Int. Conf. on Machine Learning, p.65-72.

[3]Cao, B., Pan, S.J., Zhang, Y., et al., 2010. Adaptive transfer learning. Proc. 24th AAAI Conf. on Artificial Intelligence, p.407-412.

[4]Caruana, R., 1997. Multitask learning. Mach. Learn., 28(1):41-75.

[5]Chang, C.C., Lin, C.J., 2001. LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol., 2(3):27.

[6]Chattopadhyay, R., Fan, W., Davidson, I., et al., 2013. Joint transfer and batch-mode active learning. Proc. 30th Int. Conf. on Machine Learning, p.253-261.

[7]Church, K.W., Gale, W.A., 1991. A comparison of the enhanced Good-Turing and deleted estimation methods for estimating probabilities of English bigrams. Comput. Speech Lang., 5(1):19-54.

[8]Cohn, D., Atlas, L., Ladner, R., 1994. Improving generalization with active learning. Mach. Learn., 15(2):201-221.

[9]Dagan, I., Engelson, S.P., 1995. Committee-based sampling for training probabilistic classifiers. Proc. 12th Int. Conf. on Machine Learning, p.150-157.

[10]Dai, W., Yang, Q., Xue, G., et al., 2007. Boosting for transfer learning. Proc. 24th Int. Conf. on Machine Learning, p.193-200.

[11]Freund, Y., Schapire, R.E., 1997. A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci., 55(1):119-139.

[12]Harpale, A., Yang, Y., 2010. Active learning for multi-task adaptive filtering. Proc. 27th Int. Conf. on Machine Learning, p.431-438.

[13]Krause, A., Guestrin, C., 2009. Optimal value of information in graphical models. J. Artif. Intell., 35:557-591.

[14]Lewis, D.D., Gale, W.A., 1994. A sequential algorithm for training text classifiers. Proc. 17th Annual Int. ACM SIGIR Conf. on Research and Development in Information Retrieval, p.3-12.

[15]Li, H., Shi, Y., Chen, M.Y., et al., 2010. Hybrid active learning for cross-domain video concept detection. Proc. Int. Conf. on Multimedia, p.1003-1006.

[16]Li, L., Jin, X., Pan, S., et al., 2012. Multi-domain active learning for text classification. Proc. 18th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining, p.1086-1094.

[17]Lin, J.H., 1991. Divergence measures based on the Shannon entropy. IEEE Trans. Inform. Theory, 37(1):145-151.

[18]Luo, C.Y., Ji, Y.S., Dai, X.Y., et al., 2012. Active learning with transfer learning. Proc. ACL Student Research Workshop, p.13-18.

[19]McCallum, A.K., Nigam, K., 1998. Employing EM and pool-based active learning for text classification. Proc. 15th Int. Conf. on Machine Learning, p.350-358.

[20]Muslea, I., Minton, S., Knoblock, C.A., 2002. Active+semi-supervised learning = robust multi-view learning. Proc. 19th Int. Conf. on Machine Learning, p.435-442.

[21]Pereira, F., Tishby, N., Lee, L., 1993. Distributional clustering of English words. Proc. 31st Annual Meeting of Association for Computational Linguistics, p.183-190.

[22]Rajan, S., Ghosh, J., Crawford, M.M., 2006. An active learning approach to knowledge transfer for hyperspectral data analysis. Proc. IEEE Int. Conf. on Geoscience and Remote Sensing Symp., p.541-544.

[23]Reichart, R., Tomanek, K., Hahn, U., et al., 2008. Multi-task active learning for linguistic annotations. Proc. Annual Meeting of Association for Computational Linguistics, p.861-869.

[24]Rosenstein, M.T., Marx, Z., Kaelbling, L.P., et al., 2005. To transfer or not to transfer. Proc. NIPS Workshop on Inductive Transfer: 10 Years Later.

[25]Roy, N., McCallum, A., 2001. Toward optimal active learning through sampling estimation of error reduction. Proc. 18th Int. Conf. on Machine Learning, p.441-448.

[26]Settles, B., 2010. Active Learning Literature Survey. Technical Report No. 1648, University of Wisconsin, Madison.

[27]Seung, H.S., Opper, M., Sompolinsky, H., 1992. Query by committee. Proc. 5th Annual Workshop on Computational Learning Theory, p.287-294.

[28]Shao, H., Suzuki, E., 2011. Feature-based inductive transfer learning through minimum encoding. Proc. SIAM Int. Conf. on Data Mining, p.259-270.

[29]Shao, H., Tong, B., Suzuki, E., 2011. Compact coding for hyperplane classifiers in heterogeneous environment. Proc. European Conf. on Machine Learning and Knowledge Discovery in Databases, p.207-222.

[30]Shi, X.X., Fan, W., Ren, J.T., 2008. Actively transfer domain knowledge. Proc. European Conf. on Machine Learning and Knowledge Discovery in Databases, p.342-357.

[31]Shi, Y., Lan, Z.Z., Liu, W., et al., 2009. Extending semi-supervised learning methods for inductive transfer learning. Proc. 9th IEEE Int. Conf. on Data Mining, p.483-492.

[32]Yang, L., Hanneke, S., Carbonell, J., 2013. A theory of transfer learning with applications to active learning. Mach. Learn., 90(2):161-189.

[33]Zhang, Y., 2010. Multi-task active learning with output constraints. Proc. 24th AAAI Conf. on Artificial Intelligence, p.667-672.

[34]Zhu, Z., Zhu, X., Ye, Y., et al., 2011. Transfer active learning. Proc. 20th ACM Int. Conf. on Information and Knowledge Management, p.2169-2172.

[35]Zhuang, F., Luo, P., Shen, Z., et al., 2010. Collaborative Dual-PLSA: mining distinction and commonality across multiple domains for text classification. Proc. 19th ACM Int. Conf. on Information and Knowledge Management, p.359-368.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE