簡易檢索 / 詳目顯示

研究生: 卡羅斯
Carlos Ernesto Zavala Lopez
論文名稱: A Generalization of Multi-source Transfer Learning with Multi-view Adaboost
多源轉移學習與多面向適性提升法的一般化
指導教授: 蘇豐文
Soo, Von-Wun
口試委員: 錢炳全
Chien, Been-Chian
賴尚宏
Lai, Shang-Hong
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 資訊系統與應用研究所
Institute of Information Systems and Applications
論文出版年: 2013
畢業學年度: 101
語文別: 英文
論文頁數: 46
中文關鍵詞: no
外文關鍵詞: Multi-view Transfer Learning, Multi-Source Transfer Learning
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • There are several current trends in machine learning nowadays, one of them that has been studied and improved by researches of the field is Transfer Learning. Transfer learning consist of the ability to build and train a classifier out of the source data to be able to describe the data in a target task. This principle carries several difficulties such as the lack of labeled data or large data. Thus, several approaches had been developed such as Multi-Source and Multi-View Transfer Learning that help to leverage the scarcity of data along with Adaboost. Nevertheless, this approaches did not generalize into a real Multi-View model and further research on this area was suggested. This thesis, aimed to generalize the latest approach, re-define some conception regarding the error rates of the samples and observe the effect of the use of J views in the results. In consequence a novel model, G-MsTr-MvAdaboost, was developed. Results showed that the new algorithm is able to improve the performance of Adaboost, works well with fewer attributes and demonstrates that the error rate decrease as the number of views increase. Comparison of results per models and per views are presented for observation.


    目前的趨勢有幾個時下在機器學習,其中之一由該領域的研究已經研究和改進的是遷移學習。遷移學習的能力,建立和訓練一個分類源數據能夠描述目標任務中的數據組成。這一原則進行了一些困難,如缺乏標記的數據或大型數據。因此,已開發的幾種方法,如學習,有助於利用稀缺的數據一起Adaboost的多源,多視角轉移。然而,這種方法並不能一概而論成為一個真正的多視角模式,並進一步研究這方面的建議。本論文旨在概括的最新方法,重新定義一些概念有關的錯誤率的樣本和觀察使用的J在結果的看法。因此,開發一種新的模式,G-MsTr-MvAdaboost。結果表明,該算法能夠提高Adaboost算法的性能,以及用較少的屬性,並表明意見的數量增加的錯誤率減少。模型的結果數和每觀點比較觀察。

    中文摘要 I Abstract II Acknowledgments III Table of Content IV Chapter 1 Introduction 7 1.1 Background 7 1.2 Transfer Learning 8 1.3 Multi-source Transfer Learning (MsTL) 10 1.4 Multi-view Transfer Learning (MvTL) 11 1.5 Adaboost 12 1.6 Motivation 13 1.7 Contribution 14 1.8 Problem Overview 15 1.9 Thesis Organization 16 Chapter 2 Related Work 17 2.1 Adaboost 17 2.1.1 Adaboost Timeline 18 2.1.2 Linear Combination 18 2.1.3 Terminology 18 2.2 Transfer Learning 19 2.3 Multi Source Transfer Learning 20 2.4 Multi View Transfer Learning 21 Chapter 3 Problem Formulation 22 3.1 Constraints 22 Chapter 4 Methodology 25 4.1 Overview 25 4.2 Symbols and notations 26 4.3 G-MsTL-MvAdaboost Pseudo Code 27 4.4 Construction of the Model G-MsTL-MvAdaboost 28 4.4.1 Initial settings of data and variables 28 4.4.2 Steps of the model G-MsTL-MvAdaboost 30 Chapter 5 Experiments and Discussion 33 5.1 Overview 33 5.2 Materials employed 33 5.3 Data sets and sampling technique 34 5.4 Division of all features into J views 38 Chapter 6 Results 39 Chapter 7 Conclusions and Future Work 43 References 44

    [1] Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering 22(10), 1345–1359 (2009).
    [2] Krishnapuram, B.,Bharat Rao, S. Cost-sensitive machine learning. 2012.
    [3] Xu, Z., Sun.: Multi-source Transfer learning with Multi-view Adaboost. 2012.
    [4] Xu, Z., Sun, S.: Multi-view Transfer Learning with Adaboost. In: Proceedings of the 23rd IEEE International Conference on Tools with Artificial Intelligence, pp. 399–402 (2011).
    [5] Yao, Y., Doretto, G., Boosting for transfer learning with multiple sources. 2010.
    [6] Croonenborghs, T., Driessens, K., & Bruynooghe, M. Learning Relational Options for Inductive Transfer in Relational Reinforcement Learning: LNAI 4894, pp. 88–97, 2008.
    [7] Zhong, E., Fan, W., Yang, Q., Cross-validation framework to choose amongst models and datasets for transfer learning: ECML PKDD 2010, Part III, LNAI 6323, pp. 547–562, 2010.
    [8] Zhang, Q., Sun, Q., Multiple-view multiple-learner active learning. 2010.
    [9] Xu, Z., Sun, S.: An algorithm on multi-view adaboost. In: Proceedings of 17th International Conference on Neural Information Processing, pp. 355–362 (2010).
    [10] Y. Freund, R.E. Shapire, A decision-theoretic generalization of online learning and an application to boosting, Journal of Computer System Science 55(1) (1997) 119–139.
    [11] Schapire, R., Freund, Y.: Boosting, foundations and algorithms. (2012)
    [12] Thorndike, E. L. and Woodworth, R. S. (1901) "The influence of improvement in one mental function upon the efficiency of other functions", Psychological Review 8.
    [13] Pan. S., Wenchen, V., & Yang, Q. Transfer Learning for WiFi-based Indoor Localization. 2008.
    [14] Tian, X., Tao, D., Rui, Y., Sparse transfer learning for interactive video search reranking, ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP), v.8 n.3, p.1-19, July 2012.
    [15] Mei SY, Fei W, Zhou SG:Gene Ontology based transfer learning for protein subcellular localization. BMC Bioinformatics 2011, 12:44.
    [16] Calais G., H., Veloso, A., Meira, W., Almeida, V., From bias to opinion: a transfer-learning approach to real-time sentiment analysis, Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining, August 21-24, 2011, San Diego, California, USA [doi>10.1145/2020408.2020438]

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE