研究生: |
李登豐 |
---|---|
論文名稱: |
於類神經網路中以基因演算法篩選屬性並嵌入蜜蜂演算法優化權重之方式處理分類問題 Algorithm for Classification Tasks: ABC-based Weights Optimization with GA-based Feature Selection in ANN |
指導教授: | 葉維彰 |
口試委員: | |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2010 |
畢業學年度: | 99 |
語文別: | 英文 |
論文頁數: | 45 |
中文關鍵詞: | 蜜蜂演算法 |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
類神經網路(Artificial neural network, ANN)發展至今將近七十年的歷史,已被廣泛地應用到許多的地方,無論是在分類或預測都上有不少的貢獻,從最耳熟能詳的感知器類神經網路架構搭配其訓練權重的方式可以達成簡單的線性可分任務,然後一路進化演變成具有良好非線性分類效果的多階層前饋式網路(Multilayer Feedforward ,MLFF, networks),其中又以倒遞演算法(Back-Propagation, BP ,algorithm)為其主要的權重訓練方式,但是BP時常落入局部最小的困境而導致其效果不彰。此外,在做此類模式的權重訓練時,鮮少一開始就對輸入的相關屬性作篩選動作,過濾掉一些不重要的訊息。
有鑑於此,本篇論文提出一個新的整合式演算法─基因蜜蜂演算法(GAABC algorithm),此法融合基因演算法(Genetic Algorithm, GA)和蜜蜂演算法(Artificial Bee Colony, ABC, algorithm )以訓練添加過濾層之多階層前饋式網路:基因的編碼模式是大多採取屬於二元編碼很適合做篩選器,因此以基因演算法中的染色體先過濾掉輸入的因素,然後交由蜜蜂演算法來訓練MLFF中的權重值,藉由多次的基因疊代中能夠找出最好的過濾基因以及訓練出最佳的權重值。
最後,本論文選用七筆來自UCI網路資料庫的資料組作為評比標準,分別測試了新提議的基因蜜蜂演算法和下列三種演算法:蜜蜂演算法、粒子群演算法(Particle Swarm Optimization, PSO, algorithm)以及倒傳遞演算法。從實驗數據得知,基因蜜蜂演算的表現優於其他三種演算法。
[1] J.H. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, MI, 1975.
[2] J.R. Koza, Genetic programming: a paradigm for genetically breeding populations of computer programs to solve problems, Technical Report STAN-CS-90-1314, Stanford University Computer Science Department, 1990.
[3] J.H. Yang and V. Honavar, “Feature Subset Selection Using a Genetic Algorithm,” IEEE Intelligent Systems, vol. 13, no. 2, pp. 44-49, 1998.
[4] D. Karaboga, An idea based on honeybee swarm for numerical optimization, Technical Report TR06, Erciyes University, Engineering Faculty, Computer Engineering Department, 2005.
[5] B. Basturk, D. Karaboga, An artificial bee colony (abc) algorithm for numeric function optimization, in: IEEE Swarm Intelligence Symposium 2006, Indianapolis, Indiana, USA, May 2006.
[6] R. Hecht-Nielsen, Kolmogorov's mapping neural network existence theorem, Proceedings of the IEEE International Conference on Neural Networks (pp. 11-13), New York: IEEE Press, 1987.
[7] K. Hornick, M. Stinchcombe, H. White, Mutilayer Feedforward Networks are Universal Approximators Neural Networks 2 (1989) 359–366.
[8] K. Hornick, M. Stinchcombe, H. White, Universal Approximation of an Unknown Mapping and its Derivatives Using Multilayer Feedforward Networks, Neural Networks 3 (1990) 551–560.
[9] V. Y. Kreinovich, Arbitrary Nonlinearly Is Sufficient to Represent All Functions by Neural Networks: A Theorem, Neural Networks 4 (1991) 381-383.
[10] Y. Chauvin, D.E. Rumelhart, Backpropagation: Theory, Architectures, and Applications, Erlbaum, Mahwah, NJ, 1995.
[11] Craig Reynolds (1987) showed that a realistic bird flock could be programmed by implementing three simple rules: match your neighbors’ velocity, steer for the perceived center of the flock, and avoid collisions.
[12] C.W. Reynolds, Flocks, herds, and schools: A distribution behavioral model, Computer Graphics 21 (1987) 25–34.
[13] T.C. Schneirla. Social organization in insects, as related to individual function, Psychological Review 48(6) (November 1941) 465–486.
[14] M. Dorigo, T. Stützle, Ant colony optimization, Cambridge, Mass., MIT Press, 2004.
[15] K.N. Krishnanand, D. Ghose, Glowworm swarm based optimization algorithm for multi-modal functions with collective robotics applications, Multi-agent and Grid Systems 2(3) (2006) 209–222.
[16] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, IEEE Press, Piscataway, NJ, 1995, pp. 39–43.
[17] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceedings of the IEEE International Conference on Neural Networks IV, vol. 4, IEEE Press, Piscataway, NJ, 1995, pp. 1942–1948.
[18] R.C. Eberhart, Y. Shi, Computational Intelligence: Concepts to Implementations, Morgan Kaufmann, 2003.
[19] R.C. Eberhart, Y. Shi, Evolving artificial neural networks, in: Proceedings of the 1998 International Conference on Neural Networks and Brain, Beijing, China, 1998, pp. 5–13.
[20] S. Kiranyaz, T. Ince, A. Yildirim, M. Gabbouj, Evolutionary artificaul neural networks by multi-dimensional particle swarm optimization, Neural Networks 22(10) (December 2009) 1448–1462.
[21] T. Sousa, A. Silva, A. Neves, Particle swarm based data mining algorithms for classification tasks, Parallel Comput. 30(5/6) (2004) 767–783.
[22] M. Omran, A. Salman, A.P. Engelbrecht, Image classification using Particle Swarm Optimization, in: Proceedings of the Fourth Asia-Pacific Conference on Simulated Evolution and Learning, Singapore, (2002), pp. 370–374.
[23] De Falco, A. Della Cioppa, E. Tarantino, Facing classification problems with Particle Swarm Optimization, Appl. Soft Comput. 7(3) (2007) 652–658.
[24] J. Han, M. Kamber, Data Mining: Concept and Techniques, Morgan Kaufmann, 2001.
[25] D.J. Hand, H. Mannila, P. Smyth, Principles of Data Mining, The MIT Press, 2001.
[26] D. Karaboga, B. Basturk Akay, C. Ozturk, Artificial Bee Colony (ABC) Optimization Algorithm for Training Feed-Forward Neural Networks, LNCS: Modeling Decisions for Artificial Intelligence, Vol. 4617, pp. 318-319, Springer-Verlag, 2007, MDAI 2007.
[27] D. Karaboga, C. Ozturk, Neural networks training by Artificial Bee Colony Algorithm on pattern classification, Neural Netw. World 19(3) (2009) 279–292.
[28] D. Karaboga, B. Basturk, A powerful and efficient algorithm for numerical function optimization: Artificial Bee Colony (ABC) algorithm, J. Global Optim. 39(3) (2007) 171–459.
[29] C.L. Blake, C.J. Merz, University of California at Irvine Repository of Machine Learning Databases, University of California, Irvine, 1998. http://www.ics.uci.edu/_mlearn/MLRepository.html.