研究生: |
翁瑞霞 Weng, Rey-Xia |
---|---|
論文名稱: |
Automatic Least Square Support Vector Regression Combined with Chaotic Particle Swarm Optimization for Prediction 自動式最小平支持向量機結合混沌粒子演算法應用於預測問題 |
指導教授: |
葉維彰
Yeh, Wei-Chang |
口試委員: |
唐麗英
陳茂生 |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2011 |
畢業學年度: | 99 |
語文別: | 英文 |
論文頁數: | 48 |
中文關鍵詞: | 支持向量迴歸機 、最小平方支持向量迴歸機 、核函數 、粒子群演算法 |
外文關鍵詞: | Support Vector Regression, Least Squares Support Vector Machine Regression, Mixed Kernel, Particle Swarm Optimization (PSO) |
相關次數: | 點閱:1 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在機器語言中,支持向量機可解決有限樣本學習的有效工具。最小二乘支持向量機(Least Square Support Vector regression,LSSVR)用於解決預測問題,其方法是採用最小二乘線性系統作為損失函數,代替傳統的支持向量機採用的二次規劃方法,能簡化了計算的複雜性及運算速度明顯快於支持向量機。本篇提出一種自動運算求解的最小平方支持向量迴歸機,架構主要分為三部分:第一部分為初始化參數包含混合式核函數參數與模型參數:考慮混沌序列因混沌運動,在初始化粒子群中粒子的位置時採用,加強演算法的搜索多樣性,為找到更優解和加快收斂奠定堅實的基礎。第二部分為特徵變數選擇:利用粒子群演算法進行特徵優化。第三部分為最優化參數:將第一部分初始化的解進行粒子群演算法求解,同時加入第二部分訓練的特徵變數,最後集體訓練學習成為最佳解,本研究把求解最小平方支持向量迴歸機的演算方法稱之CP- LSSVR。
在模擬驗證上,本篇採用UCI資料庫中不同類型的顯著資料測試並與多種演算法比較。實驗結果證實本篇所提出的CP- LSSVR 有較佳的預測能力。
In this study, we propose an automatic optimization least square support vector regression (LSSVR) using CPSO with a mixed kernel in order to solve regression problems. There are three parts for the LSSVR model. The first part is to consider the position of particles (solution) in chaotic sequence with good randomness and ergodic property of the characteristics in initial. The second part is the binary particle swarm optimization (PSO) employing to select possible input feature combination. Finally, a chaos search is used to select possible input features, and then we combine the optimize parameters optimized by PSO, called CP-LSSVR for short. For illustration and evaluation purposes, the CP- LSSVR is utilized to predict the remarkable datasets testing targets acquired from the UCI dataset. The results indicate that the proposed CP- LSSVR can produce a predicted model using a small number of features and show higher predictive capability than other methods listed in this paper.
[1] V. N. Vapnik, “Statistical Learning Theory.” New York: Wiley, 1998.
[2] P. K. Dash, S. R. Samantaray, and G. Panda, “Fault classification and
section identification of an advanced series-compensated transmissionline using support vector machine,” IEEE Trans. Power Del., Jan. 2007,vol. 22,no. 1, pp. 67–73.
[3] I. Goethals, K. Pelckmans, J. A. K. Suykens, and B. De Moor, “Subspace
identification of Hammerstein systems using least squares support vector
machines,” IEEE Trans. Autom. Control, Oct. 2005,vol. 50, no. 10, pp. 1509–1519.
[4] C. J. C. Burges, “A tutorial on support vector machines for pattern
recognition,” Data Mining Knowledge Discovery, 1998, vol. 2, no. 2, pp. 1–47.
[5] V. N. Vapnik, “An overview of statistical learning theory,” IEEE Trans.
Neural Networks, 1999, vol. 10, pp. 988–999.
[6] J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proc.
IEEE Int. Conf. Neural Networks, 1995, vol. 4, pp. 1942–1948.
[7] Y. Shi and R. Eberhart, “Empirical study of particle swarm optimiza-
tion,” in Proc. Cong. Evol. Comput., 1999, vol. 3, pp. 1945–1950.
[8] M. Kudo and J. Sklansky, “Comparison of algorithms that select features for pattern classifiers,” Pattern Recognit., 2000, vol. 33, pp. 25–41.
[9] G. F. Smits and E. M. Jordaan, “Improved SVM regression using mixtures of
kernels,” in Proc. Int. Joint Conf. Neural Netw., 2002, vol. 3,pp. 2785–2790.
[10] A. T. Quang, Q. L. Zhang, and X. Li, “Evolving support vector machine
parameters,” in Proc. 1st Int. Conf. Mach. Learn. Cybern., 2002, pp.548–551.
[11] R. Eberhart and Y. Shi, “Particle swarm optimization: developments,
applications, and resources,” in Proc. Cong. Evol. Comput., 2001, vol.
1, pp. 81–86.
[12] Chung-Jui Tu, Li-Yeh Chuang, Jun-Yang Chang, and Cheng-Hong Yang, Member, “Feature Selection using PSO-SVM,” IAENG International Journal of Computer Science, Feb.2007,33:1, IJCS_33_1_18
[13] A. J. Smola and B. Scholkopf, “A tutorial on support vector regression,”Royal
Holloway College, Univ., London, London, U.K., NeuroCOLT Tech. Rep.
NC-TR-98-030, 1998.
[14] L. Jiao, L. Bo, and L. Wang, “Fast sparse approximation for least squares
support vector machine,” IEEE Trans. Neural Netw., May 2007. vol.18, no.3,pp. 685–697.
[15] J.A.K. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor, J. Vandewalle, “Least Squares Support Vector Machines,” World Scienti fi c, Singapore, 2002.
[16] B. Scholkopf, Support Vector Learning. Munich, Germany: Oldenburg-Verlag, 1997.
[17] Lean Yu, Huanhuan Chen, Shouyang Wang, and Kin Keung Lai “Evolving Least Squares Support Vector Machines for Stock Market Trend Mining,” IEEE transactions on evolutionary computation, 2009,vol.13,pp.87-102
[18] Tsung-Jung Hsieh and Wei-Chang Yeh ,”Knowledge Discovery Employing Grid Scheme Least Squares Support Vector Machines Based on Orthogonal Design Bee Colony Algorithm,”
[19] M. Clerc and J. Kennedy, “The particle swarm—explosion, stability
and convergence in a multidimensional complex space,” IEEE Trans.
Evol. Comput., vol. 6, Feb. 2002,no. 1, pp. 58–73,
[20] Burges, C. J. C” A tutorial on support vector machines for pattern recognition.”Data Mining Knowledge Discovery.1998. 2(2), pp. 121–167.
[21] Osuna, E., Freund, R., & Girosi, F.. “Training support vector machines: Anapplication to face detection. In Proceedings of IEEE computer vision and pattern recognition” .1997 ,pp. 130–136.
[22] K.P. Bennett, J. Hu J, X.Y. Ji, G. Kunapuli, J.S. Pang, “Model selection via bilevel optimization,” in Proc. IEEE Int. Conf. on. Neural Netw., 2006,
pp. 16–21.
[23] Christianini, V., & Shawe-Taylor, J..” An introduction to support vector machines. Cambridge: Cambridge University Press.” 2002
[24] Y. Kim and W. N. Street, “An intelligent system for customer targeting:
a data mining approach,” Decision Support Syst., 2004, vol. 37, pp. 215–228.
[25] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, “Choosing Multiple Parameters for Support Vector Machines,” Mach. Learn , 2002.vol. 46, no. 1-3, pp. 131--159.
[26] Joachims, T., Ndellec, C., & Rouveriol, C. “Text categorization with supportvector machines: Learning with many relevant features.” In European conference on machine learning no. 10, Chemnitz, Germany, 1998, vol.1398. pp. 137–142.
[27] X. Yao, “Evolving artificial neural networks,” Proc. IEEE, Sep. 1999, vol. 87, no.9, pp. 1423–1447.
[28] O. Chapelle, V. Vapnik, “Model selection for support vector machines,”
in Proc.13th Annual Conf. on Neural Information Processing Systems (NIPS), 2000, pp. 230–236.
[29] UCI Machine Learning Repository ,http://archive.ics.uci.edu/ml/index.html
[30] R.G. Staudte, S.J. Sheather, Robust Estimation and Testing: Wiley Series in
Probability and Mathematical Statistics, Wiley, New York, 1990.
[31] D.M. Bates, D.G. Watts, Nonlinear Regression Analysis and its Applications,
Wiley, New York, 1988.