簡易檢索 / 詳目顯示

研究生: 謝宗融
Hsieh, Tsung-Jung
論文名稱: 應用柔性演算建構整合型之預測與分類系統
Construction of Integrated Systems for Forecasting and Classification Based on Soft Computing
指導教授: 葉維彰
Yeh, Wei-Chang
口試委員: 林妙聰
陳茂生
陳嘉文
溫于平
學位類別: 博士
Doctor
系所名稱: 工學院 - 工業工程與工程管理學系
Department of Industrial Engineering and Engineering Management
論文出版年: 2011
畢業學年度: 100
語文別: 英文
論文頁數: 78
中文關鍵詞: 小波轉換逐步回歸相關法選取法簡易型遞迴式類神經人工蜜蜂演算法預測分類屬性挑選網格機制最小平方支持向量機機器學習直交設計
外文關鍵詞: Wavelet transform, Stepwise regression-correlation selection, Simple recurrent neural network, Artificial bee colony algorithm, Forecasting, Classification, Feature selection, Grid scheme, Least squares support vector machines, Machine learning, Orthogonal design
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究針對資料挖礦的議題,提出兩個整合性系統,分別應用在預測以及分類的問題上面。第一個系統是應用在國際股價的預測上,其結構由小波轉換以及簡易型遞迴類神經(SRNN)所組成,再使用蜜蜂演算法(ABC)進行類神經參數最佳化。系統的運作過程有三個階段: 首先應用小波轉換中的Haar 轉換適當地消除股價時間序列中的雜訊;再者,SRNN中的輸入元(輸入資料)在本研究中是採用財務領域中基本分析以及技術分析得到的資料,再經由逐步式回歸相關選擇法(SRCS)得到與收盤價較適切的變數資料;最後,利用蜜蜂演算法來最佳化SRNN中的未知參數(權重以及修正值)。我們簡稱此系統為SRCS-WT-SRNN。
    另一個系統是有關一個新的機器學習的概念,用來處理資料分類的問題。此系統中引進一個網格機制(GS)到原先的最小平方支持向量機中(LSSVM),合稱為GS-LSSVM。GS-LSSVM的目的在於能夠在同一學習流程中,一次完成執行屬性選取、混合核函數之應用,以及參數的最佳化過程。此系統的運作過程可細分三個程序:首先,利用直交設計(OD)來得到初始參數解(包含欲選取的屬性個數和儲存在GS內的候選參數解);再者,將根據GS中的第一個網格,由前一流程得到的結果隨機選出資料屬性,而這些挑選出的屬性將被送往LSSVM中;最後,在準確度最大化(誤差最小化)的條件下,使用蜜蜂演算法最佳化LSSVM。
    為了能客觀穩健的評估所提出的方法,在預測資料上此研究將採用四國知名股價,包含美國道瓊工業指數(DJIA)、英國倫敦FTSE-100指數、日本東京Nikkei-225指數和台灣加權股價指數(TAIEX)。在分類資料方面,則採用10個UCI資料庫的分類資料。由實證結果得知,於股價實務作業上能有效預測並能獲得利潤;再者,在分類方面,所提出的分類系統GS-LSSVM能以較簡約的屬性條件下得到相較於其他在本文中列出的方法有更好的分類結果。


    ABSTRACT (ENGLISH) I ABSTRACT (CHINESE) III ACKNOWLEDGEMENT V CONTENT VI FIGURE CAPTIONS VIII TABLE CAPTIONS VX CHAPTER 1 INTRODUCTION 1 1.1 MOTIVATION 1 1.2 PROBLEM DESCRIPTIONS 1 1.2.1 Forecasting 1 1.2.2 Classification 3 1.3 THE RESEARCH FRAMEWORK 4 CHAPTER 2 LITERATURE REVIEW 5 2.1 FORECASTING 5 2.2 CLASSIFICATION 7 2.3 ARTIFICIAL BEE COLONY ALGORITHM 11 CHAPTER 3 METHODOLOGY 14 3.1 THE FORECASTING SYSTEM—SRCS-WT–SRNN 14 3.1.1 DATA PREPROCESSING USING WAVELET TRANSFORM 15 3.1.2 INPUT SELECTION USING SRCS 18 3.1.3 SIMPLE RECURRENT NEURAL NETWORK 21 3.1.3.1 SRNN Architecture 21 3.1.3.2 Solution Representation 22 3.2 THE CLASSIFICATION SYSTEM—GS-LSSVM 23 3.2.1 SVM AND LSSVM 23 3.2.2 GRID SCHEME-LEAST SQUARES SUPPORT VECTORS MACHINE 29 3.2.2.1 General Learning Framework 29 3.2.2.2 Orthogonal Design-Based Grid Scheme 32 3.2.2.3 Parameters Optimization for GS-LSSVM 34 CHAPTER 4 EXPERIMENTAL RESULTS 36 4.1 FORECASTING EXPERIMENTS 36 4.1.1 SRNN Assessment 38 4.1.2 Forecasting performance 42 4.1.3. Profit Evaluation 44 4.2 CLASSIFICATION EXPERIMENTS 46 4.2.1 ODBC-Based Feature Selection 47 4.2.2 Comparisons and Discussion 49 CHAPTER 5 CONCLUSIONS 53 APPENDIX 55 APPENDIX A: FORECASTING RESULTS FOR INTERNATIONAL STOCK MARKETS 55 APPENDIX B: THE CLASSIFICATION DATA SETS FROM UCI 71 REFERENCES 72 CURRICULUM VITAE 78

    [1] Abu-Mostafa Y.S., Atiya A.F., “Introduction to financial forecasting,” Applied Intelligence, pp.205–213, 1996.
    [2] Wu Q., “On the optimality of orthogonal experimental design,” Acta Mathematicae Applicatae Sinica, pp. 283–299, 1978.
    [3] Leung Y.W., Wang Y., “An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization,” IEEE Transactions on Evolutionary Computation, pp.41-53, 2001.
    [4] Zadeh, L. A., “The role of fuzzy logic in modeling, identification and control,” Modeling Identification and Control, pp.191–203, 1994.
    [5] Refenes A. N., Zapranis A., Francis G., “Stock performance modeling using neural networks: A comparative study with regression models,” Neural Networks, pp.375–388, 1994.
    [6] Yoon Y., Swales, G., Margavio, T.M., “A comparison of discriminate analysis versus artificial neural networks,” Journal of the Operations Research Society, pp.51–60, 1993.
    [7] Zhang Y.Q., Akkaladevi S., Vachtsevanos G., Lin T.Y., “Granular neural Web agents for stock prediction,” Soft Computing Journal, pp.406–413, 2002.
    [8] Chang P.C., Wang Y.W., Yang W.N., “An investigation of the hybrid forecasting models for stock price variation in Taiwan,” Journal of the Chinese Institute of Industrial Engineers, pp.358–368, 2004.
    [9] Chen A.S., Leung M.T., Daouk, H., “Application of neural networks to an emerging financial market: Forecasting and trading the Taiwan stock index.” Computers and Operations Research, pp.901–923, 2003.
    [10] Parasuraman K., Elshorbagy A., “Wavelet networks: An alternative to classical neural networks.” IEEE International Joint Conference on Neural Networks, pp.2674–2679, 2005.
    [11] Marmer V., “Nonlinearity, nonstationarity, and spurious forecasts,” Journal of Econometrics, pp.1–27, 2008.
    [12] Kim K.-J., Lee W. B., “Stock market prediction using artificial neural networks with optimal feature transformation,” Neural Computing & Applications, pp. 255–260, 2004.
    [13] L´opez L. F. M., D´ıaz M. A., Palencia V., Santos E., and Jim´enez P., “IBEX-35 stock market forecasting using time delay connections in enhanced neural networks,” 6th World Multi-Conference on Systemics, Cybernetics and Informatics (SCI 2002), pp. 455–460, 2002.
    [14] Slim C., “Forecasting the volatility of stock index returns: A stochastic neural network approach,” Computational Science and Its Applications, pp. 935–944, 2004.
    [15] Nenortaite J. and Simutis R., “Stocks’ trading systems based on the particle swarm optimization algorithm,” 4th International Conference on Computational Science (ICCS 2004), pp. 843–850, 2004.
    [16] Jaruszewicz M. and Mandziuk J., “One day prediction of NIKKEI index considering information from other stock markets,” 7th International Conference on Artificial Intelligence and Soft Computing (ICAISC 2004), pp. 1130–1135, 2004.
    [17] Engle, R., “GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics,” The Journal of Economic Perspectives, pp.157–168, 2001.
    [18] Chen S.M., “Forecasting enrollments based on fuzzy time-series,” Fuzzy Sets and Systems, pp.311–319, 1996.
    [19] Yu H.K., “Weighted fuzzy time-series models for TAIEX forecasting,” Physica A, pp.609–624, 2005.
    [20] Cheng, C.H., Wei, L.Y., Chen, Y.S., “Fusion ANFIS models based on multi-stock volatility causality for TAIEX forecasting,” Neurocomputing, pp.3462–3468, 2009.
    [21] Chen S.H., “Genetic Algorithms and Genetic Programming in Computational Finance.,” Kluwer Academic Publishers, Dordrecht, 2002.
    [22] Cohen A., Daubechies I., Vial P., “Wavelets on the interval and fast wavelet transform,” Applied and Computational Harmonic, pp.54–81, 1993.
    [23] Ramsey, J. B., Zhang, Z., “The analysis of foreign exchange data using waveform dictionaries,” Journal of Empirical Finance, pp.341–372, 1997.
    [24] Popoola A., Ahmad K., “Testing the suitability of wavelet preprocessing for TSK fuzzy models,” In Proceeding of FUZZ-IEEE: International Conference Fuzzy System Networks, pp.1305–1309, 2006.
    [25] Gençay R., Selcuk F., Whitcher B., “Differentiating intraday seasonalities through wavelet multi-scaling,” Physica A, pp.543–556, 2001.
    [26] Ramsey J. B., “The contribution of wavelets to the analysis of economic and financial data,” Philosophical Transactions of the Royal Society of London Series A-Mathematical Physical and Engineering Sciences, pp.2593–2606, 1999.
    [27] Papagiannaki K., Taft N., Zhang Z.-L., Diot C., “Long-term forecasting of internet backbone traffic,” IEEE Transactions on Neural Networks, pp.1110–1124, 2005.
    [28] Kim Y., Street W.N., “An intelligent system for customer targeting: a data mining approach,” Decision Support Systems, pp.215–228, 2004.
    [29] Falco I. D., Cioppa A. D., Tarantino E., “Facing classification problems with Particle Swarm Optimization,” Applied Soft Computing, pp. 652–658, 2007.
    [30] Karaboga D., Ozturk C., “A novel clustering approach: Artificial Bee Colony (ABC) algorithm,” Applied Soft Computing, pp.652-657, 2010.
    [31] Sousa T., Silva A., Neves A., “Particle Swarm based Data Mining Algorithms for classification tasks,” Parallel Computing, pp.767-783, 2004.
    [32] Mavroforakis M. E., Theodoridis S., “A geometric approach to Support Vector Machine (SVM) classification,” IEEE Transactions Neural Networks, pp. 671–682, 2006.
    [33] Mu, T., Nandi, A. K., “Multiclass Classification Based on Extended Support Vector Data Description,” IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics, pp. 1206 – 1216, 2009.
    [34] C., Zhenxiang et al., “Improving Neural Network Classification Using Further Division of Recognition Space,” International Journal of Innovative Computing Information and Control, pp. 301-310, 2009.
    [35] Misra B.B. et al., “A reduced and comprehensible polynomial neural network for classification,” Pattern Recognition Letters, pp. 1705-1712, 2008.
    [36] Russell I. et al., “Machine learning and neural network approaches to feature selection and extraction for classification,” International Journal of Pattern Recognition and Artificial Intelligence, pp. 129-132, 2005.
    [37] Rovithakis G.A., Maniadakis M., and Zervakis M., “A hybrid neural network/genetic algorithm approach to optimizing feature extraction for signal classification,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, pp.695-702, 2004.
    [38] Arulampalam G., Bouzerdoum A., “A generalized feedforward neural network architecture for classification and regression,” Neural Networks, pp.561-568, 2003
    [39] Hann T. H. and Steurer E., “Much ado about nothing? Exchange rates forecasting: Neural networks vs linear models using monthly and weekly data,” Neurocomputing, pp.323–339, 1996.
    [40] Vapnik V. N., “The Nature of Statistical Learning Theory,” New York: Springer-Verlag, 1995.
    [41] Suykens J. A. K., Gestel T. V., Brabanter J. D., Moor B. D., and Vandewalle J., “Least Squares Support Vector Machines,” Singapore: World Scientific, 2002.
    [42] Liu B., Hao Z.F., Yang X.W., “The application of least squares support vector machine for classification,” 7th International Conference on Matrix Theory and Its Applications, 2006, Chengdu, China.
    [43] Guo Z.W., Bai G.C., “Classification using least squares support vector machine for reliability analysis,” Applied Mathematics and Mechanics, pp. 853-864, 2009.
    [44] Lima C.A.M., Coelho A.L.V., Von Zuben F.J., “Pattern classification with mixtures of weighted least-squares support vector machine experts,” Neural Computing & Applications, pp.843-860, 2009.
    [45] Basturk B., Karaboga D., “An artificial bee colony (abc) algorithm for numeric function optimization,” In IEEE Swarm Intelligence Symposium, Indianapolis, Indiana, USA, May, 2006.
    [46] Karaboga D., Basturk B., “A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (abc) algorithm,” Journal of Global Optimization pp.459–471, 2007.
    [47] Karaboga D., Basturk B., “On the performance of artificial bee colony (abc) algorithm,” Applied Soft Computing, pp.687–697, 2008.
    [48] Karaboga D., Akay B., “A Comparative Study of Artificial Bee Colony Algorithm,” Applied Mathematics and Computation, pp.108-132, 2009.
    [49] Chang P. C., Liu C. H., Wang Y. W., “A hybrid model by clustering and evolving fuzzy rules for sale forecasting in printed circuit board industry,” Decision Support Systems, pp.1715–1729, 2006.
    [50] Elman J. L., “Finding structure in time,” Cognitive Science, pp.179–21114, 1990.
    [51] Karaboga D., Ozturk C., “Neural Networks Training by Artificial Bee Colony Algorithm on Pattern Classification,” Neural Network World, pp.279-292, 2009.
    [52] Keerthi S., Shevade S., Bhattacharyya C., and Murthy K., “Improvements to Platt’s SMO algorithm for SVM classifier design,” Neural Computation, pp.637–649, 2001.
    [53] Vapnik V., “Estimation of Dependences Based on Empirical Data,” Berlin, Germany, Springer-Verlag, 1982.
    [54] Suykens J. A. K. and Vandewalle J., “Least squares support vector machine classifiers,” Neural Processing Letters, pp.293–300, 1999.
    [55] Smits G. F. and Jordaan E. M., “Improved SVM regression using mixtures of kernels,” Proceeding of The 2002 International Joint Conference on Neural Networks, pp.2785–2790, 2002.
    [56] Quang A. T., Zhang Q. L., and Li X., “Evolving support vector machine parameters,” International Conference on Machine Learning and Cybernetics, Beijing, China, pp.548–551, 2002,
    [57] UCI Machine Learning Repository. http://archive.ics.uci.edu/ml/index.html
    [58] Allwein E., Schapire R., and Singer Y., “Reducing multiclass to binary: A unifying approach for margin classifiers,” Journal of Machine Learning Research, pp.113–141, 2001.
    [59] Doumpos M., Zopounidis C., and Golfinopoulou V., “Additive Support Vector Machines for Pattern Classification,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, pp.540-550, 2007.
    [60] Rodríguez M., Escalante D. Peregrín M., A., “Efficient Distributed Genetic Algorithm for Rule extraction,” Applied Soft Computing, pp.733-743, 2010.
    [61] International stock markets. http://finance.yahoo.com/q/hp?s=%5EFTSE

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE