研究生: |
張泰隆 |
---|---|
論文名稱: |
重點重覆抽樣應用於類神經網路模型的選擇 Importance Resampling for neural model selection |
指導教授: | 洪文良 |
口試委員: | |
學位類別: |
碩士 Master |
系所名稱: |
|
論文出版年: | 2005 |
畢業學年度: | 93 |
語文別: | 中文 |
論文頁數: | 39 |
中文關鍵詞: | 自助法 、多層認知器 、模型選擇 、重點重覆抽樣 |
外文關鍵詞: | Bootstrap, Multilayer perceptrons, Model selection, Importance resampling |
相關次數: | 點閱:5 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
自助法(bootstrap),又稱為重覆抽樣技術,根據原始的資料重抽產生一系列新的樣本,藉由統計上的計算,求出樣本的平均數與標準差,以評估模型參數的穩定性,做為類神經網路模型選擇的參考(Riadh Kallel [1]),但是此法有兩項缺點,其一,若樣本數過多或模型架構過於複雜,則電腦模擬時間會較長;再者,自助法重覆抽樣次數,至少需要50到200次,才能得到有效的估計量標準差。
有鑑於此,本論文以重點重覆抽樣的方法探討類神經網路模型的選擇,文中以數個模擬的資料及一個真實的資料來做測試,結果顯示使用重點重覆抽樣其抽樣次數,大約可比均勻重覆抽樣減少2到3倍,減少電腦模擬計算的時間,對於類神經網路模型的選擇,也更加準確且更有效率。
Bootstrap techniques, resampling computation techniques, have introduced new advances in model evaluation (Bootstrap for neural model selection, Riadh Kallel [1]). Using resampling methods to construct a series of new samples which are based on the original data set, allows to estimate the stability of the parameters. However two main disadvantages must be outlined. First, if or is high, computation time can be very long. Secendly, Efron (1987) finds that reasonable standard error estimates are obtained with only .
In this paper, we use importance resampling method for neural model selection. These examples indicate that it is better than uniform resampling method. Therefore it decreases the number of replications and computation time.
參考文獻
[1] R. Kallel, M. Cottrell, V. Vigneron, Bootstrap for neural model selection , Neurocomputing 48 (2002) 175-183.
[2] B. Cheng, D.M Titterington, Neural networks: a review form a statistical perpective, Statist. Sci.9(1994) 2-54.
[3] J. Hertz, A.Krogh, R. palmer, Introduction to the Theory of Neural Computation, Addison-Wesley,Reedwood City, CA, 1991.
[4] B. Eforn, R. Tibshirani , An Introduction to the Bootstrap ,Chapman & Hall , London , 1993.
[5] J. Berger, S. Fienberg, J. Gani, K. Krickeberg , I. Olkin , B. Singer, The Bootstrap and Edgeworth Expansion ,Springer Verlag 1992.
[6] Simon Haykin, Neural networks : a comprehensive foundation, Prentice Hall, N.J.,1999.
[7] Martin T. Hagan, Howard B. Demuth, Mark Beale, Neural network design, Thomson Learning, Australia ,1996.
[8] Y. Hamamoto, S. Uchimura, S. Tomita, A bootstrap technique for nearest neighbor classifier design, IEEE Trans. PAMI 19 (1997) 73-79.
[9] A. Zapranis, A.P. Refenes, Principles of Neural Model Identification, Selection and Adequacy, Springer, London, 1999.
[10] Norman R. Draper, Harry Smith , Applied Regression Analysis , third edition , Wiley Interscience 1998.
[11] Sheldon M.Ross ,Simulation,3rd edition ,Academic Press 2002.