研究生: |
蔡忠良 Tsai, Chung-Liang |
---|---|
論文名稱: |
架構在不同損失函數下的深度學習 Deep Learning under various loss functions |
指導教授: |
洪文良
Hung, Wen-Liang |
口試委員: |
張延彰
Chang, Yen-Chang 沈冠甫 Shen, Kuan-Fu |
學位類別: |
碩士 Master |
系所名稱: |
南大校區系所調整院務中心 - 應用數學系所 應用數學系所(English) |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 英文 |
論文頁數: | 11 |
中文關鍵詞: | 深度學習 、損失函數 、CNN 、卷積類神經網絡 |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本次研究想得知將損失函數更換為Renyi entropy及Gini index是否能夠在深度學習中表現依然良好。
首先以均勻分布隨機產生模擬資料集,使用Shannon entropy、Renyi entropy、cross entropy及Gini Index四種不同的信息熵進行比較,發現Renyi entropy、cross entropy在模擬測試中表現較佳;接下來選擇較shannon entropy廣義的Renyi entropy以及使用Gini index當作損失函數,並與深度學習常中見的損失函數cross entropy,將它們套用在深度學習中進行比較。
其中表現最好的是Renyi entropy及cross entropy。
The approach of this research is to the loss function cross entropy with Renyi entropy and Gini index of deep learning, and see is it still good at training and test of deep learning. First we'd given a simulation data to compare four entropies; Shannon entropy, Renyi entropy, cross entropy and Gini index. The Renyi entropy and cross entropy are performed well in our simulation dataset.
Next step,take Renyi entropy and Gini index as loss function compare with cross entropy, which is deep learning default usual loss function. The following are the main research contributions of this dissertation. The propose Renyi entropy is slightly better than cross entropy.
[1] P.A Bromiley, N.A. Thacker and E. Bouhova-Thacker, Shannon Entropy, Renyi
Entropy, and Information, Tina Memo No.2004-004 Internal Memo, 2004
[2] Ian Goodfellow and Yoshua Bengio and Aaron Courville, Deep Learning,
2016,MIT Press, http://www.deeplearningbook.org
[3] de Boer, Pieter-Tjerk; Kroese, Dirk P.; Mannor, Shie; Rubinstein, Reuven Y., A
Tutorial on the Cross-Entropy Method, Annals of Operations Research Volume
134, 2005
[4] Alex Krizhevsky, Learning Multiple Layers of Features from Tiny Images, 2009
[5] Shannon, Claude E., A Mathematical Theory of Communication., Bell System
Technical Journal. 27(3): 379{423, 1948