簡易檢索 / 詳目顯示

研究生: 李芳妤
Lee, Fang-Yu
論文名稱: 以神經網路進行函數逼近
Function approximation with neural network - A direct construction
指導教授: 林得勝
Lin, Te-Sheng
李俊璋
Lee, Chiun-Chang
口試委員: 陳人豪
Chen, Jen-Hao
曾昱豪
Zeng, Yu-Hao
學位類別: 碩士
Master
系所名稱: 理學院 - 計算與建模科學研究所
Institute of Computational and Modeling Science
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 147
中文關鍵詞: 神經網路人工智慧機器學習函數逼近
外文關鍵詞: Neural network, Artificial intelligence, Machine learning, Function approximation
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在本論文中,我們證明了任何連續函數和決策函數都可以通過只有一個隱藏層的神經網絡來近似。我們直接構建神經網絡,並以連續的sigmoidal函數作為激活函數,我們表明如果適當地調整神經網絡的參數,可以準確地逼近任何 N 維連續函數。同樣,任何 N 維連續函數也可以通過使用 relu 函數作為激活函數的神經網絡來近似。


    In this thesis we demonstrate that any continuous function as well as decision functions can be approximated by a neural networks with only a single hidden layer. We construct the neural network directly. Taking the continuous sigmoidal function as the activation function, we show that if the parameters of the neural network are adjusted appropriately, any N-dimensional continuous function can be approximated accurately. Similarly, any N-dimensional continuous function can also be approximated by a neural network that uses the relu function as the activation function.

    1 Introduction p.1 2 Universal approximation - Theorem and experiments p.3 2.1 Theoretical results for function approximation p.3 2.2 Numerical experiments of universal approximation p.4 2.3 Theoretical results for classification p.8 2.4 Classification experiments p.9 2.4.1 Binary classification p.10 2.4.2 Multiple classification p.13 3 Direct construction of universal approximation p.17 3.1 Universality of neural network constructed by continuous sigmoidal function p.17 3.1.1 Approximate one-dimensional function p.17 3.1.2 Approximate two-dimensional functions p.85 3.1.3 Approximate three-dimensional functions p.97 3.1.4 Approximate multiple-dimensional function p.100 3.2 Universality of neural network constructed by continuous relu function p.101 3.2.1 Approximate one-dimensional function p.101 4 Conclusion p.117 Appendices p.119 .1 Numerical experiments of universal approximation p.120 .2 Classification experiment of universal approximation p.127 2.1 Binary classification p.127 2.2 Multiple classification p.132 .3 Experiments of direct construction of universal approximation p.139 3.1 Use an untrained neural network which constructed from many one-dimensional bump functions and two step functions to approximate a one-dimensional function p.139 3.2 Use an trained neural network which constructed from many one-dimensional bump functions and two step functions to approximate a one-dimensional function p.141

    [AJO+18] Oludare Isaac Abiodun, Aman Jantan, Abiodun Esther Omolara, Kemi Victoria Dada, Nachaat AbdElatif Mohamed, and Humaira Ar- shad. State-of-the-art in artificial neural network applications: A sur- vey. Heliyon, 4(11):e00938, 2018.

    [BDTD+16] Mariusz Bojarski, Davide Del Testa, Daniel Dworakowski, Bernhard Firner, Beat Flepp, Prasoon Goyal, Lawrence D Jackel, Mathew Mon- fort, Urs Muller, Jiakai Zhang, et al. End to end learning for self-driving cars. arXiv preprint arXiv:1604.07316, 2016.

    [Cyb89] George Cybenko. Approximation by superpositions of a sigmoidal func- tion. Mathematics of control, signals and systems, 2(4):303–314, 1989.

    [KB10] Mehdi Khashei and Mehdi Bijari. An artificial neural network (p, d, q) model for timeseries forecasting. Expert Systems with applications, 37(1):479–489, 2010.

    [Lip89] Richard P Lippmann. Pattern classification using neural networks.
    IEEE communications magazine, 27(11):47–50, 1989.

    [LSD15] Jonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolu- tional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431– 3440, 2015.

    [Nie16] Michael Nielsen. A visual proof that neural nets can compute any func- tion. URL: http://neuralnetworksanddeeplearning. com/chap4. html, 2016.

    QR CODE