研究生: |
吳宜謙 Wu, Yi-Chien |
---|---|
論文名稱: |
使用卷積-長短期記憶神經網路進行股票交易 Stock trading using CNN-LSTM neural network model |
指導教授: |
陳人豪
Chen, Jen-Hao 李俊璋 Li, Jun-Zhang |
口試委員: |
劉晉良
Liu, Jin-Liang 陳仁純 Chen, Ren-Chun |
學位類別: |
碩士 Master |
系所名稱: |
理學院 - 計算與建模科學研究所 Institute of Computational and Modeling Science |
論文出版年: | 2021 |
畢業學年度: | 109 |
語文別: | 中文 |
論文頁數: | 29 |
中文關鍵詞: | 卷積神經網路 、長短期記憶網路 、股票交易 、資料標籤 、技術指標 、交易回測 |
外文關鍵詞: | Convolutional Neural Network, Long-Short Term Memory, Stock trading, Data Labeling, Technical Indicator, Trading-Backtest |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在這篇論文中,我們將股價數據從一維資料轉型為二維的圖像資料,並提供給
深度卷積神經網路進行訓練。為了產生二維的圖像資料,我們採用了15 種計
算指標及15 種區間,將其轉換成15*15 的二維圖片,每張圖片也根據股價於
波峰、波谷的位置將其標籤(label) 為分別為買、賣及持有。有了模型及輸入
數據及標籤後,我們將此上述模型進行進一步的延伸,將此模型訓練完後的特
徵萃取出數個特徵,轉換成如同輸入的15*15 矩陣,將此矩陣輸入給LSTM,
最終輸出預測。結果顯示,我們透過上述的方法於股票市場進行回測,相較於
CNN、Buy and Hold (BaH) 及其他方法,我們所提出的模型有較好的結果。
In this thesis, we transform stock price data from one-dimensional data to twodimensional image data and provide it to the deep CNN and LSTM for training. In order to generate two-dimensional image data, we used 15 technical indicators and 15 intervals to transform them into 15*15 two-dimensional images. We also label each picture as buy, sell and hold according to the position of the stock price at the peak and valley. With the model and input data and labels, we first train the CNN part, and extract 225 features, which is a one dimensional vector. Then we reshape them into a 15*15 matrix as the input of LSTM part. Compared with CNN and Buy-and-Hold methods, the results show that our proposed neural network model, CNN-LSTM, has better results.
[1] https://colah.github.io/posts/2015-08-Understanding-LSTMs/.
[2] https://www.analyticsvidhya.com/blog/2021/05/
convolutional-neural-networks-cnn/.
[3] E. Culurciello A. Canziani, A. Paszke. An analysis of deep neural networkmodels
for practical applications. 2016 arXiv:1605.07678, 2016.
[4] G.E. Hinton A. Krizhevsky, I. Sutskever. Imagenet classification with deepconvolutional
neural networks. 2012.
[5] Luis A. Aguirre Thomas B. Schön Antônio H. Ribeiro, Koen Tiels. Beyond exploding
and vanishing gradients: analysing rnn training using attractors and
smoothness. International Conference on Artificial Intelligence and Statistics,
2020.
[6] Caiming Xiong Richard Socher Bryan McCann, James Bradbury. Learned in
translation: Contextualized word vectors. arXiv preprint arXiv:1708.00107,
2017.
[7] Vincent Vanhoucke Alexander A. Alemi Christian Szegedy, Sergey Ioffe. An
introduction to convolutional neural networks. 2017.
[8] Junhua Mao Zhiheng Huang Chang Huang Wei Xu Jiang Wang, Yi Yang.
Cnn-rnn: A unified framework for multi-label image classification. 2016.
[9] Fangyan Dai Kai Chen, Yi Zhou. A lstm-based method for stock returns prediction
: A case study of china stock market. IEEE International Conference
on Big Data (Big Data), 2015.
[10] George Fazekas Keunwoo Choi and Mark Sandler. Text-based lstm networks
for automatic music composition. The Centre for Digital Music, Queen Mary
University of London, 2016.
[11] Salah-ddine KRIT Khalid ABOULOULA, Brahim El Habil. Money management
limits to trade by robot trader for automatic trading. International
Journal of Engineering, Science and Mathematics, 2018.
[12] Cao Lijuan. Support vector machines experts for time series forecasting.
Elsevier, 51(1):321–339, 2003.
[13] Ahmet Murat Ozbayoglu Omer Berat Sezer. Algorithmic financial trading
with deep convolutional neuralnetworks: Time series to image conversion
approach. Elsevier, pages 525–538, 2018.
[14] W. H. C. Bassetti Robert D. Edwards, John Magee. Technical analysis of
stock trends. The International Federation of Technical Analysts, 2018.
[15] Yih-Ru Wang Sin-Horng Chen, Shaw-Hwa Hwang. An rnn-based prosodic
information synthesizer for mandarin text-to-speech. 1998.
[16] C. Krauß T. Fischer. Deep learning with long short-term memory networksfor
financial market predictions. Tech. Rep., FAU Discussion Papers inEconomics,
2017.
[17] Rui Yan. i, poet: Automatic poetry composition through recurrent neural
networks with iterative polishing schema. arXiv:1605.07678, 2016.