研究生: |
陳家翔 Chen, Jia-Siang |
---|---|
論文名稱: |
使用用戶行為及產品信息建構基於嵌入向量的遞迴神經網路推薦系統 Constructing Embedding-based Recurrent Neural Network Recommendation System Using User Behavior and Product Information |
指導教授: |
林澤
Lin, Che |
口試委員: |
翁詠祿
Ueng, Yeong-Luh 鍾偉和 Chung, Wei-Ho |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 中文 |
論文頁數: | 51 |
中文關鍵詞: | 遞迴神經網路 、嵌入向量 、使用者行為 、產品信息 、推薦系統 |
外文關鍵詞: | recurrent neural network, embedding, user behavior, product information, recommendation system |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
個人化推薦系統能夠為電商平台帶來巨大的效益,引入推薦系統能提高服務能力並增加獲利。在日趨激烈的競爭環境下,個人化推薦系統能有效地保留用戶,提高用戶的忠誠度以防止用戶流失。
在過去,電商平台多使用人工的商品推薦方法進行商品銷售。在如今大數據的時代下,交易資料大量而駁雜,持續以人工的方式進行分群及推薦勢必需要投入更多的人力,耗費更大量的企業資源。若透過機器學習(machine learning)、深度學習(deep learning)建立個人化商品推薦系統,企業將能夠自動化分析用戶與商品資料且能更快速分析出人力難以識別的隱藏訊息。
在與合作企業「亞洲遊股份有限公司(AsiaYo)」商談的過程中,我們了解到該公司的用戶有非常高的比例為新使用者,且因為行銷策略的關係,該公司傾向避免蒐集用戶的基本資料,故難以根據用戶的過往紀錄取得用戶的偏好,在上述情形下,合作公司面臨新使用者問題、稀疏性問題。本研究的解決策略因而將目光聚焦在用戶於訂房平台的瀏覽紀錄,並以用戶閱覽的商品內容推測用戶的偏好。這樣的做法不需依賴用戶的個人資訊,即使對全新的用戶我們也能有一個推薦依據。
個人化推薦系統主要建立於四種類型的資料,分別為: 產品資料、交易資料、使用者行為資料、評分資料。在本研究中,我們僅蒐集商品資訊以及用戶的網頁瀏覽及訂購紀錄來建立一套個人化推薦系統,利用自編碼器對商品資訊進行特徵抽取以取得優秀的表示向量,並搭配用戶的瀏覽訂購紀錄引入時間概念,透過建立遞迴神經網路來分析時間序列並預測用戶喜好來進行商品推薦。在實驗結果中,本研究提出的演算法可以使合作公司的推薦成功機率從31.5%提升至54.2%,我們預期此成果將對提升網站轉換率作出重大貢獻。
A well-designed personalized recommendation system can be tremendously beneficial to e-commerce platforms. Such recommendation system can not only be used to improve profitability, it can also be used to prevent the loss of loyal customers.
In the past, e-commerce companies used manual recommendation to market to consumers. In the wake of a large amount of customer big data, building a personalized recommendation system via machine learning or deep learning can automatically and systematically analyze customer and product data so that companies can quickly adapt to the hidden consumers’ preferences that are difficult to be identified by humans.
In the process of collaborating with our partner company, AisaYo, we learned that the a very high percentage of their customers are new to their service. Furthermore, because of the marketing strategy, the company tends to collect little information from their customers. In this case, it is difficult for us to identify customers' preference since we are faced with new comer and sparsity problems. As a result, we turned to focus on customers’ browsing histories on AsiaYo’s platform to predict their preferences without knowing the customers’ personal information.
Personalized recommendation systems are generally based on four types of data: product information, transaction data, user behavior data, and rating data. In this study, we only collect product information and customer's web browsing and ordering records to establish a personalized recommendation system. We use the autoencoder to extract valuable feature information from the product to obtain an excellent representation vector. Then, we analyze the time series data to predict customer preferences and conduct personalized recommendation by establishing a recurrent neural network (AE-RNN). Based on our testing results, our proposed AE-RNN improves the recommendation success rate from 31.5% to 54.2%. Based on this promising result, we expect this work to make a significant contribution to improving AsiaYo’s order conversion rates and potentially their bottom line in the near future.
[1] G. Antonellis et al., “ImageNet Classification with Deep Convolutional Neural Networks,” J. Geotech. Geoenvironmental Eng., vol. 12, pp. 1097–1105, 2015.
[2] C. Farabet, C. Couprie, L. Najman, and Y. Lecun, “Learning hierarchical features for scene labeling,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, no. 8, pp. 1915–1929, 2013.
[3] T. N. Sainath, A. R. Mohamed, B. Kingsbury, and B. Ramabhadran, “Deep convolutional neural networks for LVCSR,” in ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 2013, pp. 8614–8618.
[4] B. K. Geoffrey Hinton, Li Deng, Dong Yu, George Dahl, Abdel-rahman Mohamed, Navdeep Jaitly, Vincent Vanhoucke, Patrick Nguyen, Tara Sainath, “Deep Neural Networks for Acoustic Modeling in Speech Recognition,” IEEE Signal Process. Mag., vol. 29, no. 6, pp. 82–97, 2012.
[5] K. Han, D. Yu, and I. Tashev, “Speech Emotion Recognition Using Deep Neural Network and Extreme Learning Machine,” Fifteenth Annu. Conf. …, no. September, pp. 223–227, 2014.
[6] H. Y. Xiong et al., “The human splicing code reveals new insights into the genetic determinants of disease,” Science (80-. )., vol. 347, no. 6218, 2015.
[7] M. Spencer, J. Eickholt, and J. Cheng, “A Deep Learning Network Approach to ab initio Protein Secondary Structure Prediction,” IEEE/ACM Trans. Comput. Biol. Bioinforma., vol. 12, no. 1, pp. 103–112, 2015.
[8] H. -T. Cheng et al., “Wide & Deep Learning for Recommender Systems,” Jun.2016.
[9] L. Zheng, V. Noroozi, and P. S. Yu, “Joint Deep Modeling of Users and Items Using Reviews for Recommendation,” 2017.
[10] S. Wu, W. Ren, C. Yu, G. Chen, D. Zhang, and J. Zhu, “Personal recommendation using deep recurrent neural networks in NetEase,” in 2016 IEEE 32nd International Conference on Data Engineering (ICDE), 2016, pp. 1218–1229.
[11] K. Sheth, “Deep Neural Networks for HDR imaging,” 2016.
[12] S. Okura, Y. Tagami, S. Ono, and A. Tajima, “Embedding-based News Recommendation for Millions of Users,” in Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD ’17, 2017, pp. 1933–1942.
[13] R. Catherine and W. Cohen, “TransNets: Learning to Transform for Recommendation,” 2017.
[14] C. Chen, M. Zhang, Y. Liu, and S. Ma, “Neural Attentional Rating Regression with Review-level Explanations,” in Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW ’18, 2018, pp. 1583–1592.
[15] W. Lin, S. A. Alvarez, and C. Ruiz, “Efficient Adaptive-Support Association Rule Mining for Recommender Systems,” Data Min. Knowl. Discov., vol. 6, no. 1, pp. 83–105, 2002.
[16] P. Resnick, P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl, “GroupLens: An Open Architecture for Collaborative Filtering of Netnews,” pp. 175--186, 1994.
[17] G. Linden, B. Smith, and J. York, “Amazon.com recommendations: item-to-item collaborative filtering,” IEEE Internet Comput., vol. 7, no. 1, pp. 76–80, Jan.2003.
[18] C. Wallgren-Pettersson, P. Arjomaa, and C. Holmberg, “Item-Based Collaborative Filtering Recommendation Algorithms,” Pediatr. Neurol., vol. 6, no. 3, pp. 171–174, 1990.
[19] Y. Chen, C. Wu, M. Xie, and X. Guo, “Solving the Sparsity Problem in Recommender Systems Using Association Retrieval,” J. Comput., vol. 6, no. 9, Aug.2011.
[20] J. Bobadilla, F. Ortega, A. Hernando, and J. Bernal, “A collaborative filtering approach to mitigate the new user cold start problem,” Knowledge-Based Syst., vol. 26, pp. 225–238, Feb.2012.
[21] B. Lika, K. Kolomvatsos, and S. Hadjiefthymiades, “Facing the cold start problem in recommender systems,” Expert Syst. Appl., vol. 41, no. 4, pp. 2065–2073, Mar.2014.
[22] B. Furht, R. Westwater, and J. Ice, “Restricted Boltzmann Machines for Collaborative Filtering Ruslan,” J. Comput. Inf. Technol., vol. 6, no. 3, pp. 245–254, 1998.
[23] X. He, L. Liao, H. Zhang, L. Nie, X. Hu, and T. -S. Chua, “Neural Collaborative Filtering,” 2017.
[24] 交通部觀光局, “中華民國106年國人旅遊狀況調查,” 2017.
[25] G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks.,” Science, vol. 313, no. 5786, pp. 504–7, Jul.2006.
[26] L. Deng and D. Yu, “Deep Learning: Methods and Applications,” Found. Trends® Signal Process., vol. 7, no. 3–4, pp. 197–387, 2014.
[27] D. Cireşan, U. Meier, J. Masci, and J. Schmidhuber, “Multi-column deep neural network for traffic sign classification,” Neural Networks, vol. 32, pp. 333–338, Aug.2012.
[28] G. E. Hinton and R. S. Zemel, “Autoencoders, minimum description length and Helmholtz free energy,” Proceedings of the 6th International Conference on Neural Information Processing Systems. Morgan Kaufmann Publishers Inc., pp. 3–10, 1993.
[29] Y. B. and A. C. Ian Goodfellow, “Deep Learning,” MIT Press, p. 499–523., 2016.
[30] C. -Y. Liou, W. -C. Cheng, J. -W. Liou, and D. -R. Liou, “Autoencoder for words,” Neurocomputing, vol. 139, pp. 84–96, Sep.2014.
[31] E. Denton, S. Chintala, A. Szlam, and R. Fergus, “Deep Generative Image Models using a Laplacian Pyramid of Adversarial Networks,” Jun.2015.
[32] G .Alain and Y. Bengio, “What Regularized Auto-Encoders Learn from the Data Generating Distribution,” Nov.2012.
[33] Y. Bengio, L. Yao, G. Alain, and P. Vincent, “Generalized Denoising Auto-Encoders as Generative Models,” May2013.
[34] R. J. Williams and D. Zipser, “A Learning Algorithm for Continually Running Fully Recurrent Neural Networks,” Neural Comput., vol. 1, no. 2, pp. 270–280, Jun.1989.
[35] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, Oct.1986.
[36] Y. Bengio, P. Simard, and P. Frasconi, “Learning long-term dependencies with gradient descent is difficult,” IEEE Trans. Neural Networks, vol. 5, no. 2, pp. 157–166, Mar.1994.
[37] S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory,” Neural Comput., vol. 9, no. 8, pp. 1735–1780, Nov.1997.
[38] K. Cho et al., “Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,” Jun.2014.
[39] J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling,” Dec.2014.
[40] J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Gated Feedback Recurrent Neural Networks,” Feb.2015.
[41] P. -N. Tan, M. Steinbach, and V. Kumar, “Introduction to Data Mining Instructor’s Solution Manual.”
[42] A. Singhal Google, “Modern Information Retrieval: A Brief Overview,” 2001.
[43] T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean, “Distributed Representations of Words and Phrases and their Compositionality.” pp. 3111–3119, 2013.
[44] M. Gutmann and A. Hyvärinen, “Noise-contrastive estimation: A new estimation principle for unnormalized statistical models,” 2010.