研究生: |
王詠婷 Wang, Yong-Ting |
---|---|
論文名稱: |
無線穿戴式裝置上之慣性測量單元的即 時連續姿態辨識 Real-Time Continuous Gesture Recognition with Wireless Wearable IMU Sensors |
指導教授: |
馬席彬
Ma, Hsi-Pin |
口試委員: |
蔡佩芸
Tsai, Pei-Yun 黃柏鈞 Huang, Po-Chiun |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2017 |
畢業學年度: | 105 |
語文別: | 英文 |
論文頁數: | 88 |
中文關鍵詞: | 無線穿戴式裝置 、慣性測量單元 、即時連續姿態辨識 |
外文關鍵詞: | IMU, Continuous Gesture Recognition, Real-Time |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
手勢識別是計算機科學和語言技術的一個主題,目的是通過數學演算法解
釋人類手勢。
在本篇論文中,我們會透過藍牙低功耗(BLE)佩戴在手腕上的六軸數據(包括加速度計和陀螺儀)的慣性測量單元(IMU)所記錄十種手部姿態動作的原始數據傳輸到電腦中,使用論文所提出的演算法加以運算辨識受測者的動作。我們使用的是機器學習的流程來建立的辨識系統。我們的動作可以分為兩類,第一類是分段動作,包含了十個基本動作。而第二類是連續動作,連續動作是由十個基本動作組合而成。
為了達到較高的辨識準確度,我們使用機器學習的分類流程,並且更進一步做特徵的選取與萃取,我們使用的方法為主成分分析法 (principal component analysis) 再加上線性判別分析 (linear discriminant analysis) 來萃取出較明確的特徵,主成分分析法與線性判別分析的優點為可以減少資料的維度並且減少後面分類的訓練時間,也能盡可能的保留原始資料的最大資訊量,我們也開發了
一個方法來建立適合後面分類器的特徵矩陣以達到較好的表現。最後我們使用
的分類器是支持向量機 (support vector machine)和動態時間校正(dynamic time warping),這個分類器可以使辨識的精準度更高、計算時間較少、也可以支援高維度的數據,而且動態時間校正可以用來修正每個人的動作時間長度不同也
可以判斷連續動作。在實驗中,我們把單一動作分成 10 類,共有 40 位受測
者,在使用者相依的狀況下可以得到 100%的準確度,而在使用者非相依的狀
況下我們得到 90%的準確度。而在固定連續組合動作且使用者非相依的條件下
我們也得到了 86.99%的準確度,另外在隨意連續組合動作可以達到 60%的準確
度。
我們也減少了支持向量機在即時性上的限制,將原本的判斷時間從 2.118
秒縮短為 0.195 秒,增進了即時性上的應用。
Gesture recognition is a wide topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms.
In this thesis, we we have recorded signals of ten kinds of hand movements into the computer using a wearable inertial measurement unit (IMU) wireless device with six axes data (including the accelerometer and gyroscope). The sensor is worn on the wrist, and the raw data are transmitted to the computer via Bluetooth Low Energy (BLE) to verify the captured data, a recognition system with machine learning classification process is built. Our movements can be divided into two categories, with the first being single gestures, which includes ten basic movements, and the second the continuous combinational gestures, which is com-
posed of the previous ten basic movements through different combinations.
In order to achieve higher recognition accuracy, we used machine learning process in the system and two analyses, principal component analysis (PCA) and linear discriminant analysis (LDA), to extract well distinguished features. The main advantage of PCA and LDA is reducing dimensions of data while preserving as much of the class discriminatory information as possible. In addition, later processing time can be decreased due to reduced dimensions of data. The experiment is then proceeded with support vector machine (SVM) and dynamic time warping (DTW). With SVM technique, we can recognize movement with higher accuracy and less computation time. High dimension data are also supported. Even non-linear relations can be modeled with more precise classication due to SVM kernels. Dynamic time warping increases recognition accuracy by categorizing movements through the measurement
of the resemblance among several temporal sequences which may alter in speed. In our experiment, we can get the accuracy of recognition at 100% for 10 classes with 40 subjects in
single gesture under the case of user-dependent, and for the user-independent case, the recognition rate is 90%. And in continuous combinational gesture for the user-independent case, we can get the accuracy of recognition at 86.99% in fixed combinational gesture, and 60% in arbitrary combinational gesture.
We have also overcame one of restrictions of the support vector machine, instead of running the algorithm off-line after all the data are measured, the algorithm can be held during the process of measurement, which greatly shortened the predict time from 2.118 seconds to 0.195 seconds, enhancing the efficiency of the application.
[1] A. Chaudhary, J. L. Raheja, K. Das, and S. Raheja, “Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey,” arXiv preprint arXiv:1303.2292, 2013.
[2] K. Lamb and S. Madhe, “Automatic bed position control based on hand gesture recognition for disabled patients,” in 2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT), Sept 2016, pp. 148–153.
[3] L. Chen, F. Wang, H. Deng, and K. Ji, “A survey on hand gesture recognition,” in 2013 International Conference on Computer Sciences and Applications, Dec 2013, pp. 313– 316.
[4] A. Parate, M.-C. Chiu, C. Chadowitz, D. Ganesan, and E. Kalogerakis, “Risq: Recognizing smoking gestures with inertial sensors on a wristband,” in Proceedings of the 12th annual international conference on Mobile systems, applications, and services, June 2014, pp. 149–161.
[5] X. Chai, Z. Liu, F. Yin, Z. Liu, and X. Chen, “Two streams recurrent neural networks for large-scale continuous gesture recognition,” in 2016 23rd International Conference on Pattern Recognition (ICPR), Dec 2016, pp. 31–36.
[6] X. Zhao, A. M. Naguib, and S. Lee, “Kinect based calling gesture recognition for taking order service of elderly care robot,” in 2014 23rd IEEE International Symposium on Robot and Human Interactive Communication, Aug 2014, pp. 525–530. 84 BIBLIOGRAPHY
[7] D. H. Shin and W. S. Jang, “Utilization of ubiquitous computing for construction AR technology,” Automation in Construction, vol. 18, no. 8, pp. 1063 – 1069, 2009. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0926580509000922
[8] G. Welch and E. Foxlin, “Motion tracking: no silver bullet, but a respectable arsenal,” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 24–38, Nov 2002.
[9] S. Zhou, Q. Shan, F. Fei, W. J. Li, C. P. Kwong, P. C. K. Wu, B. Meng, C. K. H. Chan, and J. Y. J. Liou, “Gesture recognition for interactive controllers using mems motion sensors,” in 2009 4th IEEE International Conference on Nano/Micro Engineered and Molecular Systems, Jan 2009, pp. 935–940.
[10] N. Barbour and G. Schmidt, “Inertial sensor technology trends,” IEEE Sensors Journal, vol. 1, no. 4, pp. 332–339, Dec 2001.
[11] V. M. S. Janaki, S. Babu, and S. S. Sreekanth, “Real time recognition of 3D gestures in mobile devices,” in 2013 IEEE Recent Advances in Intelligent Computational Systems (RAICS), Dec 2013, pp. 149–152.
[12] J. S. Wang and F. C. Chuang, “An accelerometer-based digital pen with a trajectory recognition algorithm for handwritten digit and gesture recognition,” IEEE Transactions on Industrial Electronics, vol. 59, no. 7, pp. 2998–3007, July 2012.
[13] S. J. Preece*, J. Y. Goulermas, L. P. J. Kenney, and D. Howard, “A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data,” IEEE Transactions on Biomedical Engineering, vol. 56, no. 3, pp. 871–879, March 2009.
[14] L. Bao and S. S. Intille, “Activity recognition from user-annotated acceleration data.” Springer, 2004, pp. 1–17.
[15] A. Bobick and J. Davis, “Real-time recognition of activity using temporal templates,” in IEEE Workshop on Applications of Computer Vision (WACV), Dec 1996, pp. 39–42.
[16] G. Zhu, L. Zhang, P. Shen, and J. Song, “Multimodal gesture recognition using 3-D convolution and convolutional lstm,” IEEE Access, vol. 5, pp. 4517–4524, 2017. BIBLIOGRAPHY 85
[17] Z. Ren, J. Meng, J. Yuan, and Z. Zhang, “Robust hand gesture recognition with kinect sensor,” in Proceedings of the 19th ACM International Conference on Multimedia, 2011, pp. 759–760.
[18] Y. Tao, H. Hu, and H. Zhou, “Integration of vision and inertial sensors for 3d arm motion tracking in home-based rehabilitation,” The International Journal of Robotics Research, vol. 26, no. 6, pp. 607–624, 2007. [Online]. Available: http://ijr.sagepub.com/content/26/6/607.abstract
[19] S. Zhou, F. Fei, G. Zhang, J. D. Mai, Y. Liu, J. Y. J. Liou, and W. J. Li, “2D human gesture tracking and recognition by the fusion of mems inertial and vision sensors,” IEEE Sensors Journal, vol. 14, no. 4, pp. 1160–1170, April 2014.
[20] Y. Park, J. Lee, and J. Bae, “Development of a wearable sensing glove for measuring the motion of fingers using linear potentiometers and flexible wires,” IEEE Transactions on Industrial Informatics, vol. 11, no. 1, pp. 198–206, Feb 2015.
[21] C. J. Lin and L. M. Deng, “Shadow approach for real-time hand gesture recognition using a light-dependent resistor array,” in 2017 IEEE Sensors Applications Symposium (SAS), March 2017, pp. 1–4.
[22] T. Chakraborty, M. Nasim, S. M. B. Malek, M. T. H. Majumder, M. S. Saeef, and A. B. M. A. A. Islam, “Low-cost finger gesture recognition system for disabled and elderly people,” in International Conference on Networking, Systems and Security (NSysS), Jan 2017, pp. 180–184.
[23] M. Goel, B. Lee, M. T. Islam Aumi, S. Patel, G. Borriello, S. Hibino, and B. Begole, “Surfacelink: using inertial and acoustic sensing to enable multi-device interaction on a surface,” in Proceedings of the 32nd annual ACM conference on Human factors in computing systems, 2014, pp. 1387–1396.
[24] N. Uzzaman, S. Hossain, R. Rashid, and A. Hossain, “A comparative analysis between light dependent and ultrasonic method of gesture recognition,” in 2016 3rd Interna- 86 BIBLIOGRAPHY tional Conference on Electrical Engineering and Information Communication Technology (ICEEICT), Sept 2016, pp. 1–5.
[25] T. Fan, D. Ye, J. Hangfu, Y. Sun, C. Li, and L. Ran, “Hand gesture recognition based on wi-fi chipsets,” in IEEE Radio and Wireless Symposium, Jan 2017, pp. 98–100.
[26] K. Kuroki, Y. Zhou, Z. Cheng, Z. Lu, Y. Zhou, and L. Jing, “A remote conversation support system for deaf-mute persons based on bimanual gestures recognition using fingerworn devices,” in 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), March 2015, pp. 574–578.
[27] J. Ducloux, P. Colla, P. Petrashin, W. Lancioni, and L. Toledo, “Accelerometer-based hand gesture recognition system for interaction in digital tv,” in IEEE International Instrumentation and Measurement Technology Conference (I2MTC), May 2014, pp. 1537– 1542.
[28] A. Akl, C. Feng, and S. Valaee, “A novel accelerometer-based gesture recognition system,” IEEE Transactions on Signal Processing, vol. 59, no. 12, pp. 6197–6205, Dec 2011.
[29] J. S. Wang and F. C. Chuang, “An accelerometer-based digital pen with a trajectory recognition algorithm for handwritten digit and gesture recognition,” IEEE Transactions on Industrial Electronics, vol. 59, no. 7, pp. 2998–3007, July 2012.
[30] G. Marqus and K. Basterretxea, “Efficient algorithms for accelerometer-based wearable hand gesture recognition systems,” in 2015 IEEE 13th International Conference on Embedded and Ubiquitous Computing, Oct 2015, pp. 132–139.
[31] E. Keogh, “Exact indexing of dynamic time warping,” in Proceedings of the 28th international conference on Very Large Data Bases, 2002, pp. 406–417.
[32] M. Brown and L. Rabiner, “Dynamic time warping for isolated word recognition based on ordered graph searching techniques,” in ICASSP ’82. IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 7, May 1982, pp. 1255–1258. BIBLIOGRAPHY 87
[33] H. P. Gupta, H. S. Chudgar, S. Mukherjee, T. Dutta, and K. Sharma, “A continuous hand gestures recognition technique for human-machine interaction using accelerometer and gyroscope sensors,” IEEE Sensors Journal, vol. 16, no. 16, pp. 6425–6432, Aug 2016.
[34] A. Kailas, “Basic human motion tracking using a pair of gyro + accelerometer mems devices,” in 2012 IEEE 14th International Conference on e-Health Networking, Applications and Services (Healthcom), Oct 2012, pp. 298–302.
[35] W. C. Chen and R. Y. Lyu, “A hmm-based fundamental motion synthesis approach for gesture recognition on a nintendo triaxial accelerometer,” in 2011 5th International Conference on Signal Processing and Communication Systems (ICSPCS), Dec 2011, pp. 1–5.
[36] X. H. Zhang, J. J. Wang, X. Wang, and X. L. Ma, “Improvement of dynamic hand gesture recognition based on hmm algorithm,” in 2016 International Conference on Information System and Artificial Intelligence (ISAI), June 2016, pp. 401–406.
[37] F. T. Liu, “Gesture recognition with wearable 9-axis sensors,” Master’s thesis, National Tsing Hua University, Hsinchu, Taiwan, 2016.
[38] “BHI-160 Product Specification,” Bosch, Datasheet, 2016, rev. 1.0. [Online]. Available: http://www.mouser.com/ds/2/783/BST-BHI160-DS000-01-Datasheet-967967.pdf
[39] “Accelerometer,” Jul. 2015, page Version ID: 671645806. [Online]. Available: https://en.wikipedia.org/w/index.php?title=Accelerometer&oldid=671645806.
[40] R. O’Reilly, A. Khenkin, and K. Harney, “Sonic nirvana: Using mems accelerometers as acoustic pickups in musical instruments,” Analog Dialogue, vol. 43, 2009.
[41] P. Prendergast and B. Kropf, “How to Use Programmable Analog to Measure MEMS Gyroscopes,” Jan. 2007. [Online]. Available: http://www.eetimes.com/document.asp?doc_id=1274559.
[42] T. Instruments, “CC2640 simplelink bluetooth smart wireless mcu,” CC2640 Bluetooth, vol. 4, 2015. 88 BIBLIOGRAPHY
[43] “CC-2640 Product Specification,” Texas Instruments, Datasheet, 2016. [Online]. Available: http://www.ti.com/lit/ds/symlink/cc2640.pdf
[44] S. Mistry, noble : A Node.js BLE (Bluetooth Low Energy) central module. [Online]. Available: https://github.com/sandeepmistry/noble
[45] I. Fodor, “A survey of dimension reduction techniques,” Center for Applied Scientific Computing Lawrence Livermore National Laboratory, vol. 9, pp. 1–18, 2002.
[46] “Linear discriminant analysis,” Mar 2014, [Online; accessed 05-August-2016]. [Online]. Available: http://sebastianraschka.com/articles/2014_python_lda.html
[47] R. A. Fisher, “The use of multiple measurements in taxonomic problems,” Annals of Eugenics, vol. 7, no. 2, pp. 179–188, 1936.
[48] C. R. Rao, “The utilization of multiple measurements in problems of biological classification,” Journal of the Royal Statistical Society. Series B (Methodological), vol. 10, no. 2, pp. 159–203, 1948. [Online]. Available: http://www.jstor.org/stable/2983775
[49] S. Russell, P. Norvig, and A. Intelligence, “A modern approach,” Artificial Intelligence. Prentice-Hall, Egnlewood Cliffs, vol. 25, p. 27, 1995.
[50] C. C. Chang and C. J. Lin, “LIBSVM: A library for support vector machines,” ACM Transactions on Intelligent Systems and Technology, vol. 2, pp. 27:1–27:27, 2011, software available at http://www.csie.ntu.edu.tw/∼cjlin/libsvm.