簡易檢索 / 詳目顯示

研究生: 羅世宏
Lo, Shih-Hong
論文名稱: AirType: 基於機器學習之空中輸入識別系統
AirType: In-Air Typing-Recognition System Based on Machine Learning
指導教授: 周百祥
Chou, Pai H.
口試委員: 周志遠
Chou, Jerry
蔡明哲
Tsai, Ming-Jer
學位類別: 碩士
Master
系所名稱:
論文出版年: 2018
畢業學年度: 106
語文別: 英文
論文頁數: 37
中文關鍵詞: 嵌入式系統機器學習手勢辨識
外文關鍵詞: Embedded System, Machine Learning, Gesture Recognition
相關次數: 點閱:3下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本文提出一系列的演算法用於空中輸入辨識的資料切割處理及分類。 我們亦設計一個以動 作感測為主的手指穿戴式裝置,該裝置上配備 微小的慣性感測器以及附有藍芽功能的微 控制器, 用於蒐集手指動作的資料和將資料透過藍芽傳送至電腦端進行資料處理。 我們 提出的演算法主要是進行資料切割、特徵提取以及 利用基於機器學習的k-近鄰演算法來進 行手勢分類, 最後透過將分類結果對應至預先定義好的想像的鍵盤, 即可辨認使用者所 按下之按鍵。 我們實作提出的演算法於我們設計的手指穿戴式裝置上, 實驗結果顯示, 無論使用是個人化訓練的模型, 或者是使用大眾化訓練的模型,我們的演算法能夠達到 相當高的準確性。 於我們的實驗當中, 個人化訓練的模型與大眾化訓練的模型分別可達 到90.8%和90.2%。


    We propose a series of algorithms for air-typing recognition from data collected by a finger- wearable motion-sensing ring. This finger-worn unit consists of a miniature inertial measurement unit (IMU) and a microcontroller unit (MCU) with an on-chip Bluetooth Low Energy (BLE) transceiver. Our proposed algorithms perform data segmentation, feature extraction, and classification based on k-Nearest Neighbors (kNN) to recognize the gestures and map them into the imaginary keyboard. Experimental results show that our air-typing system can achieve 90.8% and 90.2% on user-dependent and user-independent cases, respectively.

    Contents i Acknowledgments vi 1 Introduction 1 1.1 Motivation........................................ 1 1.2 Contributions ...................................... 2 1.3 ThesisOrganization................................... 2 2 Related Work 3 2.1 Projection-BasedApproach............................... 3 2.2 Ring-BasedApproach.................................. 4 3 Background Theory 5 3.1 K-NearestNeighbors .................................. 5 3.2 SupportVectorMachine ................................ 6 3.3 DecisionTree...................................... 7 3.4 NaiveBayes....................................... 9 3.5 LogisticRegression................................... 9 4 Technical Approach 11 4.1 GestureRepresentation................................. 11 4.2 MotionSensingSystem................................. 12 4.3 SegmentationAlgorithm ................................ 13 4.4 RecognitionAlgorithm ................................. 15 4.4.1 Normalization ................................. 15 4.4.2 FeatureExtraction ............................... 15 4.4.3 PrincipalComponentsAnalysis ........................ 18 4.4.4 Classification.................................. 19 4.4.5 KeystrokeIdentification ............................ 20 5 System Architecture and Implementation 21 5.1 SystemArchitecture................................... 21 5.2 NodeSubsystem .................................... 22 5.2.1 MicrocontrollerUnit .............................. 22 5.2.2 InertialMeasurementUnit ........................... 22 5.2.3 BluetoothLowEnergy ............................. 23 5.3 HostSubsystem..................................... 24 6 Evaluation 25 6.1 ExperimentalSetup................................... 25 6.2 ExperimentalResults .................................. 26 6.2.1 DimensionalityComparison .......................... 26 6.2.2 RecognitionAccuracy ............................. 27 6.2.3 Performance .................................. 30 6.3 Discussion........................................ 30 7 Conclusions and Future Work 34 7.1 Conclusions....................................... 34 7.2 FutureWork....................................... 34 7.2.1 Typingonanyplane .............................. 34 7.2.2 Full-sizedkeyboard............................... 35 7.2.3 MouseMode .................................. 35

    [1] H. Roeber, J. Bacus, and C. Tomasi, “Typing in thin air: the Canesta projection keyboard–a new method of interaction with electronic devices,” in CHI’03 extended abstracts on Human factors in computing systems, pp. 712–713, ACM, 2003.
    [2] C. Tomasi, A. Rafii, and I. Torunoglu, “Full-size projection keyboard for handheld devices,” Communications of the ACM, vol. 46, no. 7, pp. 70–75, 2003.
    [3] H.Du,T.Oggier,F.Lustenberger,andE.Charbon,“Avirtualkeyboardbasedontrue-3Doptical ranging,” in Proceedings of the British Machine Vision Conference, vol. 1, pp. 220–229, 2005.
    [4] S. Nirjon, J. Gummeson, D. Gelb, and K.-H. Kim, “Typingring: A wearable ring platform for text input,” in Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services, pp. 227–239, ACM, 2015.
    [5] L. Meli, D. Barcelli, T. L. Baldi, and D. Prattichizzo, “Hand in air tapping: A wearable input technology to type wireless,” in Robot and Human Interactive Communication (RO-MAN), 2017 26th IEEE International Symposium on, pp. 936–941, IEEE, 2017.
    [6] Embedded Platform Lab (EPL) at National Tsing Hua University (NTHU) in Taiwan, “Eco Mini.” http://epl.tw/ecomini/, 2014.
    [7] “CC2541 2.4-GHz BluetoothTM Low Energy and Proprietary System-on-Chip.” http://www.ti. com/lit/ds/symlink/cc2541.pdf.
    [8] “MPU9250 Product Specification.” https://www.invensense.com/wp-content/uploads/2015/ 02/PS-MPU-9250A-01-v1.1.pdf.
    [9] “Python interface to Bluetooth LE on Linux.” https://github.com/IanHarvey/bluepy.
    [10] “Scikit Learn.” http://scikit-learn.org/.
    [11] C.-T. Lee, Y.-H. Liang, P. H. Chou, A. Heydari Gorji, S. M. Safavi, W.-C. Shih, and W.-T. Chen, “EcoMicro: A miniature self-powered inertial sensor node based on Bluetooth Low Energy,” in Low Power Electronics and Design (ISLPED, 2018 IEEE/ACM International Symposium on, pp. 1–6, IEEE, 2018.

    QR CODE