研究生: |
李彥儀 Lee, Yan-Yi |
---|---|
論文名稱: |
利用慣性測量系統與基於物理模型機器學習方法提高手臂動態運動之量測精準度 Enhancing Measurement Accuracy of Arm Motion Dynamics with Inertial Measurement Systems and Model-Based Machine Learning Methods |
指導教授: |
張禎元
Chang, Jen-Yuan |
口試委員: |
陳榮順
CHEN, RONG-SHUN 李志鴻 LI, CHIH-HUNG 馮國華 Feng, Guo-Hua 李俊則 Lee, Chun-Tse |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 動力機械工程學系 Department of Power Mechanical Engineering |
論文出版年: | 2023 |
畢業學年度: | 111 |
語文別: | 中文 |
論文頁數: | 92 |
中文關鍵詞: | 慣性式量測系統 、手臂物理模型 、基於物理模型機器學習 |
外文關鍵詞: | Inertial Measurement System, Arm physical model, Physics-guided machine learning based on physical models |
相關次數: | 點閱:59 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著現代科技的快速發展,手臂動態運動的精準量測對於許多應用領域,如醫學、機器人應用和運動分析,至關重要。現如今已有幾種方式可以追蹤人類手臂動態位置姿態,然而這些設備有些成本較高,或限制人類手臂活動靈活度,以及設備本身容易受到干擾雜訊影響。因此本研究提出融合慣性測量系統與物理引導神經網路(Physics-Guided Neural Networks, PGNN)方法,以提高手臂動態運動量測的精準度。PGNN是基於物理模型的機器學習方法之一,將物理知識作為先驗信息引入到神經網路中的損失函數中,提升神經網路的預測性能和可解釋性。除此之外,本研究也為人類手臂建立兩種物理模型,手臂運動鍊模型與類人七軸機械臂模型,以人類解剖學角度探討感測器擺放方式,且透過數學推導手臂關節活動角度,而後得到手臂末端點位置姿態,然後利用PGNN方法提升此位置姿態的精準度,最後再根據提升過後的手臂末端點以逆向運動學求解手臂關節活動角度,並從中討論本研究與其他文獻中的不同之處。
In the context of rapid technological advancement, precise measurement of arm dynamic motion holds paramount importance across various domains such as medicine, robotics applications, and motion analysis. Despite several existing methods for tracking human arm dynamics and posture, certain limitations including high costs, restrictions on arm flexibility, and susceptibility to interference and noise have been observed. Therefore, this study proposes the integration of Inertial Measurement Systems with Physics-Guided Neural Networks (PGNN) to enhance the accuracy of arm dynamic motion measurement. PGNN, a type of machine learning approach based on physical models, incorporates prior physics knowledge into the neural network's loss function, thereby improving predictive performance and interpretability. Additionally, this research establishes two distinct physical models for the human arm: an arm motion chain model and a humanoid seven-axis robotic arm model. These models consider sensor placement from an anatomical perspective and utilize mathematical derivations to determine arm joint angles, subsequently deriving arm endpoint position and posture. The improved precision of this position and posture is achieved using the PGNN method. Moreover, this research employs inverse kinematics to deduce arm joint angles from the enhanced arm endpoint, facilitating discussions on the distinctions between this study and other literature.
參考文獻
[1] M. Field, Z. Pan, D. Stirling, and F. Naghdy, "Human motion capture sensors and analysis in robotics," Industrial Robot: An International Journal, vol. 38, no. 2, pp. 163-171, 2011.
[2] M. Field, D. Stirling, F. Naghdy, and Z. Pan, "Motion capture in robotics review," in 2009 IEEE International Conference on Control and Automation, 9-11 Dec. 2009 2009, pp. 1697-1702.
[3] E. van der Kruk and M. M. Reijne, "Accuracy of human motion capture systems for sport applications; state-of-the-art review," European Journal of Sport Science, vol. 18, no. 6, pp. 806-819, 2018/07/03 2018.
[4] A. Gmiterko and T. Lipt¨¢k, "Motion capture of human for interaction with service robot," American Journal of Mechanical Engineering, vol. 1, no. 7, pp. 212-216, 2013/12/02 2013.
[5] N. Goldfarb, A. Lewis, A. Tacescu, and G. S. Fischer, "Open source vicon toolkit for motion capture and gait analysis," Computer Methods and Programs in Biomedicine, vol. 212, p. 106414, 2021/11/01/ 2021.
[6] F. Schlagenhauf, S. Sreeram, and W. Singhose, "Comparison of kinect and vicon motion capture of upper-body joint angle tracking," in 2018 IEEE 14th International Conference on Control and Automation (ICCA), 12-15 June 2018 2018, pp. 674-679.
[7] S. L. Colyer, M. Evans, D. P. Cosker, and A. I. T. Salo, "A review of the evolution of vision-based motion analysis and the integration of advanced computer vision methods towards developing a markerless system," Sports Medicine - Open, vol. 4, no. 1, p. 24, 2018/06/05 2018.
[8] Vicon nexus 動作捕捉系統. [Online]. Availabe: https://www.vicon.com/about-us/what-is-motion-capture/.
[9] Z. Bons, T. Dickinson, R. Clark, K. Beardsley, and S. K. Charles, "Compensating for soft-tissue artifact using the orientation of distal limb segments during electromagnetic motion capture of the upper limb," Journal of Biomechanical Engineering, vol. 144, no. 7, 2022.
[10] Polhemus innovation in motion. [Online]. Availabe:
https://www.polhemus.com/?page=Eye_VisionTrak.
[11] Gypsy 7 torso motion capture system. [Online]. Availabe:
https://metamotion.com/gypsy/Gypsy-6-torso.html.
[12] A. Filippeschi, N. Schmitz, M. Miezal, G. Bleser, E. Ruffaldi, and D. Stricker, "Survey of motion tracking methods based on inertial sensors: A focus on upper limb human motion," Sensors, vol. 17, no. 6, p. 1257, 2017.
[13] M. Menolotto, D. S. Komaris, S. Tedesco, B. O'Flynn, and M. Walsh, "Motion capture technology in industrial applications: A systematic review," (in eng), Sensors (Basel), vol. 20, no. 19, Oct 5 2020.
[14] Y.-W. Kim, K.-L. Joa, H.-Y. Jeong, and S. Lee, "Wearable imu-based human activity recognition algorithm for clinical balance assessment using 1d-cnn and gru ensemble model," Sensors, vol. 21, no. 22, p. 7628, 2021.
[15] M. Schepers, M. Giuberti, and G. Bellusci, Xsens mvn: Consistent tracking of human motion using inertial sensing. 2018.
[16] G. Škulj, R. Vrabič, and P. Podržaj, "A wearable imu system for flexible teleoperation of a collaborative industrial robot," Sensors, vol. 21, no. 17, p. 5871, 2021.
[17] D. Roetenberg, H. Luinge, and P. Slycke, "Xsens mvn: Full 6dof human motion tracking using miniature inertial sensors," Xsens Motion Technol. BV Tech. Rep., vol. 3, 01/01 2009.
[18] X. Robert-Lachaine, H. Mecheri, C. Larue, and A. Plamondon, "Accuracy and repeatability of single-pose calibration of inertial measurement units for whole-body motion analysis," Gait & Posture, vol. 54, pp. 80-86, 2017/05/01/ 2017.
[19] M. K. Kim, K. Ryu, Y. Oh, S.-R. Oh, and K. Kim, "Implementation of real-time motion and force capturing system for tele-manipulation based on semg signals and imu motion data," in International Conference on Robotics & Automation(ICRA), Hong Kong, China, 2014 2014: IEEE. [Online]. Available: https://dx.doi.org/10.1109/icra.2014.6907691.
[20] T. Maruyama, M. Tada, A. Sawatome, and Y. Endo, "Constraint-based real-time full-body motion-capture using inertial measurement units," in 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 7-10 Oct. 2018, pp. 4298-4303.
[21] H. J. Luinge, P. H. Veltink, and C. T. Baten, "Ambulatory measurement of arm orientation," (in eng), J Biomech, vol. 40, no. 1, pp. 78-85, 2007.
[22] M. Miezal, B. Taetz, and G. Bleser, "On inertial body tracking in the presence of model calibration errors," Sensors, vol. 16, no. 7, p. 1132, 2016.
[23] J. B. Kuipers, "Quaternions and rotation sequences," Geometry, Integrability and Quantization, vol. 1, pp. 127-143, 2000.
[24] Pushpendra050, Denavit-hartenberg parameters. [Online]. Availabe: https://commons.wikimedia.org/w/index.php?curid=87693685.
[25] Y. Yoon, G. Swales Jr, and T. M. Margavio, "A comparison of discriminant analysis versus artificial neural networks," Journal of the Operational Research Society, vol. 44, no. 1, pp. 51-60, 1993.
[26] T. Ohira and J. D. Cowan, "Master-equation approach to stochastic neurodynamics," Physical Review E, vol. 48, no. 3, pp. 2259-2266, 09/01/ 1993.
[27] R. Ramadevi, B. Sheela Rani, and V. Prakash, "Role of hidden neurons in an elman recurrent neural network in classification of cavitation signals," Int J Comput Appl, vol. 37, no. 7, pp. 9-13, 2012.
[28] Y. Xiong and F. Quek, "Hand motion gesture frequency properties and multimodal discourse analysis," International Journal of Computer Vision, vol. 69, no. 3, pp. 353-371, 2006.
[29] M. Shimizu, H. Kakuya, W.-K. Yoon, K. Kitagaki, and K. Kosuge, "Analytical inverse kinematic computation for 7-dof redundant manipulators with joint limits and its application to redundancy resolution," IEEE Transactions on robotics, vol. 24, no. 5, pp. 1131-1142, 2008.