研究生: |
劉宇倫 Liu, Yu-Lun |
---|---|
論文名稱: |
擴增實境中人機協作之輔助性互動介面研究 An Experimental Study on Interactive Interface of Human Robot Collaboration in Augmented Reality |
指導教授: |
瞿志行
Chu, Chih-Hsing |
口試委員: |
黃瀅瑛
Huang, Ying-Yin 王怡然 Wang, I-Jan |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 72 |
中文關鍵詞: | 人機協作 、擴增實境 、視覺回饋 、觸覺回饋 、人因評估 、組裝作業 |
外文關鍵詞: | Human robot collaboration, Augmented reality, Visual feedback, Haptic feedback, Ergonomic assessment, Manual assembly |
相關次數: | 點閱:1 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
全球製造業已從過往的大量生產,逐漸轉型成客製化、小批量與高彈性的生產模式。人工智慧已廣泛運用於製造現場,然而仍存在許多人工作業,其自動化困難度或成本過高,較可行的思維是透過適當輔助功能,結合人類決策、動作與人工智慧,實現所謂的人機協同合作。現今人機協作技術已逐漸成熟,結合機器人的高效率,與人類面對不確定環境的反應能力,可有效執行高複雜性工作。與機器協作的過程中,如何確保人員安全,避免相互干擾甚至碰撞,成為重要議題之一。本研究使用擴增實境技術,發展人與機器人之間的互動介面,針對分類與組裝作業情境,提高人機協作的安全性與信任感。實驗結果顯示,即時性資訊回饋可提升工作效率,受試者透過擴增實境的視覺輔助功能,預先了解機械手臂的移動路徑。而熟悉觸覺輔助功能後,即使視覺專注於任務的同時,仍然能掌握足夠的資訊,判斷機械手臂的移動路徑。根據受試者眼動行為,以及與機械手臂靠近的程度分析中,同樣顯示出回饋資訊的助益。本研究獲得不同感官互動介面的設計準則,有效提升人機協作的工作效率,同時也展示了擴增實境技術的創新性應用。
Modern manufacturing industry has been transformed from mass production to customization with small batch and high flexibility. Artifical intelligence (AI) has been applied to a wide spectrum of manufacturing activities, but manual operations remain indispensable on the shop floor due to high complexity and/or cost involving in their automation. A more practical approach is to combine AI with human decision and action via assisted functions, thus realizing the idea of human robot collaboration (HRC). With recent progresses, HRC technologies have been able to intergrade high efficiency of machines and human capability of exception handling under uncertainty to perform high-complexity tasks. One critical issue in HRC is to assure operator's safety by avoiding interference and collision with robot during the collaboration process. This work develops an interface between human and robot using augmented reality (AR) technology. The focus is on improving the safety and trustworthiness in HRC. Experimental results show that instant feedback can improve the work performance of the operator collaborating with a robot. Subjects recognize the moving trajectory of the robot in advance assised by a visual interface in AR. Besides, they can be visually focused on the task being conducted while receiving sufficient information about the robot's movement communicated via tactile feedback. Analysis of the eye tracking behavior of the subjects and the degree of proximity to the moving robot shows the similar effectiveness of the tactile feedback. This work derives important insights into the design of various sensory feedback for assisting human robot collaboration. It also opens up new AR applications in smart manufacturing.
[1] Wang L., Liu S., Liu H., Wang X.V. (2020). Overview of Human-Robot Collaboration in Manufacturing, in Proceedings of 5th International Conference on the Industry 4.0 Model for Advanced Manufacturing, p. 15-58
[2] Pini, F., Ansaloni, M., & Leali, F. (2016, September). Evaluation of operator relief for an effective design of HRC workcells. In 2016 IEEE 21st international conference on emerging technologies and factory automation (ETFA) (pp. 1-6). IEEE.
[3] Schlotzhauer, A., Kaiser, L., Wachter, J., Brandstötter, M., & Hofbaur, M. (2019, August). On the trustability of the safety measures of collaborative robots: 2D Collision-force-map of a sensitive manipulator for safe HRC. In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE) (pp. 1676-1683). IEEE.
[4] Human’s FOV: https://vrui-research.gitbook.io/researchonvrui/ergonomic-issues/jie-mian-zui-yu/ren-yan-kan-neng-li
[5] Wickens, C. D. (2008). Multiple Resources and Mental Workload. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50(3), 449–455.
[6] Kumar, S., Savur, C., & Sahin, F. (2020). Survey of human–robot collaboration in industrial settings: Awareness, intelligence, and compliance. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 51(1), 280-297.
[7] Villani, V., Pini, F., Leali, F., & Secchi, C. (2018). Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics, 55, 248-266.
[8] Bdiwi, M., Pfeifer, M., & Sterzing, A. (2017). A new strategy for ensuring human safety during various levels of interaction with industrial robots. CIRP Annals, 66(1), 453-456.
[9] International Organization for Standardization. (2016). Robots and robotic devices - Collaborative robots (ISO Standard No. 15066: 2016). https://www.iso.org/standard/62996.html
[10] Green, S. A., Billinghurst, M., Chen, X., & Chase, J. G. (2008). Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems, 5(1), 1.
[11] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., & Kruijff, E. (2019, October). Non-visual cues for view management in narrow field of view augmented reality displays. In 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 190-201). IEEE.
[12] Lipowski, Z. J. (1975). Sensory and information inputs overload: behavioral effects. Comprehensive Psychiatry.
[13] Hietanen, A., Pieters, R., Lanz, M., Latokartano, J., & Kämäräinen, J. K. (2020). AR-based interaction for human-robot collaborative manufacturing. Robotics and Computer-Integrated Manufacturing, 63, 101891.
[14] Bolano, G., Fu, Y., Roennau, A., & Dillmann, R. (2021, July). Deploying Multi-Modal Communication Using Augmented Reality in a Shared Workspace. In 2021 18th International Conference on Ubiquitous Robots (UR) (pp. 302-307). IEEE.
[15] Grushko, S., Vysocký, A., Oščádal, P., Vocetka, M., Novák, P., & Bobovský, Z. (2021). Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors, 21(11), 3673.
[16] Scheggi, S., Chinello, F., & Prattichizzo, D. (2012, June). Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks. In Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments (pp. 1-4).
[17] Eckert, M., Blex, M., & Friedrich, C. M. (2018, January). Object detection featuring 3D audio localization for Microsoft HoloLens. In Proc. 11th Int. Joint Conf. on Biomedical Engineering Systems and Technologies (Vol. 5, pp. 555-561).
[18] Zijlstra, A. T. (2017). Using the HoloLens' Spatial Sound System to aid the Visually Impaired when Navigating Indoors (Doctoral dissertation, Faculty of Science and Engineering, University of Groningen).
[19] Ménélas, B., Picinalli, L., Katz, B. F., & Bourdot, P. (2010, March). Audio haptic feedbacks for an acquisition task in a multi-target context. In 2010 IEEE symposium on 3D user interfaces (3DUI) (pp. 51-54). IEEE.
[20] Dehais, F., Sisbot, E. A., Alami, R., & Causse, M. (2011). Physiological and subjective evaluation of a human–robot object hand-over task. Applied ergonomics, 42(6), 785-791.
[21] Lasota, P. A., Fong, T., & Shah, J. A. (2017). A survey of methods for safe human-robot interaction. Now Publishers.
[22] NASA-TLX Questionnaire: https://humansystems.arc.nasa.gov/groups/tlx/
[23] Salvendy, G. (Ed.). (2006). Handbook of human factors and ergonomics (Vol. 144). New York: Wiley.
[24] Deci, E. L., Vallerand, R. J., Pelletier, L. G., & Ryan, R. M. (1991). Motivation and education: The self-determination perspective. Educational psychologist, 26(3-4), 325-346.
[25] Marquardt, A., Trepkowski, C., Eibich, T. D., Maiero, J., Kruijff, E., & Schöning, J. (2020). Comparing non-visual and visual guidance methods for narrow field of view augmented reality displays. IEEE Transactions on Visualization and Computer Graphics, 26(12), 3389-3401.
[26] Trossen Robotics ROS Research Arm: WidowX 250 Robot Arm, https://www.trossenrobotics.com/widowx-250-robot-arm.aspx
[27] SensorGlove Nova: https://www.senseglove.com/product/nova/
[28] HTC VIVE Base Station 2.0: https://www.vive.com/us/accessory/base-station2/
[29] HTC VIVE Tracker 2.0: https://www.vive.com/nz/accessory/vive-tracker/
[30] Microsoft HoloLens 2: https://www.microsoft.com/en-us/HoloLens/hardware
[31] Windows Subsystem for Linux Documentation: https://docs.microsoft.com/en-us/windows/wsl/
[32] ROS-Robot Operating System: https://www.ros.org/
[33] System Usability Scale, SUS: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
[34] ROS MoveIt: https://moveit.ros.org/