研究生: |
呂祐任 Lu, Yu-Jen |
---|---|
論文名稱: |
基於擴增實境之人機協作任務分配與排程 Task Allocation and Scheduling for Human-Robot Collaboration in Augmented Reality |
指導教授: |
瞿志行
Chu, Chih-Hsing |
口試委員: |
李昀儒
Lee, Yun-Ju 丁慶榮 Ting, Ching-Jung 田凱文 Tien, Kai-Wen |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2024 |
畢業學年度: | 112 |
語文別: | 中文 |
論文頁數: | 76 |
中文關鍵詞: | 人機協作 、任務分配 、排程 、動素分析 、擴增實境 |
外文關鍵詞: | Human-Robot Collaboration, Task Allocation, Scheduling, Motion Analysis, Augmented Reality |
相關次數: | 點閱:30 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來隨著工業 5.0 概念的出現,人機協作逐漸成為重要研究議題,結合機 器人卓越的執行能力,以及人對環境變化的應對決策,以互補方式規劃與設計作 業,能夠有效完成高複雜度的製造流程。然而由於安全性規範限制,人機協作分 工方式尚未有明確定義,故本研究提出綜觀之作業規劃模型,以有效實現基於擴 增實境之人機協作。此模型聚區分左右手,先針對作業流程進行動素分析,重新 組合成單手任務,從時間及人因的角度分配任務,並依據對應任務執行時間決定 最佳化排程。此外,實際應用擴增實境頭戴式裝置與協作型機器人,提供即時性 輔助資訊提示,提高人員對機械手臂運動的理解,並依循模型產出結果建立實驗 環境,透過量化績效指標分析,顯示最佳化模型規劃之作業流程,相較於不區分 左右手的傳統作業方式,改善製程完成時間,並顯示以擴增實境輔助人機協作的 可行性。
Human-robot collaboration (HRC) has emerged as a crucial research area in realizing the vision of Industry 5.0. Complex manufacturing operations can be efficiently and effectively completed by combining the physical capabilities of robots with human adaptability and decision-making in dynamic environments. However, task allocation in manufacturing systems involving human-robot collaboration still faces challenges due to strict safety regulations and the variability of human behavior. To address this issue, this study introduces a comprehensive operational planning model that enables effective human-robot collaboration (HRC) supported by head-mounted augmented reality (AR). This model distinguishes between tasks performed by the left and right hands based on Therbligs analysis and subsequently reorganizes them into single-handed operations. Tasks are allocated from the perspectives of efficiency and ergonomics and are optimally scheduled by considering the completion time of each possible task. In addition, AR-assisted functions are developed to enhance human understanding of real-time robot movements through visual cues. Test results from an HRC scenario validate the advantages of the proposed model compared to a traditional approach that does not differentiate between left- and right-hand tasks. The scenario also demonstrates the practicality of head-mounted AR in supporting HRC.
[1] L. D. Xu, E. L. Xu, and L. Li, "Industry 4.0: state of the art and future trends," International journal of production research, vol. 56, no. 8, pp. 2941-2962, 2018.
[2] J. Leng et al., "Industry 5.0: Prospect and retrospect," Journal of Manufacturing Systems, vol. 65, pp. 279-295, 2022.
[3] A. Zacharaki, I. Kostavelis, A. Gasteratos, and I. Dokas, "Safety bounds in human robot interaction: A survey," Safety science, vol. 127, p. 104667, 2020.
[4] A. Capitanelli, M. Maratea, F. Mastrogiovanni, and M. Vallati, "On the manipulation of articulated objects in human–robot cooperation scenarios," Robotics and Autonomous Systems, vol. 109, pp. 139-155, 2018/11/01/ 2018, doi: https://doi.org/10.1016/j.robot.2018.08.003.
[5] C. Petzoldt, M. Harms, and M. Freitag, "Review of task allocation for human-robot collaboration in assembly," International Journal of Computer Integrated Manufacturing, vol. 36, no. 11, pp. 1675-1715, 2023.
[6] D. K. Baroroh, C.-H. Chu, and L. Wang, "Systematic literature review on augmented reality in smart manufacturing: Collaboration between human and computational intelligence," Journal of Manufacturing Systems, vol. 61, pp. 696-711, 2021.
[7] V. Villani, F. Pini, F. Leali, and C. Secchi, "Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications," Mechatronics, vol. 55, pp. 248-266, 2018.
[8] E. Z. Barsom, M. Graafland, and M. P. Schijven, "Systematic review on the effectiveness of augmented reality applications in medical training," Surgical endoscopy, vol. 30, pp. 4174-4183, 2016.
[9] S. Webel, U. Bockholt, T. Engelke, N. Gavish, M. Olbrich, and C. Preusche, "An augmented reality training platform for assembly and maintenance skills," Robotics and autonomous systems, vol. 61, no. 4, pp. 398-403, 2013.
[10] S. Koo, J. Kim, C. Kim, J. Kim, and H. S. Cha, "Development of an augmented reality tour guide for a cultural heritage site," Journal on Computing and Cultural Heritage (JOCCH), vol. 12, no. 4, pp. 1-24, 2019.
[11] R. Suzuki, A. Karim, T. Xia, H. Hedayati, and N. Marquardt, "Augmented reality and robotics: A survey and taxonomy for ar-enhanced human-robot interaction and robotic interfaces," in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, 2022, pp. 1-33.
[12] T. Bänziger, A. Kunz, and K. Wegener, "Optimizing human–robot task allocation using a simulation tool based on standardized work descriptions," Journal of Intelligent Manufacturing, vol. 31, pp. 1635-1648, 2020.
[13] N. Gjeldum, A. Aljinovic, M. Crnjac Zizic, and M. Mladineo, "Collaborative robot task allocation on an assembly line using the decision support system," International Journal of Computer Integrated Manufacturing, vol. 35, no. 4-5, pp. 510-526, 2022.
[14] E. Lamon, A. De Franco, L. Peternel, and A. Ajoudani, "A capability-aware role allocation approach to industrial assembly tasks," IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 3378-3385, 2019.
[15] F. Zacharias, C. Borst, and G. Hirzinger, "Capturing robot workspace structure: representing robot capabilities," in 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007: Ieee, pp. 3229-3236.
[16] T. Komenda, F. Ranz, and W. Sihn, "Influence of Task Allocation Patterns on Safety and Productivity in Human-Robot-Collaboration," in Conference: Industrial Simulation Conference (ISC), 2019, pp. 85-89.
[17] A. Hietanen, R. Pieters, M. Lanz, J. Latokartano, and J.-K. Kämäräinen, "AR-based interaction for human-robot collaborative manufacturing," Robotics and Computer-Integrated Manufacturing, vol. 63, p. 101891, 2020.
[18] R. Newbury, A. Cosgun, T. Crowley-Davis, W. P. Chan, T. Drummond, and E. A. Croft, "Visualizing robot intent for object handovers with augmented reality," in 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 2022: IEEE, pp. 1264-1270.
[19] J. Chen, Y. Fu, W. Lu, and Y. Pan, "Augmented reality-enabled human-robot collaboration to balance construction waste sorting efficiency and occupational safety and health," Journal of Environmental Management, vol. 348, p. 119341, 2023.
[20] A. A. Malik and A. Bilberg, "Complexity-based task allocation in human-robot collaborative assembly," Industrial Robot: the international journal of robotics research and application, vol. 46, no. 4, pp. 471-480, 2019.
[21] L. Johannsmeier and S. Haddadin, "A hierarchical human-robot interaction-planning framework for task allocation in collaborative industrial assembly processes," IEEE Robotics and Automation Letters, vol. 2, no. 1, pp. 41-48, 2016.
[22] G. Evangelou, N. Dimitropoulos, G. Michalos, and S. Makris, "An approach for task and action planning in human–robot collaborative cells using AI," Procedia Cirp, vol. 97, pp. 476-481, 2021.
[23] N. Nikolakis, N. Kousi, G. Michalos, and S. Makris, "Dynamic scheduling of shared human-robot manufacturing operations," Procedia CIRP, vol. 72, pp. 9-14, 2018.
[24] F. B. Gilbreth and L. M. Gilbreth, "Classifying the elements of work," Management and Administration, vol. 8, no. 2, pp. 151-154, 1924.
[25] C.-S. Chen, S.-K. Chen, C.-C. Lai, and C.-T. Lin, "Sequential motion primitives recognition of robotic arm task via human demonstration using hierarchical BiLSTM classifier," IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 502-509, 2020.
[26] R. E. Crandall, "Predetermined time standards origin, theory and application," Boston University, 1957.
[27] J. C. Bean, "Genetic algorithms and random keys for sequencing and optimization," ORSA journal on computing, vol. 6, no. 2, pp. 154-160, 1994.
[28] F. Fusaro, E. Lamon, E. De Momi, and A. Ajoudani, "An integrated dynamic method for allocating roles and planning tasks for mixed human-robot teams," in 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 2021: IEEE, pp. 534-539.
[29] T. Akiba, S. Sano, T. Yanase, T. Ohta, and M. Koyama, "Optuna: A next-generation hyperparameter optimization framework," in Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, 2019, pp. 2623-2631.