研究生: |
黃彥婷 Huang, Yan-Ting |
---|---|
論文名稱: |
擴增實境中沿邊作業之輔助性感官回饋 Assistive Sensory Feedback for Trajectory Tracking in Augmented Reality |
指導教授: |
瞿志行
Chu, Chih-Hsing |
口試委員: |
李昀儒
Lee, Yun-Ju 黃瀅瑛 Huang, Ying-Yin 王怡然 Wang, I-Jan |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 137 |
中文關鍵詞: | 擴增實境 、沿邊作業 、績效表現 、多感官回饋資訊 、介面設計 |
外文關鍵詞: | Augmented Reality, Trajectory Tracking, Task Performance, Sensory Feedback, Interface Design |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著近來擴增實境(Augmented Reality, AR)技術已蓬勃發展,成功應用於教育、醫療、軍事、工業與娛樂等領域,輔助進行相關的人工作業。如何透過擴增實境產生即時性互動功能,有效提高作業效能,仍欠缺較完整的介面設計準則。人工沿邊作業(Trajectory Tracking Task)常見於製造現場的機器人示範編程,以及醫療外科手術的輔助性工具,過往已提出不同感官回饋形式,但仍缺乏之間完整的績效比較,無法達成互動介面的效益最佳化。本研究針對擴增實境中,根據真實軌跡之人工沿邊作業,進行深入的人因評估實驗,根據多項量化評估指標,比較不同感官回饋方式,於輔助沿邊作業的績效表現。根據實驗結果分析顯示,於困難度較高的沿邊作業中,回饋資訊除了提供錯誤狀態提示,亦應給予受試者動作的修正建議;此外,相較於視覺與聽覺,觸覺回饋資訊顯示出最佳適配性。受試者傾向在較困難的作業中,仰賴即時回饋資訊完成作業,因此推測應採用確認回饋資訊設計,適當主動介入作業行為。最後發現人工沿邊作業中,對於判斷是否在可接受軌跡範圍內,會受到受試者作業姿勢的影響,而改變作業的績效表現,故建議設計輔助性功能時,應考量操作姿勢的限制。本研究獲得之實驗結果與發現,能提供於擴增實境功能中,人工作業輔助性介面設計的參考依據。
Augmented Reality (AR) technology has made a significant progress in recent years with AR applications successfully applied to assist manual operations in various industries like manufacturing, military, education, and entertainment. However, there is still a lack of design guidelines for engineers to develop AR user interfaces that can effectively enhance user's situation awareness with instant sensory feedback. Manual trajectory tracking frequently occurs in robot programming by demonstration in manufacturing and major surgical operations. AR functions and interfaces have been developed to assist manual trajectory tracking tasks. These functions may not be optimal design without in-depth comparisons among the effectiveness of different sensory feedbacks employed in the interfaces. Therefore, this work aims to conduct ergonomic assessment for AR user interfaces with different sensory feedback designed to assist manual trajectory tracking. The focus is to thoroughly compare different sensory feedback designs with quantitative measures. According to experimental results, not only should the feedback information displayed in the AR interface signify the occurrence of tracking errors, but it could also constantly suggest the user corrective actions. Moreover, tactile sensory feedback outperforms the visual and audio ones, as the user tends to rely on the haptic feedback force to complete the task, especially for the tracking task of high complexity. Lastly, the user's holding posture of the tracking pen influences the task performance when the tracking accuracy falls into an acceptable range. Thus, AR assisted functions need to consider this factor in optimal interface design. The findings obtained by this work provide useful insights into sensory feedback designs in AR assisted functions developed for manual trajectory tracking.
[1] Gacem, H., Bailly, G., Eagan, J., & Lecolinet, E. (2015, September). Finding objects faster in dense environments using a projection augmented robotic arm. In IFIP Conference on Human-Computer Interaction (pp. 221-238). Springer, Cham.
[2] Walker, M., Hedayati, H., Lee, J., & Szafir, D. (2018, February). Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 316-324).
[3] Aleotti, J., Micconi, G., Caselli, S., Benassi, G., Zambelli, N., Bettelli, M., & Zappettini, A. (2017). Detection of nuclear sources by UAV teleoperation using a visuo-haptic augmented reality interface. Sensors, 17(10), 2234.
[4] Chu, C. H., Liu, Y. W., Li, P. C., Huang, L. C., & Luh, Y. P. (2020). Programming by demonstration in augmented reality for the motion planning of a three-axis CNC dispenser. International Journal of Precision Engineering and Manufacturing-Green Technology, 7(5), 987-995.
[5] Müller, F., Roner, S., Liebmann, F., Spirig, J. M., Fürnstahl, P., & Farshad, M. (2020). Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. The Spine Journal, 20(4), 621-628.
[6] Ni, D., Yew, A. W. W., Ong, S. K., & Nee, A. Y. C. (2017). Haptic and visual augmented reality interface for programming welding robots. Advances in Manufacturing, 5(3), 191-198.
[7] Stadler, S., Kain, K., Giuliani, M., Mirnig, N., Stollnberger, G., & Tscheligi, M. (2016, August). Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 179-184). IEEE.
[8] Alais, D., Newell, F., & Mamassian, P. (2010). Multisensory processing in review: from physiology to behaviour. Seeing and perceiving, 23(1), 3-38.
[9] Newell, F. N., Ernst, M. O., Tjan, B. S., & Bülthoff, H. H. (2001). Viewpoint dependence in visual and haptic object recognition. Psychological science, 12(1), 37-42.
[10] Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in cognitive sciences, 8(4), 162-169.
[11] Rock, I., & Victor, J. (1964). Vision and touch: An experimentally created conflict between the two senses. Science, 143(3606), 594-596.
[12] Gebhard, J. W., & Mowbray, G. H. (1959). On discriminating the rate of visual flicker and auditory flutter. The American journal of psychology, 72(4), 521-529.
[13] Fujisaki, W., & Nishida, S. Y. (2009). Audio–tactile superiority over visuo–tactile and audio–visual combinations in the temporal resolution of synchrony perception. Experimental brain research, 198(2), 245-259.
[14] Arabzadeh, E., Clifford, C. W., & Harris, J. A. (2008). Vision merges with touch in a purely tactile discrimination. Psychological Science, 19(7), 635-641.
[15] Alais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current biology, 14(3), 257-262.
[16] Rosati, G., Oscari, F., Spagnol, S., Avanzini, F., & Masiero, S. (2012). Effect of task-related continuous auditory feedback during learning of tracking motion exercises. Journal of neuroengineering and rehabilitation, 9(1), 1-13.
[17] Boyer, É. O., Bevilacqua, F., Susini, P., & Hanneton, S. (2017). Investigating three types of continuous auditory feedback in visuo-manual tracking. Experimental brain research, 235(3), 691-701.
[18] Howard, T., & Szewczyk, J. (2016). Improving precision in navigating laparoscopic surgery instruments toward a planar target using haptic and visual feedback. Frontiers in Robotics and AI, 3, 37.
[19] 劉宇倫、宋沅叡、黃乙倫、許甯 (2019). 擴增實境中不同感知回饋對人工定位之影響比較,專題報告,清華大學工業工程與管理學系。
[20] 黃博遠、李佾昀、王怡人 (2021). 擴增實境中不同感官輔助資訊對於定位工作之影響,專題報告,清華大學工業工程與管理學系。
[21] Sun, M., Ren, X., & Cao, X. (2010). Effects of multimodal error feedback on human performance in steering tasks. Journal of Information Processing, 18, 284-292.
[22] Condino, S., Fida, B., Carbone, M., Cercenelli, L., Badiali, G., Ferrari, V., & Cutolo, F. (2020). Wearable augmented reality platform for aiding complex 3D trajectory tracing. Sensors, 20(6), 1612.
[23] Gao, B., Kim, H., Lee, H., Lee, J., & Kim, J. I. (2018). Effects of continuous auditory feedback on drawing trajectory-based finger gestures. IEEE Transactions on Human-Machine Systems, 48(6), 658-669.
[24] Williams, C. K., Tremblay, L., & Carnahan, H. (2016). It pays to go off-track: practicing with error-augmenting haptic feedback facilitates learning of a curve-tracing task. Frontiers in psychology, 7, 2010.
[25] Zareinia, K., Yang, X. D., Irani, P., & Sepehri, N. (2009). Evaluating factors that influence path tracing with passive haptic guidance. In International Conference on Haptic and Audio Interaction Design (pp. 21-30). Springer, Berlin, Heidelberg.
[26] Bluteau, J., Coquillart, S., Payan, Y., & Gentaz, E. (2008). Haptic guidance improves the visuo-manual tracking of trajectories. PloS one, 3(3), e1775.
[27] Zabramski, S., & Stuerzlinger, W. (2012, October). The effect of shape properties on ad-hoc shape replication with mouse, pen, and touch input. In Proceeding of the 16th International Academic MindTrek Conference (pp. 275-278).
[28] Kruijff, E., Orlosky, J., Kishishita, N., Trepkowski, C., & Kiyokawa, K. (2018). The influence of label design on search performance and noticeability in wide field of view augmented reality displays. IEEE transactions on visualization and computer graphics, 25(9), 2821-2837.
[29] Stern, J. A., Brown, T. B., Wang, L., & Russo, M. B. (2005). Eye and head movements in the acquisition of visual information. Psychologia, 48(2), 133-145.
[30] https://www.3dsystems.com/haptics-devices/openhaptics