簡易檢索 / 詳目顯示

研究生: 黃彥婷
Huang, Yan-Ting
論文名稱: 擴增實境中沿邊作業之輔助性感官回饋
Assistive Sensory Feedback for Trajectory Tracking in Augmented Reality
指導教授: 瞿志行
Chu, Chih-Hsing
口試委員: 李昀儒
Lee, Yun-Ju
黃瀅瑛
Huang, Ying-Yin
王怡然
Wang, I-Jan
學位類別: 碩士
Master
系所名稱: 工學院 - 工業工程與工程管理學系
Department of Industrial Engineering and Engineering Management
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 137
中文關鍵詞: 擴增實境沿邊作業績效表現多感官回饋資訊介面設計
外文關鍵詞: Augmented Reality, Trajectory Tracking, Task Performance, Sensory Feedback, Interface Design
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著近來擴增實境(Augmented Reality, AR)技術已蓬勃發展,成功應用於教育、醫療、軍事、工業與娛樂等領域,輔助進行相關的人工作業。如何透過擴增實境產生即時性互動功能,有效提高作業效能,仍欠缺較完整的介面設計準則。人工沿邊作業(Trajectory Tracking Task)常見於製造現場的機器人示範編程,以及醫療外科手術的輔助性工具,過往已提出不同感官回饋形式,但仍缺乏之間完整的績效比較,無法達成互動介面的效益最佳化。本研究針對擴增實境中,根據真實軌跡之人工沿邊作業,進行深入的人因評估實驗,根據多項量化評估指標,比較不同感官回饋方式,於輔助沿邊作業的績效表現。根據實驗結果分析顯示,於困難度較高的沿邊作業中,回饋資訊除了提供錯誤狀態提示,亦應給予受試者動作的修正建議;此外,相較於視覺與聽覺,觸覺回饋資訊顯示出最佳適配性。受試者傾向在較困難的作業中,仰賴即時回饋資訊完成作業,因此推測應採用確認回饋資訊設計,適當主動介入作業行為。最後發現人工沿邊作業中,對於判斷是否在可接受軌跡範圍內,會受到受試者作業姿勢的影響,而改變作業的績效表現,故建議設計輔助性功能時,應考量操作姿勢的限制。本研究獲得之實驗結果與發現,能提供於擴增實境功能中,人工作業輔助性介面設計的參考依據。


    Augmented Reality (AR) technology has made a significant progress in recent years with AR applications successfully applied to assist manual operations in various industries like manufacturing, military, education, and entertainment. However, there is still a lack of design guidelines for engineers to develop AR user interfaces that can effectively enhance user's situation awareness with instant sensory feedback. Manual trajectory tracking frequently occurs in robot programming by demonstration in manufacturing and major surgical operations. AR functions and interfaces have been developed to assist manual trajectory tracking tasks. These functions may not be optimal design without in-depth comparisons among the effectiveness of different sensory feedbacks employed in the interfaces. Therefore, this work aims to conduct ergonomic assessment for AR user interfaces with different sensory feedback designed to assist manual trajectory tracking. The focus is to thoroughly compare different sensory feedback designs with quantitative measures. According to experimental results, not only should the feedback information displayed in the AR interface signify the occurrence of tracking errors, but it could also constantly suggest the user corrective actions. Moreover, tactile sensory feedback outperforms the visual and audio ones, as the user tends to rely on the haptic feedback force to complete the task, especially for the tracking task of high complexity. Lastly, the user's holding posture of the tracking pen influences the task performance when the tracking accuracy falls into an acceptable range. Thus, AR assisted functions need to consider this factor in optimal interface design. The findings obtained by this work provide useful insights into sensory feedback designs in AR assisted functions developed for manual trajectory tracking.

    摘要 ii Abstract iii 第一章、 緒論 1 1.1 研究背景 1 1.2 研究目的 4 第二章、 文獻回顧 6 2.1多感官回饋資訊 6 2.2多感官回饋資訊與動作控制 8 2.3 感官回饋資訊於目標追尋作業之應用 9 2.3.1定位作業 9 2.3.2沿邊作業 10 2.4 小結 14 第三章、 研究方法 16 3.1實驗內容 16 3.2實驗設計 20 3.2.1 實驗受試者 20 3.2.2 實驗場地配置 21 3.2.3 校正流程 21 3.2.4 實驗流程 23 3.2.5 評估指標 24 3.3實驗限制 25 第四章、 實驗功能開發 26 4.1 機械手臂軟體實作 26 4.2 HoloLens2軟體實作 26 4.3 實驗控制軟體實作 27 第五章、 研究結果與討論 29 5.1 實驗數據前置處理與分析 29 5.2 中央與邊緣視覺之凝視頻率差異 30 5.3 軌跡複雜度影響沿邊作業之績效分析 31 5.4 感官回饋資訊影響沿邊作業之績效分析 35 5.5 沿邊速度影響準確性之分析 42 5.6 軌跡特徵區域影響沿邊準確性之分析 47 5.7 簡單軌跡沿邊作業中,左右側軌跡影響沿邊作業之績效分析 52 5.8 主觀訪談結果統整 56 5.9 討論 57 5.9.1 假設命題彙整與分析 57 5.9.2 實驗結果分析小結 59 第六章、 結論與未來展望 62 參考文獻 64 附錄 67 附錄一、 準確沿邊階段之整體沿邊績效表現 67 附錄二、 效率沿邊階段之整體沿邊績效表現 72 附錄三、 整體沿邊績效表現之描述統計資料 77 附錄四、 準確沿邊階段之凝視頻率 80 附錄五、 效率沿邊階段之凝視頻率 80 附錄六、 準確沿邊階段之簡單軌跡沿邊績效表現 81 附錄七、 效率沿邊階段之簡單軌跡沿邊績效表現 87 附錄八、 簡單軌跡中不同區域之沿邊績效表現之描述統計資料 92 附錄九、 準確沿邊階段之複雜軌跡沿邊績效表現 95 附錄十、 效率沿邊階段之複雜軌跡沿邊績效表現 100 附錄十一、 複雜軌跡中不同區域之沿邊績效表現 105 附錄十二、 準確沿邊階段之簡單軌跡沿邊績效表現 108 附錄十三、 效率沿邊階段之簡單軌跡沿邊績效表現 113 附錄十四、 左右側簡單軌跡之沿邊績效表現之描述統計資料 118 附錄十五、 H1統計結果 121 附錄十六、 H2統計結果 121 附錄十七、 H3統計結果 121 附錄十八、 H4統計結果 123 附錄十九、 H5統計結果 127 附錄二十、 H6統計結果 131 附錄二十一、 H7統計結果 131 附錄二十二、 H8統計結果 132 附錄二十三、 H9統計結果 134 附錄二十四、 H10統計結果 135

    [1] Gacem, H., Bailly, G., Eagan, J., & Lecolinet, E. (2015, September). Finding objects faster in dense environments using a projection augmented robotic arm. In IFIP Conference on Human-Computer Interaction (pp. 221-238). Springer, Cham.
    [2] Walker, M., Hedayati, H., Lee, J., & Szafir, D. (2018, February). Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 316-324).
    [3] Aleotti, J., Micconi, G., Caselli, S., Benassi, G., Zambelli, N., Bettelli, M., & Zappettini, A. (2017). Detection of nuclear sources by UAV teleoperation using a visuo-haptic augmented reality interface. Sensors, 17(10), 2234.
    [4] Chu, C. H., Liu, Y. W., Li, P. C., Huang, L. C., & Luh, Y. P. (2020). Programming by demonstration in augmented reality for the motion planning of a three-axis CNC dispenser. International Journal of Precision Engineering and Manufacturing-Green Technology, 7(5), 987-995.
    [5] Müller, F., Roner, S., Liebmann, F., Spirig, J. M., Fürnstahl, P., & Farshad, M. (2020). Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. The Spine Journal, 20(4), 621-628.
    [6] Ni, D., Yew, A. W. W., Ong, S. K., & Nee, A. Y. C. (2017). Haptic and visual augmented reality interface for programming welding robots. Advances in Manufacturing, 5(3), 191-198.
    [7] Stadler, S., Kain, K., Giuliani, M., Mirnig, N., Stollnberger, G., & Tscheligi, M. (2016, August). Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 179-184). IEEE.
    [8] Alais, D., Newell, F., & Mamassian, P. (2010). Multisensory processing in review: from physiology to behaviour. Seeing and perceiving, 23(1), 3-38.
    [9] Newell, F. N., Ernst, M. O., Tjan, B. S., & Bülthoff, H. H. (2001). Viewpoint dependence in visual and haptic object recognition. Psychological science, 12(1), 37-42.
    [10] Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in cognitive sciences, 8(4), 162-169.
    [11] Rock, I., & Victor, J. (1964). Vision and touch: An experimentally created conflict between the two senses. Science, 143(3606), 594-596.
    [12] Gebhard, J. W., & Mowbray, G. H. (1959). On discriminating the rate of visual flicker and auditory flutter. The American journal of psychology, 72(4), 521-529.
    [13] Fujisaki, W., & Nishida, S. Y. (2009). Audio–tactile superiority over visuo–tactile and audio–visual combinations in the temporal resolution of synchrony perception. Experimental brain research, 198(2), 245-259.
    [14] Arabzadeh, E., Clifford, C. W., & Harris, J. A. (2008). Vision merges with touch in a purely tactile discrimination. Psychological Science, 19(7), 635-641.
    [15] Alais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current biology, 14(3), 257-262.
    [16] Rosati, G., Oscari, F., Spagnol, S., Avanzini, F., & Masiero, S. (2012). Effect of task-related continuous auditory feedback during learning of tracking motion exercises. Journal of neuroengineering and rehabilitation, 9(1), 1-13.
    [17] Boyer, É. O., Bevilacqua, F., Susini, P., & Hanneton, S. (2017). Investigating three types of continuous auditory feedback in visuo-manual tracking. Experimental brain research, 235(3), 691-701.
    [18] Howard, T., & Szewczyk, J. (2016). Improving precision in navigating laparoscopic surgery instruments toward a planar target using haptic and visual feedback. Frontiers in Robotics and AI, 3, 37.
    [19] 劉宇倫、宋沅叡、黃乙倫、許甯 (2019). 擴增實境中不同感知回饋對人工定位之影響比較,專題報告,清華大學工業工程與管理學系。
    [20] 黃博遠、李佾昀、王怡人 (2021). 擴增實境中不同感官輔助資訊對於定位工作之影響,專題報告,清華大學工業工程與管理學系。
    [21] Sun, M., Ren, X., & Cao, X. (2010). Effects of multimodal error feedback on human performance in steering tasks. Journal of Information Processing, 18, 284-292.
    [22] Condino, S., Fida, B., Carbone, M., Cercenelli, L., Badiali, G., Ferrari, V., & Cutolo, F. (2020). Wearable augmented reality platform for aiding complex 3D trajectory tracing. Sensors, 20(6), 1612.
    [23] Gao, B., Kim, H., Lee, H., Lee, J., & Kim, J. I. (2018). Effects of continuous auditory feedback on drawing trajectory-based finger gestures. IEEE Transactions on Human-Machine Systems, 48(6), 658-669.
    [24] Williams, C. K., Tremblay, L., & Carnahan, H. (2016). It pays to go off-track: practicing with error-augmenting haptic feedback facilitates learning of a curve-tracing task. Frontiers in psychology, 7, 2010.
    [25] Zareinia, K., Yang, X. D., Irani, P., & Sepehri, N. (2009). Evaluating factors that influence path tracing with passive haptic guidance. In International Conference on Haptic and Audio Interaction Design (pp. 21-30). Springer, Berlin, Heidelberg.
    [26] Bluteau, J., Coquillart, S., Payan, Y., & Gentaz, E. (2008). Haptic guidance improves the visuo-manual tracking of trajectories. PloS one, 3(3), e1775.
    [27] Zabramski, S., & Stuerzlinger, W. (2012, October). The effect of shape properties on ad-hoc shape replication with mouse, pen, and touch input. In Proceeding of the 16th International Academic MindTrek Conference (pp. 275-278).
    [28] Kruijff, E., Orlosky, J., Kishishita, N., Trepkowski, C., & Kiyokawa, K. (2018). The influence of label design on search performance and noticeability in wide field of view augmented reality displays. IEEE transactions on visualization and computer graphics, 25(9), 2821-2837.
    [29] Stern, J. A., Brown, T. B., Wang, L., & Russo, M. B. (2005). Eye and head movements in the acquisition of visual information. Psychologia, 48(2), 133-145.
    [30] https://www.3dsystems.com/haptics-devices/openhaptics

    QR CODE