研究生: |
劉宇望 Liu, Yu-Wang |
---|---|
論文名稱: |
擴增實境中示範學習之運動規劃人機界面 A Human-Machine Interface of Motion Planning in Augmented Reality based Programming for Demonstration |
指導教授: |
瞿志行
Chu, Chih-Hsing |
口試委員: |
陸元平
Luh, Yuan-Ping 郭嘉真 Kuo, Chia-Chen |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工業工程與工程管理學系 Department of Industrial Engineering and Engineering Management |
論文出版年: | 2018 |
畢業學年度: | 106 |
語文別: | 中文 |
論文頁數: | 92 |
中文關鍵詞: | 擴增實境 、示範學習 、影像處理 、路徑規劃 、點膠機 、誤差分析 |
外文關鍵詞: | Augmented Reality, Programming by Demonstration, Image Processing, Motion Planning, 5-Axis Dispenser, Error Analysis |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
擴增實境(Augmented Reality, AR)為一種將虛擬資訊融入或合成於真實世界影像的介面技術,能夠提供更好的人機互動功能,適合協助人機之間的協同合作。示範學習為自動化設備常見的運動規劃方式,由操作者透過教導器,在真實場景中控制機器執行工作,並將過程輸出為運動指令。過往研究已針對點膠機的運動規劃,驗證AR人機互動介面的可行性,然而於坐標轉換、相機校正與物件追蹤等步驟中,並未進行對應的優化,仍存在較大的誤差。本研究為改善上述問題,改善深度相機擷取的深度資訊,嘗試不同演算法以提高物件追蹤的精度;於點雲中自動找出平面與直線,使用者能夠直接選取真實工件的輪廓,快速進行點膠的運動規劃;此外量化分析系統不同來源的誤差,找出顯著的影響因素,供未來研究進行改善。最後於五軸點膠機上進行測試,實際驗證人機互動功能在示範學習上的效能,並與使用教導器進行實驗比較,顯示研究概念的實用價值。
Augmented Reality (AR) is an interface technology that integrates virtual information into real-world images. It provides better human-machine interaction and is suitable for assisting human-machine collaboration. Programming by Demonstration (PbD) is a common motion planning method for automation equipment. The operator controls the machine to perform work in the real scene through a teaching pad, and outputs the signals as motion instructions. Previous studies have verified the feasibility of AR human-machine interaction interface for the motion planning of 3-Axis dispenser. However, the steps of coordinate transformation, camera calibration, and object tracking still suffer from large errors. To solve these problems, this study improves the data quality captured by a depth camera using different algorithms. Automatic construction of 3D lines and planes from the point cloud is implemented so that the user can directly select the contours of the real workpiece to quickly perform motion planning of a 5-Axes glue dispenser. In addition, we conduct quantitative analysis for various errors of the system and thus identify significant factors for improving the system performance. Finally, verification tests were carried out to demonstrate the effectiveness of the human-machine interaction interface consisting of the above improved functions. Comparison tests were also performed to show the advantage of the AR interface over the traditional teaching pad for PbD.
參考文獻
[1] T. Brogardh.“Present and Future Robot Control Development – An Industrial Perspective”. Annual Reviews in Control, 31(1): 69-79. (2007).
[2] B. Braye. “Programming Manual”. ABB Flexible Automation AS, Rep. 3HNT 105 086R1001. (1996).
[3] B. Solvang, G. Sziebig, and P. Korondi. “Vision-based Robot Programming”. IEEE International Conference on Networking Sensing and Control, pp.949-954. (2008).
[4] O. A. Anfindsen, C. Skourup, T. Petterson, and J. Pretlove. “Method and a System for Programming an Industrial Robot”. U.S. Patent 7 209 801 B2. (2007).
[5] G. Biggs, B. MacDonald. “A survey of robot programming systems”. Proceedings of the Australasian conference on robotics and automation, pp. 1-3. (2003).
[6] 黎百加,擴增實境中基於機器編程示範之運動規劃,清華大學工業工程與工程管理學系,碩士論文(2017).
[7] P.P. Valentini. “Interactive virtual assembling in augmented reality”. Intern. J. Inter. Design Manuf. 3(2). pp. 109-119. (2009).
[8] P.P. Valentini. “Interactive cable harnessing in augmented reality”. Intern. J. Inter. Design Manuf. 5(1). pp. 45-53. (2011).
[9] J. W. S. Chong, S. K. Ong, A. Y. C. Nee, and K. Y. Youmi. “Robot programming using augmented reality: an interactive method for planning collision-free paths”. Robot.Comput. Integr. Manuf. 25(3). pp. 689-701. (2009).
[10] M. F. Zaeh, W. Vogl. “Interactive laser-projection for programming industrial robots”. Proceedings of the International Symposium on Mixed and Augmented Reality. pp. 125-128. (2006).
[11] G. A. Lee, G. J. Kim. “Immersive authoring of Tangible Augmented Reality content: A user study”. Journal of Visual Languages & Computing. vol. 20. pp. 61-79. (2009).
[12] S. K. Ong, J. W. S. Chong, and A. Y. C. Nee. “A novel AR-based robot programming and path planning methodology”. Robotics and Computer-Integrated Manufacturing(IJIDM). vol. 26. pp. 240-249. 6. (2010).
[13] H. C. Fang, S. K. Ong, and A. Y. C. Nee. “A novel augmented reality-based interface for robot path planning”. International Journal on Interactive Design and Manufacturing (IJIDM). vol. 8. pp. 33-42. (2014).
[14] Z. Y. Zhang. “A flexible new technique for camera calibration”, IEEE Transactions on pattern analysis and machine intelligence. vol. 22. pp. 1330-1334. (2000).
[15] Kinect v2, Microsoft. Retrieved from: https://support.xbox.com/en-US/xbox-on-windows/accessories/Kinect v2-for-windows-v2-setup#d38879587035411fbc6231c4982e0afa
[16] Time of flight, Wikipedia. Retrieved from: https://en.wikipedia.org/wiki/Time-of-flight_camera
[17] M. Pirovano, C. Y. Ren, and I. Frosio. “Robust Silhouette Extraction from Kinect Data”. International Conference on Image Analysis and Processing (ICIAP). pp.642-651. (2013).
[18] K. Xu, J. Zhou, and Z. Wang. “A method of hole-filling for the depth map generated by Kinect with moving objects detection”. (2013).
[19] OpenGL.org, Khronos Group. Retrieved from: https://www.opengl.org/
[20] Open Source Augmented Reality SDK, Artoolkit.org. Retrieved from: https://www.artoolkit.org/
[21] W. Song, L. A. Vu, and S. W. Jung. “Hole Filling for Kinect v2 Depth Images”. (2014).
[22] E. Lachat, H. Macher, and M. A. Mittet. “First Experiences with Kinect v2 Sensor for Close Range 3D Modeling”. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. vol. XL-5/W4. (2015).
[23] Y. Y. Chuang, D. B. Goldman, B. Curless, D. H. Salesin, and R. Szeliski. “Shadow matting and compositing”. ACM Transactions on Graphics (TOG). pp. 494-500. (2003).
[24] H. Asada, H. Izumi. “Automatic program generation from teaching data for the hybrid control of robots”. IEEE Transactions on Robotics and Automation. vol. 5, pp. 166-173. (1989).
[25] OpenCV - Miscellaneous Image Transformations. Retrieved from: http://docs.opencv.org/2.4/modules/imgproc/doc/miscellaneous_transformations.html
[26] J. Yin, Y. Han, J. Li, and A. Cao. “Research on Real-Time Object Tracking by Improved CAMShift”. International Symposium on Computer Network and Multimedia Technology. pp.1-4. (2009).
[27] D. Comaniciu, P. Meer. “Mean Shift: A Robust Approach Toward Feature Space Analysis”. IEEE Transactions on Pattern Analysis and Machine Intelligence. vol.24. (2002).
[28] D. Held, S. Thrun, and S. Savarese. “Learning to Track at 100 FPS with Deep Regression Networks”. ECCV. (2016).
[29] Z.Y. Zhang. “A flexible new technique for camera calibration”, IEEE Transactions on pattern analysis and machine intelligence. vol. 22. pp. 1330-1334. (2000).
[30] Richard's blog - camera calibration - part 1 camera model. Retrieved from: http://wycwang.blogspot.tw/2012/09/camera-calibration-part-1-camera-model.html
[31] Wall, E. Michael. “Singular value decomposition and principal component analysis”. (2003).
[32] B. Nuernberger, E. Ofek, and H. Benko. “SnapToReality: Aligning Augmented Reality to the Real World”, Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. (2016).
[33] J. Canny. “A Computational Approach to Edge Detection”. IEEE Transactions on Pattern Analysis and Machine Intelligence. vol.8. (1986).
[34] N. Kiryati, Y. Eldar, and A. M. Bruckstein. “A Probabilistic Hough Transform”. Pattern Recognition. vol.24. (1991).
[35] 毛星云.“OpenCV3編程入門”. https://github.com/QianMo/OpenCV3-Intro-Book-Src. (2015).
[36] S. V. Burtsev, Y. P. Kuzmin. “An Efficient Flood-Filling Algorithm”. Computers&Graphics. vol.17. (1993).
[37] A. R. Smith. “Color Gamut Transform Pairs”. SIGGRAPH 78 Conference Proceedings. pp. 12-19. (1978).
[38] M. A. Fischler, R. C. Bolles. “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography”. Comm. of the ACM. vol 24. pp. 381-395. (1981).
[39] E. G. Gibert, D. W. Johnson, and S. S. Keerthi. “A fast procedure for computing the distance between complex objects in three-dimensional space”. IEEE Journal on Robotics and Automation. vol. 4. (1988).
[40] GJK - Distance & Closest Points. Retrieved from: http://www.dyn4j.org/2010/04/gjk-distance-closest-points/
[41] G. V. D. Bergen. “Collision Detection in Interactive 3D Environments”. (2003).
[42] M. Donald. “Octree Encoding: A New Technique for the Representation, Manipulation and Display of Arbitrary 3-D Objects by Computer”. Rensselaer Polytechnic Institute. (1980).
[43] S.Raschdorf, M. Kolonko. “Loose Octree: a data structure for the simulation of polydisperse particle packings”. (2009).
[44] C. Ericson. “Real-Time Collision Detection”. (2004).
[45] T. Nikodym. “Ray Tracing Algorithm For Interactive Applications”. (2010).
[46] LOCTAI Enterprise. Retrieved from: http://www.loctai.com.tw/
[47] T. H. Cormen, C. E. Leiserson, R. L. Rivest, and C. Stein. “Introduction to Algorithms, Third Edition”. (2009).
[48] J. Brooke, “SUS: a "quick and dirty" usability scale”. Usability Evaluation in Industry. (1996).
[49] MATLAB Calibration Toolbox. Retrieved from: http://www.vision.caltech.edu/bouguetj/calib_doc/download/index.html
[50] L. Yang, L. Y. Zhang. “Evaluating and Improving the Depth Accuarcy of Kinect for Windows v2”. IEEE Sensors Journal. vol.15. (2015).
[51] M. Laukkanen. “Performance Evaluation of Time-of-Flight Depth Camera”. Thesis for the degree of Master of Science in Technology. Aalto University. (2015).
[52] J. Wlley. “Encyclopedia of Statistical Sciences”. QA276.14.E5. (1982).
[53] Bullet Collision Detection & Physics Library SDK, Bulletphysics.org. Retrieved from: https://www.bulletphysics.org/Bullet/BulletFull/
[54] Intel RealSense Depth Camera D435. Retrieved from: https://click.intel.com/intelr-realsensetm-depth-camera-d435.html
[55] Intel RealSense Depth Camera D435 Product Specifications. Retrieved from: https://ark.intel.com/products/128255/Intel-RealSense-Depth-Camera-D435
[56] C. Dickinson. “Learning Game Physics with Bullet Physics and OpenGL”. (2013).
[57] Qt | Cross-platform software development for embedded&desktop. Retrieved from: https://www.qt.io/
[58] Microsoft Hololens. Retrieved from: https://www.microsoft.com/en-us/hololens