簡易檢索 / 詳目顯示

研究生: 劉士維
Liu, Shih-Wei
論文名稱: 具嵌入式視覺自適應夾爪之物件辨識及追蹤系統設計開發
Design and Development of an Eye-in-hand Adaptive Gripper for Object Recognition and Tracking
指導教授: 張禎元
Chang, Jen-Yuan
口試委員: 宋震國
Sung, Cheng-Kuo
曹哲之
Tsao, Che-Chih
學位類別: 碩士
Master
系所名稱: 工學院 - 動力機械工程學系
Department of Power Mechanical Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 中文
論文頁數: 87
中文關鍵詞: 近場感測器自適應夾爪物件辨識與追蹤
外文關鍵詞: Eye-in-hand, Adaptive Gripper, Image-based Tracking Control
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著自動化產業的發展,一種新的工業模式誕生了,傳統的人力資源逐漸被機器所取代。世界經濟論壇(WEF)在“2018年就業未來報告”中指出,世界正在經歷一場「職場革命」,這意味著機器將在未來發揮更重要的作用。為了因應上述的情況,這篇論文主要將情境限縮在輸送帶的生產線上,目的在開發一款應用於產線升級的關鍵系統裝置,透過此裝置可使產線機器人聰明自主的夾取並分類物件。為達成智慧夾爪系統之核心目標,此嵌入式視覺自適應夾爪,結合了夾爪與相機感測器在同一個模組當中,這讓夾爪不是瞎子摸象,而是簡單的辨識物體的形狀、方位與位置,再透過正逆向運動學,進行抓取作業。
    在本研究中,提出及驗證了使用嵌入式視覺自適應夾爪系統在輸送機上進行物體識別和追蹤的方法,這些方法主要應用在需要自動分類物體的生產線上。由於相機坐標與夾爪坐標相同,因此本研究所提出的嵌入式電腦視覺系統(eye-in-hand)提供以下優勢:(1)避免遮擋(2)直觀地操作(3)能獲得不同角度的圖像(4)更容易校準相機。同時,當相機太靠近物體時,本研究所提出的方法也能透過嵌入式電腦視覺系統(eye-in-hand)的配置,解決物體超出視野外的問題。此外,在這項研究中,亦建立嵌入式視覺自適應夾爪所需之追蹤演算法。結果指出,經過一段時間後,整體系統可精準地計算並預測輸送帶等速運作的速度,並使機器人和物體之間的跟蹤誤差在誤差可容忍的範圍內。


    With the development of Automation industry, a new industrial model has been born, and traditional human resources have gradually been replaced by machines. The World Economic Forum (WEF) pointed out in “The Future of Jobs Report 2018” that the world is experiencing a "workplace revolution", which means that machine will play a more important role in the future. In response to this situation, in this research, methods and systems for object recognition and tracking on a conveyor using eye-in-hand gripper are proposed and validated, which are useful in production line for automatic object classification. The proposed eye-in-hand configuration is the most suitable for camera and gripper application because the camera coordinate is the same as the gripper coordinate. The main advantages of the eye-in-hand configuration addressed in this research are as follow: (1) occlusion avoidance (2) intuitive teleoperation (3) image from different angles (4) simple calibration. The proposed method resolves the out of view sight problem with eye-on-hand configuration when the camera is too close to the object. In this research, the eye-in-hand robotic gripper is established and integrated with a tracking system to chase target object. Results show that the speed of the conveyor can be calculated and predicted by the system. The tracking error between the robot and the object is found to be small within tolerance after a period of time.

    中文摘要 II Abstract III 誌謝 IV 目錄 V 圖目錄 VII 表目錄 X 第一章 緒論 1 1.1 文獻回顧 1 1.1.1 外掛式電腦視覺系統(eye-on-hand)的相關應用 3 1.1.2 各式夾爪的優缺點 10 1.2 研究方法 12 1.2.1 自製夾爪的機構建立 12 1.2.2 相機校正與坐標系轉換 13 1.2.3 辨識與追蹤物體 13 第二章 理論背景 14 2.1 相機校正 14 2.1.1 Pinhole模型和Projection Matrix 15 2.1.2 參數計算 20 2.2 影像處理 26 2.2.1 灰階二值化處理 27 2.2.2 HSV色彩空間 29 2.2.3 邊緣偵測 31 2.2.4 移動偵測 37 第三章 嵌入式視覺自適應夾爪整合 39 3.1 系統硬體架設 39 3.2 系統流程 45 3.3 物體座標辨認 46 3.4 定位與對齊 54 3.5 計算目標物的高 62 3.6 速度控制 63 第四章 研究結果 68 4.1 目標物的形狀與方位 68 4.2 目標物高度即時量測 71 4.3 追蹤實驗 73 第五章 結論 78 參考文獻 84

    [1]W.-C. Chang, "Robotic assembly of smartphone back shells with eye-in-hand visual servoing," Robotics and Computer-Integrated Manufacturing, vol. 50, pp. 102-113, 2018.
    [2]J. Pomares, I. Perea, G. J. García, C. A. Jara, J. A. Corrales, and F. Torres, "A multi-sensorial hybrid control for robotic manipulation in human-robot workspaces," Sensors, vol. 11, no. 10, pp. 9839-9862, 2011.
    [3]P. Cigliano, V. Lippiello, F. Ruggiero, and B. Siciliano, "Robotic Ball Catching with an Eye-in-Hand Single-Camera System," IEEE Transactions on Control Systems Technology, vol. 23, no. 5, pp. 1657-1671, 2015.
    [4]R. Barth, J. Hemming, and E. J. van Henten, "Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation," Biosystems Engineering, vol. 146, pp. 71-84, 2016/06/01/ 2016.
    [5]H. Wang, D. Guo, H. Xu, W. Chen, T. Liu, and K. K. Leang, "Eye-in-hand tracking control of a free-floating space manipulator," IEEE Transactions on Aerospace and Electronic Systems, vol. 53, no. 4, pp. 1855-1865, 2017.
    [6]P. R. Florence, L. Manuelli, and R. Tedrake, "Dense Object Nets: Learning Dense Visual Object Descriptors By and For Robotic Manipulation," arXiv preprint arXiv:1806.08756, 2018.
    [7]S. Yu, J. Lee, B. Park, and K. Kim, "Design of a gripper system for tendon-driven telemanipulators considering semi-automatic spring mechanism and eye-in-hand camera system," Journal of Mechanical Science and Technology, vol. 31, no. 3, pp. 1437-1446, 2017.
    [8]C.-L. Shih and Y. Lee, "A Simple Robotic Eye-In-Hand Camera Positioning and Alignment Control Method Based on Parallelogram Features," Robotics, vol. 7, no. 2, p. 31, 2018.
    [9]ROBOTIQ. Wrist Camera. Available: https://robotiq.com/products/wrist-camera?ref=nav_product_new_button
    [10]FESTO. ExoHand. Available: https://www.festo.com/group/en/cms/10233.htm
    [11]E. Brown et al., "Universal robotic gripper based on the jamming of granular material," Proceedings of the National Academy of Sciences, vol. 107, no. 44, pp. 18809-18814, 2010.
    [12]K. Hsiao, P. Nangeroni, M. Huber, A. Saxena, and A. Y. Ng, "Reactive grasping using optical proximity sensors," in Robotics and Automation, 2009. ICRA'09. IEEE International Conference on, 2009, pp. 2098-2105: IEEE.
    [13]ROBOTIQ. 3-Finger Adaptive Robot Gripper. Available: https://robotiq.com/products/3-finger-adaptive-robot-gripper?301=%2Fproducts%2Findustrial-robot-hand%2F
    [14]Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330-1334, 2000.
    [15]N. Otsu, "A threshold selection method from gray-level histograms," IEEE transactions on systems, man, and cybernetics, vol. 9, no. 1, pp. 62-66, 1979.
    [16]Wikipedia. HSL和HSV色彩空間. Available: https://zh.wikipedia.org/wiki/HSL%E5%92%8CHSV%E8%89%B2%E5%BD%A9%E7%A9%BA%E9%97%B4
    [17]OpenCV. Canny Edge Detection. Available: https://docs.opencv.org/3.1.0/da/d22/tutorial_py_canny.html
    [18]L. A. Cheng and J. J. Chang, "Design of a Multiple Degrees of Freedom Robotic Gripper for Adaptive Compliant Actuation," in 2018 International Conference on System Science and Engineering (ICSSE), 2018, pp. 1-6.
    [19]OpenCV. Camera Calibration. Available: https://docs.opencv.org/3.4/dc/dbb/tutorial_py_calibration.html
    [20]R. S. Andersen, "Kinematics of a UR5," 2018.

    QR CODE