簡易檢索 / 詳目顯示

研究生: 邱鎮宇
Jeng-Yu Chiou
論文名稱: 輪型機器人之即時影像追蹤系統控制與應用
Toward the Design and Control of Real-Time Image Tracking of a Wheel-Robot
指導教授: 陳建祥
Jian-Shiang Chen
口試委員:
學位類別: 碩士
Master
系所名稱: 工學院 - 動力機械工程學系
Department of Power Mechanical Engineering
論文出版年: 2008
畢業學年度: 96
語文別: 中文
論文頁數: 66
中文關鍵詞: 雙輪機器人機械夾爪色球追蹤影像處理
外文關鍵詞: wheeled-robot, mechanical gripper, tracking colored-ball, image process
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 現今智慧型機器人的發展越來越趨於成熟。機器人可以藉由各類型的感測裝置,來判斷其身處的環境狀況,以進行下一步動作。影像感測器因具有低成本及短時間內取得高資訊量這兩大優點,故影像感測器已被廣泛的運用在智慧型機器人上。智慧型機器人應用的範圍也非常廣泛,包含家庭服務、休閒娛樂、醫療照護、導覽服務等。
    本文以一個FPGA即時影像追蹤模組以及雙輪機器人為主要平台,搭配即時影像處理,建構出一個即時目標色球追蹤以及色球夾取的系統。實驗中利用CMOS攝影機模組擷取色球影像,擷取的影像經過影像色彩分類、色彩選擇、二維濾波以及影像二元化處理,接著便可根據處理過後的二元化影像得知目標色球的位置,然後根據影像所得到的目標色球位置資訊,來驅動雙輪機器人進行目標色球的追蹤。
    正確追蹤到目標色球之後,利用建置在雙輪機器人上的機械夾爪,夾取所追蹤的目標色球,並將目標色球放置到指定的箱子內,即完成追蹤、夾取以及置放色球的任務。最後,將四種顏色色球依序追蹤以及抓取並且置放於相對應顏色的箱子裡。


    Nowadays, the development of intelligent robot becomes mature. The robot can judge the surrounding and then act based on the sensors mounted on it. Having the advantages of low cost and high data rate in a short time, the image sensor has been applied in intelligent robot extensively. The applications of intelligent robot are widely, including household services, entertainment, health care and guided service.
    Processing on a FPGA-based real-time image tracking module and wheel-robot, this thesis develops the method of a real-time colored-ball tracking and gripping system based on the real-time image. In the experimental setup, the CMOS camera module acquires the image of the targeted colored-ball. Through classification of color, the image selection of color, two-dimension filtering and the image binarization process, we are able to recognize the position of the target colored-ball by the binary image. Knowing the position data of colored-ball by the real-time image, the wheel-robot will start to track the target colored-ball.
    After accurately tracking the target colored-ball, a mechanical gripper mounted on the wheeled-robot would begin to grasp the ball and put it in the corresponding box. Therefore, the tracking, grasping and releasing tasks are completed. Finally, tracking and grasping four colored-balls in order and releasing the each colored-ball to the matching box has been demonstrated through experiments.

    第一章 緒論 1 1-1 研究背景與動機 1 1-2 文獻回顧 3 1-3 欲達成目標之情境描述 5 1-4 本文架構 7 第二章 問題描述 8 2-1緒論 8 2-2色球顏色判別 9 2-2-1 原始影像擷取 9 2-2-2影像色彩分類 10 2-2-3影像色彩選擇 12 2-2-4二維濾波 12 2-2-5二元化 14 2-3影像偵測與追蹤 15 2-3-1點狀目標偵測 15 2-3-2影像距離判斷 17 2-4雙輪機器人的系統動態與控制 21 2-5 結語 22 第三章 實驗系統架構 23 3-1 FPGA影像即時追蹤模組 23 3-2 界面處理系統FLEX 10K 25 3-3 雙輪機器人之驅動系統架構 26 3-4 機械手臂系統 27 3-5 光感測元件運用 29 3-6 色球追蹤流程操作以及影像參數設定 30 3-6-1色球追蹤流程操作 30 3-6-2色球影像參數設定 34 3-7 實驗系統架構 35 3-7 結語 36 第四章 實驗結果 37 4-1 雙輪機器人馬達控制測試及控制器設計 37 4-2 色球影像參數調整結果 39 4-3 雙輪機器人色球追蹤控制實驗結果 41 4-4 利用光感測元件量測環境照度結果 43 4-5 雙輪機器人追蹤並抓取色球與色球置放實驗結果 46 4-6 四種顏色色球之依序抓取及置放實驗結果 49 4-7 結論 55 第五章 本文貢獻與未來發展之建議 56 5-1 本文貢獻 56 5-2 未來發展之建議 57 參考文獻 58 APPENDIX 61 A. 雙輪機器人馬達控制腳位量測 61 B. BLS351數位無刷伺服機詳細規格 63 C. Hamamatsu S1133光感測元件data sheet 64

    [1]K. P. Horn and B. G. Schunck, ”Determining optical flow,”Artificial Intelligence, Vol.17, pp185-203, 1981.
    [2]C. S. Fuh and Petros Maragos, ”Region-Based Optical Flow Estimation”,Proceedings of IEEE Conference on Computer Vision and attern Recognition, San Diego, CA, pp130-133, 1989.
    [3]D. Kragic and H. I. Christensen, “Tracking Techniques for Visual Servoing Tasks,” Proc. IEEE International Conference on Robotics Automation, pp. 1663-1669, April 2000.
    [4]K. Nickels and S. Hutchinson, “Model-Based Tracking of Complex Articulated Objects,” IEEE Transactions on Robotics and Automation, pp. 28-36, February 2001.
    [5]M. Kass, A. Witkin , and D. Terzopoulos, “Snake: Active Contour Models,” International Journal of Computer Vision, pp. 321-331, 1988.
    [6]M. Yang, D. J. Kriegman, and N. Ahuja, “Detecting Face in Image: A Survey ,” IEEE Trans. On PAMI, Vol. 24, No. 1, pp. 34-58, Jan. 2002.
    [7]Ying Wu and Thomas S. Huang, “Color tracking by transductive learning,” IEEE Conference on Computer Vision and Pattern Recognition, pp.133-138, June 2000.
    [8]S. Hutchinson G. Hager, and P. Corke, “A Tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, pp. 651-670, October 1996.
    [9]Y. Ma, J. Kosecka, and S. Sastry, “Vision Guided Navigation for a Nonholonomic Mobile Robot,” IEEE Trans. On Robotics and Automation, pp. 521-536 August 1998.
    [10]J.-H. Jean and T.-P. Wu, “Robust Visual Servo Control of a Mobile Robot for Object Tracking in Shape Parameter Space,” 43rd IEEE Conference on Decision and Control, pp. 4016-4021, December 2004.
    [11]P. Liang, Y.L. Chang, and S. Hackwood, “Adaptive Self-Calibration of Vision-Based Robot Systems,” IEEE Transactions on Systems, Man and Cybernetics, pp. 811-824, July-Aug. 1989.
    [12]R. P. Gibollet and P. Rives, “Applying Visual Servoing Techniques to Control a Mobile Hand-Eye System,” IEEE Conf. on Robotics and Automation, Vol. 1, pp. 166-171, May 1995.
    [13]C. Y. Tsai and K.T. Song, “Visual Tracking Control of a Mobile Robot Using a New Model in Image Plane,” Proceedings of 12th International Conference on Advanced Robotics, pp. 540-545, July 2005
    [14]E. Goubaru and M. Sugisaka, “Visual Tracking in Real-Time Processing,” International Joint Conference, pp.5296-5299, Oct. 2006.
    [15]俊源科技, FPGA影像即時追蹤控制設計操作手冊,2006.
    [16]俊源科技股份有限公司網站, http://www.juniortek.com.tw/#場景_1
    [17]ActiveMedia Robotics, Pioneer 3 Operation Manual, 2004.
    [18]ALTERA, FLEX 10K Embedded Programmable Logic Device Family, 2001
    [19]Texas Instruments, MSP430x15x,MSP430x16x,MSP430x161x MIXED SIGNAL MICROCONTROLLER, 2007.
    [20]廖裕評、陸瑞強編著,CPLD數位電路設計使用MAX+plus II 入門篇,全華科技圖書股份有限公司,民國91年9月.
    [21]HAMAMATSU Company, Si Photodiode S1087/S1133 Series Data Sheet, Apr. 2001.
    [22]Gene F. Franklin, J. David Powell and Abbas Emami-Naeini, Feedback Control of Dynamic Systems, 4nd Edition, Prentice-Hall, Inc. New Jersey, 2002.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE