簡易檢索 / 詳目顯示

研究生: 謝東錦
Hsieh, Tung-Chin
論文名稱: 應用SR-3000三維影像於室內場景之機器人同步定位
SLAM in Indoor Environment Using SR-3000 Range Imager
指導教授: 陳永昌
Chen, Yung-Chang
口試委員:
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2009
畢業學年度: 99
語文別: 英文
論文頁數: 62
中文關鍵詞: 同步定位
外文關鍵詞: SLAM
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 現今的機器人研究中即時自我定位及建地圖是一個重要主題。一個自動移動之機器人必須具備自我定位之能力。自我定位主要利用感測器去搜尋環境中的明顯的標的物,當作參考資訊來達成自我修正定位。傳統的定位使用里程器來估測機器人位置,由於里程器與實際位置累積誤差較大,SLAM研究常使用Extended Kalman Filter來修正定位誤差。
    在這篇論文中,我們提出一個從具有三維資訊之Range Camera SR-3000萃取特徵點及深度資訊作為系統的地標點應用於室內環境的即時自我定位系統。由於它不需要校正相機參數即可獲得影像中的深度資訊,因此可以減輕系統運算上的負擔。此系統包含輪型機器人平台、里程器資訊、Harris corner特徵點萃取,標的點3D座標重建以及Extended Kalman Filter修正里程器誤差。
    我們的方法可以利用單一感測器實現機器人即時定位,並且不受到環境的光影影響。機器人可以以0.2m/s的移動速度完成即時定位,系統誤差及運算時間也在可接受的合理範圍內。


    Simultaneous localization and mapping (SLAM) becomes an ever important topic for robotic research. The ability that an autonomous mobile robot can simultaneously locate itself and navigate in an unknown indoor environment is prerequisite. The simplest localization method only uses the odometer to estimate the robot position and pose, but the accumulated error is growing with the execution time of the system. The Extended Kalman Filter (EKF) is often applied to revise the system error of SLAM problem.
    In this thesis, we propose a system which extracts features form SR-3000 3D image data as landmarks, and combine with the depth information to obtain the landmark positions. This system is based on the EKF. It contains a wheeled robot, UBOT, which serves as our experiment platform, odometry data, Harris corner detection, landmark’s 3D position reconstruction, and Extended Kalman Filter.
    Our system can use a single sensor to implement the EKF-based SLAM in real time. It can work without any camera calibration for calculating the depth information and alleviate the computational effort. The robot moves at a speed of 0.2m/s and simultaneously locates itself. The estimated error and computational time of this system are acceptable.

    Table of Contents Abstract i Table of Contents ii List of Figures v List of Tables vii Chapter 1: Introduction 1 1.1 Overview of Robotics Technology 1 1.2 Motivation 1 1.3 Thesis organization 3 Chapter 2: Related Works 4 2.1 Overview of Simultaneous localization and Mapping 4 2.2 Probabilistic models in SLAM 7 2.2.1 Posterior Estimation 7 2.2.2 The Extended Kalman Filter 8 2.3 Multi-Modal Sensor based SLAM 10 2.3.1 Range Finder of Sonar and Laser-based SLAM 10 2.3.2 Vision-based SLAM 11 Chapter 3: System Architecture and Analysis of SR-3000 Range Image for SLAM 16 3.1 System Architecture 16 3.1.1 The SLAM Process 16 3.1.2 System Flowchart 18 3.2 Comparison of Sensors for SLAM 19 3.2.1 Swiss Ranger SR-3000 Camera 20 3.2.2 Description of Indoor Environments 21 3.2.3 Problem Analysis of Using SR-3000 for SLAM and System Setup 23 Chapter 4: Landmark Extraction and Extended Kalman Filter based Algorithm 27 4.1 Landmark Characteristics 27 4.1.1 Landmark Extraction Method for SLAM 28 4.1.2 Harris Corner Detection 32 4.2 The System Models 35 4.2.1 Vehicle Model 35 4.2.2 Landmark Model 37 4.2.3 Observation Model 38 4.3 Extended Kalman Filter in SLAM 42 4.3.1 Kalman Filter 42 4.3.2 Linearization of System Models 44 4.4 Data Association 48 4.5 Summary 49 Chapter 5: Experimental Results and Discussion 50 5.1 Experimental Platform 50 5.2 Experimental Results 51 5.3 Discussion 55 Chapter 6: Conclusion and Future Works 58 6.1 Conclusion 58 6.2 Future Works 59 References 60 List of Figures Figure 2.1: Correlation between robot path error and map error in [5] 4 Figure 2.2: Observation and controls form a network of relationship between the pose of the robot and landmarks in the environment. These relationships are shown as thick lines [5]. 5 Figure 2.3: A simultaneous estimate of both robot and landmark locations is required The true locations are never known or measured directly. Observations are made between true robot and landmark locations [6] 6 Figure 2.4: The structure of Kalman Filter in SLAM [25] 8 Figure 2.5: EKF applied to a simulated data set [5] 9 Figure 2.6: Matching Omni-directional image and range data [19] 13 Figure 2.7: Robustness of SIFT in different view point 13 Figure 2.8: Laser range finder SICK horizontal scans 14 Figure 3.1: Overview of the SLAM process 17 Figure 3.2: Our system flowchart of SLAM implementation 18 Figure 3.3: The Swiss Ranger SR-3000 20 Figure 3.4: The distance and intensity image returned by SR-3000 21 Figure 3.5: Corridor environment for implementing SLAM 21 Figure 3.6: Distance data of the corridor from SR-3000 22 Figure 3.7: Ceiling of the corridor 22 Figure 3.8: The depth image with noise 23 Figure 3.9: Distance data of the corridor form SR-3000 24 Figure 3.10: The hardware setup for avoiding ambiguity region in the depth image 25 Figure 3.11: The intensity image of the ceiling with radial distribution 25 Figure 4.1: The texture of ceiling image 29 Figure 4.2: SIFT descriptors matching of depth image 30 Figure 4.3: The relationship between the eigenvalues and three distinct cases: corner, line, flat 34 Figure 4.4: The corner points extracted from the ceiling image 34 Figure 4.5: The differential drive system of UBOT 35 Figure 4.6: The robot coordinate of UBOT and the world coordinate 36 Figure 4.7: The camera coordinate of the SR-3000 38 Figure 4.8: The pin-hole camera model and the relationship between image plane and camera coordinate 39 Figure 4.9: The camera coordinate of the SR-3000 and robot coordinate 39 Figure 4.10: The range 41 Figure 4.11: Recursive process of the Kalman Filter [26] 43 Figure 4.12: The main two steps: predict and update [26] 44 Figure 4.13: The data association separates the observation into two parts 48 Figure 5.1: The experimental platform UBOT 50 Figure 5.2: The indoor environment used for this experiment 51 Figure 5.3: The path in the corridor used for Test (1) 52 Figure 5.4: The corridor navigation test of UBOT 52 Figure 5.5: The experimental result of Test (1) 53 Figure 5.6: The path in an indoor environment used for Test (2) 54 Figure 5.7: The indoor environment used for Test (2) 54 Figure 5.8: The experimental result of the Test (2) 55 Figure 5.9: The landmark positions and the estimated error 57 Figure 5.10: The real measured distance between each pair corner points 57 List of Tables Table 3.1: Comparison of sensors for SLAM 19 Table 5.1: The experimental results of Test (2) 56 Table 5.2: The estimated error of the odometry and our system in Test (2) 56

    Reference

    [1] R. Smith, M. Self, and P. Cheeseman. “Estimating uncertain spatial relationships in robotics.” In I.J. Cox and G.T. Wilfong, editors, Autonomous Robot Vehnicles, pages 167–193. Springer-Verlag, 1990.

    [2] R. C. Smith and P. Cheeseman. “On the representation and estimation of spatial uncertainty.” Technical Report TR 4760 & 7239, SRI, 1985..

    [3] G. Dissanayake, H. Durrant-Whyte, and T. Bailey. “A computationally efficient solution to the simultaneous localisation and map building (SLAM) problem.” Working notes of ICRA’2000 Workshop W4: Mobile Robot Navigation and Mapping, April 2000.

    [4] H. Durrant-Whyte, S. Majumder, S. Thrun, M. de Battista, and S. Scheding. “A Bayesian algorithm for simultaneous localization and map building.” In Proceedings of the 10th International Symposium of Robotics Research (ISRR’01), Lorne, Australia, 2001.

    [5] M. Montemerlo, S. Thrun, D. Koller, and B.Wegbreit, “FastSLAM: A factored solution to simultaneous localization and mapping,” in Proc. Nat. Conf. Artif. Intell., Edmonton, AB, Canada, 2002, pp. 593–598.

    [6] H. Durrant-Whyte and T. Bailey, “Simultaneous localization and mapping: part I,” IEEE Robotics & Automation Magazine, vol. 13, no. 2, pp. 99–110, 2006.

    [7] T. Bailey and H. Durrant-Whyte, “Simultaneous localization and mapping: part II,” Robotics & Automation Magazine, IEEE, vol. 13, no. 3, pp. 108–117, 2006.

    [8] P. Moutarlies and R. Chatila. “An experimental system for incremental environment modeling by an autonomous mobile robot.” In 1st International Symposium on Experimental Robotics, June 1989.

    [9] P. Moutarlies and R. Chatila. “Stochastic multisensory data fusion for mobile robot location and environment modeling.” In 5th International Symposium on Robotics Research, Tokyo, 1989.

    [10] R.E. Kalman. “A new approach to linear filtering and prediction problems.” Transactions of the ASME Journal of Basic Engineering, pages 35-45, March 1960.

    [11] R.E. Kalman and R.S. Bucy. "New results in linear filtering and prediction theory." Transactions of the ASME Journal of Basic Engineering, pages 95-1-08, March 1961.

    [12] J. J. Leonard and H. F. Durrant-Whyte. “Mobile robot localization by tracking geometric beacons.” IEEE Trans. Robotics and Automation, 7(3):376{382, June 1991.

    [13] A. J. Davison and D. W. Murray, “Simultaneous localization and map-building using active vision”. IEEE Trans. Pattern Anal. Machine Intell., vol. 24, no. 7, pp. 865.880, July 2002.

    [13] Chong, K.S., Kleeman, L.: “Feature-based mapping in real large scale environments using a ultrasonic array.” Int. J. Rob. Res. 18(2), 3–19 (1999)

    [14] D. G. Lowe, “Distinctive image features from scale-invariant keypoints”, International Journal of Computer Vision, vol. 60(2), pp. 91–110, 2004.

    [15] S. Se, D. G. Lowe, J. Little, “Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks”. Int. J. of Robotics Research, vol. 21(8), pp. 735–758, 2002.

    [16] S. Se, D. G. Lowe and J. Little, “Vision-based global localization and mapping for mobile robots”, IEEE Trans. on Robotics, vol. 21(3), pp. 364–375, 2005.

    [17] Kim, J.-H., Chung, M.J., “SLAM with omni-directional stereo vision sensor.” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 442-447, Las Vegas, October 2003

    [18] Drocourt, C., Delahoche, L., Marhic, B., Clerentin, A. “Simultaneous localization and map construction method using omni-directional stereoscopic information.” Proceedings of the International Conference on Robotics and Automation, pp. 894–899, Washington, May 2002

    [19] S. Kim, Se-Young Oh, “SLAM in Indoor Environments using Omni-directional Vertical and Horizontal Line Features”, Journal of Intelligent and Robotic Systems, Vol. 51, Issue 1, pp. 31-43, ISSN:0921-0296, Jan. 2008.

    [20] Buttgen B, Oggier T, Lehmann M, Kaufmann R, Lustenberger F “CCD/CMOS lock-in pixel for range imaging: challenges, limitations and state-of-the-art.” In: 1st range imaging day, Zurich,Switzerland, June 2005

    [21] R. Lange and P. Seitz, “Solid-State Time-of-Flight Range Camera”, IEEE J. Quantum Electronics, Vol. 37 (3), 390-397, March 2001.

    [22] C. Harris and M. Stephens, "A combined corner and edge detector," In Proceedings of The Fourth Alvey Vision Conference, pp. 147-151, 1988.

    [23] Bar-Shalom, Y. and Fortmann, T. E. . “Tracking and Data Association.” Academic Press.,1988.

    [24] Moravec H. P. “Towards automatic visual obstacle avoidance” In proceedings of the 5th International Joint Conference on Artificial Intelligence, Pittsburgh, PA, 1977. Carnegie-Mellon University.

    [25] P. Newman. “On the Structure and Solution of the Simultaneous Localisation and Map Building Problem.” PhD thesis, University of Sydney, March 1999.

    [26] “Wikipedia”, http://en.wikipedia.org/wiki/Kalman_filter.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE