研究生: |
簡光宏 Chien, Kuang-Hung |
---|---|
論文名稱: |
結合光達與彩色攝影機之即時定位系統 Real-time Localization from LiDAR and RGB Cues |
指導教授: |
陳煥宗
Chen, Hwann-Tzong |
口試委員: |
鄭嘉珉
Cheng, Chia-Ming 王書凡 Wang, Shu-Fan 陳鼎介 Chen, Ding-Jie |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 資訊系統與應用研究所 Institute of Information Systems and Applications |
論文出版年: | 2018 |
畢業學年度: | 107 |
語文別: | 中文 |
論文頁數: | 24 |
中文關鍵詞: | 光達 、即時定位 、車輛即時定位 、光達即時定位 |
外文關鍵詞: | LiDAR, Real-time Localization, Real-time Vehicle Localization, Real-time Localization using LiDAR |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本篇論文提出一套藉由光達感測器和彩色單目攝影機,結合同步定位與
地圖建構技術而成的車輛即時定位系統。現行的衛星定位技術,僅能用於以
靜態地圖為參考的導航系統,無法即時處理動態環境狀況,且容易受周遭環
境影響訊號強弱,進而導致定位失準。因此在本論文所提出的系統中,我們
利用彩色單目攝影機和光達轉換而成的深度圖,藉由同步定位與地圖建構技
術建立出周遭的點雲資訊,再與預先建立之帶有準確衛星定位座標的點雲圖
資進行比對,得出車輛在圖資中最相近之位置,最後隨著車輛的移動進行比
對和修正,得出車輛的準確位置和移動軌跡。
This thesis describes a real-time vehicle localization system. This system uses LiDAR and RGB Cues to achieve simultaneous localization and mapping (SLAM). Existing localization methods that rely on merely the GPS information are more suitable for navigation systems with static maps. Such navigation systems cannot either adapt to dynamic scenarios or benefit from the abundant visual cues in the surroundings. In our system, we use simultaneous localization and mapping (SLAM) to construct the point cloud from the visual cues acquired by a monocular RGB camera and the corresponding LiDAR information. Our system compares the current reconstructed point cloud with the pre-acquired, GPS-aware point cloud in the database, and finally infers the accurate position and odometry of the vehicle via continuous adjustment as the vehicle moves forward.
[1] G. Bradski. The OpenCV Library. Dr. Dobb’s Journal of Software Tools, 2000.
[2] J. Engel, T. Schöps, and D. Cremers. LSD-SLAM: Large-Scale Direct Monocular SLAM. In European Conference on Computer Vision (ECCV), September 2014.
[3] M. Labbé and F. Michaud. RTAB-Map : Real-Time Appearance-Based Mapping.
[4] R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robotics, 31(5):1147–1163, 2015.
[5] R. Mur-Artal and J. D. Tardós. ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. CoRR, abs/1610.06475, 2016.
[6] T. Shan and B. Englot. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages Accepted, To Appear in October. IEEE, 2018.
[7] K. Tateno, F. Tombari, I. Laina, and N. Navab. CNN-SLAM: Real-Time Dense Monocular SLAM with Learned Depth Prediction. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), volume 00, pages 6565–6574, July 2017.
[8] M. Velas, M. Spanel, Z. Materna, and A. Herout. Calibration of RGB Camera With Velodyne LiDAR. 2014.
[9] J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems Conference, July 2014.