研究生: |
黃雅彙 Huang, Ya-Hui |
---|---|
論文名稱: |
4D毫米波雷達應用於自駕車自運動估計 Autonomous Vehicle Ego-Motion Estimation Utilizing 4D Millimeter-Wave Radar |
指導教授: |
王培仁
Wang, Pei-Jen |
口試委員: |
劉晉良
Liu, Jinn-Liang 黃仲誼 Huang, Chung-I |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 動力機械工程學系 Department of Power Mechanical Engineering |
論文出版年: | 2024 |
畢業學年度: | 113 |
語文別: | 中文 |
論文頁數: | 74 |
中文關鍵詞: | 毫米波雷達 、自運動估計 、隨機抽樣一致 、運動分割 |
外文關鍵詞: | Millimeter-wave Radar, Ego-motion Estimation, Random Sample Consensus, Motion Segmentation |
相關次數: | 點閱:50 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
自駕車的感知系統必須融合數種感測資訊,確認車輛位置及車輛週邊情境狀態,故而系統定位能力是為自駕系統之核心技術,更是實現無人全自動自駕導航功能之重要關鍵技術。
近年來,4D毫米波雷達技術發展為自駕領域帶來許多新可能進步空間;鑒於量測數據密度之顯著提升,促成自駕車於高度複雜場域應用上顯著發揮能力,更因具備不受天候干擾性能,相較於主流感測元件如相機及光達,更能顯現毫米波雷達元件之優勢。
本論文利用毫米波雷達量測靜態物體時,徑向速度與目標方向之間的關係,採用隨機抽樣一致之演算法,將雷達點雲分割為靜態點雲,例如建築及地面,或是動態點雲,例如移動中之人或車輛,可同時取得雷達傳回之運動瞬時速度,最終經過車輛運動模型可估計車輛在三維空間之運動狀態。本論文將習知之二維車輛自運動估算推展至三維計算,能夠在錯綜複雜及高低起伏道路條件,例如高速公路環狀交流道,來協助自駕車進行定位。鑒於此算法無須強大運算力及無須事先訓練之模型,較容易於車輛電腦上實現,且能即時運作且具強健之穩健性,相當適合應用於注重安全性之自駕車定位技術上。
The perception system of autonomous vehicles integrates measured results from various sensors via computational algorithms to understand its own position and the surrounding environment. Positioning capability is considered one of the core technologies in the field of autonomous driving and served as a crucial key in establishment of unmanned autonomous navigation.
Recently, the emergence of 4D millimeter-wave radar has brought many new possibilities to the field of autonomous vehicles. The significant improvement in measurement density excels to effective function in highly complex scenarios of autonomous vehicles. Additionally, the resilience to weather and environmental interference proves its superiority over mainstream cameras and LiDAR technology.
This thesis adopts millimeter-wave radar’s fixed relationship between the speed and motion direction angle when measuring static objects. With the Random Sample Consensus (RANSAC) algorithm, the radar point cloud is segmented into static point cloud and dynamic point cloud while simultaneously obtaining the instantaneous speed of radar motion. Finally, the vehicle motion model is used to estimate the vehicle's motion in three-dimensional space.
The research efforts extend the conventional two-dimensional vehicle ego-motion estimation to three-dimensional space in assisting the autonomous vehicle localization on complex and undulating roads and highways. The algorithm is easy to implement as it does not require powerful computing resources or pre-training models. It’s real-time operation and robustness make it suitable for autonomous vehicle localization technology that prioritizes safety.
[1] R. Sun et al., "A millimeter-wave automotive radar with high angular resolution for identification of closely spaced on-road obstacles," Scientific reports, vol. 13, no. 1, p. 3233, 2023.
[2] D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, "Instantaneous ego-motion estimation using doppler radar," in 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), 2013: IEEE, pp. 869-874.
[3] J. Borenstein and L. Feng, "UMBmark: A method for measuring, comparing, and correcting dead-reckoning errors in mobile robots," 1994.
[4] O. J. Woodman, "An introduction to inertial navigation," University of Cambridge, Computer Laboratory, 2007.
[5] J. Gluckman and S. K. Nayar, "Ego-motion and omnidirectional cameras," in Sixth International Conference on Computer Vision (IEEE Cat. No. 98CH36271), 1998: IEEE, pp. 999-1005.
[6] D. Nistér, O. Naroditsky, and J. Bergen, "Visual odometry for ground vehicle applications," Journal of Field Robotics, vol. 23, no. 1, pp. 3-20, 2006.
[7] D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, "Instantaneous ego-motion estimation using multiple Doppler radars," in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014: IEEE, pp. 1592-1597.
[8] L. Armesto, J. Tornero, and M. Vincze, "Fast ego-motion estimation with multi-rate fusion of inertial and vision," The International Journal of Robotics Research, vol. 26, no. 6, pp. 577-589, 2007.
[9] J. Lobo and J. Dias, "Inertial Sensed Ego‐motion for 3D Vision," Journal of Robotic Systems, vol. 21, no. 1, pp. 3-12, 2004.
[10] P. J. Besl and N. D. McKay, "Method for registration of 3-D shapes," in Sensor fusion IV: control paradigms and data structures, 1992, vol. 1611: Spie, pp. 586-606.
[11] P. Biber and W. Straßer, "The normal distributions transform: A new approach to laser scan matching," in Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003)(Cat. No. 03CH37453), 2003, vol. 3: IEEE, pp. 2743-2748.
[12] V. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello, "Bayesian filtering for location estimation," IEEE pervasive computing, vol. 2, no. 3, pp. 24-33, 2003.
[13] M. Spangenberg, V. Calmettes, and J.-Y. Tourneref, "Fusion of GPS, INS and odometric data for automotive navigation," in 2007 15th European Signal Processing Conference, 2007: IEEE, pp. 886-890.
[14] Davison, "Real-time simultaneous localisation and mapping with a single camera," in Proceedings Ninth IEEE International Conference on Computer Vision, 2003: IEEE, pp. 1403-1410 vol. 2.
[15] N. J. Abu-Alrub and N. A. Rawashdeh, "Radar odometry for autonomous ground vehicles: A survey of methods and datasets," IEEE Transactions on Intelligent Vehicles, 2023.
[16] K. Yokoo, S. Beauregard, and M. Schneider, "Indoor relative localization with mobile short-range radar," in VTC Spring 2009-IEEE 69th Vehicular Technology Conference, 2009: IEEE, pp. 1-5.
[17] M. A. Fischler and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol. 24, no. 6, pp. 381-395, 1981.
[18] E. Ward and J. Folkesson, "Vehicle localization with low cost radar sensors," in 2016 IEEE Intelligent Vehicles Symposium (IV), 2016: IEEE, pp. 864-870.
[19] M. Rapp, M. Barjenbruch, M. Hahn, J. Dickmann, and K. Dietmayer, "Probabilistic ego-motion estimation using multiple automotive radar sensors," Robotics and Autonomous Systems, vol. 89, pp. 136-146, 2017.
[20] A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-A. Agha-Mohammadi, and C. Heckman, "Radar-inertial ego-velocity estimation for visually degraded environments," in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020: IEEE, pp. 5739-5746.
[21] C. Doer and G. F. Trommer, "An ekf based approach to radar inertial odometry," in 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2020: IEEE, pp. 152-159.
[22] J. Van Brummelen, M. O’brien, D. Gruyer, and H. Najjaran, "Autonomous vehicle perception: The technology of today and tomorrow," Transportation research part C: emerging technologies, vol. 89, pp. 384-406, 2018.
[23] A. Bacha et al., "Odin: Team victortango's entry in the darpa urban challenge," Journal of field Robotics, vol. 25, no. 8, pp. 467-492, 2008.
[24] R. H. Rasshofer and K. Gresser, "Automotive radar and lidar systems for next generation driver assistance functions," Advances in Radio Science, vol. 3, pp. 205-209, 2005.
[25] M. A. Richards, J. Scheer, W. A. Holm, and W. L. Melvin, "Principles of modern radar," 2010.
[26] Y. Zhang, A. Carballo, H. Yang, and K. Takeda, "Perception and sensing for autonomous vehicles under adverse weather conditions: A survey," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 196, pp. 146-177, 2023.
[27] Z. Hong, Y. Petillot, A. Wallace, and S. Wang, "RadarSLAM: A robust simultaneous localization and mapping system for all weather conditions," The international journal of robotics research, vol. 41, no. 5, pp. 519-542, 2022.
[28] Z. Han et al., "4D millimeter-wave radar in autonomous driving: A survey," arXiv preprint arXiv:2306.04242, 2023.
[29] S. Rao, "Introduction to mmWave sensing: FMCW radars," Texas Instruments (TI) mmWave Training Series, pp. 1-11, 2017.
[30] C. Iovescu and S. Rao, "The fundamentals of millimeter wave sensors," Texas Instruments, pp. 1-8, 2017.
[31] A. Srivastav and S. Mandal, "Radars for autonomous driving: A review of deep learning methods and challenges," IEEE Access, 2023.
[32] M. I. Skolnik, Introduction to radar systems. McGraw-hill New York, 1980.
[33] G. T. Capraro, A. Farina, H. Griffiths, and M. C. Wicks, "Knowledge-based radar signal and data processing: a tutorial review," IEEE Signal Processing Magazine, vol. 23, no. 1, pp. 18-29, 2006.
[34] A. Kamann, P. Held, F. Perras, P. Zaumseil, T. Brandmeier, and U. T. Schwarz, "Automotive radar multipath propagation in uncertain environments," in 2018 21st International Conference on Intelligent Transportation Systems (ITSC), 2018: IEEE, pp. 859-864.
[35] M. Murad et al., "Requirements for next generation automotive radars," in 2013 ieee radar conference (radarcon13), 2013: IEEE, pp. 1-6.
[36] J. Yoon, S. Lee, S. Lim, and S.-C. Kim, "High-density clutter recognition and suppression for automotive radar systems," IEEE Access, vol. 7, pp. 58368-58380, 2019.
[37] J. Yu and J. Krolik, "MIMO multipath clutter mitigation for GMTI automotive radar in urban environments," in IET International Conference on Radar Systems (Radar 2012), 2012: IET, pp. 1-5.
[38] B. Tan et al., "3-D object detection for multiframe 4-D automotive millimeter-wave radar point cloud," IEEE Sensors Journal, vol. 23, no. 11, pp. 11125-11138, 2022.
[39] L. Zheng et al., "Rcfusion: Fusing 4-d radar and camera with bird’s-eye view features for 3-d object detection," IEEE Transactions on Instrumentation and Measurement, vol. 72, pp. 1-14, 2023.
[40] H. Cui, J. Wu, J. Zhang, G. Chowdhary, and W. R. Norris, "3D detection and tracking for on-road vehicles with a monovision camera and dual low-cost 4D mmWave radars," in 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), 2021: IEEE, pp. 2931-2937.
[41] R. Raguram, O. Chum, M. Pollefeys, J. Matas, and J.-M. Frahm, "USAC: A universal framework for random sample consensus," IEEE transactions on pattern analysis and machine intelligence, vol. 35, no. 8, pp. 2022-2038, 2012.
[42] R. Hartley and A. Zisserman, Multiple view geometry in computer vision. Cambridge university press, 2003.
[43] P. H. Torr and A. Zisserman, "MLESAC: A new robust estimator with application to estimating image geometry," Computer vision and image understanding, vol. 78, no. 1, pp. 138-156, 2000.
[44] P. H. Torr, S. J. Nasuto, and J. M. Bishop, "Napsac: High noise, high dimensional robust estimation-it’s in the bag," in British Machine Vision Conference (BMVC), 2002, vol. 2, p. 3.
[45] K. Ni, H. Jin, and F. Dellaert, "GroupSAC: Efficient consensus in the presence of groupings," in 2009 IEEE 12th International Conference on Computer Vision, 2009: IEEE, pp. 2193-2200.
[46] O. Chum and J. Matas, "Matching with PROSAC-progressive sample consensus," in 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR'05), 2005, vol. 1: IEEE, pp. 220-226.
[47] O. Chum, J. Matas, and J. Kittler, "Locally optimized RANSAC," in Pattern Recognition: 25th DAGM Symposium, Magdeburg, Germany, September 10-12, 2003. Proceedings 25, 2003: Springer, pp. 236-243.
[48] J. Matas and O. Chum, "Randomized RANSAC with Td, d test," Image and vision computing, vol. 22, no. 10, pp. 837-842, 2004.
[49] D. Nistér, "Preemptive RANSAC for live structure and motion estimation," Machine Vision and Applications, vol. 16, no. 5, pp. 321-329, 2005.
[50] O. Chum, T. Werner, and J. Matas, "Two-view geometry estimation unaffected by a dominant plane," in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), 2005, vol. 1: IEEE, pp. 772-779.
[51] G. Kannan, "Road user detection on radar data cubes with deep learning," 2021.
[52] A. Palffy, E. Pool, S. Baratam, J. F. Kooij, and D. M. Gavrila, "Multi-class road user detection with 3+ 1D radar in the View-of-Delft dataset," IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 4961-4968, 2022.
[53] A. Geiger, P. Lenz, and R. Urtasun, "Are we ready for autonomous driving? the kitti vision benchmark suite," in 2012 IEEE conference on computer vision and pattern recognition, 2012: IEEE, pp. 3354-3361.
[54] M. Quigley et al., "ROS: an open-source Robot Operating System," in ICRA workshop on open source software, 2009, vol. 3, no. 3.2: Kobe, Japan, p. 5.
[55] F. Meinl, M. Stolz, M. Kunert, and H. Blume, "An experimental high performance radar system for highly automated driving," in 2017 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), 2017: IEEE, pp. 71-74.