簡易檢索 / 詳目顯示

研究生: 黃雅彙
Huang, Ya-Hui
論文名稱: 4D毫米波雷達應用於自駕車自運動估計
Autonomous Vehicle Ego-Motion Estimation Utilizing 4D Millimeter-Wave Radar
指導教授: 王培仁
Wang, Pei-Jen
口試委員: 劉晉良
Liu, Jinn-Liang
黃仲誼
Huang, Chung-I
學位類別: 碩士
Master
系所名稱: 工學院 - 動力機械工程學系
Department of Power Mechanical Engineering
論文出版年: 2024
畢業學年度: 113
語文別: 中文
論文頁數: 74
中文關鍵詞: 毫米波雷達自運動估計隨機抽樣一致運動分割
外文關鍵詞: Millimeter-wave Radar, Ego-motion Estimation, Random Sample Consensus, Motion Segmentation
相關次數: 點閱:50下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 自駕車的感知系統必須融合數種感測資訊,確認車輛位置及車輛週邊情境狀態,故而系統定位能力是為自駕系統之核心技術,更是實現無人全自動自駕導航功能之重要關鍵技術。
    近年來,4D毫米波雷達技術發展為自駕領域帶來許多新可能進步空間;鑒於量測數據密度之顯著提升,促成自駕車於高度複雜場域應用上顯著發揮能力,更因具備不受天候干擾性能,相較於主流感測元件如相機及光達,更能顯現毫米波雷達元件之優勢。
    本論文利用毫米波雷達量測靜態物體時,徑向速度與目標方向之間的關係,採用隨機抽樣一致之演算法,將雷達點雲分割為靜態點雲,例如建築及地面,或是動態點雲,例如移動中之人或車輛,可同時取得雷達傳回之運動瞬時速度,最終經過車輛運動模型可估計車輛在三維空間之運動狀態。本論文將習知之二維車輛自運動估算推展至三維計算,能夠在錯綜複雜及高低起伏道路條件,例如高速公路環狀交流道,來協助自駕車進行定位。鑒於此算法無須強大運算力及無須事先訓練之模型,較容易於車輛電腦上實現,且能即時運作且具強健之穩健性,相當適合應用於注重安全性之自駕車定位技術上。


    The perception system of autonomous vehicles integrates measured results from various sensors via computational algorithms to understand its own position and the surrounding environment. Positioning capability is considered one of the core technologies in the field of autonomous driving and served as a crucial key in establishment of unmanned autonomous navigation.
    Recently, the emergence of 4D millimeter-wave radar has brought many new possibilities to the field of autonomous vehicles. The significant improvement in measurement density excels to effective function in highly complex scenarios of autonomous vehicles. Additionally, the resilience to weather and environmental interference proves its superiority over mainstream cameras and LiDAR technology.
    This thesis adopts millimeter-wave radar’s fixed relationship between the speed and motion direction angle when measuring static objects. With the Random Sample Consensus (RANSAC) algorithm, the radar point cloud is segmented into static point cloud and dynamic point cloud while simultaneously obtaining the instantaneous speed of radar motion. Finally, the vehicle motion model is used to estimate the vehicle's motion in three-dimensional space.
    The research efforts extend the conventional two-dimensional vehicle ego-motion estimation to three-dimensional space in assisting the autonomous vehicle localization on complex and undulating roads and highways. The algorithm is easy to implement as it does not require powerful computing resources or pre-training models. It’s real-time operation and robustness make it suitable for autonomous vehicle localization technology that prioritizes safety.

    摘要 I ABSTRACT II 誌謝 III 目錄 IV 圖目錄 VIII 表目錄 XI 符號文字對照表 XII 第一章 緒論 1 1-1 研究背景 1 1-2 研究動機與目的 2 1-3 文獻回顧 3 1-3-1 車輛自運動估計與定位 3 1-3-2 雷達自運動估計 6 第二章 基礎理論介紹 9 2-1 自駕車感測器分析 9 2-2 毫米波雷達量測原理與挑戰 10 2-2-1 距離(Range) 11 2-2-2 速度(Radial Velocity) 12 2-2-3 到達角度(Angel of Arrival) 13 2-2-4 雷達訊號處理流程 15 2-2-5 車用雷達的限制與挑戰 15 2-3 隨機抽樣一致演算法 16 2-3-1 參數與演算法流程 17 2-3-2 最大迭代次數設置 18 2-3-3 抽樣數量設置 19 2-3-4 內群誤差閥值設置 19 2-3-5 提前停止門檻設置 20 2-3-6 隨機抽樣一致改進演算法 20 2-4 二維都普勒雷達自運動估計方法 21 2-4-1 靜態物體速度與雷達方位角關係 22 2-4-2 感測器瞬時速度估計方法 23 2-4-3 車輛自運動估計方法 23 第三章 三維車輛自運動估計及實驗 29 3-1 前言 29 3-2 車輛三維自運動估計方法 29 3-2-1 靜態物體速度與雷達方位角/俯仰角關係 29 3-2-2 感測器瞬時速度估計 30 3-2-3 車輛三維運動模型與三維自運動估計 30 3-2-4 里程計算方法 32 3-3 4D都普勒雷達資料集 33 3-4 評估指標 34 3-4-1 相對姿態誤差 34 3-4-2 平均平移與旋轉誤差 35 3-5 系統與環境 36 3-5-1 電腦配置 36 3-5-2 軟體工具 36 第四章 實驗結果與分析 41 4-1 前言 41 4-2 資料集與演算法測試與分析 41 4-2-1 雷達量測點數量分佈 41 4-2-2 不同雷達瞬時速度下的量測點誤差 42 4-2-3 測試路線介紹 43 4-2-4 量測點徑向速度誤差與環境動靜態目標關係 43 4-3 隨機抽樣一致參數配置表現分析 44 4-3-1 內群誤差閥值調整 44 4-3-2 最大迭代次數調整 47 4-3-3 提前停止門檻調整 48 4-3-4 取樣數量調整 48 4-4 測試路線里程評估結果分析 49 4-4-1 相對姿態誤差結果分析 49 4-4-1 平均平移與旋轉誤差結果分析 51 4-5 角速度計算結果驗證與分析 52 4-6 視覺化驗證 53 4-7 驗證結果討論 54 第五章 結論與未來展望 68 5-1 結論 68 5-2 未來展望 69 參考文獻 70

    [1] R. Sun et al., "A millimeter-wave automotive radar with high angular resolution for identification of closely spaced on-road obstacles," Scientific reports, vol. 13, no. 1, p. 3233, 2023.
    [2] D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, "Instantaneous ego-motion estimation using doppler radar," in 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), 2013: IEEE, pp. 869-874.
    [3] J. Borenstein and L. Feng, "UMBmark: A method for measuring, comparing, and correcting dead-reckoning errors in mobile robots," 1994.
    [4] O. J. Woodman, "An introduction to inertial navigation," University of Cambridge, Computer Laboratory, 2007.
    [5] J. Gluckman and S. K. Nayar, "Ego-motion and omnidirectional cameras," in Sixth International Conference on Computer Vision (IEEE Cat. No. 98CH36271), 1998: IEEE, pp. 999-1005.
    [6] D. Nistér, O. Naroditsky, and J. Bergen, "Visual odometry for ground vehicle applications," Journal of Field Robotics, vol. 23, no. 1, pp. 3-20, 2006.
    [7] D. Kellner, M. Barjenbruch, J. Klappstein, J. Dickmann, and K. Dietmayer, "Instantaneous ego-motion estimation using multiple Doppler radars," in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014: IEEE, pp. 1592-1597.
    [8] L. Armesto, J. Tornero, and M. Vincze, "Fast ego-motion estimation with multi-rate fusion of inertial and vision," The International Journal of Robotics Research, vol. 26, no. 6, pp. 577-589, 2007.
    [9] J. Lobo and J. Dias, "Inertial Sensed Ego‐motion for 3D Vision," Journal of Robotic Systems, vol. 21, no. 1, pp. 3-12, 2004.
    [10] P. J. Besl and N. D. McKay, "Method for registration of 3-D shapes," in Sensor fusion IV: control paradigms and data structures, 1992, vol. 1611: Spie, pp. 586-606.
    [11] P. Biber and W. Straßer, "The normal distributions transform: A new approach to laser scan matching," in Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003)(Cat. No. 03CH37453), 2003, vol. 3: IEEE, pp. 2743-2748.
    [12] V. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello, "Bayesian filtering for location estimation," IEEE pervasive computing, vol. 2, no. 3, pp. 24-33, 2003.
    [13] M. Spangenberg, V. Calmettes, and J.-Y. Tourneref, "Fusion of GPS, INS and odometric data for automotive navigation," in 2007 15th European Signal Processing Conference, 2007: IEEE, pp. 886-890.
    [14] Davison, "Real-time simultaneous localisation and mapping with a single camera," in Proceedings Ninth IEEE International Conference on Computer Vision, 2003: IEEE, pp. 1403-1410 vol. 2.
    [15] N. J. Abu-Alrub and N. A. Rawashdeh, "Radar odometry for autonomous ground vehicles: A survey of methods and datasets," IEEE Transactions on Intelligent Vehicles, 2023.
    [16] K. Yokoo, S. Beauregard, and M. Schneider, "Indoor relative localization with mobile short-range radar," in VTC Spring 2009-IEEE 69th Vehicular Technology Conference, 2009: IEEE, pp. 1-5.
    [17] M. A. Fischler and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol. 24, no. 6, pp. 381-395, 1981.
    [18] E. Ward and J. Folkesson, "Vehicle localization with low cost radar sensors," in 2016 IEEE Intelligent Vehicles Symposium (IV), 2016: IEEE, pp. 864-870.
    [19] M. Rapp, M. Barjenbruch, M. Hahn, J. Dickmann, and K. Dietmayer, "Probabilistic ego-motion estimation using multiple automotive radar sensors," Robotics and Autonomous Systems, vol. 89, pp. 136-146, 2017.
    [20] A. Kramer, C. Stahoviak, A. Santamaria-Navarro, A.-A. Agha-Mohammadi, and C. Heckman, "Radar-inertial ego-velocity estimation for visually degraded environments," in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020: IEEE, pp. 5739-5746.
    [21] C. Doer and G. F. Trommer, "An ekf based approach to radar inertial odometry," in 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2020: IEEE, pp. 152-159.
    [22] J. Van Brummelen, M. O’brien, D. Gruyer, and H. Najjaran, "Autonomous vehicle perception: The technology of today and tomorrow," Transportation research part C: emerging technologies, vol. 89, pp. 384-406, 2018.
    [23] A. Bacha et al., "Odin: Team victortango's entry in the darpa urban challenge," Journal of field Robotics, vol. 25, no. 8, pp. 467-492, 2008.
    [24] R. H. Rasshofer and K. Gresser, "Automotive radar and lidar systems for next generation driver assistance functions," Advances in Radio Science, vol. 3, pp. 205-209, 2005.
    [25] M. A. Richards, J. Scheer, W. A. Holm, and W. L. Melvin, "Principles of modern radar," 2010.
    [26] Y. Zhang, A. Carballo, H. Yang, and K. Takeda, "Perception and sensing for autonomous vehicles under adverse weather conditions: A survey," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 196, pp. 146-177, 2023.
    [27] Z. Hong, Y. Petillot, A. Wallace, and S. Wang, "RadarSLAM: A robust simultaneous localization and mapping system for all weather conditions," The international journal of robotics research, vol. 41, no. 5, pp. 519-542, 2022.
    [28] Z. Han et al., "4D millimeter-wave radar in autonomous driving: A survey," arXiv preprint arXiv:2306.04242, 2023.
    [29] S. Rao, "Introduction to mmWave sensing: FMCW radars," Texas Instruments (TI) mmWave Training Series, pp. 1-11, 2017.
    [30] C. Iovescu and S. Rao, "The fundamentals of millimeter wave sensors," Texas Instruments, pp. 1-8, 2017.
    [31] A. Srivastav and S. Mandal, "Radars for autonomous driving: A review of deep learning methods and challenges," IEEE Access, 2023.
    [32] M. I. Skolnik, Introduction to radar systems. McGraw-hill New York, 1980.
    [33] G. T. Capraro, A. Farina, H. Griffiths, and M. C. Wicks, "Knowledge-based radar signal and data processing: a tutorial review," IEEE Signal Processing Magazine, vol. 23, no. 1, pp. 18-29, 2006.
    [34] A. Kamann, P. Held, F. Perras, P. Zaumseil, T. Brandmeier, and U. T. Schwarz, "Automotive radar multipath propagation in uncertain environments," in 2018 21st International Conference on Intelligent Transportation Systems (ITSC), 2018: IEEE, pp. 859-864.
    [35] M. Murad et al., "Requirements for next generation automotive radars," in 2013 ieee radar conference (radarcon13), 2013: IEEE, pp. 1-6.
    [36] J. Yoon, S. Lee, S. Lim, and S.-C. Kim, "High-density clutter recognition and suppression for automotive radar systems," IEEE Access, vol. 7, pp. 58368-58380, 2019.
    [37] J. Yu and J. Krolik, "MIMO multipath clutter mitigation for GMTI automotive radar in urban environments," in IET International Conference on Radar Systems (Radar 2012), 2012: IET, pp. 1-5.
    [38] B. Tan et al., "3-D object detection for multiframe 4-D automotive millimeter-wave radar point cloud," IEEE Sensors Journal, vol. 23, no. 11, pp. 11125-11138, 2022.
    [39] L. Zheng et al., "Rcfusion: Fusing 4-d radar and camera with bird’s-eye view features for 3-d object detection," IEEE Transactions on Instrumentation and Measurement, vol. 72, pp. 1-14, 2023.
    [40] H. Cui, J. Wu, J. Zhang, G. Chowdhary, and W. R. Norris, "3D detection and tracking for on-road vehicles with a monovision camera and dual low-cost 4D mmWave radars," in 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), 2021: IEEE, pp. 2931-2937.
    [41] R. Raguram, O. Chum, M. Pollefeys, J. Matas, and J.-M. Frahm, "USAC: A universal framework for random sample consensus," IEEE transactions on pattern analysis and machine intelligence, vol. 35, no. 8, pp. 2022-2038, 2012.
    [42] R. Hartley and A. Zisserman, Multiple view geometry in computer vision. Cambridge university press, 2003.
    [43] P. H. Torr and A. Zisserman, "MLESAC: A new robust estimator with application to estimating image geometry," Computer vision and image understanding, vol. 78, no. 1, pp. 138-156, 2000.
    [44] P. H. Torr, S. J. Nasuto, and J. M. Bishop, "Napsac: High noise, high dimensional robust estimation-it’s in the bag," in British Machine Vision Conference (BMVC), 2002, vol. 2, p. 3.
    [45] K. Ni, H. Jin, and F. Dellaert, "GroupSAC: Efficient consensus in the presence of groupings," in 2009 IEEE 12th International Conference on Computer Vision, 2009: IEEE, pp. 2193-2200.
    [46] O. Chum and J. Matas, "Matching with PROSAC-progressive sample consensus," in 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR'05), 2005, vol. 1: IEEE, pp. 220-226.
    [47] O. Chum, J. Matas, and J. Kittler, "Locally optimized RANSAC," in Pattern Recognition: 25th DAGM Symposium, Magdeburg, Germany, September 10-12, 2003. Proceedings 25, 2003: Springer, pp. 236-243.
    [48] J. Matas and O. Chum, "Randomized RANSAC with Td, d test," Image and vision computing, vol. 22, no. 10, pp. 837-842, 2004.
    [49] D. Nistér, "Preemptive RANSAC for live structure and motion estimation," Machine Vision and Applications, vol. 16, no. 5, pp. 321-329, 2005.
    [50] O. Chum, T. Werner, and J. Matas, "Two-view geometry estimation unaffected by a dominant plane," in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), 2005, vol. 1: IEEE, pp. 772-779.
    [51] G. Kannan, "Road user detection on radar data cubes with deep learning," 2021.
    [52] A. Palffy, E. Pool, S. Baratam, J. F. Kooij, and D. M. Gavrila, "Multi-class road user detection with 3+ 1D radar in the View-of-Delft dataset," IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 4961-4968, 2022.
    [53] A. Geiger, P. Lenz, and R. Urtasun, "Are we ready for autonomous driving? the kitti vision benchmark suite," in 2012 IEEE conference on computer vision and pattern recognition, 2012: IEEE, pp. 3354-3361.
    [54] M. Quigley et al., "ROS: an open-source Robot Operating System," in ICRA workshop on open source software, 2009, vol. 3, no. 3.2: Kobe, Japan, p. 5.
    [55] F. Meinl, M. Stolz, M. Kunert, and H. Blume, "An experimental high performance radar system for highly automated driving," in 2017 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), 2017: IEEE, pp. 71-74.

    QR CODE