簡易檢索 / 詳目顯示

研究生: 林泳箴
Lin, Yung-Chen
論文名稱: 適用於多輸入多輸出混沌光達系統之每秒66萬像素即時深度偵測加速電路設計與實作
A 660K pixels/sec Real-Time Depth Sensing Accelerator for MIMO Chaos LiDAR Systems
指導教授: 黃元豪
Huang, Yuan-Hao
口試委員: 蔡佩芸
Tsai, Pei-Yun
沈中安
Shen, Chung-An
陳坤志
Chen, Kun-Chih
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 64
中文關鍵詞: 混沌光達飛行時間測距多輸入多輸出現場可程式化邏輯閘陣列
外文關鍵詞: Chaos LiDAR, ToF, MIMO, FPGA
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 光達是一種用於測量物體距離的技術,通過發射雷射光至目標物並計算雷射光的飛行時間,可以估計出與目標物之間的距離。近年來在自動駕駛、擴增/虛擬實境等等領域的發展使得光達越來越受到重視。混沌光是一種類似噪聲的訊號,有著亂數的特性,因此使用混沌光作為光源的混沌光達系統能提供穩定與高精準度的深度偵測結果。雖然混沌光達系統有優異的精準度及不易受到干擾的特性,但它同時具有幾個缺點,導致其應用場景受到限制: (1) 混沌光達掃描的深度點在空間上較為分散 (2) 偵測一個像素的深度點所需的時間較長。為了克服上述問題,本文提出了一種可以同時提高混沌光達系統空間解析度和時間解析度的方法。在提升空間解析度上,使用雙通道多輸入多輸出的作法並設計了相應的控制流程以同時得到兩倍的深度偵測結果。在提升時間解析度上,設計了飛行時間測距的電路以加速深度偵測的過程。實驗結果說明了使用本文所提出的方法,混沌光達系統可以同時掃描兩個通道,取得兩倍的深度圖,每個通道可以達到每秒33萬像素的吞吐量,雙通道的多輸入多輸出系統可以達到每秒66萬像素的吞吐量。


    LiDAR technology can estimate the distances of environmental objects by emitting a laser and measuring the time of flight. Recent developments in the field like autonomous cars and AR/VR have led to considerable growing interests in LiDAR systems. With the random nature of chaos laser, it has been shown that using chaos laser as the light source of LiDAR can provide robust and high-accuracy depth-sensing results. Despite its outstanding precision and capability of anti-interference, the chaos LiDAR system suffers from several major drawbacks: (1) the depth points sensed by LiDAR are usually too sparse and (2) the speed of estimating depths of pixels is not fast enough for real-time processing. These drawbacks limit the usage scenario of the chaos LiDAR system. To overcome these drawbacks, this research proposed a method that can enhance both spatial and temporal resolution at a specific time. A two-channel multi-input multi-output (MIMO) scheme and the corresponding control flow were designed to acquire a double amount of depth-sensing results, and a time-of-flight (TOF) hardware was designed to accelerate the depth-sensing process. The experimental results demonstrate that the proposed techniques for the chaos LiDAR system can capture two depth maps by two scanning channels simultaneously, and each channel achieved a throughput of 330 K pixels per second, therefore the whole MIMO chaos LiDAR system achieved a throughput of 660 K pixels per second.

    1 Introduction 1 1.1 Research Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Chaos LiDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.3 Organization of This Thesis . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Depth Map Sensing of Chaos LiDAR 7 2.1 Square-wave Removal Filter . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.2 Correlation Between the Reference and Target Signals . . . . . . . . . . . 9 2.3 Interpolation Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3 MIMO Chaos LiDAR System 17 3.1 Important Devices in LiDAR . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2 MIMO Chaos LiDAR System . . . . . . . . . . . . . . . . . . . . . . . . 21 3.3 Hardware Block Diagram of MIMO Chaos LiDAR System . . . . . . . . 22 3.4 Control Design of MIMO Chaos LiDAR System . . . . . . . . . . . . . . 25 3.5 Graphical User Interface (GUI) . . . . . . . . . . . . . . . . . . . . . . . 32 4 Time of Flight (TOF) Hardware Optimization 35 4.1 Throughput Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 4.2 Square-wave Removal Filter Design . . . . . . . . . . . . . . . . . . . . . 37 4.3 Correlator Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.4 Peak Search Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 4.5 Interpolator Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 5 System Demonstration and Hardware Results 47 5.1 Settings of the System Parameters . . . . . . . . . . . . . . . . . . . . . . 47 5.1.1 Capacities of the FIFOs . . . . . . . . . . . . . . . . . . . . . . . 48 5.1.2 Hardware and Control Parameters . . . . . . . . . . . . . . . . . . 48 5.1.3 Delay Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 5.2 TOF Hardware Implementation Results . . . . . . . . . . . . . . . . . . . 50 5.3 MIMO Chaos LiDAR System Demonstration . . . . . . . . . . . . . . . . 51 5.3.1 Real-Time MIMO Chaos LiDAR System Demonstration . . . . . 53 5.3.2 Resolution Enhancement Results . . . . . . . . . . . . . . . . . . 56 6 Conclusion and Future Work 59 6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 6.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 References 61

    [1] C.-H. Cheng, C.-Y. Chen, J.-D. Chen, D.-K. Pan, K.-T. Ting, and F.-Y. Lin, “3d
    pulsed chaos lidar system,” Opt. Express, vol. 26, no. 9, pp. 12 230–12 241, Apr 2018.
    [Online]. Available: https://opg.optica.org/oe/abstract.cfm?URI=oe-26-9-12230
    [2] Xilinx, “Xilinx virtex-7 fpga vc707 evaluation kit.” [Online]. Available:
    https://www.xilinx.com/products/boards-and-kits/ek-v7-vc707-g.html
    [3] Abaco, “fmc126.” [Online]. Available: https://www.abaco.com/products/fmc126-
    fpga-mezzanine-card
    [4] Infineon, “cyusb3fx3.” [Online]. Available:
    https://www.infineon.com/cms/en/product/evaluation-boards/cyusb3kit-003/
    [5] H.-G. Jeon, J. Park, G. Choe, J. Park, Y. Bok, Y.-W. Tai, and I. S. Kweon,
    “Accurate depth map estimation from a lenslet light field camera,” in 2015 IEEE
    Conference on Computer Vision and Pattern Recognition (CVPR), 2015, pp. 1547–
    1555.
    [6] D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured
    light,” vol. 1, 07 2003, pp. I–195.
    [7] S. Gokturk, H. Yalcın, and C. Bamji, “A time-of-flight depth sensor - system description,
    issues and solutions,” 01 2004, pp. 35–35.
    [8] L. Wang, Y. Xu, Y. Li, and Y. Zhao, “Voxel segmentation-based 3d building detection
    algorithm for airborne lidar data,” PLOS ONE, vol. 13, p. e0208996, 12
    2018.
    [9] J.-F. Cˆot´e, R. A. Fournier, J. E. Luther, and O. R. van Lier,
    “Fine-scale three-dimensional modeling of boreal forest plots to improve
    forest characterization with remote sensing,” Remote Sensing
    of Environment, vol. 219, pp. 99–114, 2018. [Online]. Available:
    https://www.sciencedirect.com/science/article/pii/S0034425718304450
    [10] J. Xian, Y. Han, S. Huang, D. Sun, J. Zheng, F. Han, A. Zhou, S. Yang, W. Xu,
    Q. Song, L. Wei, Q. Tan, and X. Li, “Novel lidar algorithm for horizontal visibility
    measurement and sea fog monitoring,” Optics Express, vol. 26, p. 34853, 12 2018.
    [11] Y. Li and J. Ibanez-Guzman, “Lidar for autonomous driving: The principles, challenges,
    and trends for automotive lidar and perception systems,” IEEE Signal Processing
    Magazine, vol. 37, no. 4, pp. 50–61, 2020.
    [12] M.-C. Amann, T. Bosch, M. Lescure, R. Myllyl¨a, and M. Rioux, “Laser ranging: A
    critical review of unusual techniques for distance measurement,” Opt. Eng., vol. 40,
    pp. 10–19, 01 2001.
    [13] G. Kim, J. Eom, and Y. Park, “Investigation on the occurrence of mutual interference
    between pulsed terrestrial lidar scanners,” 06 2015, pp. 437–442.
    [14] X. Ai, R. Nock, J. G. Rarity, and N. Dahnoun, “High-resolution random-modulation
    cw lidar,” Appl. Opt., vol. 50, no. 22, pp. 4478–4488, Aug 2011.
    [15] F.-Y. Lin and J.-M. Liu, “Chaotic lidar,” IEEE Journal of Selected Topics in Quantum
    Electronics, vol. 10, no. 5, pp. 991–997, 2004.
    [16] A. Ahmad, “Influence of optical feedback strength and semiconductor laser coherence
    on chaos communications,” Journal of the Optical Society of America B,
    vol. 35, 03 2018.
    [17] T. Shimazu, “A 10 frames/sec real-time depth map fpga system for chaos lidar
    systems,” Master’s thesis, National Tsing Hua University, Taiwan, 2021.
    [18] C.-C. Liu, “A multi-input-multi-output (mimo) pulsed chaos lidar using a multimode
    chaos laser,” Master’s thesis, National Tsing Hua University, Taiwan, 2021.
    [19] J.-D. Chen, K.-W. Wu, H.-L. Ho, C.-T. Lee, and F.-Y. Lin, “3-d multi-input multioutput
    (mimo) pulsed chaos lidar based on time-division multiplexing,” IEEE Journal
    of Selected Topics in Quantum Electronics, vol. 28, no. 5: Lidars and Photonic
    Radars, pp. 1–9, 2022.
    [20] W. L. Lin, “Real-time fpga system for chaos lidar depth map and its alignment
    method with camera images,” Master’s thesis, National Tsing Hua University, Taiwan,
    2020.
    [21] C.-T. Chiu, Y.-C. Ding, W.-C. Lin, W.-J. Chen, S.-Y. Wu, C.-T. Huang, C.-Y.
    Lin, C.-Y. Chang, M.-J. Lee, S. Tatsunori, T. Chen, F.-Y. Lin, and Y.-H. Huang,
    “Chaos lidar based rgb-d face classification system with embedded cnn accelerator
    on fpgas,” IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 69,
    no. 12, pp. 4847–4859, 2022.
    [22] Y.-H. Huang, “Artificial intelligent 3d sensing image processing system for array
    sensing lidar,” National Tsing Hua University, Tech. Rep. MOST 110-2218-E-007-
    046, 2021.
    [23] T.-C. Chen, Y.-Y. C. T.-C. Ma, and L.-G. Chen, “Design and implementation of
    cubic spline interpolation for spike sorting microsystems,” 2011 IEEE International
    Conference on Acoustics, Speech and Signal Processing (ICASSP), 2011.
    [24] R. H. Bartels, J. C. Beatty, and B. A. Barsky, An Introduction to Splines for Use
    in Computer Graphics & Geometric Modeling. San Francisco, CA, USA: Morgan
    Kaufmann Publishers Inc., 1987.
    [25] A. system, FMC122/FMC125/FMC126 User Manual, Abaco system.
    [26] D. E. Taylor, “Efficient implementation of cross-correlation in hardware,” 2014.

    QR CODE