研究生: |
島津達則 Shimazu Tatsunori |
---|---|
論文名稱: |
適用於混沌光達系統之每秒10幀即時深度圖FPGA系統設計與實作 A 10 Frames/Sec Real-Time Depth Map FPGA System for Chaos LiDAR Systems |
指導教授: |
黃元豪
Huang, Yuan-Hao |
口試委員: |
蔡佩芸
Tsai, Pei-Yun 陳坤志 Chen, Kun-Chih 沈中安 Shen, Chung An |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 英文 |
論文頁數: | 57 |
中文關鍵詞: | 混沌 、光達 、即時 、現場可程式化邏輯閘陣列 |
外文關鍵詞: | chaos, LiDAR, Real-time, FPGA |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
光達是一種測量目標距離或深度訊息的技術。我們可以通過估計光信號從設備傳輸到目標的時間來計算到目標的距離。由於我們在混沌光達系統中使用的光源是混沌光,它是一種類噪聲信號,因此該系統被稱為混沌光達系統。
混沌光達系統可以應用於各個領域,包括深度計算、自動駕駛、3D 建模、擴增時實境、虛擬實境等等。為了保證混沌光達系統的可靠性和有效性,實時光達系統勢在必行。本文介紹了在現場可程式化邏輯閘陣列和個人電腦上的混沌光達實時實現,以及研究過程中遇到的挑戰,例如光學儀器之間延遲、跨時脈域訊號處理等等,並且建立了圖形使用者介面,說明了以上問題的解決方案和演示結果並與現有的產品進行比較。以每秒 11 萬像素的吞吐量,混沌光達系統每秒提供 10 幀長100點寬100點深度分辨率,並且具有毫米等級的精準度。
Light detection and ranging (LiDAR) is a technique that measures the distance or depth information of the target. We can calculate the distance of the target by estimating the time of the transmission signal transmitting from device to target. Since the light source we use in the chaos LiDAR system is chaos laser, which is a noise-like signal, the system is called chaos LiDAR system. Chaos LiDAR system can be applied to various fields, including depth calculation, autopilot, 3D modeling, augmented reality, virtual reality, etc. To ensure the reliability and validity of chaos LiDAR system, an access to real-time LiDAR system is imperative. This thesis introduces the chaos LiDAR real-time implementation on field programmable gate array (FPGA) and personal computer (PC), followed by challenges during the research process, solutions to the problems and the demo results. With a throughput of 110 thousand pixels per second, the chaos LiDAR system provides 10 frames of 100x100 depth map resolution per second.
[1] F.-Y. Lin and J.-M. Liu, “Chaotic lidar,”IEEE Journal of Selected Topics in Quantum Electronics, vol. 10, no. 5, pp. 991–997, 2004.
[2] Abaco, “fmc126.” [Online]. Available: https://www.abaco.com/products/fmc126-fpga-mezzanine-card
[3] Infineon, “cyusb3fx3.” [Online]. Available: https://www.infineon.com/cms/en/product/evaluation-boards/cyusb3kit-003/
[4] Sony, “Dsc-rx100m5a. ”[Online]. Available: https://www.sony.com.tw/zh/electronics/cyber-shot-compact-cameras/dsc-rx100m5a\#related
roducts\dynamic\default
[5] F. Pellen, P. Olivard, Y. Guern, J. Cariou, and J. Lotrian, “Radio frequency modulation on an optical carrier for target detection enhancement in sea-water,”Journalof Physics D: Applied Physics, vol. 34, pp. 1122–1130, 2001.
[6] K. Myneni, T. A. Barr, B. R. Reed, S. D. Pethel, and N. J. Corron, “High-precision ranging using a chaotic laser pulse train,”Applied Physics Letters, vol. 78, no. 11,pp. 1496–1498, 2001.
[7] J. Liu, Q. Sun, Z. Fan, and Y. Jia, “Tof lidar development in autonomous vehicle,”2018 IEEE 3rd Optoelectronics Global Conference (OGC), 2008.
[8] A. Zomet, A. Rav-Acha, and S. Peleg, “Robust super-resolution,” inProceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, vol. 1. IEEE, 2001, pp. I–645–I–650.
[9] J. Mark, B. Tromborg, and J. Mark, “Chaos in semiconductor lasers with optical feedback: Theory and experiment,”IEEE Journal of Quantum Electronics, vol. 28,no. 1, pp. 93–108, 1992.
[10] Y. H. L. W. T. Wu and F. Lin, “Noise suppressions in chaotic lidars under different synchronization schemes,” in 2010 23rd Annual Meeting of the IEEE Photonics Society, 2010, pp. 183-184, doi: 10.1109/PHOTONICS.2010.5698819. IEEE, 2010.
[11] Y. Li and J. Ibanez-Guzman, “Lidar for autonomous driving: The principles,challenges, and trends for automotive lidar and perception systems,” ininIEEE Signal Processing Magazine, vol. 37, no. 4, pp. 50-61, July 2020, doi:10.1109/MSP.2020.2973615.IEEE, 2020.
[12] Y.-C. Lin, P.-H. Hsieh, J.-L. Hong, Y.-H. Lai, J.-D. Chen, F.-Y. Lin, Y.-H. Huang,and P.-C. Huang, “A cross-correlation-based time-of-flight design for chaos lidar systems,” in2021 IEEE Asian Solid-State Circuits Conference (A-SSCC), 2021,pp. 1–3.
[13] H.-L. Ho, J.-D. Chen, C.-A. Yang, C.-C. Liu, C.-T. Lee, Y.-H. Lai, and F.-Y.Lin, “High-speed 3d imaging using a chaos lidar system,”The European Physical Journal Special Topics, 01 2022.
[14] H. L. Ho, “Study of chaos-modulated pulses generates by different modulationdevices and their characteristics in the master oscillator power amplifier configuration,” Master’s thesis, National Tsing Hua University, Taiwan, 2020.
[15] J. P. Queralta, F. Yuhong, L. Salomaa, L. Qingqing, T. N. Gia, Z. Zou, H. Ten-hunen, and T. Westerlund, “Fpga-based architecture for a low-cost 3d lidar design and implementation from multiple rotating 2d lidars with ros,” in2019 IEEE SEN-SORS, 2019, pp. 1–4.
[16] Xilinx, “Xilinx virtex-7 fpga vc707 evaluation kit.” [Online]. Available:https://www.xilinx.com/products/boards-and-kits/ek-v7-vc707-g.html
[17] A. system,FMC122/FMC125/FMC126 User Manual, Abaco system.
[18] W. L. Lin, “Real-time fpga system for chaos lidar depth map and its alignmentmethod with camera images,” Master’s thesis, National Tsing Hua University, Taiwan, 2020.
[19] J.-J. Guo, “Low-complexity depth-sensing algorithm and architecture design forchaotic lidar system,” Master’s thesis, National Tsing Hua University, Taiwan,2019.
[20] Xilinx,Vivado Design Suite User Guide-Design Analysis and Closure Techniques,Xilinx.
[21] G. Bradski, “The OpenCV Library,”Dr. Dobb’s Journal of Software Tools, 2000.
[22] R. Szeliski,Computer Vision: Algorithms and Applications, 2nd ed., 2020.
[23] Intel, “Intel realsense camera depth testing methodology.” [Online]. Avail-able: https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSenseDepthQualityTesting.pdf
[24] Intel, “Intel lidar-camera-d400-datasheet.” [Online]. Available:https://dev.intelrealsense.com/docs/intel-realsense-d400-series-product-family-datasheet
[25] Intel, “Intel lidar-camera-l515-datasheet.” [Online]. Available:https://dev.intelrealsense.com/docs/lidar-camera-l515-datasheet
[26] D. Pagliari and L. Pinto, “Calibration of kinect for xbox one and comparison be-tween the two generations of microsoft sensors,”Sensors, vol. 15, pp. 27569–27589,10 2015