研究生: |
陳柏君 Chen, Po Chune |
---|---|
論文名稱: |
應用於無人飛行器的室內定位及追蹤系統 Indoor Positioning and Tracking System for Drones |
指導教授: |
馬席彬
Ma, Hsi Pin |
口試委員: |
楊家驤
黃元豪 孫民 |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2015 |
畢業學年度: | 104 |
語文別: | 英文 |
論文頁數: | 108 |
中文關鍵詞: | 無人飛行器 、室內定位系統 、圖形識別 |
外文關鍵詞: | Drones, Indoor positioning system, Pattern recognition |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來,無人飛行器的相關應用越來越普遍,如攝影、製圖、廣告以及災害搜救。在這篇論文裡,將提出一個應用於無人飛行器的室內定位及追蹤系統。系統分為三個區塊:定位的區塊用來得知特定無線裝置在空間中的位置、追蹤的區塊利用圖形識別來追蹤被選取的物件、移動的區塊決定飛機的移動方向。
我們使用了一款可以限制通訊範圍且可以利用內建微控制器作運算的無線裝置實現了改良的Cell of Origin(CoO)定位演算法。而對於追蹤區塊,我們實現了Tracking-Learning-Detection (TLD)追蹤演算法。TLD是一種可以追蹤位置物件以及學習物件圖樣的計算機視覺演算法。一旦追蹤成功,其誤差距離會小於10公分。移動區塊利用定位以及追蹤的結果來判斷飛機要如何移動,目標是要讓被追蹤的物件維持在鏡頭畫面中央,以及可以自動飛行到使用者指定的室內區域。
最終,我們提出了一個結合CoO定位演算法、TLD追蹤演算法和移動演算法且適用於無人飛行器的系統。此系統使用無線裝置來定位,然後使用無人飛行器的鏡頭來追蹤,最後將鏡頭對準物件且自動移動到指定區域。實驗結果顯示定位的精準度為2公尺、追蹤的成功率高於其他對照演算法33%。除此之外,我們的定位系統主要的優勢為不需為室內環境建立專屬模型,且定位的涵蓋範圍可以擴張。再者,無線裝置的體積為3公分*2公分*2公分,使定位系統擁有可攜式的特性。最後,無線裝置在運作電壓3V的耗能為49.9 mW,在850mAh電池的供應下可以持續運作2天以上。
Recently, the drone-related applications are more and more popular such as photography, mapping, real estate and disaster response. In this thesis, the indoor positioning and tracking system for drones is proposed. It includes three parts: positioning component locates position of specified wireless device; tracking component tracks the selected object by pattern recognition; and movement decision component decides the drone’s movement.
We have implemented the modified cell of origin (CoO) indoor positioning algorithm with the wireless device, which is the one limits its communication range and processes some calculation with the built-in microcontroller. For the tracking component, we have implemented tracking-learning-detection (TLD) algorithm. TLD is a computer vision algorithm that tracks unknown objects and recognizes object patterns. If tracking is successful, the margin of error will be less than 10 cm. The movement decision component computes feedback from positioning and tracking components to generate the drone’s movement. It positions objects to the center of image and directs the drone to the location commanded by the user.
Finally, we propose a system which combines the CoO indoor positioning algorithm, the TLD algorithm, and the movement decisions for drones. The system locates the position of drone with the wireless device, tracks objects through drone’s camera, aims drone’s camera to the object, and flies the drone to desired locations. The test results show 2 m accuracy of positioning, and improved successful tracking rate of 33% higher than other tracking algorithms as control groups. The positioning algorithm contributes to the environment model unnecessary and the scalable coverage of positioning. Moreover, the wireless device’s size of
3 cm * 2 cm * 2 cm makes the system portable and the power consumption of 49.9 mW under 3.0 V operation voltage on a 850 mAh battery keeps the wireless device working over 2 days.
[1] AIR-VID, 20 great UAV applications areas for Drones, Sept. 2014. [Online]. Available: http://air-vid.com/wp/20-great-uav-applications-areas-drones/.
[2] Y. Gu, A. Lo, and I. Niemegeers, “A survey of indoor positioning systems for wireless personal networks,” 2009 IEEE Communications Surveys Tutorials, vol. 11, no. 1, pp. 13–32, First 2009.
[3] R. Mautz, “Indoor positioning technologies,” Habilitation Thesis, ETH Zurich, Feb. 2012.
[4] M. Al-Ammar, S. Alhadhrami, A. Al-Salman, A. Alarifi, H. Al-Khalifa, A. Alnafessah, and M. Alsaleh, “Comparative survey of indoor positioning technologies, techniques, and algorithms,” in 2014 International Conf. Cyberworlds, Oct. 2014, pp. 245–252.
[5] Z. Song, G. Jiang, and C. Huang, “A survey on indoor positioning technologies,” in Theoretical and Mathematical Foundations of Computer Science. Springer, 2011, pp. 198–206.
[6] J.-g. Liu, D.-M. Shi, and M. K. Leung, “Indoor navigation system based on omni-directional corridorguidelines,” in 2008 IEEE International Conf. Machine Learning and Cybernetics, vol. 3, 2008, pp. 1271–1276.
[7] M. S. Svalastog, “Indoor positioning-technologies, services and architectures,” CandSient Thesis, OSLO Department of Informatics, May 2007.
[8] H. Liu, H. Darabi, P. Banerjee, and J. Liu, “Survey of wireless indoor positioning techniques and systems,” IEEE Trans. Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 37, no. 6, pp. 1067–1080, 2007.
[9] Z. Farid, R. Nordin, and M. Ismail, “Recent advances in wireless indoor localization techniques and system,” Computer Networks and Communications, vol. 2013, 2013.
[10] D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Computer Vision, vol. 60, no. 2, pp. 91–110, 2004.
[11] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” in 2001 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, vol. 1, 2001, pp. I–511–I–518.
[12] V. Lepetit, P. Lagger, and P. Fua, “Randomized trees for real-time keypoint recognition,” in 2005 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, vol. 2, 2005, pp. 775–781.
[13] Z. Kalal, K. Mikolajczyk, and J. Matas, “Tracking-learning-detection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 34, no. 7, pp. 1409–1422, July 2012.
[14] B. D. Lucas, T. Kanade et al., “An iterative image registration technique with an application to stereo vision.” in International Joint Conf. Artificial Intelligence, vol. 81, 1981, pp. 674–679.
[15] P. Sand and S. Teller, “Particle video: Long-range motion estimation using point trajectories,” in 2006 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, vol. 2, 2006, pp. 2195–2202.
[16] S. Birchfield, “Elliptical head tracking using intensity gradients and color histograms,” in 1998 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, Jun. 1998, pp. 232–237.
[17] T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Computer Vision-ECCV 2004. Springer, 2004, pp. 25–36.
[18] H. Grabner and H. Bischof, “On-line boosting and vision,” in 2006 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, vol. 1, 2006, pp. 260–267.
[19] B. Babenko, M.-H. Yang, and S. Belongie, “Visual tracking with online multiple instance
learning,” in 2009 IEEE Conf. Computer Vision and Pattern Recognition, 2009, pp. 983–
990.
[20] H. Grabner, C. Leistner, and H. Bischof, “Semi-supervised on-line boosting for robust tracking,” in Computer Vision-ECCV 2008. Springer, 2008, pp. 234–247.
[21] F. Tang, S. Brennan, Q. Zhao, and H. Tao, “Co-tracking using semi-supervised support vector machines,” in 2007 IEEE 11th International Conf. Computer Vision, 2007, pp. 1–8.
[22] Q. Yu, T. B. Dinh, and G. Medioni, “Online tracking and reacquisition using co-trained generative and discriminative trackers,” in Computer Vision-ECCV 2008. Springer, 2008, pp. 678–691.
[23] H. Hile and G. Borriello, “Positioning and orientation in indoor environments using camera phones,” 2008 IEEE Computer Graphics and Applications, vol. 28, no. 4, pp. 32–39, July 2008.
[24] J. Ido, Y. Shimizu, Y. Matsumoto, and T. Ogasawara, “Indoor navigation for a humanoid robot using a view sequence,” Robotics Research, vol. 28, no. 2, pp. 315–325, 2009.
[25] M. K¨ohler, S. N. Patel, J. W. Summet, E. P. Stuntebeck, and G. D. Abowd, “Track-sense: infrastructure free precise indoor positioning using projected patterns,” in Pervasive Computing. Springer, 2007, pp. 334–350.
[26] F. Boochs, R. Sch¨utze, C. Simon, F. Marzani, H. Wirth, and J. Meier, “Increasing the accuracy of untaught robot positions by means of a multi-camera system,” in 2010 IEEE International Conf. Indoor Positioning and Indoor Navigation, 2010, pp. 1–9.
[27] N. Uchitomi, A. Inada, M. Fujimoto, T. Wada, K. Mutsuura, and H. Okada, “Accurate indoor position estimation by swift-communication range recognition (s-crr) method in passive rfid systems,” in 2010 IEEE International Conf. Indoor Positioning and Indoor
Navigation, 2010, pp. 1–7.
[28] M. Baum, B. Niemann, F. Abelbeck, D.-H. Fricke, and L. Overmeyer, “Qualification tests of hf rfid foil transponders for a vehicle guidance system,” in 2007 IEEE Intelligent Transportation Systems Conf. Intelligent Transportation Systems, Sept. 2007, pp. 950–955.
[29] A. R. J. Ruiz, F. S. Granja, J. Honorato, and J. I. G. Rosas, “Pedestrian indoor navigation
by aiding a foot-mounted imu with rfid signal strength measurements,” in 2010 IEEE International Conf. Indoor Positioning and Indoor Navigation, 2010, pp. 1–7.
[30] F. Seco, C. Plagemann, A. R. Jim´enez, and W. Burgard, “Improving rfid-based indoor positioning accuracy using gaussian processes,” in 2010 IEEE International Conf. Indoor Positioning and Indoor Navigation, 2010, pp. 1–8.
[31] M. Segura, H. Hashemi, C. Sisterna, and V. Mut, “Experimental demonstration of self-localized ultra wideband indoor mobile robot navigation system,” in 2010 IEEE International Conf. Indoor Positioning and Indoor Navigation, 2010, pp. 1–9.
[32] M. Pietrzyk and T. von der Grun, “Experimental validation of a toa uwb ranging platform with the energy detection receiver,” in 2010 IEEE International Conf. Indoor Positioning and Indoor Navigation, Sept. 2010, pp. 1–8.
[33] L. Stoica, A. Rabbachin, and I. Oppermann, “A low-complexity noncoherent ir-uwb transceiver architecture with toa estimation,” IEEE Trans. Microwave Theory and Techniques, vol. 54, no. 4, pp. 1637–1646, June 2006.
[34] G. Fischer, O. Klymenko, D. Martynenko, and H. Luediger, “An impulse radio uwb transceiver with high-precision toa measurement unit,” in 2010 International Conf. Indoor Positioning and Indoor Navigation, Sept. 2010, pp. 1–8.
[35] S. Mazuelas, A. Bahillo, R. Lorenzo, P. Fernandez, F. Lago, E. Garcia, J. Blas, and E. Abril, “Robust indoor positioning provided by real-time rssi values in unmodified wlan networks,” 2009 IEEE Selected Topics in Signal Processing, vol. 3, no. 5, pp. 821–831, Oct 2009.
[36] T. King, S. Kopf, T. Haenselmann, C. Lubberger, andW. Effelsberg, “Compass: A probabilistic indoor positioning system based on 802.11 and digital compasses,” in the First Association for Computing Machinery International Workshop Conf. Wireless network
testbeds, experimental evaluation & characterization, 2006, pp. 34–40.
[37] L. Koski, T. Perala, and R. Piche, “Indoor positioning using wlan coverage area estimates,” in 2010 International Conf. Indoor Positioning and Indoor Navigation, Sept. 2010, pp. 1–7.
[38] Z. Xiang, S. Song, J. Chen, H. Wang, J. Huang, and X. Gao, “A wireless lan-based indoor positioning technology,” IBM Research and Development, vol. 48, no. 5.6, pp. 617–626, Sep 2004.
[39] B. Parodi, H. Lenz, A. Szabo, H. Wang, J. Horn, J. Bamberger, and D. Obradovic, “Initialization and online-learning of rss maps for indoor / campus localization,” in 2006 IEEE International Conf. Position, Location, And Navigation Symposium, Apr. 2006, pp. 164–172.
[40] L. Aalto, N. G¨othlin, J. Korhonen, and T. Ojala, “Bluetooth and wap push based location-aware mobile advertising system,” in the Second Association for Computing Machinery International Conf. Mobile systems, applications, and services, 2004, pp. 49–58.
[41] M. S. Bargh and R. de Groote, “Indoor localization based on response rate of bluetooth inquiries,” in the First Association for Computing Machinery International Workshop Conf. Mobile entity localization and tracking in GPS-less environments, 2008, pp. 49–54.
[42] V. Filonenko, C. Cullen, and J. Carswell, “Investigating ultrasonic positioning on mobile phones,” in 2010 International Conf. Indoor Positioning and Indoor Navigation, Sept. 2010, pp. 1–8.
[43] H. Schweinzer and M. Syafrudin, “Losnus: An ultrasonic system enabling high accuracy and secure tdoa locating of numerous devices,” in 2010 International Conf. Indoor Positioning and Indoor Navigation, Sept. 2010, pp. 1–8.
[44] T. Sato, S. Nakamura, K. Terabayashi, M. Sugimoto, and H. Hashizume, “Design and implementation of a robust and real-time ultrasonic motion-capture system,” in 2011
International Conf. Indoor Positioning and Indoor Navigation, Sept. 2011, pp. 1–6.
[45] A. Saffari, C. Leistner, J. Santner, M. Godec, and H. Bischof, “On-line random forests,” in 2009 IEEE 12th International Conf. Computer Vision Workshops, Sept. 2009, pp. 1393–1400.
[46] A. Adam, E. Rivlin, and I. Shimshoni, “Robust fragments-based tracking using the integral histogram,” in 2006 IEEE Computer Society Conf. Computer Vision and Pattern Recognition, vol. 1, Jun. 2006, pp. 798–805.
[47] J. Santner, C. Leistner, A. Saffari, T. Pock, and H. Bischof, “Prost: Parallel robust online simple tracking,” in 2010 IEEE Conf. Computer Vision and Pattern Recognition, Jun. 2010, pp. 723–730.
[48] B. Hu, “Wi-fi based indoor positioning system using smartphones,” Ph.D. dissertation, RMIT University, 2013.
[49] S. Jose, Wi-Fi Location-Based Services 4.1 Design Guide. Cisco Systems, Inc., May 2008.
[50] I. Recommendations, “Propagation data and prediction methods for the planning of indoor radiocommunication systems and radio local area networks in the frequency range 900 MHz to 100 GHz,” ITU Recommendations, 2001.
[51] Z. Kalal, J. Matas, and K. Mikolajczyk, “Online learning of robust object detectors during unstable tracking,” in 2009 IEEE 12th International Conf. Computer Vision Workshops, Sept. 2009, pp. 1417–1424.
[52] J.-Y. Bouguet, “Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm,” Intel Corporation, vol. 5, pp. 1–10, 2001.
[53] Z. Kalal, K. Mikolajczyk, and J. Matas, “Forward-backward error: Automatic detection of tracking failures,” in 2010 20th International Conf. Pattern Recognition, Aug. 2010, pp. 2756–2759.
[54] Parrot, Parrot AR.Drone Elite Edition, 2015. [Online]. Available: http://cdn.ardrone2.parrot.com/.
[55] Parrot, “Parrot ar.drone2.0 datasheet,” Tech. Rep., sept. 2013.
[56] puku0x, CV Drone, 2014. [Online]. Available: https://github.com/puku0x/cvdrone.
[57] itseez, OpenCV 3.0, Jun. 2015. [Online]. Available: http://opencv.org/opencv-3-0.html.
[58] Parrot, AR.Drone Developer Guide SDK 2.0, May 2012.
[59] R. Digital, “Rfd22301.datasheet,” Tech. Rep., Nov. 2013.
[60] RF Digital. [Online]. Available: http://www.rfdigital.com/.
[61] N. Semiconductor, Gazell Link Layer User Guide, 2013.
[62] R. Digital, RFduino BLE Programming Reference, Apr. 2014.
[63] G. Nebehay, OpenTLD, 2013. [Online]. Available: https://github.com/gnebehay/OpenTLD.
[64] S. Stalder, H. Grabner, and L. Van Gool, “Beyond semi-supervised tracking: Tracking should be as simple as detection, but not simpler than recognition,” in 2009 IEEE 12th International Conf. Computer Vision Workshops, 2009, pp. 1409–1416.
[65] R. de Klein, Serial library for C++, 2003. [Online]. Available: http://www.codeproject.com/Articles/992/Serial-library-for-C.