簡易檢索 / 詳目顯示

研究生: 吳欣蓉
Wu, Hsin-Jung
論文名稱: 基於多追蹤器選擇之物體追蹤方法
Object Tracking Based on Multi-tracker Selection
指導教授: 林嘉文
Lin, Chia-Wen
口試委員: 葉梅珍
王鈺強
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2014
畢業學年度: 102
語文別: 英文
論文頁數: 47
中文關鍵詞: 物體追蹤多重追蹤器特徵權重
外文關鍵詞: Object Tracking, Multiple Trackers, Feature Weighting
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在本篇論文中,提出一套整合多個追蹤器達到物體追蹤效能提升的結果。物
    體追蹤這個議題已經被討論幾十年,但是就我們的認知,目前並沒有任何追蹤器
    可以廣泛適應現實生活當中多變的環境。統計數據顯示,不同的追蹤器因為基於
    各式方法,在追蹤過程中,根據環境的不同,會有相對擅長及特別容易產生失敗
    的情況;過去文獻中為了解決這樣的問題,把追蹤架構分成多個部分,將各個部
    分做採樣得到多個基本追蹤器並結合,但是這個架構上對於可以結合的基本追蹤
    器限制太多,因此本篇論文希望可以透過多重追蹤器的結合,達到廣泛適應多變
    的環境,提升物體追蹤效能。
    在結合的過程中,會根據追蹤器擅長的環境做特性分類,分類包括了會造成
    追蹤失敗的環境因素,例如光源變化、遮蔽、相機移動及物體外觀改變等,將特
    性差異度最大的追蹤器做結合,以達到最有明顯的效果提升以及適應情況的廣泛
    性;另外,為了使外觀評估更為可靠,根據環境的不同,特徵在不同的物體與背
    景相似度,給定不同的特徵權重,與相似度成反比。
    在實驗結果當中,顯示出本篇方法的優勢。包括結合當中不受追蹤器方法的
    限制,換句話說,可以基於任何追蹤器做結合,達到提升此追蹤器效能之目的;
    第二,證實透過追蹤器的挑選,可以有效結合不同特性的追蹤器,達到最大幅度
    的效能提升。


    In this thesis, we propose a framework which can improve object tracking based
    on integration of multiple trackers. Object tracking has been attention last decades,
    but there are not any tracker can be widely adapted to the changing environment in
    real-world, in our best knowledge. Statistics shows that due to the trackers based on
    different kind of methods, in the tracking, it will be relatively good and prone to
    failure in different sequences based on different circumstances. In the past, there are
    paper used combine many sampled tracker to solve this problem; but there are too
    many restrictions. Therefore, in this thesis we hope to achieve widespread adapt to
    changing environment by combining multiple trackers, and then, enhance the
    performance of object tracking.
    In step combination of trackers, we will classify trackers into different properties,
    according to the tracker do well in which environments. Classification includes
    environmental influence causes tracker failure, such as illumination variation,
    occlusion, moving camera, and target appearance changes, etc. we combine
    complementary tracker according to classification of property, in order to achieve the
    most significant improvement of performance and adapt to the extensive situation.
    Moreover, to make the appearance evaluation more reliable, features have different
    weighting according to similarity between the object and the background; it is
    inversely proportional to the degree of similarity.
    Among experimental results show the superiorities in our method. First, in the
    combination of trackers are not restricted to the method. In other words, we can select
    any tracker to combination, to improve the tracker’s performance; on the other hand,
    we confirm the selection by tracker, can effectively combine different properties of
    tracker, to achieve the most significant performance improvement.

    摘要................................................................................................................................ ii Abstract .......................................................................................................................... v Content .......................................................................................................................... vi Chapter 1 Introduction ................................................................................................... 1 1.1 Research Background .......................................................................................... 1 1.2 Motivation and Objective .................................................................................... 3 1.3 Thesis Organization ............................................................................................ 4 Chapter 2 Related Work................................................................................................. 5 2.1 Multiple Trackers ................................................................................................ 5 2.2 Analysis of the Trackers ...................................................................................... 7 Chapter 3 Proposed Method......................................................................................... 12 3.1 Training Reference Tracker with the Property .................................................. 13 3.2 Testing with Appearance Evaluation ................................................................ 17 3.3 Adaptive Weighting .......................................................................................... 22 Chapter 4 Experimental Results and Analysis ............................................................. 25 4.1 Overall Performance ......................................................................................... 25 vii 4.2 Analyze the Selection of Tracker ...................................................................... 32 4.3 Analyze the Penalty of the PoE ......................................................................... 36 4.4 Robustness of Ours Method .............................................................................. 37 Chapter 5 Conclusion ................................................................................................... 40 Reference ..................................................................................................................... 41

    [1] W. Choi, C. Pantofaru, and S. Savarese, "A general framework for tracking
    multiple people from a moving camera," in Pattern Analysis and Machine
    Intelligence, vol. 35, pp. 1577-1591, 2013.
    [2] L. Cehovin, M. Kristan, and A. Leonardis, "An adaptive coupled-layer visual
    model for robust visual tracking," in Computer Vision, 2011, pp. 1363-1370.
    [3] S. Hare, A. Saffari, and P. H. Torr, "Struck: Structured output tracking with
    kernels," in Computer Vision, 2011, pp. 263-270.
    [4] Y. Wu, J. Lim, and M.-H. Yang, "Online object tracking: A benchmark," in
    Computer Vision and Pattern Recognition, 2013, pp. 2411-2418.
    [5] A. Smeulders, D. Chu, R. Cucchiara, S. Calderara, A. Dehghan, and M. Shah,
    "Visual tracking: An experimental survey," in Pattern Analysis and Machine
    Intelligence, 2014, pp. 1442-1468.
    [6] W. Zhong, H. Lu, and M.-H. Yang, "Robust object tracking via sparsity-based
    collaborative model," in Computer Vision and Pattern Recognition, 2012, pp.
    1838-1845.
    [7] Z. Kalal, J. Matas, and K. Mikolajczyk, "P-N learning: Bootstrapping binary
    classifiers by structural constraints," in Computer Vision and Pattern Recognition,
    2010, pp. 49-56.
    [8] J. Kwon and K. M. Lee, "Visual tracking decomposition," in Computer Vision
    and Pattern Recognition, 2010, pp. 1269-1276.
    [9] J. Kwon and K. M. Lee, "Tracking by Sampling and Integrating Multiple
    Trackers," in Pattern Analysis and Machine Intelligence, 2014, pp. 1428-1441.
    [10] J. Sullivan, A. Blake, M. Isard, and J. MacCormick, "Bayesian object localisation
    in images," in Computer Vision, vol. 44, pp. 111-135, 2001.
    [11] S. Oron, A. Bar-Hillel, D. Levi, and S. Avidan, "Locally orderless tracking," in
    Computer Vision and Pattern Recognition, 2012, pp. 1940-1947.
    [12] J. Kwon and K. M. Lee, "Tracking of a non-rigid object via patch-based dynamic
    appearance modeling and adaptive basin hopping monte carlo sampling," in
    Computer Vision and Pattern Recognition, 2009, pp. 1208-1215.
    [13] B. Babenko, M.-H. Yang, and S. Belongie, "Visual tracking with online multiple
    instance learning," in Computer Vision and Pattern Recognition, 2009, pp.
    983-990.
    [14] K. Briechle and U. D. Hanebeck, "Template matching using fast normalized
    cross correlation," in Aerospace/Defense Sensing, Simulation, and Controls,
    2001, pp. 95-102.
    [15] S. Baker and I. Matthews, "Lucas-kanade 20 years on: A unifying framework," in
    42
    Computer Vision, vol. 56, pp. 221-255, 2004.
    [16] H. T. Nguyen and A. W. Smeulders, "Fast occluded object tracking by a robust
    appearance filter," in Pattern Analysis and Machine Intelligence, vol. 26, pp.
    1099-1104, 2004.
    [17] A. Adam, E. Rivlin, and I. Shimshoni, "Robust fragments-based tracking using
    the integral histogram," in Computer Vision and Pattern Recognition, 2006, pp.
    798-805.
    [18] D. Comaniciu, V. Ramesh, and P. Meer, "Real-time tracking of non-rigid objects
    using mean shift," in Computer Vision and Pattern Recognition, 2000, pp.
    142-149.
    [19] D. A. Ross, J. Lim, R.-S. Lin, and M.-H. Yang, "Incremental learning for robust
    visual tracking," in Computer Vision, vol. 77, pp. 125-141, 2008.
    [20] J. Kwon, K. M. Lee, and F. C. Park, "Visual tracking via geometric particle
    filtering on the affine group with optimal importance functions," in Computer
    Vision and Pattern Recognition, 2009, pp. 991-998.
    [21] L. Cehovin, M. Kristan, and A. Leonardis, "An adaptive coupled-layer visual
    model for robust visual tracking," in Computer Vision, 2011, pp. 1363-1370.
    [22] X. Mei and H. Ling, "Robust visual tracking using ℓ1 minimization," in
    Computer Vision, 2009, pp. 1436-1443.
    [23] X. Mei, H. Ling, Y. Wu, E. Blasch, and L. Bai, "Minimum error bounded
    efficient ℓ1 tracker with occlusion detection," in Computer Vision and Pattern
    Recognition, 2011, pp. 1257-1264.
    [24] H. T. Nguyen and A. W. Smeulders, "Robust tracking using
    foreground-background texture discrimination," in Computer Vision, vol. 69, pp.
    277-293, 2006.
    [25] M. Godec, P. M. Roth, and H. Bischof, "Hough-based tracking of non-rigid
    objects, " in Computer Vision, 2011.
    [26] S. Wang, H. Lu, F. Yang, and M.-H. Yang, "Superpixel tracking," in Computer
    Vision, 2011.
    [27] R. Collins and Y. Liu, "Online selection of discriminative tracking features," in
    Pattern Analysis and Machine Intelligence, pp. 1631-1643, 2005.
    [28] F. Henriques, R. Caseiro, P. Martins, and J. Batista, "Exploiting the Circulant
    Structure of Tracking-by-Detection with Kernels," in European Conference on
    Computer Vision, 2012.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE