簡易檢索 / 詳目顯示

研究生: 簡郁潔
Yu-Chieh Chien
論文名稱: 採用隨機抽樣之強固主成份分析演算法
RANSAC-Like Algorithms for Robust PCA
指導教授: 賴尚宏
Shang-Hong Lai
口試委員:
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 資訊工程學系
Computer Science
論文出版年: 2006
畢業學年度: 94
語文別: 英文
論文頁數: 58
中文關鍵詞: 主成份分析強固計算隨機抽樣
外文關鍵詞: PCA, Robust Estimation, RANSAC
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在電腦視覺領域中,主成份分析一直是常被應用至與資料分析或子空間學習等相關問題上的統計方法。而在某些情況下,主成份分析的輸入資料會不可避免地含有一些不正確的離群值,即所謂的outlier,此時,我們可利用強固計算方法來偵測outlier,並將之剔除。在這篇論文中,我們提出了以隨機抽樣來達成此目的之強固主成份分析演算法。
    總體來說,本演算法包含了以下幾個步驟。首先,在全部資料中隨機選取幾筆資料來計算一個主成份模型,接著,以其他未被選取之資料來評估此模型,而如果該模型不夠好的話,則不斷重複此抽樣評估的程序,直到找到一個好的模型為止。為了處理不同種類的outlier,我們使用不同的抽樣策略。對於sample outlier,我們採取一維抽樣,而對intra-sample outlier,則採取二維抽樣。此外,針對傳統隨機抽樣方法所遇到的問題,即誤差規模估計及停止條件設定,我們則以extended MSSE (Modified Selective Statistical Estimator) 機制來解決。
    為了瞭解本演算法在不同情況下的表現,我們在模擬資料及真實資料上均進行了實驗。實驗結果證明,相較於標準主成份分析,以及採用M-估計之強固主成份分析方法,本隨機抽樣演算法擁有較為準確的結果,且對於含有高達百分之八十sample outlier的資料,或是含有百分之三十隨機分佈之intra-sample outlier的資料,本演算法都可以計算出與正確值相當接近之主成份模型與重建資料。


    In this thesis, we propose RANSAC [6] (RANdom SAmple Consensus) based approaches to achieve robust PCA [1] (Principal Component Analysis) for data containing outliers. This problem is related to a variety of vision applications that require data analysis or subspace learning, especially for the case that outliers are unavoidable.
    Overall, our algorithms consist of the following steps. We randomly select a subset of data to compute a PCA model, evaluate the model by other unselected data, and repeat the hypothesize-and-test procedure until a good model is found. To deal with different types of outliers, we apply different sampling strategies, one-dimensional sampling for sample outliers, and two-dimensional sampling for intra-sample outliers. In addition, problems resulted from traditional RANSAC, i.e. a prior error scale and the stopping criteria, are solved by the developed extended MSSE [2] (Modified Selective Statistical Estimator) framework.
    Experiments on simulated and real data demonstrate superior performance of the proposed RANSAC-like algorithms compared to standard PCA and the robust PCA technique [5] based on M-estimation. The results also verify that the proposed algorithms are robust against up to 80% sample outliers or 30% randomly distributed intra-sample outliers.

    CHAPTER 1 INTRODUCTION 1 1.1 PRINCIPAL COMPONENT ANALYSIS 3 1.2 PREVIOUS WORKS 5 1.3 CONTRIBUTIONS 10 1.4 SYSTEM OVERVIEW 11 1.5 THESIS OUTLINE 12 CHAPTER 2 BACKGROUND 13 2.1 RANDOM SAMPLE CONSENSUS (RANSAC) 13 2.2 MODIFIED SELECTIVE STATISTICAL ESTIMATOR (MSSE) 15 2.3 PROGRESSIVE SAMPLING CONSENSUS (PROSAC) 16 2.4 INCREMENTAL PCA 17 CHAPTER 3 ALGORITHM FOR SAMPLE OUTLIERS 19 3.1 MODEL GENERATION 20 3.2 PARAMETER UPDATE 23 3.3 MODEL VERIFICATION 23 3.4 SAMPLING ACCELERATION 26 3.5 ALGORITHM SUMMARY 26 CHAPTER 4 ALGORITHM FOR INTRA-SAMPLE OUTLIERS 28 4.1 MODEL GENERATION 29 4.2 ROBUST RECONSTRUCTION 31 CHAPTER 5 EXPERIMENTAL RESULTS 37 5.1 SIMULATIONS 38 5.1.1 Experiments for Sample Outliers 38 5.1.2 Experiments for Intra-Sample Outliers 41 5.2 APPLICATION TO REAL IMAGES 48 5.2.1 Face Training System 48 5.2.2 Background Modeling 51 CHAPTER 6 CONCLUSION 53 6.1 SUMMARY 53 6.2 FUTURE WORKS 54 REFERENCE 57

    [1] N. A. Combell. Robust Procedures in Multivariate Analysis I: Robust Covariance Estimation. Applied Statistics, vol. 29, no. 3, pp. 231-237, 1980.
    [2] A. Bab-Hadiashar and D. Suter. Robust Segmentation of Visual Data Using Ranked Unbiased Scale Estimate. ROBOTICA, International Journal of Information, Education and Research in Robotics and Artificial Intelligence, vol. 17, pp. 649-660, 1999
    [3] M. J. Black and A. Rangarajan. On the Unification of Line Processes, Outlier Rejection, and Robust Statistics with Applications in Early Vision. International Journal of Computer Vision, vol. 25, no. 19, pp. 57-92, 1996.
    [4] O. Chum and J. Matas, Matching with PROSAC - Progressive Sample Consensus, In IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 220- 226, June 2005
    [5] F. De la Torre and M. J. Black. Robust Principal Component Analysis for Computer Vision. In International Conference on Computer Vision, pp. 362-369, 2001.
    [6] M. A. Fischler and R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Communications of the ACM, vol. 24, no. 6, pp. 381-395, 1981.
    [7] K. R. Gabriel and S. Zamir. Low Rank Approximation of Matrices by Least Squares with Any Choice of Weights. Technometrics, vol. 21, pp. 489-498, 1979
    [8] M. J. Greenacre. Theory and Applications of Correspondence Analysis. Academic Press: London, 1984
    [9] P. J. Huber. Robust Statistics. Wiley: New York, first edition, 1981.
    [10] I. T. Jolliffe. Principal Component Analysis. Springer-Verlag: New York, 1986.
    [11] A. Leonardis and H. Bischof. Robust Recognition Using Eigenimages. Computer Vision and Image Understanding, vol. 78, no. 1, pp. 99-118, 2000.
    [12] D. Lowe. Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision, vol. 60 no. 2, pp. 91-110, 2004.
    [13] S. Roweis. EM Algorithm for PCA and SPCA. Neural Information Processing Systems, pp. 626-632, 1997.
    [14] F. H. Ruymagaart. A Robust Principal Component Analysis. Journal of Multivariate Analysis, vol. 11, pp. 485-497, 1981.
    [15] D. Skocaj and A. Leonardis. Weighted and Robust Incremental Method for Subspace Learning. In International Conference on Computer Vision, vol. 2, pp. 1494-1501, 2003.
    [16] D. Skocaj, H. Bischof, and A. Leonardis. A Robust PCA Algorithm for Building Representations from Panoramic Images. In European Conference on Computer Vision, vol. 4, pp. 761–775, May 2002.
    [17] L. Xu and A. L. Yuille. Robust Principal Component Analysis by Self-Organizing Rules Based on Statistical Physics Approach. IEEE Transactions on Neural Networks, vol. 6, no. 1, pp. 131-143, Jan 1995.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE