簡易檢索 / 詳目顯示

研究生: 羅少廷
Lo, Shao-Ting
論文名稱: 時間序列之逆迴歸降維法
Dimension reduction in time series regression model
指導教授: 周若珍
Chou, Rouh-Jane
口試委員:
學位類別: 博士
Doctor
系所名稱: 理學院 - 統計學研究所
Institute of Statistics
論文出版年: 2009
畢業學年度: 97
語文別: 中文
論文頁數: 80
中文關鍵詞: 反切迴歸動態反切迴歸
外文關鍵詞: sliced inverse regression, dynamical sliced inverse regression
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在獨立資料下,Li(1991)和Bura and Cook(2001)分別提出sliced inverse regression(SIR)和parametric inverse regression(PIR)方法,利用逆迴歸的方法以達到降低自變數X維的目的。對於時間序列數據, Xia et al.(2002)提出minimum average variance estimation (MAVE)法,除適用於時間序列資料外,亦可應用於獨立資料;Becker et al.(2000)和Huang(2006)分別將SIR法和PIR法延伸,將解釋變數的當期和前期指標皆置於自變數中。本論文探討這種延伸可能的問題,並提出dynamical sliced inverse regression(DSIR)法和dynamical parametric inverse regression(DPIR)法,達到降維之目的。除完成DSIR法和DPIR法的理論基礎外,並以模擬實驗比較二者與MAVE之維度與方向估計,發現DSIR法優於DPIR法和MAVE法。最後,本文以幾個實例展示各法的預測能力。


    Regression analysis is a popular way of studying the relationship between a response variable y and its explanatory variables X. As the dimension of X gets higher, we need to have a gigantic sample. Li (1991) and Bura and Cook (2001) proposed SIR (sliced inverse regression) and PIR (parametric inverse regression) methods respectively to reduce the dimension of explanatory variables for independent data. For time series, Xia et al. (2002) proposed MAVE (minimum average variance estimation) method. The method is applicable for time series and independent data both. Becker et al. (2000) and Huang (2006) extended SIR and PIR to time dependent data by adding the lagged variables into explanatory variables. In this dissertation, we discuss the issue of the extension and propose DSIR (dynamical SIR) and DPIR (dynamical PIR) methods to reduce dimension. We complete the theoretical foundations of DSIR and DPIR methods. Show the efficiency by simulation. Simulation studies show that both outperforms MAVE in estimation dimensionality. Finally, we display their forecasting performances by some empirical studies.

    目錄 1 緒論 1 2 文獻回顧    4 2.1 SIR方法...........................................................................................4 2.2 SAVE方法........................................................................................6 2.3 DAME方法......................................................................................6 2.4 PIR方法...........................................................................................6 2.5 MAVE方法......................................................................................8 2.6 DSIR方法.........................................................................................9 3 時間序列資料之降維度法 10 3.1 DSIR方法的介紹...........................................................................10 3.2 方向估計與模擬分析...................................................................13 3.2.1 等價向量..............................................................................13 3.2.2 模擬實驗分析......................................................................15 3.3 DSIR方法的修正與模擬分析.......................................................20 3.4 DPIR方法的介紹與其修正和模擬...............................................23 3.4.1 DPIR方法的介紹..................................................................23 3.4.2 DPIR方法的修正與模擬分析..............................................26 3.5 DSIR方法、DPIR方法與MAVE方法之比較...............................28 3.5.1 MAVE方法的模擬實驗分析...............................................28 3.5.2 修正後的DSIR、DPIR方法與MAVE方法之比較.............29 3.6 修正後DSIR方法之函數估計.....................................................31 4 實證研究分析 37 4.1 汙染和氣候對循環及呼吸病人的影響之實證分析...................37 4.2 台積電股價投資報酬率之實證分析...........................................46 4.3 台灣失業率之實證分析...............................................................55 5 結論探討 64 附錄 68 參考文獻 77

    1. Becker, C., Fried, R., and Gather, U. (2000), Applying sliced inverse regression to dynamical data, In Mathematical Statistics with Application in Biometry (J. Kunert, G.. Trenkler, eds.) 201-214, Jose Eul, Lohmar.
    2. Bollerslev, T. (1986), Generalized autoregressive conditional heteroskedasticity, Journal of Econometrics, 31, 307-327.
    3. Breidt, F. J. and Davis, R. A. (1991), Time-reversibility, identifiability and independence of innovations for stationary time series, Journal of Time Series Analysis, 13, 377-390.
    4. Breiman, L., Friedman, J. H., Olshen, R. A., and Stone, C. J. (1984), Classification and regression trees, Belmont, CA: Wadsworth.
    5. Bura, E. and Cook, R. D. (2001), Estimating the structural dimension of regressions via parametric inverse regression, Journal of the Royal Statistical Society, Series B, 63, 393-410.
    6. Bura, E. and Cook, R. D. (2001), Extending sliced inverse regression: the weighted chi-square test, Journal of the American Statistical Association, 96, 996-1003.
    7. Carr, D. B., Littlefield, R. J., Nicholson, W. L., and Littlefield, J. S. (1987), Scatterplot matrix techniques for large N, Journal of the American Statistical Association, 82, 424-437.
    8. Chaudhuri, P., Huang, M. C., Loh, W. Y., and Yao, R. (1994), Piecewise-polynomial regression trees, Statistica Sinica, 4, 143-167.
    9. Chen, L. S. (1995), Sliced inverse regression for time series analysis, Thesis (PH. D.)--University of California, Los Angeles.
    10. Cook, R. D. (1994b), Using dimension-reduction subspace to identify important inputs in models of physical systems, in 1994 Proceedings of the Section on Physical and Engineering Sciences, Alexandria, VA: American Statistical Association, pp. 18-25.
    11. Cook, R. D. (1996), Graphics for regressions with a binary response, Journal of the American Statistical Association, 91, 983-992.
    12. Cook, R. D. and Li, B. (2002), Dimension reduction for the conditional mean, The Annals of Statistics, 30, 455-474.
    13. Cook, R. D. and Li, B. (2004), Determining the dimension of iterative Hessian transformation, The Annals of Statistics, 32, 2501-2531.
    14. Cook, R. D. and Weisberg, S. (1991), Sliced inverse regression for dimension reduction: comment, Journal of the American Statistical Association, 86, 328-332.
    15. Ferre, L. and Yao, A. F. (2003), Functional sliced inverse regression, Statistics, 37, 475-488.
    16. Gather, U., Hilker, T. and Becker, C. (2001), A robustified version of sliced inverse regression, In Statistics in Genetics and in the Environmental Sciences (L. T. Fernholz, S. Morganthaler and W. Stahel, eds.) 145-157, Birkhauser, Basel.
    17. Hsing, T. and Carroll, R. J. (1992), An asymptotic theory for sliced inverse regression, Annals of Statistic, 20, 1040-1061.
    18. Hu, Y. P. (2009), Identifying the time invariant effective dimension reduction space, manuscript.
    19. Huber, P. (1987), Experience with three-dimension scatterplots, Journal of the American Statistical Association, 82, 448-454.
    20. Kato, T. (1976), Perturbation theory for linear operators (2nd ed.), Berlin: Springer-Verlag.

    21. Li, B., Cook, R. D. and Chiaromonte, F. (2003), Dimension reduction for conditional mean in regression with categorical predictors, The Annals of Statistics, 31, 1636-1668.
    22. Li, B. and Dong, Y. X., (2009), Dimension reduction for nonellipitically distributed predictors, Annals of Statistics, 37, 1272-1298.
    23. Li, K. C. (1991), Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-342.
    24. Li, K. C. (1992), On principal Hessian directions for data visualization and dimension reduction: Another application of Stein Lemma, Journal of the American Statistical Association, 87, 1025-1039.
    25. Li, K. C., Aragon Y, Shedden K, et al. (2003), Dimension reduction for multivariate response data, Journal of the American Statistical Association, 98, 99-109.
    26. Li, K. C., Lue, H. H., and Chen, C. H. (2000), Interactive tree-structured regression via principal Hessian directions, Journal of the American Statistical Association, 95, 547-560.
    27. Li, Y. X. and Zhu, L. X. (2007), Asymptotics for sliced average variance estimation, Annals of Statistics, 35, 41-69.
    28. Lue, H. H., (2008), Sliced average variance estimation for censored data, Communications in Statistics-Theory and Methods, 37, 3276-3286.
    29. Lue, H. H., (2009), Sliced inverse regression for multivariate response regression, Journal of Statistical Planning and Inference, 139, 2656-2664.
    30. Naik, P., and Tsai, C. L. (2005), Constrained inverse regression for incorporating prior information, Journal of the American Statistical Association, 100, 204-211.
    31. Schott, J. R. (1994), Determining the dimensionality in sliced inverse regression, Journal of the American Statistical Association, 89, 141-148.
    32. Sheather, S. J., McKean, J. W., and Crimin K., (2008), Sliced mean variance -covariance inverse regression, Computational Statictics and Data Analysis, 52, 1908-1927.
    33. Stone, C. J. (1985), Additive regression and other nonparametric models, The Annals of Statistics, 13, 689-705.
    34. Tong, H. (1990), Nonlinear time series analysis: a dynamical system approach, Oxford: Oxford University Press.
    35. Xia, Y. C., Tong, H., Li, W. and Zhu, L. X. (2002), An adaptive estimation of dimension reduction space, Journal of the Royal Statistical Society, Series B, 64, 363-388.
    36. Zhao, J. L. and Xu, X. Z., (2009), Dimension reduction based on weighted variance estimate, Science in China Series A-Mathmatics, 52, 539-560.
    37. Zhou, J. H. and He, X. M., (2008), Dimension reduction based on constrained canonical correlation and variable filtering, Annals of Statistic, 36, 1649-1668.
    38. Zhu, L. X. and Fang, K. T. (1996), Asymptotics for kernel estimate of sliced inverse regression, Annals of Statistic, 24, 1053-1068.
    39. Zhu, L. X. and Ng, K. W. (1995), Asymptotics of sliced inverse regression, Statistica Sinica, 5, 727-736.
    40. Huang, C. Z. (2006), 時間序列迴歸中降維之探討,國立清華大學統計研究所碩士論文。

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE