研究生: |
栢家凱 Bai, Jia-Kai |
---|---|
論文名稱: |
以貝氏推論進行動態卜瓦松迴歸模型之參數更新 On-line Updating for Dynamic Poisson Regression Using A Bayesian Framework |
指導教授: |
徐南蓉
Hsu, Nan-Jung |
口試委員: |
曾勝滄
Tseng, Sheng-Tsaing 汪上曉 Wong, Shan-Hill |
學位類別: |
碩士 Master |
系所名稱: |
理學院 - 統計學研究所 Institute of Statistics |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 中文 |
論文頁數: | 37 |
中文關鍵詞: | 貝氏推論 、動態卜瓦松迴歸模型 、卡爾曼濾波器 |
外文關鍵詞: | Bayesian, Dynamic Poisson Regression, Kalman filter |
相關次數: | 點閱:1 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文感興趣的議題是監控計數型資料之參數變化。若資料為連續型,可以利用線性迴歸模型估計參數,並透過Kalman filter 進行參數更新。本篇論文,利用泰勒展開式將卜瓦松模型(Poisson regression)的對數概似函數(log-likelihood function)近似成常態,並利用貝氏定理進行參數後驗分配的推論,發展了一套更新參數的non-linear filter ,可以系統性的進行參數更新,達到監控參數變化的目的。
此外,如何辨識重要因子也是模型配適的重要議題,若在廣義線性模型的估計方程式加入L1-norm懲罰項,常面臨到不可微分的情形,本篇論文進一步在更新參數的流程中,加入參數估計的正規化(regularization)限制條件,除了達到監控參數的目的外,也能夠辨識重要因子。
本論文的研究僅針對卜瓦松模型進行預測推論,並探討了加入Lasso、Group Lasso、Adaptive Lasso 及 Adaptive Group Lasso不同懲罰項所帶來的效益。但所提出的方法可以廣泛地適用於其他廣義線性模型(generalized linear model),亦可搭配更一般化的參數限制條件。
This thesis proposes an on-line updating scheme for parameters in a dynamic Poisson regression model. In particular, the regularization is considered in the updating scheme to simultaneously identify important variables. The proposed method is a two-stage procedure: (1) re-estimate the parameters, and (2) select the important variables. In the first stage (re-estimate stage), the likelihood function of Poisson regression model and the prior are approximated as a normal density via Taylor expansion and then the resulting posterior for parameters remains normal-distributed and the corresponding mean and variance structure can be computed recursively. In the second stage, the posterior mean estimates are regularized after parameter updated to further remove some insignificant effects from the fitting. As a good result, the algorithm has simple closed forms and easy to implement like Kalman filter. For illustration, the proposed method is applied to failure count data from a multi-stage-multi-tool manufacturing process. The updating scheme works effectively on identifying good production paths during the production stage.
[1] 李少芃 (2016)。計數型數據的類別變數選取與水準合併分析,國立清華大學統計學研究所碩士論文。
[2] Duchi, J. and Singer, Y. (2009). Efficient online and batch learning using forward backward splitting, Journal of Machine Learning Research, 10(Dec):
2899–2934.
[3] Kalman (1960). A new approach to linear filtering and prediction problems, Journal of basic Engineering, 82(1):35–45.
[4] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society: Series B (Methodological), 58(1):267–288.
[5] Wang, H. and Leng, C. (2008). A note on adaptive group lasso, Computational Statistics & Data Analysis, 52(12):5277–5286.
[6] West, M. and Harrison, J. (2006). Bayesian Forecasting and Dynamic Models. Springer Science & Business Media.
[7] Zou, H. (2006). The adaptive lasso and its oracle properties, Journal of the American statistical association, 101(476):1418–1429.