研究生: |
高 穎 Kao, Ying |
---|---|
論文名稱: |
應用於遠端心率偵測的雙通道時間網路 Siamese Temporal Network for Remote Heart Rate Estimation |
指導教授: |
許秋婷
Hsu, Chiou-Ting |
口試委員: |
陳煥宗
CHEN, HWANN-TZONG 邵皓強 Shao, Hao-Chiang |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 資訊工程學系 Computer Science |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 英文 |
論文頁數: | 23 |
中文關鍵詞: | 遠端心率偵測 、孿生網絡 、卷積類神經網路 |
外文關鍵詞: | Remote heart rate estimation, Siamese network, Convolutional neural network |
相關次數: | 點閱:4 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來,許多研究人員關注在遠端心率偵測問題上,使用遠端光電容積描記術從面部視頻中檢測心率。現有方法忽略了心跳的穩定性,導致預測出來的心跳在短時間內急劇變化,這違背了現實中人們的生理機制。 在本文中,我們提出了一種新的用於遠端心率偵測的暹羅神經網絡。 為了穩定訓練過程以及得到更穩定的心率,我們同時從兩個具有時間偏移的特徵中學習模型。 我們還提出了一個遠端光電容積描記訊號相似流,它使用循環互相關層從面部區域提取可靠且具有相同周期性的遠端光電容積描記信號。 在COHFACE數據集和PURE數據集的實驗結果表明,我們提出的方法與現有方法相比可以達到最佳的心率偵測效果。
In recent years, many researchers pay attention to remote heart rate estimation from facial videos, using Remote photoplethysmography (rPPG). Because existing methods ignore the stability of HR, cause the predicted HR value to dramatically change in a short time. This situation is inconsistent with the physiological mechanism of people. To tackle this problem, we propose to simultaneously learn the model from two temporally shifted spatial-temporal maps under the Siamese network. We also develop an rPPG-similarity stream which uses circular cross-correlation layer to extract reliable and periodic rPPG features from facial regions. Experimental results on the COHFACE dataset and the PURE dataset demonstrate that our proposed method achieves state-of-the-art performance for HR estimation.
[1] G. Haan and V. Jeanne., “Robust pulse rate from chrominance based rppg,” IEEE Transactions on Biomed. Eng., vol. 60, no. 10, pp. 2878–2886, 2013.
[2] Y. Qiu, Y. Liu, J. Aeaga-Falconi, H. Dong, and A. Saddik, “EVM-CNN: Real-time contactless heart rate estimation from facial video,” IEEE Transactions on Multimedia, vol. 21, no. 7, pp. 1778-1787, 2018.
[3] W. Chen and D. McDuff. “Deepphys: Video-based physiological measurement using convolutional attention networks,” in Proc. ECCV, 2018.
[4] R. Spetlik, V. Franc, J. Cech, and J. Matas, “Visual heart rate estimation with convolutional neural network,” in Proc. BMVC, 2018.
[5] X. Niu1, X. Zhao, H. Han, A. Das, A. Dantcheva, S. Shan, and X. Chen, “Robust Remote Heart Rate Estimation from Face Utilizing Spatial-temporal Attention,” in Proc. AFGR, 2019.
[6] X. Li, J. Chen, G. Zhao, and M. Pietikainen, “Remote heart rate measurement from face videos under realistic situations,” in Proc. CVPR, 2014.
[7] S. Tulyakov, X. Alameda-Pineda, E. Ricci, L. Yin, J. F. Cohn, and N. Sebe, “Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions,” in Proc. CVPR, 2016.
[8] P. Li, Y. Benezeth, K. Nakamura, R. Gomez, C. Li, and F. Yang : “Comparison of region of interest segmentation methods for video-based heart rate measurements”, in Proc. BIBE, 2018.
[9] J. Bromley, J. W. Bentz, L. Bottou, I. Guyon, Y. LeCun, C. Moore,E. Sackinger, and R. Shah, “Signature verification using a Siamese time delay neural network,” in Proc. IJPRAI, 1993.
[10] Y. Taigman, M. Yang, M. Ranzato, and L. Wolf, “Deepface: Closing the gap to human-level performance in face verification,” in Proc. CVPR, pp.1701–1708, 2014.
[11] E. Ahmed, M. Jones, and T. K. Marks, “An improved deep learning architecture for person re-identification,” in Proc. CVPR, pp. 3908–3916, 2016.
[12] L. Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. Torr, “Fully-convolutional Siamese networks for object tracking,” in Proc. ECCV, pp. 850–865, 2016.
[13] Y. Zhang, M. Yu, N. Li, C. Yu, J. Cui, and D. Yu, “Seq2seq attentional Siamese neural networks for text-dependent speaker verification,” in Proc. ICASSP, 2019.
[14] S. H. Mohammadi and A. Kain, “Siamese autoencoders for speech style extraction and switching applied to voice identification and conversion,” in Proc. Interspeech, pp. 1293–1297, 2019.
[15] S. Kwon, J. Kim, D. Lee, and K. Park, “Roi analysis for remote photoplethysmography on facial video,” in Proc. EMBS, pp. 851– 862, 2015.
[16] A. Bulat and G. Tzimiropoulos, “How far are we from solving the 2D & 3D Face Alignment problem? (and a dataset of 230,000 3D facial landmarks),” in Proc. ICCV, 2017.
[17] Z. Wang , “Exploiting Remote Photoplethysmography Features for Vision-based Heart Rate Estimation,” master's thesis, National Tsing Hua University, 2019.
[18] G. Heusch, A. Anjos, and S. Marcel. “A Reproducible Study on Remote Heart Rate Measurement.” In arXiv:1709.00962 [cs], 2017.
[19] R. Stricker, S. Müller, and H.-M. Gross, “Non-contact Video-based Pulse Rate Measurement on a Mobile Service Robot” in Proc. 23st IEEE Int. Symposium on Robot and Human Interactive Communication, Edinburgh, Scotland, UK, pp. 1056 - 1062, IEEE, 2014.
[20] W. Wang, S. Stuijk, and G. de Haan, “A novel algorithm for remote photoplethysmography: Spatial subspace rotation,” IEEE Transactions on Bio-medical Engineering, vol. 63, pp. 1974–1984, 2016.