簡易檢索 / 詳目顯示

研究生: 梁耀升
Liang, Yao-Sheng
論文名稱: 應用於可攜式電子鼻資料分類之多類支持向量機晶片
An On-Chip Multi-Class Support Vector Machine Applied to Portable Electronic Nose Data Classification
指導教授: 鄭桂忠
Tang, Kea-Tiong
口試委員: 陳新
黃聖傑
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2011
畢業學年度: 99
語文別: 中文
論文頁數: 59
中文關鍵詞: 電子鼻支持向量機高斯函數
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 電子鼻近年來被廣泛的應用在各個領域中,在傳統上電子鼻系統仍是一個龐大的裝置,然而若要能方便的在人類的生活中使用,則必須將電子鼻發展成可隨身攜帶的裝置,此外,在某些應用的需求上,例如環境的監控,考量到成本和體積,我們不可能以大型的電子鼻裝置來運作,因此將電子鼻系統微小化是必要的。電子鼻系統的運作從吸入氣味樣本後,使其與氣體感測器反應,接著將反應的訊號轉換和處理,最後透過資料辨識得到結果,是一個結合多個領域的裝置,在氣味資料辨識的部分,其牽涉到圖形辨識的技術,這些分類演算法通常以電腦或微處理器運行,但若要應用在可攜式的裝置上,以此方式並不划算,因此透過低功率的類比積體電路設計實現是較佳的方式。
    在圖形辨識的領域中,目前已經發展相當多種的演算法,支持向量機從90年代被提出後,因為根據統計學習理論可以證明支持向量機的原理在分類的許多考量上較佳,因此在許多的領域中蓬勃發展和應用。本研究論文針對可攜式電子鼻提出一個三類支持向量機晶片,以一對一方法將支持向量機由區分兩類別拓展到多類別,並且在同一片晶片上同時實現參數學習和氣味分類的機制,透過一個進一步簡化的遞迴式類神經網路電路實現參數的訓練。晶片以TSMC 0.18μm CMOS製程製作,經由氣體實驗感測器量測的氣體資料測試晶片,從氣味分類實驗的統計結果知其辨識率在七成以上,並且操作在1.8V時消耗的功率僅125μW。實際量測時,最低操作電壓可到1.2V,此時的功耗更只有58μW,因此本研究相當適合應用在可攜式電子鼻上。


    摘要 i ABSTRACT ii 致謝 iii 目錄 iv 圖目錄 vii 表目錄 x 第一章 緒論 1 1.1 研究背景 1 1.2 研究動機 3 1.3 章節簡介 4 第二章 文獻回顧 5 2.1 支持向量機介紹 5 2.1.1 支持向量機的發展 5 2.1.2 線性支持向量機 5 2.1.3 線性不可分情形 8 2.1.4 非線性支持向量機 9 2.1.5 常見的核函數 11 2.1.6 多類支持向量機方法 11 2.2 學習方法 15 2.3 相關電路發展 17 第三章 系統架構與模擬 19 3.1 系統規格 19 3.2 二元支持向量機 20 3.2.1 二元支持向量機學習器 20 3.2.2 分類器 22 3.3 區塊電路 23 3.3.1高斯核函數實現 23 3.3.2電流方向選擇電路 26 3.3.3投影算符電路(Projection operator) 27 3.3.4低通濾波器 29 3.3.5積分器 31 3.4 參數範圍 33 3.5 模擬結果 35 3.5.1 氣體資料來源 35 3.5.2 系統模擬結果 37 3.6 電路布局圖 41 第四章 量測結果與討論 42 4.1 晶片量測環境 42 4.1.1 晶片照相圖 42 4.1.2 量測環境 43 4.2 區塊電路量測結果 43 4.2.1 幫浦電路 43 4.2.2 低通濾波器 45 4.3 支持向量機量測結果 46 4.3.1 參數訓練量測 46 4.3.2 氣體資料分類結果 48 4.4 文獻比較表 51 第五章 結論 52 5.1 結論 52 5.2 未來工作 53 參考文獻 54

    [1] E. Llobet, E. L. Hines, J. W. Gardner, and S. Franco, “Non-destructive banana ripeness determination using a neural network-based electronic nose,” Measurement Science and Technology, vol. 10, pp. 538–547, 1999.
    [2] C. Di Natale, A. Macagnano, E. Martinelli, R. Paolesse, E. Proietti, and A. D’Amico, “ The Evaluation of Quality of Post-harvest Oranges and Apples by Means of an Electronic Nose,” Sensors and Actuators, vol. B78, pp. 26–31, 2001.
    [3] J. W. Gardner, H. W. Shin, E. L. Hines, and C. S. Dow, “An electronic nose system for monitoring the quality of potable water,” Sensors and Actuators, vol. B69, pp. 336–341, 2000.
    [4] H. W. Shin, E. Llobet, J. W. Gardner, E. L. Hines, and C. S. Dow, “Classification of the strain and growth phase of cyanobacteria in potable water using an electronic nose system,” in IEE Proceedings-Science Measurement and Technology, vol. 147, pp. 158–164, 2000.
    [5] J. W. Gardner, H. W. Shin, and E. L. Hines, “An electronic nose system to diagnose illness,” Sensors and Actuators, vol. B70, pp. 19–24, 2000.
    [6] Y. Lin, H. Guo, Y. Chang, M. Kao, H. Wang, and R. Hong, “Application of the electronic nose for uremia diagnosis,” Sensors and Actuators, vol. B76, pp. 177–180, 2001.
    [7] R. W. Moncrieff, “An instrument for measuring and classifying odours,” J. Appl. Physiol., vol. 16, pp. 742–749, 1961.
    [8] W. F. Wilkens and A. D. Hatman, “An electronic analog for the olfactory processes,” Ann. NY Acad. Sci., vol. 116, pp. 608–612, 1964.
    [9] T. M. Buck, F. G. Allen, and M. Dalton, “Detection of chemical species by surface effects on metals and semiconductors,” Surface Effects in Detection, Spartan Books Inc., USA, 1965.
    [10] K. Persaud and G. H. Dodd, “Analysis of discrimination mechanisms of the mammalian olfactory system using a model nose,” Nature, vol. 299, pp. 352–355, 1982.
    [11] J. W. Gardner, P. N. Bartlett, G. H. Dodd, and H. V. Shurmer, “Pattern recognition in the Warwick Electronic Nose,” in 8th Int. Congress of European Chemoreception Research Organisation, University of Warwick, UK, July 1987.
    [12] J. W. Gardner and P. N. Bartlett (eds.), NATO Advanced Research Workshop, Reykjavik, Iceland, Sensors and Sensory Systems for an Electronic Nose, NATO ASI Series E: Applied Sciences, vol. 212, 1992.
    [13] R. R. Reed, “Signaling pathways in odorant detection,” Neuron, vol. 8, pp. 205–209, 1992.
    [14] D. Lancet, N. Ben-Arie, “Olfactory receptors,” Curr. Biol., vol. 3, pp. 668–674, 1993.
    [15] M. S. Freund and N. S. Lewis, “A Chemically Diverse Conducting Polymer-Based Electronic Nose,” Proc. Natl. Acad. Sci. U.S.A., vol. 92, pp. 2652–2656, 1995.
    [16] H. V. Shurmer and J. W. Gardner, “Odor discrimination with an electronic nose,” Sensors and Actuators, vol. B8, pp. 1–11, 1992.
    [17] A. R. Newman, “Electronic noses,” Anal. Chem., vol. 63, pp. 585A–588A, 1991.
    [18] T. Nakamoto, A. Fukuda, and T. Moriizimi, “Perfume and flavour identification by odour-sensing system using quartz-resonator sensor array and neural-network pattern recognition,” Sensors and Actuators, vol. B10, pp. 85–90, 1993.
    [19] B. Schoelkopf, K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik, “Comparing support vector machines with Gaussian kernels to radial basis functions classifiers,” IEEE Trans. Signal Process., vol. 45, no. 11, pp. 2758–2765, Nov. 1997.
    [20] D. Meyer, F. Leisch, and K. Hornik, “The support vector machine under test,” Neurocomputing, vol. 55, pp. 169–186, 2003.
    [21] D. Chen, J. Odobez, and H. Bourlard, “Text detection and recognition in images and video frames,” Pattern Recognition, vol. 37, pp. 595–608, 2004.
    [22] M. Pardo and G. Sberveglieri, “Classification of electronic nose data with support vector machines,” Sensors and Actuators, vol. B107, pp. 730–737, 2005.
    [23] C. Distante, N. Ancona, and P. Siciliano, “Support vector Machines for olfactory signals recognition,” Sensors and Actuators, vol. B88, pp. 30–39, 2003.
    [24] S. Al-Khalifa, S. Maldonado-Bascón, and J. W. Gardner, “Identification of CO and NO2 using a thermally resistive microsensor and support vector machine,” IEE Proceedings Measurement and Technology, vol. 150, pp. 11–14, 2003.
    [25] T. C. Pearce, S. S. Schiffman, H. T. Nagle, and J. W. Gardner, Handbook of Machine Olfaction. New York: Wiley-VCH, 2003.
    [26] B. Boser, I. Guyon, and V. Vapnik, “A training algorithm for optimal margin classifier,” in Proc. 5th Annu. ACM Workshop on Computational Learning Theory, pp. 144–152, 1992.
    [27] C. Cortes and V. Vapnik, “Support vector networks,” Machine Learning, vol. 20, pp. 273–297, 1995.
    [28] V. Vapnik, The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
    [29] D. Gorgevik, D. Cakmakov, and V. Radevski, “Handwritten digit recognition by combining support vector machines using rule-based reasoning,” in Proceedings of 23rd Int. Conference on Information Technology Interfaces, pp. 139–144, 2001.
    [30] B. Zhao, Y. Liu, and S.–W. Xia, “Support vector machine and its application in handwritten numerical recognition,” in Proceedings of 15th Int. Conference on Pattern Recognition, vol. 2, pp. 720–723, 2000.
    [31] E. Osuna, R. Freund, and F. Girosi, “Training support machines: an application to face detection,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 130–136, 1997.
    [32] G. Guo, S.–Z. Li, and K.–L.Chan, “Support vector machines for face recognition,” Journal of Image and Vision Computing, vol. 19, pp. 631–638, 2001.
    [33] O. Chapelle, S. P. Haffner, and V. Vapnik, “Support vector machines for histogram-based image classification,” IEEE Trans. Neural Netw., vol. 10, pp. 1055–1064, May 1999.
    [34] Y. Zhang, R. Zhao, and Y. Leung, “Image classification by support vector machines,” in Proceedings of Int. Conference on Intelligent Multimedia, Video and Speech Processing, pp. 360–363, 2001.
    [35] V. Wan and W. M. Campbell, “Support vector machines for speaker verification and identification,” in Proceedings of IEEE Workshop on Neural Networks for Signal Processing X, vol. 2, 2000.
    [36] X. Dong and W. Zhaohui, “Speaker recognition using continuous density support vector machines,” Electron. Lett., vol. 37, pp. 1099–1101, 2001.
    [37] C. C. Burges, “A tutorial on support vector machines for pattern recognition,” in Proceedings of Int. Conference on Data Mining and Knowledge Discovery, vol. 2, pp. 121–167, 1998.
    [38] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines And Other Kernel-based Learning Methods, Cambridge University Press, 2000.
    [39] R. Fletcher, Practical methods of optimization; (2nd ed.), Wiley-Interscience, New York, NY, 1987
    [40] B. Schölkopf and A. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond, MIT Press, Cambridge, MA, 2001.
    [41] C.–W. Hsu, C.–J. Lin, “A Comparison of Methods for Multiclass Support Vector Machines,” IEEE Trans. Neural Netw., vol. 13, pp. 415–425, 2002.
    [42] J. C. Platt, N. Cristianini, and J. Shawe-Taylor, “Large margin DAG’s for multiclass classification,” Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, vol. 12, pp. 547–553, 2000.
    [43] V. Vapnik, Statistical Learning Theory. New York: Wiley, 1998.
    [44] J. Weston and C. Watkins, “Multi-class support vector machines,” in Proc. ESANN99, 1999.
    [45] T.–T. Friess, N. Cristianini, and C. Campbell, “The Kernel-Adatron algorithm: A Fast and Simple Learning Procedure for Support Vector Machines,” in Proc. 15th Int. Conf. Machine Learning, pp. 188–196, 1998.
    [46] T. Joachims, “Making Large-Scale SVM Learning Practical,” Advances in Kernel Methods - Support Vector Learning, B. Schölkopf, C. Burges, and A. Smola (ed.), MIT Press, 1999.
    [47] J. C. Platt, “Fast training of support vector machines using sequential minimal optimization,” Advances in Kernel Methods - Support Vector Learning, B. Schölkopf, C. Burges, and A. Smola (ed.), MIT Press, 1999.
    [48] S. Keerthi, S. Shevade, C. Bhattacharyya, and K. Murthy, “A fast iterative nearest point algorithm for support vector machine classifier design,” IEEE Trans. Neural Netw., vol. 11, no. 1, pp.124–136, 2000.
    [49] Y.–S. Xia and J. Wang, “Recurrent neural networks for optimization: the state of the art,” Recurrent Neural Networks (Design and Applications), L.R. Medseker, L.C. Jain (ed.), CRC Press, Boca Raton, FL, pp. 29–45, 2000.
    [50] Y.–S. Xia, H. Leung, and J. Wang, “A projection neural network and its application to constrained optimization problems,” IEEE Trans. Circuits Syst. I, vol. 49, pp. 447–458, Apr. 2002.
    [51] Y. Tan, Y.–S. Xia, and J. Wang, “Neural network realization of support vector methods for pattern classification,” in Proc. IEEE Int. Joint Conf. Neural Networks, pp. 411–416, 2000.
    [52] D. Anguita and A. Boni, “Improved neural network for SVM learning,” IEEE Trans. Neural Netw., vol. 13, no. 5, pp. 1243–1244, May 2002.
    [53] Y.–S. Xia and J. Wang, “A one-layer recurrent neural network for support vector machine learning,” IEEE Trans. Syst., Man, Cybern. B, Cybern., vol. 34, no. 2, pp. 1261–1269, Apr. 2004.
    [54] R. Genov and G. Cauwenberghs, “Kerneltron: Support vector machine in silicon,” IEEE Trans. Neural Netw., vol. 14, no. 5, pp. 1426–1434, May 2003.
    [55] P. Kucher and S. Chakrabartty, “An energy-scalable margin propagation-based analog VLSI support vector machine,” in Proc. IEEE Int. Conf. Circuits Syst. (ISCAS), pp. 1289–1292, May 2007.
    [56] S. Chakrabartty and G. Cauwenberghs, “Sub-microwatt analog VLSI trainable pattern classifier,” IEEE J. Solid-State Circuits, vol. 42, no. 5, pp. 1169–1179, May 2007.
    [57] S.–Y. Peng, B. A. Minch, and P. Hasler, “Analog VLSI implementation of support vector machine learning and classification,” in Proc. IEEE Int. Symp. Circuits Syst. (ISCAS), pp. 860–863, May 2008.
    [58] K. Kang and T. Shibata, “An on-chip-trainable analog Gaussian support vector machine,” in Proc. IEEE Int. Symp. Circuits Syst. (ISCAS), pp. 2661–2664, May 2009.
    [59] T. Delbück, “Bump circuits for computing similarity and dissimilarity of analog voltage,” in Proc. Int. Neural Network Society, Seattle, WA, vol. I, pp. 475–479, 1991.
    [60] R.–J. Huang and T.–D. Chiueh, “Circuit implementation of the multivalued exponential recurrent associative memory,” in World Congr. Neural Networks, pp. 618–623, 1994.
    [61] B. Gilbert, “Translinear circuits: A proposed classification,” Electron. Lett., vol. 11, no. 1, pp. 14–16, Jan. 1975.
    [62] J. Mulder, W. A. Serdijn, A. C. Woerd, and A. H. M. Roermund, “Dynamic Translinear Circuits–An Overview,” Analog Integ. Circuits and Signal Proc., vol. 22, issue 2, pp. 111–126, 2000.
    [63] D. Anguita, A. Boni, and S. Ridella, “Learning algorithm for nonlinear support vector machines suited for digital VLSI,” Electron. Lett., vol. 35, no. 16, pp. 993–1009, 1999.
    [64] D. Anguita, A. Boni, and S. Ridella, “A digital architecture for support vector machines: Theory, algorithm, and FPGA implementation,” IEEE Trans. Neural Netw., vol. 14, no. 5, pp. 993–1009, May 2003.
    [65] K.–T. Tang, H.–Y. Hsieh, C.–H. Pan, J.–M. Shyu, and Y.–S. Lin, “A portable electronic nose system that can detect fruity odors,” in Proc. IEEE Int. Symp. Circuits Syst. (ISCAS), pp. 780–780, 2009.
    [66] X. Wang, H. Zhang, and C. Zhang. “Signals recognition of electronic nose based on support vector machines.” in Proceedings of the Fourth International Conference on Machine Learning and Cybernetics, pp. 3394–3398, 2005.
    [67] Q. Chen, J. Zhao, Z. Chen, H. Lin, and D.–A. Zhao, “Discrimination of green tea quality using the electronic nose technique and the human panel test, comparison of linear and nonlinear classification tools,” Sensors and Actuators B: Chem., 2011.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE