研究生: |
陳相廷 Hsiang-Ting Chen |
---|---|
論文名稱: |
視點追蹤之大眾化三維投影顯示系統 AR for the Masses: Building a Head-Tracked Projector-Based 3D Display with Off-the-Shelf Components |
指導教授: |
張鈞法
Chun-Fa Chang |
口試委員: | |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 資訊工程學系 Computer Science |
論文出版年: | 2004 |
畢業學年度: | 92 |
語文別: | 英文 |
論文頁數: | 52 |
中文關鍵詞: | 虛擬環境 、擴充實境 、投影機 、相機 、校正 |
外文關鍵詞: | Virtual Reality, Augmented Reality, Projector, Camera, Calibration |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著電腦圖學近十年來的快速發展,我們已經可以即時的繪製出如真實物體一般的複雜三維電腦圖學模型。但一般用來表現三維模型的顯示裝置有絕大部份仍舊是個人電腦或工作站上的二維CRT或LCD螢幕。換句話說,當三維模型被顯示在二維螢幕上時,它的深度資訊就已經被捨棄(隱藏)了。雖然我們可以使用立體視覺(stereo vision)的技術來模擬物體深度的效果,這仍然是種被動的表示法。因為當使用者移動他的視點時,使用者應該要能看到物體相對應於該視點的不同部份,而不是一張靜止的圖像。在絕大部份現有的應用程式裡,使用者仍然得使用滑鼠或鍵盤來操縱虛擬物體或是改變自已的視點,這是很不方便而且很不自然的。要帶給使用者極逼真的虛擬實境感覺,我們相信視點追蹤(head-tracking)是不可或缺的重要元素。
在本論文裡,我們提出了一個視點追蹤之大眾化三維投影顯示系統(head-tracked projector-based 3D display system)。透過使用此系統,使用者能夠以最直覺的方式來觀察虛擬的物體-也就是說他/她可以自由的移動頭部、身子向前進、向後退;而他/她所見到的影像將會逼真的如同真的物體就擺設在前方。因為我們使用的是投影顯示系統,虛擬的物體甚至可以融入現實既存的環境裡,產生擴充實境(augmented reality)的效果。
為了使每個人都能夠享受我們的三維顯示系統,我們更付出了極大的努力來使本系統能夠「輕巧」、「低成本」且「易設置」。以避免如大部份現存的擴充實境(augmented reality)或虛擬實境(virtual reality)系統般要求使用者配戴笨重的器材、使用昂貴的設備且本身設置困難。
With the rapid development in computer graphics, we can now render photorealistic three-dimensional computer graphics models in real time. However, the dominating device for displaying such realistic models is still the two-dimensional CRT or LCD monitor of our personal computer or workstation. Although we can utilize the technique of stereo vision to make up for the lack of depth information, such display method is still a passive one. When the user moves her or his head, the user would expect to see the different portion of the virtual objects corresponding to the changed view point rather than just a still image. In most existed applications, the user still needs to use the mouse or keyboard to manipulate the virtual objects or to change the user’s view, which is often inconvenient and unnatural. To create a compelling experience of 3D environment, we believe that the head-tracking is an essential element.
In this thesis we propose a head-tracked 3D display system that gives user more intuitive access to the virtual objects. The user can observe the virtual objects in any desired direction by simply moving his/her head freely and enjoy the illusion that virtual objects really exist. To be more ambitious, our system can also make the virtual objects appear as if it coexists with the surrounding environment. Besides, to make everyone truly enjoy the interesting experience of 3D display, we also put a lot of effort to make it comfortable to use, low-cost and portable. In contrast, most existing AR (Augmented Reality) and VR (Virtual Reality) systems, which require the users to wear cumbersome devices, are expensive and rarely portable.
BIMBER O. FROHLICH, B. SCHMALSTEG, D. AND ENCARNACAO, L. M. 2001. The Virtual Showcase. In Proceedings of IEEE Computer Graphics and Applications, November/December, 2001.
BIOCCA, F.A. AND ROLLAND, J.P. 1998 Virtual Eyes Can Rearrange Your Body: Adaptation to Visual Displacement in See-Through, Head-Mounted Displays. Presence: Teleoperators and Virtual Environments. vol. 7, no. 3, June 1998
BRUNELLI, R. AND POGGIO, T. 1993 Face recognition: Features versus templates. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15(10):1042–1052, 1993.
CRUZ-NEIRA, C. SANDIN, D.J. AND DEFANTI, T.A. 1993. Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE. Proceedings of ACM SIGGRAPH, pp. 135-142, 1993
DORSEY, J. O’B. SILLION, F. X. GREENBERG, D. P. 1991. Design and Simulation of Opera Lighting and Projection Effects. SIGGRAPH 91 Conference Proceedings, Annual Conference Series, Addison-Wesley, pp 41-50.
DWORKIN, P. AND ZELTER, D. 1993. A New Model for Efficient Dynamic Simulation. Proceedings Fourth Eurographics Workshop on Animation and Simulation, pp.135-147.
B. HEISELE, T. SERRE, AND T. POGGIO. 2001 Component-based face detection. AI Memo, Center for Biological and Computational Learning, MIT, Cambridge, MA, 2001.
JARVIS, K. 1997. Real Time 60Hz Distortion Correction on a Silicon Graphics IG.. in Real Time Graphics, Vol. 5, No. 7, pp. 6-7, February 1997.
LOW, K.L. WELCH, G. LASTRA, A. AND FUCHS, H. 2001 Life-Sized Projector-Based Dioramas: Spatially Real and Visually Virtual. ACM SIGGRAPH 2001 Sketches and Applications, August 2001
LOWE, D.G. 1987 Three-Dimensional Object Recognition from Single Two-Dimensional Images. Artificial Intelligence, pp.355-395.
LOWE, D.G. 1992 Robust Model-based Motion Tracking Through the Integration of Search and Estimation. International Journal of Computer Vision, pp.113-122.
MILGRAM, P. AND KISHINO, F. 1994. A Taxonomy of Mixed RealityVisual Displays IEICE Trans. Information Systems, vol. E77-D, no. 12, 1994, pp. 1321-1329.
RASKAR, R. WELCH, G. AND FUCHS, H. 1998a. Spatially Augmented Reality. In Proceedings of First International Workshop on Augmented Reality, San Francisco, November 1, 1998.
RASKAR, R. CUTTS, MATT. WELCH, GREG. AND ST□ERZLINGER. W. 1998b Efficient Image Generation for Multiprojector and Multisurface Displays. In Proceedings of 9th Eurographics Rendering Workshop, 1998.
RASKAR, R. WELCH, G. CUTTS, M. LAKE, A. STESIN, L AND FUCHS, H. 1998. The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially Immersive Displays. SIGGRAPH 98 Conference Proceedings, Annual Conference Series, Addison-Wesley, July 1998.
RASKAR, R. WELCH, G. LOW, K. L. AND BANDYOPADHVAY, D. 2001. Shader Lamps: Animating Real Objects With Image-Based Illumination. In Proceedings of Eurographics Workshop on Rendering, London, England, June 25-27, 2001.
RASKAR, R. AND BEARDSLEY, P. 2001. A Self-Correcting Projector. In Proceedings of IEEE Computer Vision and Pattern Recognition (CVPR) 2001, Hawaii, Dec 2001.
RASKAR, R. BAAR, J. V. BEARDSLEY, P. WILLWACHER, T. RAO, S. AND FORLINES, C. 2003. iLamps: Geometrically Aware and Self-Configuring Projectors. ACM Transactions on Graphics, 22, 3, 809-818.
ROLAND, A. BAILLOT, Y. BEHRINGER, R. FEINER, S. JULIER, S. MACINTYRE, B. 2001. Recent Advances in Augmented Reality. IEEE Computer Graphics and Applications 21, 6 (Nov/Dec 2001), 34-47.
ROWLEY, H.A. BALUJA, S. AND KANADE, T. Neural networkbased face detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(1):23–38, 1998.
RUSPINI, D. C. KOLAROV, K. KHATIB, O. 1996. The Haptic Display of Complex Graphical Environments.
STATE, A. HIROTA, G. CHEN, D. T. GARRETT, W. AND LIVINGSTON M.A. 1996. Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking. SIGGRAPH 96 Conference Proceedings.
TAKAGI, A. ET AL. 2001 Development of a Stereo Video Seethrough HMD for AR Systems Proc. Int’l Symp. Augmented Reality 2000 (ISAR 00), IEEE CS Press, Los Alamitos,
Calif., 2000, pp. 68-77.
WELCH, G. BISHOP, G. VICCI, L. BRUMBACK, S. KELLER, K. AND COLUCCI, D. N. 1999 . The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments. Proceedings of the ACM Symposium on Virtual Reality Software and Technology (pp. 1-11). University College London, London, United Kingdom (December 20 - 23): ACM
SIGGRAPH, Addison-Wesley.
ZILLES, C. B. AND SALISBURY, J. K. 1995. A Constrained-based God-object Method for Haptic Displa