簡易檢索 / 詳目顯示

研究生: 劉用九
Yung-Chiu Liu
論文名稱: 以模型與影像為基礎的虛擬室內漫遊系統
Model-based IBR for Indoor Virtual Walkthrough
指導教授: 陳永昌
Yung-Chang Chen
口試委員:
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2005
畢業學年度: 93
語文別: 英文
論文頁數: 54
中文關鍵詞: 虛擬實境影像繪圖
外文關鍵詞: virtual reality, image-based rendering
相關次數: 點閱:1下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 以影像為基礎的成像系統(IBR)常常應用於各種虛擬實境與電腦繪圖中,並且提供一個栩栩如生的環境,讓人們可以在其中自由行走,但是卻因為沒有模型或是三維空間的概念,使得我們無法在IBR系統中加入一些新的物件模型,如人物、家俱等等,這種限制對於虛擬會議系統或是室內設計等應用都是非常不利的,所以我們希望能夠有一個
    方法,使得我們不但有照片般的畫質,更有模型所需要的三維資訊。
    在這篇論文中,我們提出了一種以模型為基礎的的IBR系統,首先,我們使用多張全景影像,標出影像上垂直的牆角,進而建立出場景的牆面模型,在牆面的貼圖方面,我們將會考慮目前所在的虛擬位置,挑出適合的全景影像,並將此影像做適當的重取樣,得到沒有全景形變的各牆面材質,但因為有時場景太複雜,且重取樣的計算量過大,在成像時會有延遲的感覺,為了提昇移動時的frame rate,我也建立了一連串的判斷機制,如可調性的牆面材質,360度的可視牆面,與視角內牆面,以節省在成像時所要花的時間。另一方面,光源的影響在電腦繪圖中也是很重要的一環,在文章中,我們亦會討論光源的定位、顏色,點光源的衰減量,以改善在傳統的場景IBR裡,沒有光源訊息的問題,使物件加入場景後,有著正確的光影效果,這樣除了場景本身,就連裡面所外加的物件還有其他的人物等,也能有真實的效果。


    Image-based rendering (IBR) is usually applied in the computer graphics and virtual reality and provides a photorealistic virtual environment in which the user can move around. However, because this method does not contain the concept of the model or 3D information, a new object such as a man or a piece of furniture can not be added to this system. This limitation is disadvantageous for virtual video conference or interior decoration.
    In this thesis, a Model-based IBR system is proposed. First, several panoramic images are selected and the featured vertical edges are marked on these images. Then a basic model of the environment is constructed. When it comes to texture maps, the user’s location will be used to select a suitable panoramic image to be considered and this image will be remapped to display a texture without distortion. Nevertheless, sometimes the virtual world is so complex that the time for remapping is very long and time lag will occur in the process of rendering. In order to increase the display frame rate, several mechanisms are designed such as flexible resolution of the texture and visible walls. These methods will reduce the time for rendering. Besides, the lighting effect is another important topic in the computer graphics. In the thesis, the location and the color of light and decay of a point light source will be discussed in order to improve the defect that shows no lighting information about the environment in traditional Image-based rendering. Thus, a realistic virtual world which can add a new object with good lighting effects is developed.

    Abstract Table of Contents i Chapter1 Introduction 1 1.1 Motivation of This Work 2 1.2 Related Work 2 1.3 Thesis Organization 4 Chapter2 Construction of Edge and Plane Model 5 2.1 Panoramic Cylindrical Optics 5 2.2 The Construction of Vertical Edge 6 2.3 Error Estimation of the Constructed Model 9 2.4 Typical Result of Edge Model Construction 11 Chapter3 Model-based IBR Texture Mapping 13 3.1 Texture Mapping of the Walls 13 3.2 Modeling the Ceiling and the Floor 16 3.3 Updating the Texture Mapping 20 3.4 Experimental Results 21 Chapter4 Accelerating the Speed of Rendering 24 4.1 Determining the visible walls 24 4.2 Determining Visible Walls in View Field 29 4.3 Determining the Vertical and Horizontal Dpi of Texture Mapping. 32 Chapter5 Illumination in the System 36 5.1 Illuminance of a Point Light 37 5.2 Parameters of Light Reconstruction 38 5.3 The Lights Occluded by Walls 41 Chapter6 Experimental Results 45 6.1 Experimental Results 45 Chapter7 Conclusion and Future Work 51 Reference 53

    Reference

    [1] E. H. Adelson and J. Bergen, “The Plenoptic function and the elements of early vision”, in Computational Models of Visual Processing, pp 3-20, MIT
    Press, Cambridge, MA, 1991

    [2] L. McMillan and G. Bishop, “Plenoptic modeling: An image-based rendering system”, in Proceedings of ACM SIGGRAPH'95, pp 39-46, 1995

    [3] S. J. Gorther, R. Grzeszczuk, R. Szeliski, and M. F. Cohon, “The lumigraph”, in Proceedings of ACM SIGGRAPH, pp 43-54, 1996

    [4] M. Levoy and P. Hanrahan, “Light field rendering”, in Proceedings of ACM SIGGRAPH '96, pp 31-42, 1996

    [5] D. Aliaga et al., “Sea of images,” Proc. IEEE Visualization (Vis 02), IEEE CS Press, 2002, pp. 31-338.

    [6] Heung-Yeung Shum and Li-Wei-He, “Rendering with Concentric Mosaics”, in Proceedings of ACM SIGGRAPH, pp 299-306, 1999.

    [7] S. E. Chen, “QuickTime VR - An image-based approach to virtual environment navigation”, in Proceedings of ACM SIGGRAPH'95, pp 29-38, 1995

    [8] Sing Bing Kang and R. Szeliski, “3-D Scene Data Recovery using Omnidirectional Multibaseline Stereo”, Cambridge Research Laboratory, Technical Report Series.

    [9] R. Koch, Pollefeys M., and L. Van Gool, “Multiviewpoint stereo from uncalibrated video sequences”, In Proc. ECCV'98, number 1406 in LNCS. Springer, 1998.

    [10] P. Debevec, C. Taylor, and J. Malik, “Modeling and rendering architecture from photographs: A hybrid geometry- and image-based approach”, In Computer Graphics, SIGGRAPH 96 Proceedings, pp 11-20, August 1996.

    [11] Heung-Yeung Shum, Mei Han, and R. Szeliski, “Interactive Construction of 3D Models from Panoramic Mosaics”, Microsoft Research 1998.

    [12] R. Ziegler, W. Matusik, H. Pfister, and L. McMillan, “3D Reconstruction Using Labeled Image Regions”, In Eurographics Symposium on Geometry Processing, 2003.

    [13] Zhigang Zhu, “Omnidirectional Stereo Vision” Workshop on Omnidirecional Vision, in the 10th IEEE ICAR, Budapest, Hungary, August, 2001.

    [14] MARSCHNER, S. R., WESTIN, S. H., LAFORTUNE, E. P. F., TORRANCE,
    K. E., AND GREENBERG, D. P, “Image-based BRDF measurement including human skin”, Eurographics Rendering Workshop 1999 (June 1999).

    [15] DEBEVEC, P. “Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography”, In SIGGRAPH 98, July 1998.

    [16] T. Takahashi, H. Kawasaki, “Arbitrary View Position and Direction Rendering for Large-Scale Scenes”, In 2000 IEEE

    [17] Daniel G. Aliaga and Igrid Carlbom, “Plenoptic Stitching: A Scalable Method for Reconstructing 3D Interactive Walkthroughs”, ACM SIGGRAPH 2001

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)

    QR CODE