104 research outputs found

    Design of Immersive Online Hotel Walkthrough System Using Image-Based (Concentric Mosaics) Rendering

    Get PDF
    Conventional hotel booking websites only represents their services in 2D photos to show their facilities. 2D photos are just static photos that cannot be move and rotate. Imagebased virtual walkthrough for the hospitality industry is a potential technology to attract more customers. In this project, a research will be carried out to create an Image-based rendering (IBR) virtual walkthrough and panoramic-based walkthrough by using only Macromedia Flash Professional 8, Photovista Panorama 3.0 and Reality Studio for the interaction of the images. The web-based of the image-based are using the Macromedia Dreamweaver Professional 8. The images will be displayed in Adobe Flash Player 8 or higher. In making image-based walkthrough, a concentric mosaic technique is used while image mosaicing technique is applied in panoramic-based walkthrough. A comparison of the both walkthrough is compared. The study is also focus on the comparison between number of pictures and smoothness of the walkthrough. There are advantages of using different techniques such as image-based walkthrough is a real time walkthrough since the user can walk around right, left, forward and backward whereas the panoramic-based cannot experience real time walkthrough because the user can only view 360 degrees from a fixed spot

    Design of Immersive Online Hotel Walkthrough System Using Image-Based (Concentric Mosaics) Rendering

    Get PDF
    Conventional hotel booking websites only represents their services in 2D photos to show their facilities. 2D photos are just static photos that cannot be move and rotate. Imagebased virtual walkthrough for the hospitality industry is a potential technology to attract more customers. In this project, a research will be carried out to create an Image-based rendering (IBR) virtual walkthrough and panoramic-based walkthrough by using only Macromedia Flash Professional 8, Photovista Panorama 3.0 and Reality Studio for the interaction of the images. The web-based of the image-based are using the Macromedia Dreamweaver Professional 8. The images will be displayed in Adobe Flash Player 8 or higher. In making image-based walkthrough, a concentric mosaic technique is used while image mosaicing technique is applied in panoramic-based walkthrough. A comparison of the both walkthrough is compared. The study is also focus on the comparison between number of pictures and smoothness of the walkthrough. There are advantages of using different techniques such as image-based walkthrough is a real time walkthrough since the user can walk around right, left, forward and backward whereas the panoramic-based cannot experience real time walkthrough because the user can only view 360 degrees from a fixed spot

    A virtual reality system using the concentric mosaic: Construction, rendering, and data compression

    Get PDF
    This paper proposes a new image-based rendering (IBR) technique called "concentric mosaic" for virtual reality applications. IBR using the plenoptic function is an efficient technique for rendering new views of a scene from a collection of sample images previously captured. It provides much better image quality and lower computational requirement for rendering than conventional three-dimensional (3-D) model-building approaches. The concentric mosaic is a 3-D plenoptic function with viewpoints constrained on a plane. Compared with other more sophisticated four-dimensional plenoptic functions such as the light field and the lumigraph, the file size of a concentric mosaic is much smaller. In contrast to a panorama, the concentric mosaic allows users to move freely in a circular region and observe significant parallax and lighting changes without recovering the geometric and photometric scene models. The rendering of concentric mosaics is very efficient, and involves the reordering and interpolating of previously captured slit images in the concentric mosaic. It typically consists of hundreds of high-resolution images which consume a significant amount of storage and bandwidth for transmission. An MPEG-like compression algorithm is therefore proposed in this paper taking into account the access patterns and redundancy of the mosaic images. The compression algorithms of two equivalent representations of the concentric mosaic, namely the multiperspective panoramas and the normal setup sequence, are investigated. A multiresolution representation of concentric mosaics using a nonlinear filter bank is also proposed.published_or_final_versio

    Applying image processing techniques to pose estimation and view synthesis.

    Get PDF
    Fung Yiu-fai Phineas.Thesis (M.Phil.)--Chinese University of Hong Kong, 1999.Includes bibliographical references (leaves 142-148).Abstracts in English and Chinese.Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Model-based Pose Estimation --- p.3Chapter 1.1.1 --- Application - 3D Motion Tracking --- p.4Chapter 1.2 --- Image-based View Synthesis --- p.4Chapter 1.3 --- Thesis Contribution --- p.7Chapter 1.4 --- Thesis Outline --- p.8Chapter 2 --- General Background --- p.9Chapter 2.1 --- Notations --- p.9Chapter 2.2 --- Camera Models --- p.10Chapter 2.2.1 --- Generic Camera Model --- p.10Chapter 2.2.2 --- Full-perspective Camera Model --- p.11Chapter 2.2.3 --- Affine Camera Model --- p.12Chapter 2.2.4 --- Weak-perspective Camera Model --- p.13Chapter 2.2.5 --- Paraperspective Camera Model --- p.14Chapter 2.3 --- Model-based Motion Analysis --- p.15Chapter 2.3.1 --- Point Correspondences --- p.16Chapter 2.3.2 --- Line Correspondences --- p.18Chapter 2.3.3 --- Angle Correspondences --- p.19Chapter 2.4 --- Panoramic Representation --- p.20Chapter 2.4.1 --- Static Mosaic --- p.21Chapter 2.4.2 --- Dynamic Mosaic --- p.22Chapter 2.4.3 --- Temporal Pyramid --- p.23Chapter 2.4.4 --- Spatial Pyramid --- p.23Chapter 2.5 --- Image Pre-processing --- p.24Chapter 2.5.1 --- Feature Extraction --- p.24Chapter 2.5.2 --- Spatial Filtering --- p.27Chapter 2.5.3 --- Local Enhancement --- p.31Chapter 2.5.4 --- Dynamic Range Stretching or Compression --- p.32Chapter 2.5.5 --- YIQ Color Model --- p.33Chapter 3 --- Model-based Pose Estimation --- p.35Chapter 3.1 --- Previous Work --- p.35Chapter 3.1.1 --- Estimation from Established Correspondences --- p.36Chapter 3.1.2 --- Direct Estimation from Image Intensities --- p.49Chapter 3.1.3 --- Perspective-3-Point Problem --- p.51Chapter 3.2 --- Our Iterative P3P Algorithm --- p.58Chapter 3.2.1 --- Gauss-Newton Method --- p.60Chapter 3.2.2 --- Dealing with Ambiguity --- p.61Chapter 3.2.3 --- 3D-to-3D Motion Estimation --- p.66Chapter 3.3 --- Experimental Results --- p.68Chapter 3.3.1 --- Synthetic Data --- p.68Chapter 3.3.2 --- Real Images --- p.72Chapter 3.4 --- Discussions --- p.73Chapter 4 --- Panoramic View Analysis --- p.76Chapter 4.1 --- Advanced Mosaic Representation --- p.76Chapter 4.1.1 --- Frame Alignment Policy --- p.77Chapter 4.1.2 --- Multi-resolution Representation --- p.77Chapter 4.1.3 --- Parallax-based Representation --- p.78Chapter 4.1.4 --- Multiple Moving Objects --- p.79Chapter 4.1.5 --- Layers and Tiles --- p.79Chapter 4.2 --- Panorama Construction --- p.79Chapter 4.2.1 --- Image Acquisition --- p.80Chapter 4.2.2 --- Image Alignment --- p.82Chapter 4.2.3 --- Image Integration --- p.88Chapter 4.2.4 --- Significant Residual Estimation --- p.89Chapter 4.3 --- Advanced Alignment Algorithms --- p.90Chapter 4.3.1 --- Patch-based Alignment --- p.91Chapter 4.3.2 --- Global Alignment (Block Adjustment) --- p.92Chapter 4.3.3 --- Local Alignment (Deghosting) --- p.93Chapter 4.4 --- Mosaic Application --- p.94Chapter 4.4.1 --- Visualization Tool --- p.94Chapter 4.4.2 --- Video Manipulation --- p.95Chapter 4.5 --- Experimental Results --- p.96Chapter 5 --- Panoramic Walkthrough --- p.99Chapter 5.1 --- Problem Statement and Notations --- p.100Chapter 5.2 --- Previous Work --- p.101Chapter 5.2.1 --- 3D Modeling and Rendering --- p.102Chapter 5.2.2 --- Branching Movies --- p.103Chapter 5.2.3 --- Texture Window Scaling --- p.104Chapter 5.2.4 --- Problems with Simple Texture Window Scaling --- p.105Chapter 5.3 --- Our Walkthrough Approach --- p.106Chapter 5.3.1 --- Cylindrical Projection onto Image Plane --- p.106Chapter 5.3.2 --- Generating Intermediate Frames --- p.108Chapter 5.3.3 --- Occlusion Handling --- p.114Chapter 5.4 --- Experimental Results --- p.116Chapter 5.5 --- Discussions --- p.116Chapter 6 --- Conclusion --- p.121Chapter A --- Formulation of Fischler and Bolles' Method for P3P Problems --- p.123Chapter B --- Derivation of z1 and z3 in terms of z2 --- p.127Chapter C --- Derivation of e1 and e2 --- p.129Chapter D --- Derivation of the Update Rule for Gauss-Newton Method --- p.130Chapter E --- Proof of (λ1λ2-λ 4)>〉0 --- p.132Chapter F --- Derivation of φ and hi --- p.133Chapter G --- Derivation of w1j to w4j --- p.134Chapter H --- More Experimental Results on Panoramic Stitching Algorithms --- p.138Bibliography --- p.14

    Interactive illumination and navigation control in an image-based environment.

    Get PDF
    Fu Chi-wing.Thesis (M.Phil.)--Chinese University of Hong Kong, 1999.Includes bibliographical references (leaves 141-149).Abstract --- p.iAcknowledgments --- p.iiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Introduction to Image-based Rendering --- p.1Chapter 1.2 --- Scene Complexity Independent Property --- p.2Chapter 1.3 --- Application of this Research Work --- p.3Chapter 1.4 --- Organization of this Thesis --- p.4Chapter 2 --- Illumination Control --- p.7Chapter 2.1 --- Introduction --- p.7Chapter 2.2 --- Apparent BRDF of Pixel --- p.8Chapter 2.3 --- Sampling Illumination Information --- p.11Chapter 2.4 --- Re-rendering --- p.13Chapter 2.4.1 --- Light Direction --- p.15Chapter 2.4.2 --- Light Intensity --- p.15Chapter 2.4.3 --- Multiple Light Sources --- p.15Chapter 2.4.4 --- Type of Light Sources --- p.18Chapter 2.5 --- Data Compression --- p.22Chapter 2.5.1 --- Intra-pixel coherence --- p.22Chapter 2.5.2 --- Inter-pixel coherence --- p.22Chapter 2.6 --- Implementation and Result --- p.22Chapter 2.6.1 --- An Interactive Viewer --- p.22Chapter 2.6.2 --- Lazy Re-rendering --- p.24Chapter 2.7 --- Conclusion --- p.24Chapter 3 --- Navigation Control - Triangle-based Warping Rule --- p.29Chapter 3.1 --- Introduction to Navigation Control --- p.29Chapter 3.2 --- Related Works --- p.30Chapter 3.3 --- Epipolar Geometry (Perspective Projection Manifold) --- p.31Chapter 3.4 --- Drawing Order for Pixel-Sized Entities --- p.35Chapter 3.5 --- Triangle-based Image Warping --- p.36Chapter 3.5.1 --- Image-based Triangulation --- p.36Chapter 3.5.2 --- Image-based Visibility Sorting --- p.40Chapter 3.5.3 --- Topological Sorting --- p.44Chapter 3.6 --- Results --- p.46Chapter 3.7 --- Conclusion --- p.48Chapter 4 --- Panoramic Projection Manifold --- p.52Chapter 4.1 --- Epipolar Geometry (Spherical Projection Manifold) --- p.53Chapter 4.2 --- Image Triangulation --- p.56Chapter 4.2.1 --- Optical Flow --- p.56Chapter 4.2.2 --- Image Gradient and Potential Function --- p.57Chapter 4.2.3 --- Triangulation --- p.58Chapter 4.3 --- Image-based Visibility Sorting --- p.58Chapter 4.3.1 --- Mapping Criteria --- p.58Chapter 4.3.2 --- Ordering of Two Triangles --- p.59Chapter 4.3.3 --- Graph Construction and Topological Sort --- p.63Chapter 4.4 --- Results --- p.63Chapter 4.5 --- Conclusion --- p.65Chapter 5 --- Panoramic-based Navigation using Real Photos --- p.69Chapter 5.1 --- Introduction --- p.69Chapter 5.2 --- System Overview --- p.71Chapter 5.3 --- Correspondence Matching --- p.72Chapter 5.3.1 --- Basic Model of Epipolar Geometry --- p.72Chapter 5.3.2 --- Epipolar Geometry between two Neighbor Panoramic Nodes --- p.73Chapter 5.3.3 --- Line and Patch Correspondence Matching --- p.74Chapter 5.4 --- Triangle-based Warping --- p.75Chapter 5.4.1 --- Why Warp Triangle --- p.75Chapter 5.4.2 --- Patch and Layer Construction --- p.76Chapter 5.4.3 --- Triangulation and Mesh Subdivision --- p.76Chapter 5.4.4 --- Layered Triangle-based Warping --- p.77Chapter 5.5 --- Implementation --- p.78Chapter 5.5.1 --- Image Sampler and Panoramic Stitcher --- p.78Chapter 5.5.2 --- Interactive Correspondence Matcher and Triangulation --- p.79Chapter 5.5.3 --- Basic Panoramic Viewer --- p.79Chapter 5.5.4 --- Formulating Drag Vector and vn --- p.80Chapter 5.5.5 --- Controlling Walkthrough Parameter --- p.82Chapter 5.5.6 --- Interactive Web-based Panoramic Viewer --- p.83Chapter 5.6 --- Results --- p.84Chapter 5.7 --- Conclusion and Possible Enhancements --- p.84Chapter 6 --- Compositing Warped Images for Object-based Viewing --- p.89Chapter 6.1 --- Modeling Object-based Viewing --- p.89Chapter 6.2 --- Triangulation and Convex Hull Criteria --- p.92Chapter 6.3 --- Warping Multiple Views --- p.94Chapter 6.3.1 --- Method I --- p.95Chapter 6.3.2 --- Method II --- p.95Chapter 6.3.3 --- Method III --- p.95Chapter 6.4 --- Results --- p.97Chapter 6.5 --- Conclusion --- p.100Chapter 7 --- Complete Rendering Pipeline --- p.107Chapter 7.1 --- Reviews on Illumination and Navigation --- p.107Chapter 7.1.1 --- Illumination Rendering Pipeline --- p.107Chapter 7.1.2 --- Navigation Rendering Pipeline --- p.108Chapter 7.2 --- Analysis of the Two Rendering Pipelines --- p.109Chapter 7.2.1 --- Combination on the Architectural Level --- p.109Chapter 7.2.2 --- Ensuring Physical Correctness --- p.111Chapter 7.3 --- Generalizing Apparent BRDF --- p.112Chapter 7.3.1 --- Difficulties to Encode BRDF with Spherical Harmonics --- p.112Chapter 7.3.2 --- Generalize Apparent BRDF --- p.112Chapter 7.3.3 --- Related works for Encoding the generalized apparent BRDF --- p.113Chapter 7.4 --- Conclusion --- p.116Chapter 8 --- Conclusion --- p.117Chapter A --- Spherical Harmonics --- p.120Chapter B --- It is Rare for Cycles to Exist in the Drawing Order Graph --- p.123Chapter B.1 --- Theorem 3 --- p.123Chapter B.2 --- Inside and Outside-directed Triangles in a Triangular Cycle --- p.125Chapter B.2.1 --- Assumption --- p.126Chapter B.2.2 --- Inside-directed and Outside-directed triangles --- p.126Chapter B.3 --- Four Possible Cases to Form a Cycle --- p.127Chapter B.3.1 --- Case(l) Triangular Fan --- p.128Chapter B.3.2 --- Case(2) Two Outside-directed Triangles --- p.129Chapter B.3.3 --- Case(3) Three Outside-directed Triangles --- p.130Chapter B.3.4 --- Case(4) More than Three Outside-directed Triangles --- p.131Chapter B.4 --- Experiment --- p.132Chapter C --- Deriving the Epipolar Line Formula on Cylindrical Projection Manifold --- p.133Chapter C.1 --- Notations --- p.133Chapter C.2 --- General Formula --- p.134Chapter C.3 --- Simplify the General Formula to a Sine Curve --- p.137Chapter C.4 --- Show that the Epipolar Line is a Sine Curve Segment --- p.139Chapter D --- Publications Related to this Research Work --- p.141Bibliography --- p.14

    Spherical Image Processing for Immersive Visualisation and View Generation

    Get PDF
    This research presents the study of processing panoramic spherical images for immersive visualisation of real environments and generation of in-between views based on two views acquired. For visualisation based on one spherical image, the surrounding environment is modelled by a unit sphere mapped with the spherical image and the user is then allowed to navigate within the modelled scene. For visualisation based on two spherical images, a view generation algorithm is developed for modelling an indoor manmade environment and new views can be generated at an arbitrary position with respect to the existing two. This allows the scene to be modelled using multiple spherical images and the user to move smoothly from one sphere mapped image to another one by going through in-between sphere mapped images generated

    Simulation of High-Visual Quality Scenes in Low-Cost Virtual Reality

    Get PDF
    With the increasing popularity of virtual reality, many video games and virtual experiences with high-visual quality have been developed recently. Virtual reality with a high-quality representation of scenes is still an experience linked to high-cost devices. There are currently low-cost virtual reality solutions by using mobile devices, but in those cases, the visual quality of the presented virtual environments must be simplified for running on mobile devices with limited hardware characteristics. In this work, we present a novel Image-Based Rendering technique for low-cost virtual reality. We have conducted a performance evaluation of three mobile devices with different hardware characteristics. Results show that our technique represents high-visual quality virtual environments with considerably better performance compared to traditional rendering solutions.Workshop: WCGIV – Computación Gráfica, Imágenes y VisualizaciónRed de Universidades con Carreras en Informátic

    An efficient approach to layered-depth image based rendering

    Get PDF
    Master'sMASTER OF SCIENC

    Survey of image-based representations and compression techniques

    Get PDF
    In this paper, we survey the techniques for image-based rendering (IBR) and for compressing image-based representations. Unlike traditional three-dimensional (3-D) computer graphics, in which 3-D geometry of the scene is known, IBR techniques render novel views directly from input images. IBR techniques can be classified into three categories according to how much geometric information is used: rendering without geometry, rendering with implicit geometry (i.e., correspondence), and rendering with explicit geometry (either with approximate or accurate geometry). We discuss the characteristics of these categories and their representative techniques. IBR techniques demonstrate a surprising diverse range in their extent of use of images and geometry in representing 3-D scenes. We explore the issues in trading off the use of images and geometry by revisiting plenoptic-sampling analysis and the notions of view dependency and geometric proxies. Finally, we highlight compression techniques specifically designed for image-based representations. Such compression techniques are important in making IBR techniques practical.published_or_final_versio
    corecore