14,462 research outputs found

    Visual communication in urban planning and urban design

    Get PDF
    This report documents the current status of visual communication in urban design and planning. Visual communication is examined through discussion of standalone and network media, specifically concentrating on visualisation on the World Wide Web(WWW).Firstly, we examine the use of Solid and Geometric Modelling for visualising urban planning and urban design. This report documents and compares examples of the use of Virtual Reality Modelling Language (VRML) and proprietary WWW based Virtual Reality modelling software. Examples include the modelling of Bath and Glasgow using both VRML 1.0 and 2.0. A review is carried out on the use of Virtual Worldsand their role in visualising urban form within multi-user environments. The use of Virtual Worlds is developed into a case study of the possibilities and limitations of Virtual Internet Design Arenas (ViDAs), an initiative undertaken at the Centre for Advanced Spatial Analysis, University College London. The use of Virtual Worlds and their development towards ViDAs is seen as one of the most important developments in visual communication for urban planning and urban design since the development plan.Secondly, photorealistic media in the process of communicating plans is examined.The process of creating photorealistic media is documented, examples of the Virtual Streetscape and Wired Whitehall Virtual Urban Interface System are provided. The conclusion is drawn that although the use of photo-realistic media on the WWW provides a way to visually communicate planning information, its use is limited. The merging of photorealistic media and solid geometric modelling is reviewed in the creation of Augmented Reality. Augmented Reality is seen to provide an important step forward in the ability to quickly and easily visualise urban planning and urban design information.Thirdly, the role of visual communication of planning data through GIS is examined interms of desktop, three dimensional and Internet based GIS systems. The evolution to Internet GIS is seen as a critical component in the development of virtual cities which will allow urban planners and urban designers to visualise and model the complexity of the built environment in networked virtual reality.Finally a viewpoint is put forward of the Virtual City, linking Internet GIS with photorealistic multi-user Virtual Worlds. At present there are constraints on how far virtual cities can be developed, but a view is provided on how these networked virtual worlds are developing to aid visual communication in urban planning and urban design

    Incorporating interactive 3-dimensional graphics in astronomy research papers

    Full text link
    Most research data collections created or used by astronomers are intrinsically multi-dimensional. In contrast, all visual representations of data presented within research papers are exclusively 2-dimensional. We present a resolution of this dichotomy that uses a novel technique for embedding 3-dimensional (3-d) visualisations of astronomy data sets in electronic-format research papers. Our technique uses the latest Adobe Portable Document Format extensions together with a new version of the S2PLOT programming library. The 3-d models can be easily rotated and explored by the reader and, in some cases, modified. We demonstrate example applications of this technique including: 3-d figures exhibiting subtle structure in redshift catalogues, colour-magnitude diagrams and halo merger trees; 3-d isosurface and volume renderings of cosmological simulations; and 3-d models of instructional diagrams and instrument designs.Comment: 18 pages, 7 figures, submitted to New Astronomy. For paper with 3-dimensional embedded figures, see http://astronomy.swin.edu.au/s2plot/3dpd

    Incorporating interactive 3-dimensional graphics in astronomy research papers

    Full text link
    Most research data collections created or used by astronomers are intrinsically multi-dimensional. In contrast, all visual representations of data presented within research papers are exclusively 2-dimensional. We present a resolution of this dichotomy that uses a novel technique for embedding 3-dimensional (3-d) visualisations of astronomy data sets in electronic-format research papers. Our technique uses the latest Adobe Portable Document Format extensions together with a new version of the S2PLOT programming library. The 3-d models can be easily rotated and explored by the reader and, in some cases, modified. We demonstrate example applications of this technique including: 3-d figures exhibiting subtle structure in redshift catalogues, colour-magnitude diagrams and halo merger trees; 3-d isosurface and volume renderings of cosmological simulations; and 3-d models of instructional diagrams and instrument designs.Comment: 18 pages, 7 figures, submitted to New Astronomy. For paper with 3-dimensional embedded figures, see http://astronomy.swin.edu.au/s2plot/3dpd

    Seamful interweaving: heterogeneity in the theory and design of interactive systems

    Get PDF
    Design experience and theoretical discussion suggest that a narrow design focus on one tool or medium as primary may clash with the way that everyday activity involves the interweaving and combination of many heterogeneous media. Interaction may become seamless and unproblematic, even if the differences, boundaries and 'seams' in media are objectively perceivable. People accommodate and take advantage of seams and heterogeneity, in and through the process of interaction. We use an experiment with a mixed reality system to ground and detail our discussion of seamful design, which takes account of this process, and theory that reflects and informs such design. We critique the 'disappearance' mentioned by Weiser as a goal for ubicomp, and Dourish's 'embodied interaction' approach to HCI, suggesting that these design ideals may be unachievable or incomplete because they underemphasise the interdependence of 'invisible' non-rationalising interaction and focused rationalising interaction within ongoing activity

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio

    Utilizing a 3D game engine to develop a virtual design review system

    Get PDF
    A design review process is where information is exchanged between the designers and design reviewers to resolve any potential design related issues, and to ensure that the interests and goals of the owner are met. The effective execution of design review will minimize potential errors or conflicts, reduce the time for review, shorten the project life-cycle, allow for earlier occupancy, and ultimately translate into significant total project savings to the owner. However, the current methods of design review are still heavily relying on 2D paper-based format, sequential and lack central and integrated information base for efficient exchange and flow of information. There is thus a need for the use of a new medium that allow for 3D visualization of designs, collaboration among designers and design reviewers, and early and easy access to design review information. This paper documents the innovative utilization of a 3D game engine, the Torque Game Engine as the underlying tool and enabling technology for a design review system, the Virtual Design Review System for architectural designs. Two major elements are incorporated; 1) a 3D game engine as the driving tool for the development and implementation of design review processes, and 2) a virtual environment as the medium for design review, where visualization of design and design review information is based on sound principles of GUI design. The development of the VDRS involves two major phases; firstly, the creation of the assets and the assembly of the virtual environment, and secondly, the modification of existing functions or introducing new functionality through programming of the 3D game engine in order to support design review in a virtual environment. The features that are included in the VDRS are support for database, real-time collaboration across network, viewing and navigation modes, 3D object manipulation, parametric input, GUI, and organization for 3D objects

    Using Augmented Reality as a Medium to Assist Teaching in Higher Education

    Get PDF
    In this paper we describe the use of a high-level augmented reality (AR) interface for the construction of collaborative educational applications that can be used in practice to enhance current teaching methods. A combination of multimedia information including spatial three-dimensional models, images, textual information, video, animations and sound, can be superimposed in a student-friendly manner into the learning environment. In several case studies different learning scenarios have been carefully designed based on human-computer interaction principles so that meaningful virtual information is presented in an interactive and compelling way. Collaboration between the participants is achieved through use of a tangible AR interface that uses marker cards as well as an immersive AR environment which is based on software user interfaces (UIs) and hardware devices. The interactive AR interface has been piloted in the classroom at two UK universities in departments of Informatics and Information Science

    Visualization of and Access to CloudSat Vertical Data through Google Earth

    Get PDF
    Online tools, pioneered by the Google Earth (GE), are facilitating the way in which scientists and general public interact with geospatial data in real three dimensions. However, even in Google Earth, there is no method for depicting vertical geospatial data derived from remote sensing satellites as an orbit curtain seen from above. Here, an effective solution is proposed to automatically render the vertical atmospheric data on Google Earth. The data are first processed through the Giovanni system, then, processed to be 15-second vertical data images. A generalized COLLADA model is devised based on the 15-second vertical data profile. Using the designed COLLADA models and satellite orbit coordinates, a satellite orbit model is designed and implemented in KML format to render the vertical atmospheric data in spatial and temporal ranges vividly. The whole orbit model consists of repeated model slices. The model slices, each representing 15 seconds of vertical data, are placed on the CloudSat orbit based on the size, scale, and angle with the longitude line that are precisely and separately calculated on the fly for each slice according to the CloudSat orbit coordinates. The resulting vertical scientific data can be viewed transparently or opaquely on Google Earth. Not only is the research bridged the science and data with scientists and the general public in the most popular way, but simultaneous visualization and efficient exploration of the relationships among quantitative geospatial data, e.g. comparing the vertical data profiles with MODIS and AIRS precipitation data, becomes possible
    • …
    corecore