53,098 research outputs found

    A system for synthetic vision and augmented reality in future flight decks

    Get PDF
    Rockwell Science Center is investigating novel human-computer interaction techniques for enhancing the situational awareness in future flight decks. One aspect is to provide intuitive displays that provide the vital information and the spatial awareness by augmenting the real world with an overlay of relevant information registered to the real world. Such Augmented Reality (AR) techniques can be employed during bad weather scenarios to permit flying in Visual Flight Rules (VFR) in conditions which would normally require Instrumental Flight Rules (IFR). These systems could easily be implemented on heads-up displays (HUD). The advantage of AR systems vs. purely synthetic vision (SV) systems is that the pilot can relate the information overlay to real objects in the world, whereas SV systems provide a constant virtual view, where inconsistencies can hardly be detected. The development of components for such a system led to a demonstrator implemented on a PC. A camera grabs video images which are overlaid with registered information. Orientation of the camera is obtained from an inclinometer and a magnetometer; position is acquired from GPS. In a possible implementation in an airplane, the on-board attitude information can be used for obtaining correct registration. If visibility is sufficient, computer vision modules can be used to fine-tune the registration by matching visual cues with database features. This technology would be especially useful for landing approaches. The current demonstrator provides a frame-rate of 15 fps, using a live video feed as background with an overlay of avionics symbology in the foreground. In addition, terrain rendering from a 1 arc sec. digital elevation model database can be overlaid to provide synthetic vision in case of limited visibility. For true outdoor testing (on ground level), the system has been implemented on a wearable computer

    Enabling Self-aware Smart Buildings by Augmented Reality

    Full text link
    Conventional HVAC control systems are usually incognizant of the physical structures and materials of buildings. These systems merely follow pre-set HVAC control logic based on abstract building thermal response models, which are rough approximations to true physical models, ignoring dynamic spatial variations in built environments. To enable more accurate and responsive HVAC control, this paper introduces the notion of "self-aware" smart buildings, such that buildings are able to explicitly construct physical models of themselves (e.g., incorporating building structures and materials, and thermal flow dynamics). The question is how to enable self-aware buildings that automatically acquire dynamic knowledge of themselves. This paper presents a novel approach using "augmented reality". The extensive user-environment interactions in augmented reality not only can provide intuitive user interfaces for building systems, but also can capture the physical structures and possibly materials of buildings accurately to enable real-time building simulation and control. This paper presents a building system prototype incorporating augmented reality, and discusses its applications.Comment: This paper appears in ACM International Conference on Future Energy Systems (e-Energy), 201

    VANET Applications: Hot Use Cases

    Get PDF
    Current challenges of car manufacturers are to make roads safe, to achieve free flowing traffic with few congestions, and to reduce pollution by an effective fuel use. To reach these goals, many improvements are performed in-car, but more and more approaches rely on connected cars with communication capabilities between cars, with an infrastructure, or with IoT devices. Monitoring and coordinating vehicles allow then to compute intelligent ways of transportation. Connected cars have introduced a new way of thinking cars - not only as a mean for a driver to go from A to B, but as smart cars - a user extension like the smartphone today. In this report, we introduce concepts and specific vocabulary in order to classify current innovations or ideas on the emerging topic of smart car. We present a graphical categorization showing this evolution in function of the societal evolution. Different perspectives are adopted: a vehicle-centric view, a vehicle-network view, and a user-centric view; described by simple and complex use-cases and illustrated by a list of emerging and current projects from the academic and industrial worlds. We identified an empty space in innovation between the user and his car: paradoxically even if they are both in interaction, they are separated through different application uses. Future challenge is to interlace social concerns of the user within an intelligent and efficient driving

    Symbolic representation of scenarios in Bologna airport on virtual reality concept

    Get PDF
    This paper is a part of a big Project named Retina Project, which is focused in reduce the workload of an ATCO. It uses the last technological advances as Virtual Reality concept. The work has consisted in studying the different awareness situations that happens daily in Bologna Airport. It has been analysed one scenario with good visibility where the sun predominates and two other scenarios with poor visibility where the rain and the fog dominate. Due to the study of visibility in the three scenarios computed, the conclusion obtained is that the overlay must be shown with a constant dimension regardless the position of the aircraft to be readable by the ATC and also, the frame and the flight strip should be coloured in a showy colour (like red) for a better control by the ATCO

    Uncertainty Estimation and Visualization of Wind in Weather Forecasts

    Get PDF
    The Collaborative Symbiotic Weather Forecasting system, CSWF, let individual users do on-demand small region, short-term, and very high-resolution forecasts. When the regions have some overlap, a symbiotic forecast can be produced based on the individual forecasts from each region. Small differences in where the center of the region is located when there is complex terrain in the region, leads to significant differences in the forecasted values of wind speed and direction. These differences reflect the uncertainty of the numerical model. This paper describes two different ways of presenting these differences using a traditional map based approach on a laptop and a display wall, and an augmented reality approach on a tablet. The approaches have their distinct advantages and disadvantages depending on the actual use and requirements of the user

    Electronic Chart of the Future: The Hampton Roads Project

    Get PDF
    ECDIS is evolving from a two-dimensional static display of chart-related data to a decision support system capable of providing real-time or forecast information. While there may not be consensus on how this will occur, it is clear that to do this, ENC data and the shipboard display environment must incorporate both depth and time in an intuitively understandable way. Currently, we have the ability to conduct high-density hydrographic surveys capable of producing ENCs with decimeter contour intervals or depth areas. Yet, our existing systems and specifications do not provide for a full utilization of this capability. Ideally, a mariner should be able to benefit from detailed hydrographic data, coupled with both forecast and real-time water levels, and presented in a variety of perspectives. With this information mariners will be able to plan and carry out transits with the benefit of precisely determined and easily perceived underkeel, overhead, and lateral clearances. This paper describes a Hampton Roads Demonstration Project to investigate the challenges and opportunities of developing the “Electronic Chart of the Future.” In particular, a three-phase demonstration project is being planned: 1. Compile test datasets from existing and new hydrographic surveys using advanced data processing and compilation procedures developed at the University of New Hampshire’s Center for Coastal and Ocean Mapping/Joint Hydrographic Center (CCOM/JHC); 2. Investigate innovative approaches being developed at the CCOM/JHC to produce an interactive time- and tide-aware navigation display, and to evaluate such a display on commercial and/or government vessels; 3. Integrate real-time/forecast water depth information and port information services transmitted via an AIS communications broadcast
    • …
    corecore