352 research outputs found

    Panoramic Images for Situational Awareness in a 3D Chart-of-the-Future Display

    Get PDF
    Many early charts featured sketches of the coastline, providing a good picture of what the shore looked like from the bridge of a ship. These helped the mariner to distinguish one port from another during an approach and establish their rough position within that approach. More recent experimental 3D chart interfaces have incorporated 3D models of land topography and man-made structures to perform the same function. However, topography is typically captured from the air, by means of stereophotogrammetry or lidar and fails to present a good representation of what is seen from a vessel’s bridge. We have been conducting an investigation of ways to present photographic imagery to the mariner to better capture the utility of the early coastline sketches. Our focus has been on navigation in restricted waters, using the Piscataqua River as a test area. This is part of our “Chart-of-the-Future” project being conducted by The Data Visualization Research Lab at the UNH Center for Coastal and Ocean Mapping. Through our investigation, we have developed a new method for presenting photographic imagery to the mariner, in the form of a series of panoramic images progressing down the channel. The panoramas consist of images stitched almost seamlessly together into circular arcs, whose centers are intended to be close to the position of a vessel’s bridge during transit. When viewed from this center, there is no distortion, and distortion increases to a maximum between two panorama centers. Our preliminary trials suggest that panoramas can provide an excellent supplement to electronic navigation aids by making them visible in the context of what can be seen out the window. We believe panoramas will be especially useful both in familiarizing a mariner with an unfamiliar approach during planning, and in enhancing situational awareness at times of reduced visibility such as in fog, dusk, or nightfall

    Photon counting compressive depth mapping

    Get PDF
    We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 x 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 x 32 pixel real-time video for three-dimensional object tracking at 14 frames-per-second.Comment: 16 pages, 8 figure

    Fusing Information in a 3D Chart-of-the-Future Display

    Get PDF
    The Data Visualization Research Lab at the Center for Coastal and Ocean Mapping is investigating how three-dimensional navigational displays can most effectively be constructed. This effort is progressing along multiple paths and is implemented in the GeoNav3D system, a 3D chart-of-the-future research prototype. We present two lines of investigation here. First, we explore how tide, depth, and planning information can be combined (fused) into a single view, in order to give the user a more realistic picture of effective water depths. In the GeoNav3D system, 3D shaded bathymetry, coded for color depth, is used to display navigable areas. As in ENC displays, different colors are used to easily identify areas that are safe, areas where under-keel clearance is minimal, and areas where depths are too shallow. Real-time or model-generated tide information is taken into account in dynamically color-coding the depths. One advantage to using a continuous bathymetric model, versus discrete depth areas, is that the model can be continuously adjusted for water level. This concept is also extended for planning purposes by displaying the color-coded depths along a proposed corridor at the expected time of reaching each point. In our second line of investigation, we explore mechanisms for linking information from multiple 3D views into a coherent whole. In GeoNav3D, it is possible to create a variety of plan and perspective views, and these views can be attached to moving reference frames. This provides not only semi-static views such as from-the-bridge and under-keel along-track profile views, but also more dynamic, interactive views. These views are linked through visual devices that allow the fusion of information from among the views. We present several such devices and show how they highlight relevant details and help to minimize user confusion. Investigation into the utility of various linked views for aiding realsituation decision-making is ongoin

    GeoZui3D: Data Fusion for Interpreting Oceanographic Data

    Get PDF
    GeoZui3D stands for Geographic Zooming User Interface. It is a new visualization software system designed for interpreting multiple sources of 3D data. The system supports gridded terrain models, triangular meshes, curtain plots, and a number of other display objects. A novel center of workspace interaction method unifies a number of aspects of the interface. It creates a simple viewpoint control method, it helps link multiple views, and is ideal for stereoscopic viewing. GeoZui3D has a number of features to support real-time input. Through a CORBA interface external entities can influence the position and state of objects in the display. Extra windows can be attached to moving objects allowing for their position and data to be monitored. We describe the application of this system for heterogeneous data fusion, for multibeam QC and for ROV/AUV monitoring

    Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning

    Get PDF

    Photon Counting Compressive Depth Mapping

    Get PDF
    We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 Ă— 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 Ă— 32 pixel real-time video for three-dimensional object tracking at 14 frames-per-second

    Intravital microscopy for evaluating tumor perfusion of nanoparticles exposed to non-invasive radiofrequency electric fields

    Get PDF
    Poor biodistribution and accumulation of chemotherapeutics in tumors due to limitations on diffusive transport and high intra-tumoral pressures (Jain RK, Nat Med. 7(9):987–989, 2001) have prompted the investigation of adjunctive therapies to improve treatment outcomes. Hyperthermia has been widely applied in attempts to meet this need, but it is limited in its ability to reach tumors in deeply located body regions. High-intensity radiofrequency (RF) electric fields have the potential to overcome such barriers enhancing delivery and extravasation of chemotherapeutics. However, due to factors, including tumor heterogeneity and lack of kinetic information, there is insufficient understanding of time-resolved interaction between RF fields and tumor vasculature, drug molecules and nanoparticle (NP) vectors. Intravital microscopy (IVM) provides time-resolved high-definition images of specific tumor microenvironments, overcoming heterogeneity issues, and can be integrated with a portable RF device to enable detailed observation over time of the effects of the RF field on kinetics and biodistribution at the microvascular level. Herein, we provide a protocol describing the safe integration of IVM with a high-powered non-invasive RF field applied to 4T1 orthotopic breast tumors in live mice. Results show increased perfusion of NPs in microvasculature upon RF hyperthermia treatment and increased perfusion, release and spreading of injected reagents preferentially in irregular vessels during RF exposure

    Electronic Chart of the Future: The Hampton Roads Project

    Get PDF
    ECDIS is evolving from a two-dimensional static display of chart-related data to a decision support system capable of providing real-time or forecast information. While there may not be consensus on how this will occur, it is clear that to do this, ENC data and the shipboard display environment must incorporate both depth and time in an intuitively understandable way. Currently, we have the ability to conduct high-density hydrographic surveys capable of producing ENCs with decimeter contour intervals or depth areas. Yet, our existing systems and specifications do not provide for a full utilization of this capability. Ideally, a mariner should be able to benefit from detailed hydrographic data, coupled with both forecast and real-time water levels, and presented in a variety of perspectives. With this information mariners will be able to plan and carry out transits with the benefit of precisely determined and easily perceived underkeel, overhead, and lateral clearances. This paper describes a Hampton Roads Demonstration Project to investigate the challenges and opportunities of developing the “Electronic Chart of the Future.” In particular, a three-phase demonstration project is being planned: 1. Compile test datasets from existing and new hydrographic surveys using advanced data processing and compilation procedures developed at the University of New Hampshire’s Center for Coastal and Ocean Mapping/Joint Hydrographic Center (CCOM/JHC); 2. Investigate innovative approaches being developed at the CCOM/JHC to produce an interactive time- and tide-aware navigation display, and to evaluate such a display on commercial and/or government vessels; 3. Integrate real-time/forecast water depth information and port information services transmitted via an AIS communications broadcast
    • …
    corecore