13 research outputs found

    Integration of georegistered information on a virtual globe

    Full text link
    In collaborative augmented reality (AR) missions, much georegis-tered information is collected and sent to a command and control center. This paper describes the concept and prototypical imple-mentation of a mixed reality (MR) based system that integrates georegistered information from AR systems and other sources on a virtual globe. The application can be used for a command and control center to monitor the field operation where multiple AR users are engaging in a collaborative mission. Google Earth is used to demonstrate the system, which integrates georegistered icons, live video streams from field operators or surveillance cameras, 3D models, and satellite or aerial photos into one MR environment

    Mixed Reality on a Virtual Globe

    Get PDF

    Augmented Reality

    Get PDF
    Augmented Reality (AR) is a natural development from virtual reality (VR), which was developed several decades earlier. AR complements VR in many ways. Due to the advantages of the user being able to see both the real and virtual objects simultaneously, AR is far more intuitive, but it's not completely detached from human factors and other restrictions. AR doesn't consume as much time and effort in the applications because it's not required to construct the entire virtual scene and the environment. In this book, several new and emerging application areas of AR are presented and divided into three sections. The first section contains applications in outdoor and mobile AR, such as construction, restoration, security and surveillance. The second section deals with AR in medical, biological, and human bodies. The third and final section contains a number of new and useful applications in daily living and learning

    The Land Tool Box is Full

    Get PDF

    Evaluation of remote sensing approaches

    Get PDF
    Megan M Lewis, Davina Whitehttp://archive.nwc.gov.au/library/topic/groundwater/allocating-water-and-maintaining-springs-in-the-great-artesian-basi

    KOLAM : human computer interfaces fro visual analytics in big data imagery

    Get PDF
    In the present day, we are faced with a deluge of disparate and dynamic information from multiple heterogeneous sources. Among these are the big data imagery datasets that are rapidly being generated via mature acquisition methods in the geospatial, surveillance (specifically, Wide Area Motion Imagery or WAMI) and biomedical domains. The need to interactively visualize these imagery datasets by using multiple types of views (as needed) into the data is common to these domains. Furthermore, researchers in each domain have additional needs: users of WAMI datasets also need to interactively track objects of interest using algorithms of their choice, visualize the resulting object trajectories and interactively edit these results as needed. While software tools that fulfill each of these requirements individually are available and well-used at present, there is still a need for tools that can combine the desired aspects of visualization, human computer interaction (HCI), data analysis, data management, and (geo-)spatial and temporal data processing into a single flexible and extensible system. KOLAM is an open, cross-platform, interoperable, scalable and extensible framework for visualization and analysis that we have developed to fulfil the above needs. The novel contributions in this thesis are the following: 1) Spatio-temporal caching for animating both giga-pixel and Full Motion Video (FMV) imagery, 2) Human computer interfaces purposefully designed to accommodate big data visualization, 3) Human-in-the-loop interactive video object tracking - ground-truthing of moving objects in wide area imagery using algorithm assisted human-in-the-loop coupled tracking, 4) Coordinated visualization using stacked layers, side-by-side layers/video sub-windows and embedded imagery, 5) Efficient one-click manual tracking, editing and data management of trajectories, 6) Efficient labeling of image segmentation regions and passing these results to desired modules, 7) Visualization of image processing results generated by non-interactive operators using layers, 8) Extension of interactive imagery and trajectory visualization to multi-monitor wall display environments, 9) Geospatial applications: Providing rapid roam, zoom and hyper-jump spatial operations, interactive blending, colormap and histogram enhancement, spherical projection and terrain maps, 10) Biomedical applications: Visualization and target tracking of cell motility in time-lapse cell imagery, collecting ground-truth from experts on whole-slide imagery (WSI) for developing histopathology analytic algorithms and computer-aided diagnosis for cancer grading, and easy-to-use tissue annotation features.Includes bibliographical reference

    Advanced Location-Based Technologies and Services

    Get PDF
    Since the publication of the first edition in 2004, advances in mobile devices, positioning sensors, WiFi fingerprinting, and wireless communications, among others, have paved the way for developing new and advanced location-based services (LBSs). This second edition provides up-to-date information on LBSs, including WiFi fingerprinting, mobile computing, geospatial clouds, geospatial data mining, location privacy, and location-based social networking. It also includes new chapters on application areas such as LBSs for public health, indoor navigation, and advertising. In addition, the chapter on remote sensing has been revised to address advancements

    Examining spatiotemporal changes in the phenology of Australian mangroves using satellite imagery

    Get PDF
    Nicolás Younes investigated the phenology of Australian mangroves using satellite imagery, field data, and generalized additive models. He found that satellite-derived phenology changes with location, frequency of observation, and spatial resolution. Nicolás challenges the common methods for detecting phenology and proposes a data-driven approach

    New algorithms for atmospheric correction and retrieval of biophysical parameters in earth observation : application to ENVISAT/MERIS data

    Get PDF
    An algorithm for the derivation of atmospheric and surface biophysical products from the MEdium Resolution Imaging Spectrometer Instrument (MERIS) on board ENVIronmental SATellite (ENVISAT/MERIS) Level 1b data over land has been developed. Georectified aerosol optical thickness (AOT), columnar water vapor (CWV), spectral surface reflectance and chlorophyll fluorescence (CF) maps are generated. Emphasis has been put on implementing a robust software able to provide those products on an operational manner, making no use of ancillary parameters apart from those attached to MERIS images. For this reason, it has been named Self-Contained Atmospheric Parameters Estimation from MERIS data (SCAPE-M). The fundamentals of the algorithm and the validation of the derived products are presented in this thesis. Errors of ±0.03, ±4% and ±8% have been estimated for AOT, CWV and surface reflectance retrievals, respectively, by means of a sensitivity analysis. More than 200 MERIS images have been processed in order to assess the method performance under a range of atmospheric and geographical conditions. A good comparison is found between SCAPE-M AOT retrievals and ground-based measurements taken during the SPectra bARrax Campaigns (SPARC) 2003 and 2004, except for a date when an episode of Saharan dust intrusion was detected. Comparison of SCAPE-M retrievals with data from AErosol RObotic NETwork (AERONET) stations showed a square Pearson's correlation coefficient R2 of about 0.7-0.8. Those values grow up to more than 0.9 in the case of CWV after comparison with the same stations. A good correlation is also found with the ESA Level 2 official CWV product, although slight different performances with varying surface elevation are detected. Retrieved surface reflectance maps have been intercompared with reflectance data derived from MERIS images by the Bremen AErosol Retrieval (BAER) method in the first place
    corecore