2,115 research outputs found

    Visualization of and Access to CloudSat Vertical Data through Google Earth

    Get PDF
    Online tools, pioneered by the Google Earth (GE), are facilitating the way in which scientists and general public interact with geospatial data in real three dimensions. However, even in Google Earth, there is no method for depicting vertical geospatial data derived from remote sensing satellites as an orbit curtain seen from above. Here, an effective solution is proposed to automatically render the vertical atmospheric data on Google Earth. The data are first processed through the Giovanni system, then, processed to be 15-second vertical data images. A generalized COLLADA model is devised based on the 15-second vertical data profile. Using the designed COLLADA models and satellite orbit coordinates, a satellite orbit model is designed and implemented in KML format to render the vertical atmospheric data in spatial and temporal ranges vividly. The whole orbit model consists of repeated model slices. The model slices, each representing 15 seconds of vertical data, are placed on the CloudSat orbit based on the size, scale, and angle with the longitude line that are precisely and separately calculated on the fly for each slice according to the CloudSat orbit coordinates. The resulting vertical scientific data can be viewed transparently or opaquely on Google Earth. Not only is the research bridged the science and data with scientists and the general public in the most popular way, but simultaneous visualization and efficient exploration of the relationships among quantitative geospatial data, e.g. comparing the vertical data profiles with MODIS and AIRS precipitation data, becomes possible

    The Hierarchic treatment of marine ecological information from spatial networks of benthic platforms

    Get PDF
    Measuring biodiversity simultaneously in different locations, at different temporal scales, and over wide spatial scales is of strategic importance for the improvement of our understanding of the functioning of marine ecosystems and for the conservation of their biodiversity. Monitoring networks of cabled observatories, along with other docked autonomous systems (e.g., Remotely Operated Vehicles [ROVs], Autonomous Underwater Vehicles [AUVs], and crawlers), are being conceived and established at a spatial scale capable of tracking energy fluxes across benthic and pelagic compartments, as well as across geographic ecotones. At the same time, optoacoustic imaging is sustaining an unprecedented expansion in marine ecological monitoring, enabling the acquisition of new biological and environmental data at an appropriate spatiotemporal scale. At this stage, one of the main problems for an effective application of these technologies is the processing, storage, and treatment of the acquired complex ecological information. Here, we provide a conceptual overview on the technological developments in the multiparametric generation, storage, and automated hierarchic treatment of biological and environmental information required to capture the spatiotemporal complexity of a marine ecosystem. In doing so, we present a pipeline of ecological data acquisition and processing in different steps and prone to automation. We also give an example of population biomass, community richness and biodiversity data computation (as indicators for ecosystem functionality) with an Internet Operated Vehicle (a mobile crawler). Finally, we discuss the software requirements for that automated data processing at the level of cyber-infrastructures with sensor calibration and control, data banking, and ingestion into large data portals.Peer ReviewedPostprint (published version

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Establishing a persistent interoperability test-bed for European geospatial research

    Get PDF
    The development of standards for geospatial web services has been spearheaded by the Open Geospatial Consortium (OGC) - a group of over 370 private, public and academic organisations (OGC, 1999-2009). The OGC aims to facilitate interoperability between geospatial technologies through education, standards and other initiatives. The OGC Service Architecture, described in the international standard ISO 19119, offers an abstract specification for web services covering data dissemination, processing, portrayal, workflows and other areas. The development of specifications covering each of these categories of web services has led to a significant number of geospatial data and computational services available on the World Wide Web (the Web). A project1 to establish a persistent geospatial interoperability test-bed (PTB) was commissioned in 2007 by the Association of Geographic Information Laboratories in Europe (AGILE), Commission 5 (Networks) of the European Spatial Data Research (EuroSDR) organisation and the OGC

    Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services

    Full text link
    One of the most widely-implemented service standards provided by the Open Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS). WMS is widely employed globally, but there is limited knowledge of the global distribution, adoption status or the service quality of these online WMS resources. To fill this void, we investigated global WMSs resources and performed distributed performance monitoring of these services. This paper explicates a distributed monitoring framework that was used to monitor 46,296 WMSs continuously for over one year and a crawling method to discover these WMSs. We analyzed server locations, provider types, themes, the spatiotemporal coverage of map layers and the service versions for 41,703 valid WMSs. Furthermore, we appraised the stability and performance of basic operations for 1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major reasons for request errors and performance issues, as well as the relationship between service response times and the spatiotemporal distribution of client monitoring sites. This paper will help service providers, end users and developers of standards to grasp the status of global WMS resources, as well as to understand the adoption status of OGC standards. The conclusions drawn in this paper can benefit geospatial resource discovery, service performance evaluation and guide service performance improvements.Comment: 24 pages; 15 figure

    Interface refactoring in performance-constrained web services

    Get PDF
    This paper presents the development of REF-WS an approach to enable a Web Service provider to reliably evolve their service through the application of refactoring transformations. REF-WS is intended to aid service providers, particularly in a reliability and performance constrained domain as it permits upgraded ’non-backwards compatible’ services to be deployed into a performance constrained network where existing consumers depend on an older version of the service interface. In order for this to be successful, the refactoring and message mediation needs to occur without affecting functional compatibility with the services’ consumers, and must operate within the performance overhead expected of the original service, introducing as little latency as possible. Furthermore, compared to a manually programmed solution, the presented approach enables the service developer to apply and parameterize refactorings with a level of confidence that they will not produce an invalid or ’corrupt’ transformation of messages. This is achieved through the use of preconditions for the defined refactorings
    • …
    corecore