21,322 research outputs found
The PLACE Toolkit: exposing geospatial ready digital collections
PLACE, the Position-based Library Archive Coordinate Explorer, is a University of New Hampshire geospatial data server and search interface that enables discovery of digital collections. Identifying geographic coordinates for “geospatial ready” digitized cultural heritage materials is key to the project.
Presented: Open Repositories 2017, Brisbane, Australia. June 27, 201
Geospatial Authentication
A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication
FogGIS: Fog Computing for Geospatial Big Data Analytics
Cloud Geographic Information Systems (GIS) has emerged as a tool for
analysis, processing and transmission of geospatial data. The Fog computing is
a paradigm where Fog devices help to increase throughput and reduce latency at
the edge of the client. This paper developed a Fog-based framework named Fog
GIS for mining analytics from geospatial data. We built a prototype using Intel
Edison, an embedded microprocessor. We validated the FogGIS by doing
preliminary analysis. including compression, and overlay analysis. Results
showed that Fog computing hold a great promise for analysis of geospatial data.
We used several open source compression techniques for reducing the
transmission to the cloud.Comment: 6 pages, 4 figures, 1 table, 3rd IEEE Uttar Pradesh Section
International Conference on Electrical, Computer and Electronics (09-11
December, 2016) Indian Institute of Technology (Banaras Hindu University)
Varanasi, Indi
Recommended from our members
A linked data approach to publishing complex scientific workflows
Past data management practices in many fields of natural science, including climate research, have focused primarily on the final research output - the research publication - with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. Data were often regarded merely as an adjunct to the publication, rather than a scientific resource in their own right. In this paper, we attempt to address the issues of capturing and publishing detailed workflows associated with the climate/research datasets held by the Climatic Research Unit (CRU) at the University of East Anglia. To this end, we present a customisable approach to exposing climate research workflows for the effective re-use of the associated data, through the adoption of linked-data principles, existing widely adopted citation techniques (Digital Object Identifier) and data exchange mechanisms (Open Archives Initiative Object Reuse and Exchange)
GeomRDF: A Geodata Converter with a Fine-Grained Structured Representation of Geometry in the Web
In recent years, with the advent of the web of data, a growing number of
national mapping agencies tend to publish their geospatial data as Linked Data.
However, differences between traditional GIS data models and Linked Data model
can make the publication process more complicated. Besides, it may require, to
be done, the setting of several parameters and some expertise in the semantic
web technologies. In addition, the use of standards like GeoSPARQL (or ad hoc
predicates) is mandatory to perform spatial queries on published geospatial
data. In this paper, we present GeomRDF, a tool that helps users to convert
spatial data from traditional GIS formats to RDF model easily. It generates
geometries represented as GeoSPARQL WKT literal but also as structured
geometries that can be exploited by using only the RDF query language, SPARQL.
GeomRDF was implemented as a module in the RDF publication platform Datalift. A
validation of GeomRDF has been realized against the French administrative units
dataset (provided by IGN France).Comment: 12 pages, 2 figures, the 1st International Workshop on Geospatial
Linked Data (GeoLD 2014) - SEMANTiCS 201
Recommended from our members
An ECOOP web portal for visualising and comparing distributed coastal oceanography model and in situ data
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations
Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services
One of the most widely-implemented service standards provided by the Open
Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS).
WMS is widely employed globally, but there is limited knowledge of the global
distribution, adoption status or the service quality of these online WMS
resources. To fill this void, we investigated global WMSs resources and
performed distributed performance monitoring of these services. This paper
explicates a distributed monitoring framework that was used to monitor 46,296
WMSs continuously for over one year and a crawling method to discover these
WMSs. We analyzed server locations, provider types, themes, the spatiotemporal
coverage of map layers and the service versions for 41,703 valid WMSs.
Furthermore, we appraised the stability and performance of basic operations for
1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major
reasons for request errors and performance issues, as well as the relationship
between service response times and the spatiotemporal distribution of client
monitoring sites. This paper will help service providers, end users and
developers of standards to grasp the status of global WMS resources, as well as
to understand the adoption status of OGC standards. The conclusions drawn in
this paper can benefit geospatial resource discovery, service performance
evaluation and guide service performance improvements.Comment: 24 pages; 15 figure
- …
