7,871 research outputs found

    Technology Integration around the Geographic Information: A State of the Art

    Get PDF
    One of the elements that have popularized and facilitated the use of geographical information on a variety of computational applications has been the use of Web maps; this has opened new research challenges on different subjects, from locating places and people, the study of social behavior or the analyzing of the hidden structures of the terms used in a natural language query used for locating a place. However, the use of geographic information under technological features is not new, instead it has been part of a development and technological integration process. This paper presents a state of the art review about the application of geographic information under different approaches: its use on location based services, the collaborative user participation on it, its contextual-awareness, its use in the Semantic Web and the challenges of its use in natural languge queries. Finally, a prototype that integrates most of these areas is presented

    Constrained set-up of the tGAP structure for progressive vector data transfer

    Get PDF
    A promising approach to submit a vector map from a server to a mobile client is to send a coarse representation first, which then is incrementally refined. We consider the problem of defining a sequence of such increments for areas of different land-cover classes in a planar partition. In order to submit well-generalised datasets, we propose a method of two stages: First, we create a generalised representation from a detailed dataset, using an optimisation approach that satisfies certain cartographic constraints. Second, we define a sequence of basic merge and simplification operations that transforms the most detailed dataset gradually into the generalised dataset. The obtained sequence of gradual transformations is stored without geometrical redundancy in a structure that builds up on the previously developed tGAP (topological Generalised Area Partitioning) structure. This structure and the algorithm for intermediate levels of detail (LoD) have been implemented in an object-relational database and tested for land-cover data from the official German topographic dataset ATKIS at scale 1:50 000 to the target scale 1:250 000. Results of these tests allow us to conclude that the data at lowest LoD and at intermediate LoDs is well generalised. Applying specialised heuristics the applied optimisation method copes with large datasets; the tGAP structure allows users to efficiently query and retrieve a dataset at a specified LoD. Data are sent progressively from the server to the client: First a coarse representation is sent, which is refined until the requested LoD is reached

    NON-PARAMETRIC STATISTICAL APPROACH TO CORRECT SATELLITE RAINFALL DATA IN NEAR-REAL-TIME FOR RAIN BASED FLOOD NOWCASTING

    Get PDF
    Floods resulting from intense rainfall are one of the most disastrous hazards in many regions of the world since they contribute greatly to personal injury and to property damage mainly as a result of their ability to strike with little warning. The possibility to give an alert about a flooding situation at least a few hours before helps greatly to reduce the damage. Therefore, scores of flood forecasting systems have been developed during the past few years mainly at country level and regional level. Flood forecasting systems based only on traditional methods such as return period of flooding situations or extreme rainfall events have failed on most occasions to forecast flooding situations accurately because of changes on territory in recent years by extensive infrastructure development, increased frequency of extreme rainfall events over recent decades, etc. Nowadays, flood nowcasting systems or early warning systems which run on real- time precipitation data are becoming more popular as they give reliable forecasts compared to traditional flood forecasting systems. However, these kinds of systems are often limited to developed countries as they need well distributed gauging station networks or sophisticated surface-based radar systems to collect real-time precipitation data. In most of the developing countries and in some developed countries also, precipitation data from available sparse gauging stations are inadequate for developing representative aerial samples needed by such systems. As satellites are able to provide a global coverage with a continuous temporal availability, currently the possibility of using satellite-based rainfall estimates in flood nowcasting systems is being highly investigated. To contribute to the world's requirement for flood early warning systems, ITHACA developed a global scale flood nowcasting system that runs on near-real-time satellite rainfall estimates. The system was developed in cooperation with United Nations World Food Programme (WFP), to support the preparedness phase of the WFP like humanitarian assistance agencies, mainly in less developed countries. The concept behind this early warning system is identifying critical rainfall events for each hydrological basin on the earth with past rainfall data and using them to identify floodable rainfall events with real time rainfall data. The individuation of critical rainfall events was done with a hydrological analysis using 3B42 rainfall data which is the most accurate product of Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) dataset. These critical events have been stored in a database and when a rainfall event is found in real-time which is similar or exceeds the event in the database an alert is issued for the basin area. The most accurate product of TMPA (3B42) is derived by applying bias adjustments to real time rainfall estimates using rain gauge data, thus it is available for end-users 10-15 days after each calendar month. The real time product of TMPA (3B42RT) is released approximately 9 hours after real-time and lacks of such kind of bias adjustments using rain gauge data as rain gauge data are not available in real time. Therefore, to have reliable alerts it is very important to reduce the uncertainty of 3B42RT product before using it in the early warning system. For this purpose, a statistical approach was proposed to make near real- time bias adjustments for the near real time product of TMPA (3B42RT). In this approach the relationship between the bias adjusted rainfall data product (3B42) and the real-time rainfall data product (3B42RT) was analyzed on the basis of drainage basins for the period from January 2003 to December 2007, and correction factors were developed for each basin worldwide to perform near real-time bias adjusted product estimation from the real-time rainfall data product (3B42RT). The accuracy of the product was analyzed by comparing with gauge rainfall data from Bangladesh and it was found that the uncertainty of the product is less even than the most accurate product of TMPA dataset (3B42

    A generic approach to simplification of geodata for mobile applications

    Get PDF

    Integration of forest fire management with SDI: user requirements.

    Get PDF
    Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies.Forest fires management is not only an emergency task, the preventive task could be even more important, being better avoid the possibility of a forest fire ignition before it start or reduce its hazard, that latter try to extinct it. To implement a useful forest fire management into a SDI is crucial to know the user requirements, which is the spatial information they manage, which are the GIS applications they manage in their work, which are the alerts send a receive in the forest fires context. A Survey has been done to have a better compression of the reality and user requirements. A review of Spanish and European works in forest fires and emergency management has been done to identify which are the actual challenges in emergency management

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Mapping and the Citizen Sensor

    Get PDF
    Maps are a fundamental resource in a diverse array of applications ranging from everyday activities, such as route planning through the legal demarcation of space to scientific studies, such as those seeking to understand biodiversity and inform the design of nature reserves for species conservation. For a map to have value, it should provide an accurate and timely representation of the phenomenon depicted and this can be a challenge in a dynamic world. Fortunately, mapping activities have benefitted greatly from recent advances in geoinformation technologies. Satellite remote sensing, for example, now offers unparalleled data acquisition and authoritative mapping agencies have developed systems for the routine production of maps in accordance with strict standards. Until recently, much mapping activity was in the exclusive realm of authoritative agencies but technological development has also allowed the rise of the amateur mapping community. The proliferation of inexpensive and highly mobile and location aware devices together with Web 2.0 technology have fostered the emergence of the citizen as a source of data. Mapping presently benefits from vast amounts of spatial data as well as people able to provide observations of geographic phenomena, which can inform map production, revision and evaluation. The great potential of these developments is, however, often limited by concerns. The latter span issues from the nature of the citizens through the way data are collected and shared to the quality and trustworthiness of the data. This book reports on some of the key issues connected with the use of citizen sensors in mapping. It arises from a European Co-operation in Science and Technology (COST) Action, which explored issues linked to topics ranging from citizen motivation, data acquisition, data quality and the use of citizen derived data in the production of maps that rival, and sometimes surpass, maps arising from authoritative agencies
    corecore