3,405 research outputs found

    A generic architecture for geographic information systems

    Get PDF

    Historical collaborative geocoding

    Full text link
    The latest developments in digital have provided large data sets that can increasingly easily be accessed and used. These data sets often contain indirect localisation information, such as historical addresses. Historical geocoding is the process of transforming the indirect localisation information to direct localisation that can be placed on a map, which enables spatial analysis and cross-referencing. Many efficient geocoders exist for current addresses, but they do not deal with the temporal aspect and are based on a strict hierarchy (..., city, street, house number) that is hard or impossible to use with historical data. Indeed historical data are full of uncertainties (temporal aspect, semantic aspect, spatial precision, confidence in historical source, ...) that can not be resolved, as there is no way to go back in time to check. We propose an open source, open data, extensible solution for geocoding that is based on the building of gazetteers composed of geohistorical objects extracted from historical topographical maps. Once the gazetteers are available, geocoding an historical address is a matter of finding the geohistorical object in the gazetteers that is the best match to the historical address. The matching criteriae are customisable and include several dimensions (fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is to facilitate historical work, we also propose web-based user interfaces that help geocode (one address or batch mode) and display over current or historical topographical maps, so that they can be checked and collaboratively edited. The system is tested on Paris city for the 19-20th centuries, shows high returns rate and is fast enough to be used interactively.Comment: WORKING PAPE

    Fast retrieval of weather analogues in a multi-petabyte meteorological archive

    Get PDF
    The European Centre for Medium-Range Weather Forecasts (ECMWF) manages the largest archive of meteorological data in the world. At the time of writing, it holds around 300 petabytes and grows at a rate of 1 petabyte per week. This archive is now mature, and contains valuable datasets such as several reanalyses, providing a consistent view of the weather over several decades. Weather analogue is the term used by meteorologists to refer to similar weather situations. Looking for analogues in an archive using a brute force approach requires data to be retrieved from tape and then compared to a user-provided weather pattern, using a chosen similarity measure. Such an operation would be very long and costly. In this work, a wavelet-based fingerprinting scheme is proposed to index all weather patterns from the archive, over a selected geographical domain. The system answers search queries by computing the fingerprint of the query pattern and looking for close matched in the index. Searches are fast enough that they are perceived as being instantaneous. A web-based application is provided, allowing users to express their queries interactively in a friendly and straightforward manner by sketching weather patterns directly in their web browser. Matching results are then presented as a series of weather maps, labelled with the date and time at which they occur. The system has been deployed as part of the Copernicus Climate Data Store and allows the retrieval of weather analogues from ERA5, a 40-years hourly reanalysis dataset. Some preliminary results of this work have been presented at the International Conference on Computational Science 2018 (Raoult et al. (2018))

    A Data Model and Processing Environment for Ocean-Wide Bathymetric Data Compilations

    Get PDF
    The compilation of ocean-wide digital bathymetric models (DBM) requires specific features of the bathymetric data storage and great flexibility of the data processing chain. In this article a solution based upon a spatial relational database management system and a Geographical Information System front end is introduced, which will eventually serve the compilation of a new DBM of the North Atlantic Ocean. As shown in a preliminary case study, the abundance of sounding data-both single beam and multibeam-available in that area to date bears an extremely high potential to derive a DBM with much greater accuracy and resolution than the DBMs commonly used today.La compilacion de modelos batimetricos digitales oceanicos requiere caracterfsticas especfficas de almacenamiento de datos batimetricos y una gran flexibilidad en la cadena de procesamiento de datos. En este articulo se presenta una solucion basada en un sistema de administracion de una base de datos relacionales espaciales y se introduce un Sistema de Informacion Geografica, que servira finalmente para la compilacion de un nuevo Modelo Batimetrico Digital del Oceano Atlantica Norte. Tal y como se muestra en un estudio de un caso preliminar, la abundancia de datos de sondeos - tanto multihaz como monohaz - disponibles actualmente en esa zona ofrecen un potencial extremadamente alto para alcanzar un Modelo Batimetrico Digital con mayor precision y resolucion que los modelos utilizados comunmente hoy en dia.La compilation des modeles bathymetriques numeriques (DBM) des oceans necessite des elements specifiques du stockage des donnees bathymetriques et une grande flexibilite de la chaine de traitement des donnees. Dans cet article, une solution reposant sur le systeme de gestion de la base de donnees relationnelle et un systeme d'information geographique frontal sont introduits, ce qui servira en fin de compte a la compilation d'un nouveau systeme DBM de l'ocean atlantique nord septentrional. Comme indique dans l'etude de cas preliminaire, le grand nombre de donnees de sondage, a la fois monofaisceau et multifaisceaux, disponible dans cette zone, constitue a ce jour un potentiet tres eteve pour la mise au point d'un DBM avec une exactitude et une resolution bien superieures a celle aujourd'hui des DBMs en service dans ces jours

    Bibliometrix Analysis: Management Inventory dan Supply Management

    Get PDF
    Inventory and Supply Management is an important aspect of efficient business operations. Both are interrelated and have an impact on achieving competitive advantage and customer satisfaction. In the case study of PT. Berkahjaya Sentosa Technique, Bibliometrix analysis is used to evaluate the performance of inventory and supply management of companies. The results show that effective inventory and supply management can improve operational efficiency and customer satisfaction. Bibliometrics is an analytical method used to map and analyze scientific literature in a particular field. In the context of inventory management and distribution control, bibliometrics can provide valuable insight into the development and focus of study in a particular field. The purpose of this study is to analyze and summarize scientific literature related to inventory management and distribution control using bibliometric methods. The method used is descriptive analysis using analysis tools such as three-field plots and tables of main information. The results show that scientific literature related to inventory management and distribution control has experienced a decline in annual growth in the period 2018-2023. However, the average citation per document shows that this literature is still very relevant and important in the context of inventory management and distribution control. This research provides valuable insights for researchers and practitioners in the field of inventory management and distribution contro
    • …
    corecore