638 research outputs found

    Investigating the use of semantic technologies in spatial mapping applications

    Get PDF
    Semantic Web Technologies are ideally suited to build context-aware information retrieval applications. However, the geospatial aspect of context awareness presents unique challenges such as the semantic modelling of geographical references for efficient handling of spatial queries, the reconciliation of the heterogeneity at the semantic and geo-representation levels, maintaining the quality of service and scalability of communicating, and the efficient rendering of the spatial queries' results. In this paper, we describe the modelling decisions taken to solve these challenges by analysing our implementation of an intelligent planning and recommendation tool that provides location-aware advice for a specific application domain. This paper contributes to the methodology of integrating heterogeneous geo-referenced data into semantic knowledgebases, and also proposes mechanisms for efficient spatial interrogation of the semantic knowledgebase and optimising the rendering of the dynamically retrieved context-relevant information on a web frontend

    Levels Of Data Interoperability In The Emerging North American Groundwater Data Network

    Full text link
    The Canadian Groundwater Information Network (GIN) and the US National Ground-Water Monitoring Network (NGWMN) connect data from a variety of sources including states, provinces and federal agencies. Data heterogeneity is a major challenge faced by these networks, one that must be overcome at five distinct levels: systems, syntax, structure, semantics, and pragmatics. This paper discusses approaches taken at each of the five levels to ensure interoperability between the Canadian and American networks. The result is an emerging North American Groundwater Data Network, which enables users to access data transparently and uniformly on either side of the shared border

    Viewpoints on emergent semantics

    Get PDF
    Authors include:Philippe Cudr´e-Mauroux, and Karl Aberer (editors), Alia I. Abdelmoty, Tiziana Catarci, Ernesto Damiani, Arantxa Illaramendi, Robert Meersman, Erich J. Neuhold, Christine Parent, Kai-Uwe Sattler, Monica Scannapieco, Stefano Spaccapietra, Peter Spyns, and Guy De Tr´eWe introduce a novel view on how to deal with the problems of semantic interoperability in distributed systems. This view is based on the concept of emergent semantics, which sees both the representation of semantics and the discovery of the proper interpretation of symbols as the result of a self-organizing process performed by distributed agents exchanging symbols and having utilities dependent on the proper interpretation of the symbols. This is a complex systems perspective on the problem of dealing with semantics. We highlight some of the distinctive features of our vision and point out preliminary examples of its applicatio

    Paradigmatic Tendencies in Cartography: A Synthesis of the Scientific-Empirical, Critical and Post-Representational Perspectives

    Get PDF
    Maps have been important elements of visual representation in the development of different societies, and for this reason they have mainly been considered from a practical and utilitarian point of view. This means that cartographers or mapmakers have largely focused on the technical aspects of the cartographic products, and cartography has given little attention to both its theoretical component and to its philosophical and epistemological aspects. The current study is dedicated to consider these views. In this study the main trends, thoughts and different directions in cartography during positivism/empiricism, neo-positivism and post-structuralism are reviewed; and cartography is analysed under the modernism and post-modernism periods. Some of the arguments proposed by philosophers such as Ludwig Wittgenstein and Karl Popper are examined as important contributions in our understanding of the development of cartography and mapping. This study also incorporates the idea or concept of paradigm, which has been taken from the field of the epistemology of sciences. The aforementioned opens a space to analyse cartography in terms of a paradigm shift. In the analysis of each trend within contemporary cartography – from the second half of the twentieth century until today – it is necessary to keep in mind the theoretical scheme of a scientific discipline (object of study, research aims, methods and approaches, and results). This helps to determine the body of knowledge in cartography. It is also important to consider the epistemological context in which the tendencies are developed: positivism/empiricism, realism/structuralism and idealism/hermeneutic. In this way, by considering three epistemological levels - essentialist/ontical (scientific), deconstructive (sociological), and ontological (emergent) - some paradigmatic tendencies are postulated. The first level results in tendencies such as cartographic communication, cartographic semiotics, analytical cartography and cartographic visualisation - all of these belong to the scientific-empirical perspective. In the second level, we have critical cartography, belonging to the critical perspective and that confronts the scientific stances. Finally, in the third level the so-called post-representational cartography arises in open opposition to the traditional representational cartography.Im Entwicklungsprozess verschiedener Gesellschaften sind Karten immer wichtige Elemente visueller Darstellung gewesen. Karten wurden meist aus einer praktischen und utilitaristischen Sicht betrachtet. Das heißt, dass sich Kartographen oder Kartenmacher gezielt auf die technischen Aspekte kartographischer Produkte fokussiert haben, und Kartographie sich nur wenig mit den theoretischen Komponenten und philosophischen oder epistemologischen Aspekten auseinandergesetzt hat. Diese Arbeit verfolgt das Ziel, diese Sichten zu analysieren. Diese Studie untersucht die verschiedenen kartographischen Denkrichtungen, die während des Positivismus/Empirismus, des Neo-Positivismus und der Post-Strukturalismusperioden entstanden sind und analysiert Kartographie der Moderne und post-moderner Perioden. Argumente von Philosophen wie Ludwig Wittgenstein und Karl Popper werden untersucht als wichtige Beiträge zu unserem Verständnis der Entwicklung der Kartographie. Diese Arbeit berücksichtigt auch das Konzept des Paradigmas, welches aus dem Gebiet der wissenschaftlichen Epistemologie adaptiert wurde. Dies eröffnet die Möglichkeit, Kartographie hinsichtlich eines Paradigmenwechsels analysieren zu können. Wenn man die Tendenzen der zeitgenössischen Kartographie – von der zweiten Hälfte des zwanzigsten Jahrhunderts bis heute – studiert, muss der theoretische Rahmen einer wissenschaftlichen Disziplin (Forschungsobjekt, Forschungsziel, Arbeitsmethodik und Ergebnisse) berücksichtigt werden. Dies erlaubt es, das gesammelte Wissen der Kartographie zu ermitteln. Ebenfalls wichtig ist die Berücksichtigung des epistemologischen Kontexts, in dem diese Tendenzen entstanden: Positivismus/Empirismus, Realismus/Strukturalismus und Idealismus/Hermeneutik. Unter Berücksichtigung dreier epistemologischer Ebenen – Essenzialisten/ontisch (wissenschaftlich), dekonstructiv (soziologisch) und ontologisch (emergent) – werden ausgewählte paradigmatische Tendenzen postuliert. Die erste Ebene ergibt Tendenzen wie die kartographische Kommunikation, die kartographische Semiotik, die analytische Kartographie und die kartographische Visualisierung, die alle zu der wissenschaftlich-empirischen Perspektive gehören. Zur zweiten Ebene gehört die kritische Kartographie, welche der kritischen Perspektive zugeordnet ist und die wissenschaftliche Standpunkte konfrontiert. Die so genannte post-repräsentative Kartographie entsteht aus der dritten Ebene im offenen Widerstand zur traditionellen repräsentativen Kartographie

    Geospatial crowdsourced data fitness analysis for spatial data infrastructure based disaster management actions

    Get PDF
    The reporting of disasters has changed from official media reports to citizen reporters who are at the disaster scene. This kind of crowd based reporting, related to disasters or any other events, is often identified as 'Crowdsourced Data' (CSD). CSD are freely and widely available thanks to the current technological advancements. The quality of CSD is often problematic as it is often created by the citizens of varying skills and backgrounds. CSD is considered unstructured in general, and its quality remains poorly defined. Moreover, the CSD's location availability and the quality of any available locations may be incomplete. The traditional data quality assessment methods and parameters are also often incompatible with the unstructured nature of CSD due to its undocumented nature and missing metadata. Although other research has identified credibility and relevance as possible CSD quality assessment indicators, the available assessment methods for these indicators are still immature. In the 2011 Australian floods, the citizens and disaster management administrators used the Ushahidi Crowd-mapping platform and the Twitter social media platform to extensively communicate flood related information including hazards, evacuations, help services, road closures and property damage. This research designed a CSD quality assessment framework and tested the quality of the 2011 Australian floods' Ushahidi Crowdmap and Twitter data. In particular, it explored a number of aspects namely, location availability and location quality assessment, semantic extraction of hidden location toponyms and the analysis of the credibility and relevance of reports. This research was conducted based on a Design Science (DS) research method which is often utilised in Information Science (IS) based research. Location availability of the Ushahidi Crowdmap and the Twitter data assessed the quality of available locations by comparing three different datasets i.e. Google Maps, OpenStreetMap (OSM) and Queensland Department of Natural Resources and Mines' (QDNRM) road data. Missing locations were semantically extracted using Natural Language Processing (NLP) and gazetteer lookup techniques. The Credibility of Ushahidi Crowdmap dataset was assessed using a naive Bayesian Network (BN) model commonly utilised in spam email detection. CSD relevance was assessed by adapting Geographic Information Retrieval (GIR) relevance assessment techniques which are also utilised in the IT sector. Thematic and geographic relevance were assessed using Term Frequency – Inverse Document Frequency Vector Space Model (TF-IDF VSM) and NLP based on semantic gazetteers. Results of the CSD location comparison showed that the combined use of non-authoritative and authoritative data improved location determination. The semantic location analysis results indicated some improvements of the location availability of the tweets and Crowdmap data; however, the quality of new locations was still uncertain. The results of the credibility analysis revealed that the spam email detection approaches are feasible for CSD credibility detection. However, it was critical to train the model in a controlled environment using structured training including modified training samples. The use of GIR techniques for CSD relevance analysis provided promising results. A separate relevance ranked list of the same CSD data was prepared through manual analysis. The results revealed that the two lists generally agreed which indicated the system's potential to analyse relevance in a similar way to humans. This research showed that the CSD fitness analysis can potentially improve the accuracy, reliability and currency of CSD and may be utilised to fill information gaps available in authoritative sources. The integrated and autonomous CSD qualification framework presented provides a guide for flood disaster first responders and could be adapted to support other forms of emergencies

    A Two-Level Information Modelling Translation Methodology and Framework to Achieve Semantic Interoperability in Constrained GeoObservational Sensor Systems

    Get PDF
    As geographical observational data capture, storage and sharing technologies such as in situ remote monitoring systems and spatial data infrastructures evolve, the vision of a Digital Earth, first articulated by Al Gore in 1998 is getting ever closer. However, there are still many challenges and open research questions. For example, data quality, provenance and heterogeneity remain an issue due to the complexity of geo-spatial data and information representation. Observational data are often inadequately semantically enriched by geo-observational information systems or spatial data infrastructures and so they often do not fully capture the true meaning of the associated datasets. Furthermore, data models underpinning these information systems are typically too rigid in their data representation to allow for the ever-changing and evolving nature of geo-spatial domain concepts. This impoverished approach to observational data representation reduces the ability of multi-disciplinary practitioners to share information in an interoperable and computable way. The health domain experiences similar challenges with representing complex and evolving domain information concepts. Within any complex domain (such as Earth system science or health) two categories or levels of domain concepts exist. Those concepts that remain stable over a long period of time, and those concepts that are prone to change, as the domain knowledge evolves, and new discoveries are made. Health informaticians have developed a sophisticated two-level modelling systems design approach for electronic health documentation over many years, and with the use of archetypes, have shown how data, information, and knowledge interoperability among heterogenous systems can be achieved. This research investigates whether two-level modelling can be translated from the health domain to the geo-spatial domain and applied to observing scenarios to achieve semantic interoperability within and between spatial data infrastructures, beyond what is possible with current state-of-the-art approaches. A detailed review of state-of-the-art SDIs, geo-spatial standards and the two-level modelling methodology was performed. A cross-domain translation methodology was developed, and a proof-of-concept geo-spatial two-level modelling framework was defined and implemented. The Open Geospatial Consortium’s (OGC) Observations & Measurements (O&M) standard was re-profiled to aid investigation of the two-level information modelling approach. An evaluation of the method was undertaken using II specific use-case scenarios. Information modelling was performed using the two-level modelling method to show how existing historical ocean observing datasets can be expressed semantically and harmonized using two-level modelling. Also, the flexibility of the approach was investigated by applying the method to an air quality monitoring scenario using a technologically constrained monitoring sensor system. This work has demonstrated that two-level modelling can be translated to the geospatial domain and then further developed to be used within a constrained technological sensor system; using traditional wireless sensor networks, semantic web technologies and Internet of Things based technologies. Domain specific evaluation results show that twolevel modelling presents a viable approach to achieve semantic interoperability between constrained geo-observational sensor systems and spatial data infrastructures for ocean observing and city based air quality observing scenarios. This has been demonstrated through the re-purposing of selected, existing geospatial data models and standards. However, it was found that re-using existing standards requires careful ontological analysis per domain concept and so caution is recommended in assuming the wider applicability of the approach. While the benefits of adopting a two-level information modelling approach to geospatial information modelling are potentially great, it was found that translation to a new domain is complex. The complexity of the approach was found to be a barrier to adoption, especially in commercial based projects where standards implementation is low on implementation road maps and the perceived benefits of standards adherence are low. Arising from this work, a novel set of base software components, methods and fundamental geo-archetypes have been developed. However, during this work it was not possible to form the required rich community of supporters to fully validate geoarchetypes. Therefore, the findings of this work are not exhaustive, and the archetype models produced are only indicative. The findings of this work can be used as the basis to encourage further investigation and uptake of two-level modelling within the Earth system science and geo-spatial domain. Ultimately, the outcomes of this work are to recommend further development and evaluation of the approach, building on the positive results thus far, and the base software artefacts developed to support the approach

    A Language-centered Approach to support environmental modeling with Cellular Automata

    Get PDF
    Die Anwendung von Methodiken und Technologien aus dem Bereich der Softwaretechnik auf den Bereich der Umweltmodellierung ist eine gemeinhin akzeptierte Vorgehensweise. Im Rahmen der "modellgetriebenen Entwicklung"(MDE, model-driven engineering) werden Technologien entwickelt, die darauf abzielen, Softwaresysteme vorwiegend auf Basis von im Vergleich zu Programmquelltexten relativ abstrakten Modellen zu entwickeln. Ein wesentlicher Bestandteil von MDE sind Techniken zur effizienten Entwicklung von "domänenspezifischen Sprachen"( DSL, domain-specific language), die auf Sprachmetamodellen beruhen. Die vorliegende Arbeit zeigt, wie modellgetriebene Entwicklung, und insbesondere die metamodellbasierte Beschreibung von DSLs, darüber hinaus Aspekte der Pragmatik unterstützen kann, deren Relevanz im erkenntnistheoretischen und kognitiven Hintergrund wissenschaftlichen Forschens begründet wird. Hierzu wird vor dem Hintergrund der Erkenntnisse des "modellbasierten Forschens"(model-based science und model-based reasoning) gezeigt, wie insbesondere durch Metamodelle beschriebene DSLs Möglichkeiten bieten, entsprechende pragmatische Aspekte besonders zu berücksichtigen, indem sie als Werkzeug zur Erkenntnisgewinnung aufgefasst werden. Dies ist v.a. im Kontext großer Unsicherheiten, wie sie für weite Teile der Umweltmodellierung charakterisierend sind, von grundsätzlicher Bedeutung. Die Formulierung eines sprachzentrierten Ansatzes (LCA, language-centered approach) für die Werkzeugunterstützung konkretisiert die genannten Aspekte und bildet die Basis für eine beispielhafte Implementierung eines Werkzeuges mit einer DSL für die Beschreibung von Zellulären Automaten (ZA) für die Umweltmodellierung. Anwendungsfälle belegen die Verwendbarkeit von ECAL und der entsprechenden metamodellbasierten Werkzeugimplementierung.The application of methods and technologies of software engineering to environmental modeling and simulation (EMS) is common, since both areas share basic issues of software development and digital simulation. Recent developments within the context of "Model-driven Engineering" (MDE) aim at supporting the development of software systems at the base of relatively abstract models as opposed to programming language code. A basic ingredient of MDE is the development of methods that allow the efficient development of "domain-specific languages" (DSL), in particular at the base of language metamodels. This thesis shows how MDE and language metamodeling in particular, may support pragmatic aspects that reflect epistemic and cognitive aspects of scientific investigations. For this, DSLs and language metamodeling in particular are set into the context of "model-based science" and "model-based reasoning". It is shown that the specific properties of metamodel-based DSLs may be used to support those properties, in particular transparency, which are of particular relevance against the background of uncertainty, that is a characterizing property of EMS. The findings are the base for the formulation of an corresponding specific metamodel- based approach for the provision of modeling tools for EMS (Language-centered Approach, LCA), which has been implemented (modeling tool ECA-EMS), including a new DSL for CA modeling for EMS (ECAL). At the base of this implementation, the applicability of this approach is shown
    • …
    corecore