81 research outputs found

    Type-safe two-level data transformation

    Get PDF
    A two-level data transformation consists of a type-level transformation of a data format coupled with value-level transformations of data instances corresponding to that format. Examples of two-level data transformations include XML schema evolution coupled with document migration, and data mappings used for interoperability and persistence. We provide a formal treatment of two-level data transformations that is type-safe in the sense that the well-formedness of the value-level transformations with respect to the type-level transformation is guarded by a strong type system. We rely on various techniques for generic functional programming to implement the formalization in Haskell. The formalization addresses various two-level transformation scenarios, covering fully automated as well as user-driven transformations, and allowing transformations that are information-preserving or not. In each case, two-level transformations are disciplined by one-step transformation rules and type-level transformations induce value-level transformations. We demonstrate an example hierarchical-relational mapping and subsequent migration of relational data induced by hierarchical format evolution.Fundação para a Ciência e a Tecnologia (FCT

    "Last-Mile" preparation for a potential disaster

    Get PDF
    Extreme natural events, like e.g. tsunamis or earthquakes, regularly lead to catastrophes with dramatic consequences. In recent years natural disasters caused hundreds of thousands of deaths, destruction of infrastructure, disruption of economic activity and loss of billions of dollars worth of property and thus revealed considerable deficits hindering their effective management: Needs for stakeholders, decision-makers as well as for persons concerned include systematic risk identification and evaluation, a way to assess countermeasures, awareness raising and decision support systems to be employed before, during and after crisis situations. The overall goal of this study focuses on interdisciplinary integration of various scientific disciplines to contribute to a tsunami early warning information system. In comparison to most studies our focus is on high-end geometric and thematic analysis to meet the requirements of small-scale, heterogeneous and complex coastal urban systems. Data, methods and results from engineering, remote sensing and social sciences are interlinked and provide comprehensive information for disaster risk assessment, management and reduction. In detail, we combine inundation modeling, urban morphology analysis, population assessment, socio-economic analysis of the population and evacuation modeling. The interdisciplinary results eventually lead to recommendations for mitigation strategies in the fields of spatial planning or coping capacity

    Generic Model Refactorings

    Get PDF
    Many modeling languages share some common concepts and principles. For example, Java, MOF, and UML share some aspects of the concepts\ud of classes, methods, attributes, and inheritance. However, model\ud transformations such as refactorings specified for a given language\ud cannot be readily reused for another language because their related\ud metamodels may be structurally different. Our aim is to enable a\ud flexible reuse of model transformations across various metamodels.\ud Thus, in this paper, we present an approach allowing the specification\ud of generic model transformations, in particular refactorings, so\ud that they can be applied to different metamodels. Our approach relies\ud on two mechanisms: (1) an adaptation based mainly on the weaving\ud of aspects; (2) the notion of model typing, an extension of object\ud typing in the model-oriented context. We validated our approach by\ud performing some experiments that consisted of specifying three well\ud known refactorings (Encapsulate Field, Move Method, and Pull Up Method)\ud and applying each of them onto three different metamodels (Java,\ud MOF, and UML)

    "Last-Mile" preparation for a potential disaster - Interdisciplinary approach towards tsunami early warning and an evacuation information system for the coastal city of Padang, Indonesia

    Get PDF
    Extreme natural events, like e.g. tsunamis or earthquakes, regularly lead to catastrophes with dramatic consequences. In recent years natural disasters caused hundreds of thousands of deaths, destruction of infrastructure, disruption of economic activity and loss of billions of dollars worth of property and thus revealed considerable deficits hindering their effective management: Needs for stakeholders, decision-makers as well as for persons concerned include systematic risk identification and evaluation, a way to assess countermeasures, awareness raising and decision support systems to be employed before, during and after crisis situations. The overall goal of this study focuses on interdisciplinary integration of various scientific disciplines to contribute to a tsunami early warning information system. In comparison to most studies our focus is on high-end geometric and thematic analysis to meet the requirements of smallscale, heterogeneous and complex coastal urban systems. Data, methods and results from engineering, remote sensing and social sciences are interlinked and provide comprehensive information for disaster risk assessment, management and reduction. In detail, we combine inundation modeling, urban morphology analysis, population assessment, socioeconomic analysis of the population and evacuation modeling. The interdisciplinary results eventually lead to recommendations for mitigation strategies in the fields of spatial planning or coping capacity.DFG/03G0666A-

    Towards system optimum: Finding optimal routing strategies in time dependent networks for large-scale evacuation problems

    Get PDF
    Disaster and evacuation planning crucially depend on good routing strategies. This article compares two different routing strategies in a multi-agent simulation of a large real world evacuation scenario. The first approach approximates a Nash equilibrium where every evacuee adopts an individually optimal routing strategy regardless of what this solution imposes on others. The second approach approximately minimizes the total travel time in the system, which requires to enforce cooperative behavior of the evacuees. Both approaches are analyzed in terms of the global evacuation dynamics and on a detailed geographic level

    Transforming data by calculation

    Get PDF
    Thispaperaddressesthefoundationsofdata-modeltransformation.A catalog of data mappings is presented which includes abstraction and representa- tion relations and associated constraints. These are justified in an algebraic style via the pointfree-transform, a technique whereby predicates are lifted to binary relation terms (of the algebra of programming) in a two-level style encompassing both data and operations. This approach to data calculation, which also includes transformation of recursive data models into “flat” database schemes, is offered as alternative to standard database design from abstract models. The calculus is also used to establish a link between the proposed transformational style and bidi- rectional lenses developed in the context of the classical view-update problem.Fundação para a Ciência e a Tecnologia (FCT

    Reducing the Cost of Grammar-Based Testing Using Pattern Coverage

    Get PDF
    Part 2: Test Derivation MethodsInternational audienceIn grammar-based testing, context-free grammars may be used to generate relevant test inputs for language processors, or meta programs, such as programming language compilers, refactoring tools, and implementations of software quality metrics. This technique can be used to test these meta programs, but the amount of sentences, and syntax trees thereof, which needs to be generated to obtain reasonable coverage of the input language is exponential.Pattern matching is a programming language feature used often when writing meta programs. Pattern matching helps because it automates the frequently occurring task of detecting shapes in, and extracting information from syntax trees. However, meta programs which contain many patterns are difficult to test using only randomly generated sentences from grammar rules. The reason is that statistically it is uncommon to directly generate sentences which accidentally match the patterns in the code.To solve this problem, in this paper we extract information from the patterns in the code of meta programs to guide the sentence generation process. We introduce a new coverage criterion, called Pattern Coverage, which focuses on providing a test strategy to reduce the amount of test necessary cases, while covering the relevant parts of the meta program. An initial experimental evaluation is presented and the result is compared with traditional grammar-based testing
    corecore