71 research outputs found

    A Simple and Scalable Static Analysis for Bound Analysis and Amortized Complexity Analysis

    Full text link
    We present the first scalable bound analysis that achieves amortized complexity analysis. In contrast to earlier work, our bound analysis is not based on general purpose reasoners such as abstract interpreters, software model checkers or computer algebra tools. Rather, we derive bounds directly from abstract program models, which we obtain from programs by comparatively simple invariant generation and symbolic execution techniques. As a result, we obtain an analysis that is more predictable and more scalable than earlier approaches. Our experiments demonstrate that our analysis is fast and at the same time able to compute bounds for challenging loops in a large real-world benchmark. Technically, our approach is based on lossy vector addition systems (VASS). Our bound analysis first computes a lexicographic ranking function that proves the termination of a VASS, and then derives a bound from this ranking function. Our methodology achieves amortized analysis based on a new insight how lexicographic ranking functions can be used for bound analysis

    Relaxed notions of schema mapping equivalence revisited

    No full text
    Recently, two relaxed notions of equivalence of schema mappings have been introduced, which provide more potential of optimizing schema mappings than logical equivalence: data exchange (DE) equivalence and conjunctive query (CQ) equivalence. In this work, we systematically investigate these notions of equivalence for mappings consisting of s-t tgds and target egds and/or target tgds. We prove that both CQ- and DE-equivalence are undecidable and so are some important optimization tasks (like detecting if some dependency is redundant). However, we also identify an important difference between the two notions of equivalence: CQ-equivalence remains undecidable even if the schema mappings consist of s-t tgds and target dependencies in the form of key dependencies only. In contrast, DE-equivalence is decidable for schema mappings with s-t tgds and target dependencies in the form of functional and inclusion dependencies with terminating chase property

    Object Reconstruction In A Bundle Block Environment

    No full text
    : One of the main fields of current research in photogrammetry is concerned with the reconstruction of object surfaces from digital images. The great number of object classes which can be reconstructed by photogrammetric plotting as well as the great number of possible plotting configurations suggest to investigate common strategies for reconstruction and common structures for object models. In the course of the Austrian Research Program on Digital Image Processing, a framework for object reconstruction is being developed which should work under quite general circumstances. The central part of the framework is the integration of a bundle block adjustment system for consistent modelling of the object surfaces and hybrid robust estimation of model parameters for verification of correspondence hypotheses. The generation of these hypotheses will be a task-dependent feature based matching algorithm. Approximate values will be improved by hierarchical methods (image pyramids) starting from ..

    Modern Texture Mapping in Computer Graphics

    No full text
    Figure 1: Rendering of virtual surface detail (left and center, [Policarpo et al. 2005]) and shadow mapping (right, from the pc game ’Crysis’). The original texture mapping algorithm was introduced to add detail, color and surface texture to computer generated graphics. Because by itself it lacks any capability to simulate changes of lighting, shadowing and perspective perturbations caused by finer surface detail a number of improvements like bump mapping, parallax mapping and relief mapping have been introduced. Another common use of textures is shadow mapping. This paper not only describes those modern day additions to basic texture mapping but focuses on the latest refinements of those techniques which finally give them the quality and performance required for real world applications

    Towards EPC Semantics based on State and Context Jan Mendling Wil van der Aalst

    No full text
    Abstract: The semantics of the OR-join have been discussed for some time, in the context of EPCs, but also in the context of other business process modeling languages like YAWL. In this paper, we show that the existing solutions are not satisfactory from the intuition of the modeler. Furthermore, we present a novel approach towards the definition of EPC semantics based on state and context. The approach uses two types of annotations for arcs. Like in some of the other approaches, arcs are annotated with positive and negative tokens. Moreover, each arc has a context status denoting whether a positive token may still arrive. Using a four-phase approach tokens and statuses are propagated thus yielding a new kind of semantics which overcomes some of the wellknown problems related to OR-joins in EPCs.

    What is Descriptive Geometry for?

    No full text
    This is a pleading for Descriptive Geometry. From the very first, Descriptive Geometry is a method to study 3D geometry through 2D images thus offering insight into structure and metrical properties of spatial objects, processes and principles. The education in Descriptive Geometry provides a training of the students ’ intellectual capability of space perception. Drawings are the guide to geometry but not the main aim.

    CABRISS Project Poster

    No full text
    CABRISS Project Poste
    • …
    corecore