59,740 research outputs found

    Identification-based Diagnosis of Rainfall ÂżStream Flow Data: the Tinderry Catchment

    Get PDF
    System identification tools, such as transfer function (TF) model structure identification, recursive estimation, time-varying parameter (TVP) estimation and assessment of data information, are used to evaluate the quality of rainfall-stream flow data from the Tinderry catchment (ACT, Australia) and the timevarying behaviour of the rainfall-stream flow dynamics. For the catchment, given the wide range and the abrupt changes of the single input-single output transfer functions describing different periods or events, we conclude that further investigation of (i) local rainfall effects, (ii) time-varying time delays (travelling time), (iii) time-varying residence times related to the base flow and (iv) occurrence of negative residues is needed. Periods with high and low data information content, for further use in effective parameter estimation procedures, are clearly indicated by the analysis

    Generating analyzers with PAG

    Get PDF
    To produce high qualitiy code, modern compilers use global optimization algorithms based on it abstract interpretation. These algorithms are rather complex; their implementation is therfore a non-trivial task and error-prone. However, since thez are based on a common theory, they have large similar parts. We conclude that analyzer writing better should be replaced with analyzer generation. We present the tool sf PAG that has a high level functional input language to specify data flow analyses. It offers th specifications of even recursive data structures and is therfore not limited to bit vector problems. sf PAG generates efficient analyzers wich can be easily integrated in existing compilers. The analyzers are interprocedural, they can handle recursive procedures with local variables and higher order functions. sf PAG has successfully been tested by generating several analyzers (e.g. alias analysis, constant propagation, inerval analysis) for an industrial quality ANSI-C and Fortran90 compiler. This technical report consits of two parts; the first introduces the generation system and the second evaluates generated analyzers with respect to their space and time consumption. bf Keywords: data flow analysis, specification and generation of analyzers, lattice specification, abstract syntax specification, interprocedural analysis, compiler construction

    Generalized Points-to Graphs: A New Abstraction of Memory in the Presence of Pointers

    Full text link
    Flow- and context-sensitive points-to analysis is difficult to scale; for top-down approaches, the problem centers on repeated analysis of the same procedure; for bottom-up approaches, the abstractions used to represent procedure summaries have not scaled while preserving precision. We propose a novel abstraction called the Generalized Points-to Graph (GPG) which views points-to relations as memory updates and generalizes them using the counts of indirection levels leaving the unknown pointees implicit. This allows us to construct GPGs as compact representations of bottom-up procedure summaries in terms of memory updates and control flow between them. Their compactness is ensured by the following optimizations: strength reduction reduces the indirection levels, redundancy elimination removes redundant memory updates and minimizes control flow (without over-approximating data dependence between memory updates), and call inlining enhances the opportunities of these optimizations. We devise novel operations and data flow analyses for these optimizations. Our quest for scalability of points-to analysis leads to the following insight: The real killer of scalability in program analysis is not the amount of data but the amount of control flow that it may be subjected to in search of precision. The effectiveness of GPGs lies in the fact that they discard as much control flow as possible without losing precision (i.e., by preserving data dependence without over-approximation). This is the reason why the GPGs are very small even for main procedures that contain the effect of the entire program. This allows our implementation to scale to 158kLoC for C programs

    Underapproximation of Procedure Summaries for Integer Programs

    Full text link
    We show how to underapproximate the procedure summaries of recursive programs over the integers using off-the-shelf analyzers for non-recursive programs. The novelty of our approach is that the non-recursive program we compute may capture unboundedly many behaviors of the original recursive program for which stack usage cannot be bounded. Moreover, we identify a class of recursive programs on which our method terminates and returns the precise summary relations without underapproximation. Doing so, we generalize a similar result for non-recursive programs to the recursive case. Finally, we present experimental results of an implementation of our method applied on a number of examples.Comment: 35 pages, 3 figures (this report supersedes the STTT version which in turn supersedes the TACAS'13 version

    An Algebraic Framework for Compositional Program Analysis

    Full text link
    The purpose of a program analysis is to compute an abstract meaning for a program which approximates its dynamic behaviour. A compositional program analysis accomplishes this task with a divide-and-conquer strategy: the meaning of a program is computed by dividing it into sub-programs, computing their meaning, and then combining the results. Compositional program analyses are desirable because they can yield scalable (and easily parallelizable) program analyses. This paper presents algebraic framework for designing, implementing, and proving the correctness of compositional program analyses. A program analysis in our framework defined by an algebraic structure equipped with sequencing, choice, and iteration operations. From the analysis design perspective, a particularly interesting consequence of this is that the meaning of a loop is computed by applying the iteration operator to the loop body. This style of compositional loop analysis can yield interesting ways of computing loop invariants that cannot be defined iteratively. We identify a class of algorithms, the so-called path-expression algorithms [Tarjan1981,Scholz2007], which can be used to efficiently implement analyses in our framework. Lastly, we develop a theory for proving the correctness of an analysis by establishing an approximation relationship between an algebra defining a concrete semantics and an algebra defining an analysis.Comment: 15 page

    Data-based mechanistic modelling, forecasting, and control.

    Get PDF
    This article briefly reviews the main aspects of the generic data based mechanistic (DBM) approach to modeling stochastic dynamic systems and shown how it is being applied to the analysis, forecasting, and control of environmental and agricultural systems. The advantages of this inductive approach to modeling lie in its wide range of applicability. It can be used to model linear, nonstationary, and nonlinear stochastic systems, and its exploitation of recursive estimation means that the modeling results are useful for both online and offline applications. To demonstrate the practical utility of the various methodological tools that underpin the DBM approach, the article also outlines several typical, practical examples in the area of environmental and agricultural systems analysis, where DBM models have formed the basis for simulation model reduction, control system design, and forecastin
    • …
    corecore