309,578 research outputs found

    Value Flow Graph Analysis with SATIrE

    Get PDF
    Partial redundancy elimination is a common program optimization that attempts to improve execution time by removing superfluous computations from a program. There are two well-known classes of such techniques: syntactic and semantic methods. While semantic optimization is more powerful, traditional algorithms based on SSA from are complicated, heuristic in nature, and unable to perform certain useful optimizations. The value flow graph is a syntactic program representation modeling semantic equivalences; it allows the combination of simple syntactic partial redundancy elimination with a powerful semantic analysis. This yields an optimization that is computationally optimal and simpler than traditional semantic methods. This talk discusses partial redundancy elimination using the value flow graph. A source-to-source optimizer for C++ was implemented using the SATIrE program analysis and transformation system. Two tools integrated in SATIrE were used in the implementation: ROSE is a framework for arbitrary analyses and source-to-source transformations of C++ programs, PAG is a tool for generating data flow analyzers from functional specifications

    Symbol Elimination for Automated Generation of Program Properties

    Get PDF
    Automatic understanding of the intended meaning of computer programs is a very hard problem, requiring intelligence and reasoning. In this talk we describe applications of our symbol elimination methods in automated proram analysis. Symbol elimination uses first-order theorem proving techniques in conjunction with symbolic computation methods, and derives nontrivial program properties, such as loop invariants and loop bounds, in a fully automatic way. Moreover, symbol elimination can be used as an alternative to interpolation for software verification

    Solid state, CCD-buried channel, television camera study and design

    Get PDF
    An investigation of an all solid state television camera design, which uses a buried channel charge-coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array was utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a design which addresses the program requirements for a deliverable solid state TV camera

    Making Rigorous Linear Programming Practical for Program Analysis

    Get PDF
    Linear programming is a key technique for analysis and verification of numerical properties in programs, neural networks, etc. In particular, in program analysis based on abstract interpretation, many numerical abstract domains (such as Template Constraint Matrix, constraint-only polyhedra, etc.) are designed on top of linear programming. However, most state-of-the-art linear programming solvers use floating-point arithmetic in their implementations, leading to an approximate result that may be unsound. On the other hand, the solvers implemented using exact arithmetic are too costly. To this end, this paper focuses on advancing rigorous linear programming techniques based on floating-point arithmetic for building sound and efficient program analysis. Particularly, as a supplement to existing techniques, we present a novel rigorous linear programming technique based on Fourier-Mozkin elimination. On this basis, we implement a tool, namely, RlpSolver, combining our technique with existing techniques to lift effectiveness of rigorous linear programming in the scene of analysis and verification. Experimental results show that our technique is complementary to existing techniques, and their combination (RlpSolver) can achieve a better trade-off between cost and precision via heuristic rules

    Average case analysis of DJ graphs

    Get PDF
    AbstractSreedhar et al. [V.C. Sreedhar, G.R. Gao, Y.-F. Lee, A new framework for elimination-based data flow analysis using DJ graphs, ACM Trans. Program. Lang. Syst. 20 (2) (1998) 388–435; V.C. Sreedhar, Efficient program analysis using DJ graphs, PhD thesis, School of Computer Science, McGill University, Montréal, Québec, Canada, 1995] have presented an elimination-based algorithm to solve data flow problems. A thorough analysis of the algorithm shows that the worst-case performance is at least quadratic in the number of nodes of the underlying graph. In contrast, Sreedhar reports a linear time behavior based on some practical applications.In this paper we prove that for goto-free programs, the average case behavior is indeed linear. As a byproduct our result also applies to the average size of the so-called dominance frontier.A thorough average case analysis based on a graph grammar is performed by studying properties of the j-edges in DJ graphs. It appears that this is the first time that a graph grammar is used in order to analyze an algorithm. The average linear time of the algorithm is obtained by classic techniques in the analysis of algorithms and data structures such as singularity analysis of generating functions and transfer lemmas

    Automatic modular abstractions for template numerical constraints

    Full text link
    We propose a method for automatically generating abstract transformers for static analysis by abstract interpretation. The method focuses on linear constraints on programs operating on rational, real or floating-point variables and containing linear assignments and tests. In addition to loop-free code, the same method also applies for obtaining least fixed points as functions of the precondition, which permits the analysis of loops and recursive functions. Our algorithms are based on new quantifier elimination and symbolic manipulation techniques. Given the specification of an abstract domain, and a program block, our method automatically outputs an implementation of the corresponding abstract transformer. It is thus a form of program transformation. The motivation of our work is data-flow synchronous programming languages, used for building control-command embedded systems, but it also applies to imperative and functional programming

    Extending Traditional Static Analysis Techniques to Support Development, Testing and Maintenance of Component-Based Solutions

    Get PDF
    Traditional static code analysis encompasses a mature set of techniques for helping understand and optimize programs, such as dead code elimination, program slicing, and partial evaluation (code specialization). It is well understood that compared to other program analysis techniques (e.g., dynamic analysis), static analysis techniques do a reasonable job for the cost associated with implementing them. Industry and government are moving away from more ‘traditional’ development approaches towards component-based approaches as ‘the norm.’ Component-based applications most often comprise a collection of distributed object-oriented components such as forms, code snippets, reports, modules, databases, objects, containers, and the like. These components are glued together by code typically written in a visual language. Some industrial experience shows that component-based development and the subsequent use of visual development environments, while reducing an application\u27s total development time, actually increase certain maintenance problems. This provides a motivation for using automated analysis techniques on such systems. The results of this research show that traditional static analysis techniques may not be sufficient for analyzing component-based systems. We examine closely the characteristics of a component-based system and document many of the issues that we feel make the development, analysis, testing and maintenance of such systems more difficult. By analyzing additional summary information for the components as well as any available source code for an application, we show ways in which traditional static analysis techniques may be augmented, thereby increasing the accuracy of static analysis results and ultimately making the maintenance of component-based systems a manageable task. We develop a technique to use semantic information about component properties obtained from type library and interface definition language files, and demonstrate this technique by extending a traditional unreachable code algorithm. To support more complex analysis, we then develop a technique for component developers to provide summary information about a component. This information can be integrated with several traditional static analysis techniques to analyze component-based systems more precisely. We then demonstrate the effectiveness of these techniques on several real Department of Defense (DoD) COTS component-based systems

    QFD analysis of RSRM aqueous cleaners

    Get PDF
    This paper presents a Quality Function Deployment (QFD) analysis of the final down-selected aqueous cleaners to be used on the Redesigned Solid Rocket Motor (RSRM) program. The new cleaner will replace solvent vapor degreasing. The RSRM Ozone Depleting Compound Elimination program is discontinuing the methyl chloroform vapor degreasing process and replacing it with a spray-in-air aqueous cleaning process. Previously, 15 cleaners were down-selected to two candidates by passing screening tests involving toxicity, flammability, cleaning efficiency, contaminant solubility, corrosion potential, cost, and bond strength. The two down-selected cleaners were further evaluated with more intensive testing and evaluated using QFD techniques to assess suitability for cleaning RSRM case and nozzle surfaces in preparation for adhesive bonding
    • …
    corecore