985 research outputs found
TuBound - A Conceptually New Tool for Worst-Case Execution Time Analysis
TuBound is a conceptually new tool for the worst-case execution time (WCET) analysis of programs. A distinctive feature of TuBound is the seamless integration of a WCET analysis component and of a compiler in a uniform tool. TuBound enables the programmer to provide hints improving the precision of the WCET computation on the high-level program source code, while preserving the advantages of using an optimizing compiler and the accuracy of a WCET analysis performed on the low-level machine code. This way, TuBound ideally serves the needs of both the programmer and the WCET analysis by providing them the interface on the very abstraction level that is most appropriate and convenient to them.
In this paper we present the system architecture of TuBound, discuss the internal work-flow of the tool, and report on first measurements using benchmarks from Maelardalen University. TuBound took also part in the WCET Tool Challenge 2008
A Domain-Specific Language for Generating Dataflow Analyzers
Dataflow analysis is a well-understood and very powerful technique for analyzing programs as part of the compilation process. Virtually all compilers use some sort of dataflow analysis as part of their optimization phase. However, despite being well-understood theoretically, such analyses are often difficult to code, making it difficult to quickly experiment with variants. To address this, we developed a domain-specific language, Analyzer Generator (AG), that synthesizes dataflow analysis phases for Microsoft's Phoenix compiler framework. AG hides the fussy details needed to make analyses modular, yet generates code that is as efficient as the hand-coded equivalent. One key construct we introduce allows IR object classes to be extended without recompiling. Experimental results on three analyses show that AG code can be one-tenth the size of the equivalent handwritten C++ code with no loss of performance. It is our hope that AG will make developing new dataflow analyses much easier
Generating program analyzers
In this work the automatic generation of program analyzers from
concise specifications is presented. It focuses on provably correct
and complex interprocedural analyses for real world sized imperative
programs. Thus, a powerful and flexible specification mechanism
is required, enabling both correctness proofs and efficient
implementations. The generation process relies on the theory of
data flow analysis and on abstract interpretation. The theory of
data flow analysis provides methods to efficiently implement analyses.
Abstract interpretation provides the relation to the semantics
of the programming language. This allows the systematic derivation
of efficient provably correct, and terminating analyses. The
approach has been implemented in the program analyzer generator
PAG. It addresses analyses ranging from "simple\u27; intraprocedural
bit vector frameworks to complex interprocedural alias
analyses. A high level specialized functional language is used as
specification mechanism enabling elegant and concise specifications
even for complex analyses. Additionally, it allows the automatic
selection of efficient implementations for the underlying
abstract datatypes, such as balanced binary trees, binary decision
diagrams, bit vectors, and arrays. For the interprocedural analysis
the functional approach, the call string approach, and a novel
approach especially targeting on the precise analysis of loops can
be chosen. In this work the implementation of PAG as well as a
large number of applications of PAG are presented.Diese Arbeit befaßt sich mit der automatischen Generierung von Programmanalysatoren aus prägnanten Spezifikationen. Dabei wird besonderer Wert auf die Generierung von beweisbar korrekten und komplexen interprozeduralen Analysen für imperative Programme realer Größe gelegt. Um dies zu erreichen, ist ein leistungsfähiger und flexibler Spezifikationsmechanismus erforderlich, der sowohl Korrektheitsbeweise, als auch effiziente Implementierungen ermöglicht. Die Generierung basiert auf den Theorien der Datenflußanalyse und der abstrakten Interpretation. Die Datenflußanalyse liefert Methoden zur effizienten Implementierung von Analysen. Die abstrakte Interpretation stellt den Bezug zur Semantik der Programmiersprache her und ermöglicht dadurch die systematische Ableitung beweisbar korrekter und terminierender Analysen. Dieser Ansatz wurde im Programmanalysatorgenerator PAG implementiert, der sowohl für einfache intraprozedurale Bitvektor- Analysen, als auch für komplexe interprozedurale Alias-Analysen geeignet ist. Als Spezifikationsmechanismus wird dabei eine spezialisierte funktionale Sprache verwendet, die es ermöglicht, auch komplexe Analysen kurz und prägnant zu spezifizieren. Darüberhinaus ist es möglich, für die zugrunde liegenden abstrakten Bereiche automatisch effiziente Implementierungen auszuwählen, z.B. balancierte binäre Bäume, Binary Decision Diagrams, Bitvektoren oder Felder. Für die interprozedurale Analyse stehen folgende Möglichkeiten zur Auswahl: der funktionale Ansatz, der Call-String-Ansatz und ein neuer Ansatz, der besonders auf die präzise Analyse von Schleifen abzielt. Diese Arbeit beschreibt sowohl die Implementierung von PAG, als auch eine große Anzahl von Anwendungen
Value Flow Graph Analysis with SATIrE
Partial redundancy elimination is a common program optimization that
attempts to improve execution time by removing superfluous computations from
a program. There are two well-known classes of such techniques: syntactic
and semantic methods. While semantic optimization is more powerful,
traditional algorithms based on SSA from are complicated, heuristic in
nature, and unable to perform certain useful optimizations. The value flow
graph is a syntactic program representation modeling semantic equivalences;
it allows the combination of simple syntactic partial redundancy elimination
with a powerful semantic analysis. This yields an optimization that is
computationally optimal and simpler than traditional semantic methods.
This talk discusses partial redundancy elimination using the value flow
graph. A source-to-source optimizer for C++ was implemented using the
SATIrE program analysis and transformation system. Two tools integrated in
SATIrE were used in the implementation: ROSE is a framework for arbitrary
analyses and source-to-source transformations of C++ programs, PAG is a tool
for generating data flow analyzers from functional specifications
Spectral Amplitude and Phase Characterization of Optical Devices by RF scan
Projecte final de carrera fet en col.laboraciĂł amb Politecnico di TorinoStudy and evaluate through both theoretic analysis and simulations with the program VPI the performance
of the method which uses a RF scan to determinate the phase difference and the amplitude difference (in
dB) in order to calculate the dispersion value D
Study to perform preliminary experiments to evaluate particle generation and characterization techniques for zero-gravity cloud physics experiments
Methods of particle generation and characterization with regard to their applicability for experiments requiring cloud condensation nuclei (CCN) of specified properties were investigated. Since aerosol characterization is a prerequisite to assessing performance of particle generation equipment, techniques for characterizing aerosol were evaluated. Aerosol generation is discussed, and atomizer and photolytic generators including preparation of hydrosols (used with atomizers) and the evaluation of a flight version of an atomizer are studied
A research to reduce interior noise in general aviation airplanes. General aviation interior noise study
The construction, calibration, and properties of a facility for measuring sound transmission through aircraft type panels are described along with the theoretical and empirical methods used. Topics discussed include typical noise source, sound transmission path, and acoustic cabin properties and their effect on interior noise. Experimental results show an average sound transmission loss in the mass controlled frequency region comparable to theoretical predictions. The results also verify that transmission losses in the stiffness controlled region directly depend on the fundamental frequency of the panel. Experimental and theoretical results indicate that increases in this frequency, and consequently in transmission loss, can be achieved by applying pressure differentials across the specimen
Analysis of path exclusion at the machine code level
We present a method to find static path exclusions in a
control flow graph in order to refine the WCET analysis.
Using this information, some infeasible paths can be discarded
during the ILP-based longest path analysis which
helps to improve precision. The new analysis works at the
assembly level and uses the Omega library to evaluate Presburger
formulas
- …