161,079 research outputs found
Set-Based Pre-Processing for Points-To Analysis
We present set-based pre-analysis: a virtually universal op-
timization technique for flow-insensitive points-to analysis.
Points-to analysis computes a static abstraction of how ob-
ject values flow through a program’s variables. Set-based
pre-analysis relies on the observation that much of this rea-
soning can take place at the set level rather than the value
level. Computing constraints at the set level results in sig-
nificant optimization opportunities: we can rewrite the in-
put program into a simplified form with the same essential
points-to properties. This rewrite results in removing both
local variables and instructions, thus simplifying the sub-
sequent value-based points-to computation. E
ectively, set-
based pre-analysis puts the program in a normal form opti-
mized for points-to analysis.
Compared to other techniques for o
-line optimization of
points-to analyses in the literature, the new elements of our
approach are the ability to eliminate statements, and not just
variables, as well as its modularity: set-based pre-analysis
can be performed on the input just once, e.g., allowing the
pre-optimization of libraries that are subsequently reused
many times and for di
erent analyses. In experiments with
Java programs, set-based pre-analysis eliminates 30% of the
program’s local variables and 30% or more of computed
context-sensitive points-to facts, over a wide set of bench-
marks and analyses, resulting in a
20% average speedup
(max: 110%, median: 18%)
Efficient and Effective Handling of Exceptions in Java Points-To Analysis
A joint points-to and exception analysis has been shown to yield benefits in both precision and performance. Treating exceptions as regular objects,
however, incurs significant and rather unexpected overhead. We show that in a
typical joint analysis most of the objects computed to flow in and out of a method
are due to exceptional control-flow and not normal call-return control-flow. For
instance, a context-insensitive analysis of the Antlr benchmark from the DaCapo
suite computes 4-5 times more objects going in or out of a method due to exceptional control-flow than due to normal control-flow. As a consequence, the
analysis spends a large amount of its time considering exceptions.
We show that the problem can be addressed both e
ectively and elegantly by
coarsening the representation of exception objects. An interesting find is that, instead of recording each distinct exception object, we can collapse all exceptions
of the same type, and use one representative object per type, to yield nearly identical precision (loss of less than 0.1%) but with a boost in performance of at least
50% for most analyses and benchmarks and large space savings (usually 40% or
more)
Estimating Fire Weather Indices via Semantic Reasoning over Wireless Sensor Network Data Streams
Wildfires are frequent, devastating events in Australia that regularly cause
significant loss of life and widespread property damage. Fire weather indices
are a widely-adopted method for measuring fire danger and they play a
significant role in issuing bushfire warnings and in anticipating demand for
bushfire management resources. Existing systems that calculate fire weather
indices are limited due to low spatial and temporal resolution. Localized
wireless sensor networks, on the other hand, gather continuous sensor data
measuring variables such as air temperature, relative humidity, rainfall and
wind speed at high resolutions. However, using wireless sensor networks to
estimate fire weather indices is a challenge due to data quality issues, lack
of standard data formats and lack of agreement on thresholds and methods for
calculating fire weather indices. Within the scope of this paper, we propose a
standardized approach to calculating Fire Weather Indices (a.k.a. fire danger
ratings) and overcome a number of the challenges by applying Semantic Web
Technologies to the processing of data streams from a wireless sensor network
deployed in the Springbrook region of South East Queensland. This paper
describes the underlying ontologies, the semantic reasoning and the Semantic
Fire Weather Index (SFWI) system that we have developed to enable domain
experts to specify and adapt rules for calculating Fire Weather Indices. We
also describe the Web-based mapping interface that we have developed, that
enables users to improve their understanding of how fire weather indices vary
over time within a particular region.Finally, we discuss our evaluation results
that indicate that the proposed system outperforms state-of-the-art techniques
in terms of accuracy, precision and query performance.Comment: 20pages, 12 figure
Locality and Singularity for Store-Atomic Memory Models
Robustness is a correctness notion for concurrent programs running under
relaxed consistency models. The task is to check that the relaxed behavior
coincides (up to traces) with sequential consistency (SC). Although
computationally simple on paper (robustness has been shown to be
PSPACE-complete for TSO, PGAS, and Power), building a practical robustness
checker remains a challenge. The problem is that the various relaxations lead
to a dramatic number of computations, only few of which violate robustness.
In the present paper, we set out to reduce the search space for robustness
checkers. We focus on store-atomic consistency models and establish two
completeness results. The first result, called locality, states that a
non-robust program always contains a violating computation where only one
thread delays commands. The second result, called singularity, is even stronger
but restricted to programs without lightweight fences. It states that there is
a violating computation where a single store is delayed.
As an application of the results, we derive a linear-size source-to-source
translation of robustness to SC-reachability. It applies to general programs,
regardless of the data domain and potentially with an unbounded number of
threads and with unbounded buffers. We have implemented the translation and
verified, for the first time, PGAS algorithms in a fully automated fashion. For
TSO, our analysis outperforms existing tools
Probabilistic Model Checking for Energy Analysis in Software Product Lines
In a software product line (SPL), a collection of software products is
defined by their commonalities in terms of features rather than explicitly
specifying all products one-by-one. Several verification techniques were
adapted to establish temporal properties of SPLs. Symbolic and family-based
model checking have been proven to be successful for tackling the combinatorial
blow-up arising when reasoning about several feature combinations. However,
most formal verification approaches for SPLs presented in the literature focus
on the static SPLs, where the features of a product are fixed and cannot be
changed during runtime. This is in contrast to dynamic SPLs, allowing to adapt
feature combinations of a product dynamically after deployment. The main
contribution of the paper is a compositional modeling framework for dynamic
SPLs, which supports probabilistic and nondeterministic choices and allows for
quantitative analysis. We specify the feature changes during runtime within an
automata-based coordination component, enabling to reason over strategies how
to trigger dynamic feature changes for optimizing various quantitative
objectives, e.g., energy or monetary costs and reliability. For our framework
there is a natural and conceptually simple translation into the input language
of the prominent probabilistic model checker PRISM. This facilitates the
application of PRISM's powerful symbolic engine to the operational behavior of
dynamic SPLs and their family-based analysis against various quantitative
queries. We demonstrate feasibility of our approach by a case study issuing an
energy-aware bonding network device.Comment: 14 pages, 11 figure
Symbolic and analytic techniques for resource analysis of Java bytecode
Recent work in resource analysis has translated the idea of amortised resource analysis to imperative languages using a program logic that allows mixing of assertions about heap shapes, in the tradition of separation logic, and assertions about consumable resources. Separately, polyhedral methods have been used to calculate bounds on numbers of iterations in loop-based programs. We are attempting to combine these ideas to deal with Java programs involving both data structures and loops, focusing on the bytecode level rather than on source code
On quantum mean-field models and their quantum annealing
This paper deals with fully-connected mean-field models of quantum spins with
p-body ferromagnetic interactions and a transverse field. For p=2 this
corresponds to the quantum Curie-Weiss model (a special case of the
Lipkin-Meshkov-Glick model) which exhibits a second-order phase transition,
while for p>2 the transition is first order. We provide a refined analytical
description both of the static and of the dynamic properties of these models.
In particular we obtain analytically the exponential rate of decay of the gap
at the first-order transition. We also study the slow annealing from the pure
transverse field to the pure ferromagnet (and vice versa) and discuss the
effect of the first-order transition and of the spinodal limit of metastability
on the residual excitation energy, both for finite and exponentially divergent
annealing times. In the quantum computation perspective this quantity would
assess the efficiency of the quantum adiabatic procedure as an approximation
algorithm.Comment: 44 pages, 23 figure
- …