68,149 research outputs found
Robustness against Power is PSPACE-complete
Power is a RISC architecture developed by IBM, Freescale, and several other
companies and implemented in a series of POWER processors. The architecture
features a relaxed memory model providing very weak guarantees with respect to
the ordering and atomicity of memory accesses.
Due to these weaknesses, some programs that are correct under sequential
consistency (SC) show undesirable effects when run under Power. We call these
programs not robust against the Power memory model. Formally, a program is
robust if every computation under Power has the same data and control
dependencies as some SC computation.
Our contribution is a decision procedure for robustness of concurrent
programs against the Power memory model. It is based on three ideas. First, we
reformulate robustness in terms of the acyclicity of a happens-before relation.
Second, we prove that among the computations with cyclic happens-before
relation there is one in a certain normal form. Finally, we reduce the
existence of such a normal-form computation to a language emptiness problem.
Altogether, this yields a PSPACE algorithm for checking robustness against
Power. We complement it by a matching lower bound to show PSPACE-completeness
Succinct Representations for Abstract Interpretation
Abstract interpretation techniques can be made more precise by distinguishing
paths inside loops, at the expense of possibly exponential complexity.
SMT-solving techniques and sparse representations of paths and sets of paths
avoid this pitfall. We improve previously proposed techniques for guided static
analysis and the generation of disjunctive invariants by combining them with
techniques for succinct representations of paths and symbolic representations
for transitions based on static single assignment. Because of the
non-monotonicity of the results of abstract interpretation with widening
operators, it is difficult to conclude that some abstraction is more precise
than another based on theoretical local precision results. We thus conducted
extensive comparisons between our new techniques and previous ones, on a
variety of open-source packages.Comment: Static analysis symposium (SAS), Deauville : France (2012
Developing digital interventions: a methodological guide.
Digital interventions are becoming an increasingly popular method of delivering healthcare as they enable and promote patient self-management. This paper provides a methodological guide to the processes involved in developing effective digital interventions, detailing how to plan and develop such interventions to avoid common pitfalls. It demonstrates the need for mixed qualitative and quantitative methods in order to develop digital interventions which are effective, feasible, and acceptable to users and stakeholders
Aggregation of Votes with Multiple Positions on Each Issue
We consider the problem of aggregating votes cast by a society on a fixed set
of issues, where each member of the society may vote for one of several
positions on each issue, but the combination of votes on the various issues is
restricted to a set of feasible voting patterns. We require the aggregation to
be supportive, i.e. for every issue the corresponding component of
every aggregator on every issue should satisfy . We prove that, in such a set-up, non-dictatorial
aggregation of votes in a society of some size is possible if and only if
either non-dictatorial aggregation is possible in a society of only two members
or a ternary aggregator exists that either on every issue is a majority
operation, i.e. the corresponding component satisfies , or on every issue is a minority operation, i.e.
the corresponding component satisfies We then introduce a notion of uniformly non-dictatorial
aggregator, which is defined to be an aggregator that on every issue, and when
restricted to an arbitrary two-element subset of the votes for that issue,
differs from all projection functions. We first give a characterization of sets
of feasible voting patterns that admit a uniformly non-dictatorial aggregator.
Then making use of Bulatov's dichotomy theorem for conservative constraint
satisfaction problems, we connect social choice theory with combinatorial
complexity by proving that if a set of feasible voting patterns has a
uniformly non-dictatorial aggregator of some arity then the multi-sorted
conservative constraint satisfaction problem on , in the sense introduced by
Bulatov and Jeavons, with each issue representing a sort, is tractable;
otherwise it is NP-complete
Adaptive drivers in a model of urban traffic
We introduce a simple lattice model of traffic flow in a city where drivers
optimize their route-selection in time in order to avoid traffic jams, and
study its phase structure as a function of the density of vehicles and of the
drivers' behavioral parameters via numerical simulations and mean-field
analytical arguments. We identify a phase transition between a low- and a
high-density regime. In the latter, inductive drivers may surprisingly behave
worse than randomly selecting drivers.Comment: 7 pages, final versio
Variability Abstractions: Trading Precision for Speed in Family-Based Analyses (Extended Version)
Family-based (lifted) data-flow analysis for Software Product Lines (SPLs) is
capable of analyzing all valid products (variants) without generating any of
them explicitly. It takes as input only the common code base, which encodes all
variants of a SPL, and produces analysis results corresponding to all variants.
However, the computational cost of the lifted analysis still depends inherently
on the number of variants (which is exponential in the number of features, in
the worst case). For a large number of features, the lifted analysis may be too
costly or even infeasible. In this paper, we introduce variability abstractions
defined as Galois connections and use abstract interpretation as a formal
method for the calculational-based derivation of approximate (abstracted)
lifted analyses of SPL programs, which are sound by construction. Moreover,
given an abstraction we define a syntactic transformation that translates any
SPL program into an abstracted version of it, such that the analysis of the
abstracted SPL coincides with the corresponding abstracted analysis of the
original SPL. We implement the transformation in a tool, reconfigurator that
works on Object-Oriented Java program families, and evaluate the practicality
of this approach on three Java SPL benchmarks.Comment: 50 pages, 10 figure
Global semantic typing for inductive and coinductive computing
Inductive and coinductive types are commonly construed as ontological
(Church-style) types, denoting canonical data-sets such as natural numbers,
lists, and streams. For various purposes, notably the study of programs in the
context of global semantics, it is preferable to think of types as semantical
properties (Curry-style). Intrinsic theories were introduced in the late 1990s
to provide a purely logical framework for reasoning about programs and their
semantic types. We extend them here to data given by any combination of
inductive and coinductive definitions. This approach is of interest because it
fits tightly with syntactic, semantic, and proof theoretic fundamentals of
formal logic, with potential applications in implicit computational complexity
as well as extraction of programs from proofs. We prove a Canonicity Theorem,
showing that the global definition of program typing, via the usual (Tarskian)
semantics of first-order logic, agrees with their operational semantics in the
intended model. Finally, we show that every intrinsic theory is interpretable
in a conservative extension of first-order arithmetic. This means that
quantification over infinite data objects does not lead, on its own, to
proof-theoretic strength beyond that of Peano Arithmetic. Intrinsic theories
are perfectly amenable to formulas-as-types Curry-Howard morphisms, and were
used to characterize major computational complexity classes Their extensions
described here have similar potential which has already been applied
- âŠ