3,819 research outputs found
Model Checking CTL is Almost Always Inherently Sequential
The model checking problem for CTL is known to be P-complete (Clarke,
Emerson, and Sistla (1986), see Schnoebelen (2002)). We consider fragments of
CTL obtained by restricting the use of temporal modalities or the use of
negations---restrictions already studied for LTL by Sistla and Clarke (1985)
and Markey (2004). For all these fragments, except for the trivial case without
any temporal operator, we systematically prove model checking to be either
inherently sequential (P-complete) or very efficiently parallelizable
(LOGCFL-complete). For most fragments, however, model checking for CTL is
already P-complete. Hence our results indicate that, in cases where the
combined complexity is of relevance, approaching CTL model checking by
parallelism cannot be expected to result in any significant speedup. We also
completely determine the complexity of the model checking problem for all
fragments of the extensions ECTL, CTL+, and ECTL+
Quantifier-Free Interpolation of a Theory of Arrays
The use of interpolants in model checking is becoming an enabling technology
to allow fast and robust verification of hardware and software. The application
of encodings based on the theory of arrays, however, is limited by the
impossibility of deriving quantifier- free interpolants in general. In this
paper, we show that it is possible to obtain quantifier-free interpolants for a
Skolemized version of the extensional theory of arrays. We prove this in two
ways: (1) non-constructively, by using the model theoretic notion of
amalgamation, which is known to be equivalent to admit quantifier-free
interpolation for universal theories; and (2) constructively, by designing an
interpolating procedure, based on solving equations between array updates.
(Interestingly, rewriting techniques are used in the key steps of the solver
and its proof of correctness.) To the best of our knowledge, this is the first
successful attempt of computing quantifier- free interpolants for a variant of
the theory of arrays with extensionality
Software Model Checking with Explicit Scheduler and Symbolic Threads
In many practical application domains, the software is organized into a set
of threads, whose activation is exclusive and controlled by a cooperative
scheduling policy: threads execute, without any interruption, until they either
terminate or yield the control explicitly to the scheduler. The formal
verification of such software poses significant challenges. On the one side,
each thread may have infinite state space, and might call for abstraction. On
the other side, the scheduling policy is often important for correctness, and
an approach based on abstracting the scheduler may result in loss of precision
and false positives. Unfortunately, the translation of the problem into a
purely sequential software model checking problem turns out to be highly
inefficient for the available technologies. We propose a software model
checking technique that exploits the intrinsic structure of these programs.
Each thread is translated into a separate sequential program and explored
symbolically with lazy abstraction, while the overall verification is
orchestrated by the direct execution of the scheduler. The approach is
optimized by filtering the exploration of the scheduler with the integration of
partial-order reduction. The technique, called ESST (Explicit Scheduler,
Symbolic Threads) has been implemented and experimentally evaluated on a
significant set of benchmarks. The results demonstrate that ESST technique is
way more effective than software model checking applied to the sequentialized
programs, and that partial-order reduction can lead to further performance
improvements.Comment: 40 pages, 10 figures, accepted for publication in journal of logical
methods in computer scienc
Algorithms for Game Metrics
Simulation and bisimulation metrics for stochastic systems provide a
quantitative generalization of the classical simulation and bisimulation
relations. These metrics capture the similarity of states with respect to
quantitative specifications written in the quantitative {\mu}-calculus and
related probabilistic logics. We first show that the metrics provide a bound
for the difference in long-run average and discounted average behavior across
states, indicating that the metrics can be used both in system verification,
and in performance evaluation. For turn-based games and MDPs, we provide a
polynomial-time algorithm for the computation of the one-step metric distance
between states. The algorithm is based on linear programming; it improves on
the previous known exponential-time algorithm based on a reduction to the
theory of reals. We then present PSPACE algorithms for both the decision
problem and the problem of approximating the metric distance between two
states, matching the best known algorithms for Markov chains. For the
bisimulation kernel of the metric our algorithm works in time O(n^4) for both
turn-based games and MDPs; improving the previously best known O(n^9\cdot
log(n)) time algorithm for MDPs. For a concurrent game G, we show that
computing the exact distance between states is at least as hard as computing
the value of concurrent reachability games and the square-root-sum problem in
computational geometry. We show that checking whether the metric distance is
bounded by a rational r, can be done via a reduction to the theory of real
closed fields, involving a formula with three quantifier alternations, yielding
O(|G|^O(|G|^5)) time complexity, improving the previously known reduction,
which yielded O(|G|^O(|G|^7)) time complexity. These algorithms can be iterated
to approximate the metrics using binary search.Comment: 27 pages. Full version of the paper accepted at FSTTCS 200
Modal Logics of Topological Relations
Logical formalisms for reasoning about relations between spatial regions play
a fundamental role in geographical information systems, spatial and constraint
databases, and spatial reasoning in AI. In analogy with Halpern and Shoham's
modal logic of time intervals based on the Allen relations, we introduce a
family of modal logics equipped with eight modal operators that are interpreted
by the Egenhofer-Franzosa (or RCC8) relations between regions in topological
spaces such as the real plane. We investigate the expressive power and
computational complexity of logics obtained in this way. It turns out that our
modal logics have the same expressive power as the two-variable fragment of
first-order logic, but are exponentially less succinct. The complexity ranges
from (undecidable and) recursively enumerable to highly undecidable, where the
recursively enumerable logics are obtained by considering substructures of
structures induced by topological spaces. As our undecidability results also
capture logics based on the real line, they improve upon undecidability results
for interval temporal logics by Halpern and Shoham. We also analyze modal
logics based on the five RCC5 relations, with similar results regarding the
expressive power, but weaker results regarding the complexity
Assessment of highly distributed power systems using an integrated simulation approach
In a highly distributed power system (HDPS), micro renewable and low carbon technologies would make a significant contribution to the electricity supply. Further, controllable devices such as micro combined heat and power (CHP) could be used to assist in maintaining stability in addition to simply providing heat and power to dwellings. To analyse the behaviour of such a system requires the modelling of both the electrical distribution system and the coupled microgeneration devices in a realistic context. In this paper a pragmatic approach to HDPS modelling is presented: microgeneration devices are simulated using a building simulation tool to generate time-varying power output profiles, which are then replicated and processed statistically so that they can be used as boundary conditions for a load flow simulation; this is used to explore security issues such as under and over voltage, branch thermal overloading, and reverse power flow. Simulations of a section of real network are presented, featuring different penetrations of micro-renewables and micro-CHP within the ranges that are believed to be realistically possible by 2050. This analysis indicates that well-designed suburban networks are likely to be able to accommodate such levels of domestic-scale generation without problems emerging such as overloads or degradation to the quality of supply
Recommended from our members
Perceptual and conceptual processing of visual objects across the adult lifespan.
Making sense of the external world is vital for multiple domains of cognition, and so it is crucial that object recognition is maintained across the lifespan. We investigated age differences in perceptual and conceptual processing of visual objects in a population-derived sample of 85 healthy adults (24-87 years old) by relating measures of object processing to cognition across the lifespan. Magnetoencephalography (MEG) was recorded during a picture naming task to provide a direct measure of neural activity, that is not confounded by age-related vascular changes. Multiple linear regression was used to estimate neural responsivity for each individual, namely the capacity to represent visual or semantic information relating to the pictures. We find that the capacity to represent semantic information is linked to higher naming accuracy, a measure of task-specific performance. In mature adults, the capacity to represent semantic information also correlated with higher levels of fluid intelligence, reflecting domain-general performance. In contrast, the latency of visual processing did not relate to measures of cognition. These results indicate that neural responsivity measures relate to naming accuracy and fluid intelligence. We propose that maintaining neural responsivity in older age confers benefits in task-related and domain-general cognitive processes, supporting the brain maintenance view of healthy cognitive ageing.Research Foundation Flander
Governing the governors : a case study of college governance in English further education
This paper addresses the nature of governors in the governance of further education colleges in an English context (1). It explores the complex relationship between governors (people/agency), government (policy/structure) and governance (practice), in a college environment. While recent research has focused on the governance of schooling and higher education there has been little attention paid to the role of governors in the lifelong learning sector. The objective of the paper is to contribute to the debate about the purpose of college governance at a time when the Learning and Skills Council (LSC) commissioning era ends, and new government bodies responsible for further education and training, including local authorities, arrive. The paper analyses the nature of FE governance through the perspectives and experiences of governors, as colleges respond to calls from government for greater improvement and accountability in the sector (LSIS, 2009a). What constitutes creative governance is complex and controversial in the wider framework of regulation and public policy reform (Stoker, 1997; Seddon, 2008). As with other tricky concepts such as leadership, professionalism and learning, college governance is best defined in the contexts, cultures and situations in which it is located. College governance does not operate in a vacuum. It involves governors, chairs, principals, professionals, senior managers, clerks, community, business and wider agencies, including external audit and inspection regimes. Governance also acts as a prism through which national education and training reforms are mediated, at local level. While governing bodies are traditionally associated with the business of FE - steering, setting the tone and style, dealing with finance, funding, audit and procedural matters â they are increasingly being challenged to be more creative and responsive to the wider society. Drawing on a recent case study of six colleges, involving governors and key policy stakeholders, this paper explores FE governance in a fast changing policy environment
A type reduction theory for systems with replicated components
The Parameterised Model Checking Problem asks whether an implementation
Impl(t) satisfies a specification Spec(t) for all instantiations of parameter
t. In general, t can determine numerous entities: the number of processes used
in a network, the type of data, the capacities of buffers, etc. The main theme
of this paper is automation of uniform verification of a subclass of PMCP with
the parameter of the first kind, i.e. the number of processes in the network.
We use CSP as our formalism. We present a type reduction theory, which, for a
given verification problem, establishes a function \phi that maps all
(sufficiently large) instantiations T of the parameter to some fixed type T^
and allows us to deduce that if Spec(T^) is refined by \phi(Impl(T)), then
(subject to certain assumptions) Spec(T) is refined by Impl(T). The theory can
be used in practice by combining it with a suitable abstraction method that
produces a t-independent process Abstr that is refined by {\phi}(Impl(T)) for
all sufficiently large T. Then, by testing (with a model checker) if the
abstract model Abstr refines Spec(T^), we can deduce a positive answer to the
original uniform verification problem. The type reduction theory relies on
symbolic representation of process behaviour. We develop a symbolic operational
semantics for CSP processes that satisfy certain normality requirements, and we
provide a set of translation rules that allow us to concretise symbolic
transition graphs. Based on this, we prove results that allow us to infer
behaviours of a process instantiated with uncollapsed types from known
behaviours of the same process instantiated with a reduced type. One of the
main advantages of our symbolic operational semantics and the type reduction
theory is their generality, which makes them applicable in a wide range of
settings
- âŠ