268 research outputs found
Timed Control with Observation Based and Stuttering Invariant Strategies
International audienceIn this paper we consider the problem of controller synthesis for timed games under imperfect information. Novel to our approach is the requirements to strategies: they should be based on a finite collection of observations and must be stuttering invariant in the sense that repeated identical observations will not change the strategy. We provide a constructive transformation to equivalent finite games with perfect information, giving decidability as well as allowing for an efficient on-the-fly forward algorithm. We report on application of an initial experimental implementation
Verification and Control of Partially Observable Probabilistic Real-Time Systems
We propose automated techniques for the verification and control of
probabilistic real-time systems that are only partially observable. To formally
model such systems, we define an extension of probabilistic timed automata in
which local states are partially visible to an observer or controller. We give
a probabilistic temporal logic that can express a range of quantitative
properties of these models, relating to the probability of an event's
occurrence or the expected value of a reward measure. We then propose
techniques to either verify that such a property holds or to synthesise a
controller for the model which makes it true. Our approach is based on an
integer discretisation of the model's dense-time behaviour and a grid-based
abstraction of the uncountable belief space induced by partial observability.
The latter is necessarily approximate since the underlying problem is
undecidable, however we show how both lower and upper bounds on numerical
results can be generated. We illustrate the effectiveness of the approach by
implementing it in the PRISM model checker and applying it to several case
studies, from the domains of computer security and task scheduling
Formal methods for real-time requirements engineering
Timed model checking turned out to be a very successful technique for the verification of real-time systems. In general, however, large-scale systems require more than a mere real-time perspective: They utilise, for example, Abstract Data Types and Fairness Aspects. VSE-II (Verification Support Environment) is a general tool which supports the design and the verification process of such large-scale systems. The basic machinery within VSE-II is theorem proving rather than model checking and one of its underlying formalisms is close to TLA (Temporal Logic of Actions), i.e. it is based on linear discrete time. In this thesis we develop a technique to perform an exact discretisation of dense real-time aspects, i.e. a discretisation that is not just an approximation but rather mirrors dense behaviour exactly. This discretisation is achieved without an explicit or implicit introduction of rational numbers. With the help of the exact discretisation we define an embedding of Hybrid Automata into VSE-II such that model checking strategies for Hybrid Automata can be used in VSE-II. Vice versa, the embedding allows the model checking strategies to benefit from the proof work done in VSE-II. This thesis introduces a general methodology for formal requirements analysis, namely observer models, that deals with particular perspectives on a system rather than with particular aspects of it. This way, different specialised approaches can be integrated and used to describe the overall system requirements. One such view, for example, is a real-time which uses a new discretisation technique.In der Verifikation von Realzeit-Systemen haben sich Model-Checking Verfahren
bewährt. Im Allgemeinen kann man jedoch sagen, dass große industrielle Anwendungen nicht nur die Realzeit Dimension aufweisen. Sie bestehen vielmehr aus einer Vielzahl weiterer Dimensionen (Sichten) wie eine Informationsflusssicht oder eine Security-Sicht. Zur Spezifikation dieser Sichten werden beispielsweise Abstrakte Datentypen oder auch Fairness Aspekte verwendet. VSE-II (Verification Support Environment) ist ein Werkzeug, welches den formalen Entwicklungsprozess vom Design bis hin zur Verifikation solcher Anwendungen unterstützt. Der Kern des VSE-IIWerkzeugs ist ein interaktives Beweissystem, das auf einem Sequenzenkalkül basiert, der neben der Logik erster Stufe und Dynamischer Logik auch die Temporale Logik der Aktionen (TLA) beinhaltet. TLA beruht auf einem Zeitmodell, welches linear und diskret ist.
In dieser Arbeit beschreiben wir eine Technik, die eine exakte Diskretisierung
von dichten Realzeitaspekten erlaubt, so dass das VSE-II System diese Aspekte
mit den vorhandenen Verfahren und Regeln behandeln kann. Die Diskretisierung
ist so definiert, dass sie nicht nur eine Approximation ist, sondern sie spiegelt vielmehr das dichte Verhalten exakt wider. Dies wird ohne die explizite oder implizite Einführung von rationalen Zahlen erreicht. Mit Hilfe der exakten Diskretisierung wird eine Einbettung von Hybriden Automaten in VSE-II definiert, die es ermöglicht Teilbeweise, die von Modelcheckingverfahren
für Hybride Automaten gefunden wurden, ohne weiteren Beweis in VSE-II zu verwenden und umgekehrt. Weiterhin wird eine Methodologie zur formalen Anforderungsanalyse eingeführt, die verschiedene Sichten auf ein System und nicht nur verschiedene Aspekte eines Systems behandelt. Diese Methodologie, genannt Observer Models, ermöglicht die Integration unterschiedlicher spezieller Werkzeuge bzw. Verfahren zur Beschreibung der einzelnen Sichten und somit zur Beschreibung der gesamten Systemanforderungen. Eine solche Sicht stellt beispielsweise eine Realzeit-Sicht dar, welche auf der oben erwähnten Einbettung beruht
On verifying timed hyperproperties
We study the satisfiability and model-checking problems for timed
hyperproperties specified with HyperMTL, a timed extension of HyperLTL.
Depending on whether interleaving of events in different traces is allowed, two
possible semantics can be defined for timed hyperproperties: asynchronous and
synchronous. While the satisfiability problem can be decided similarly to
HyperLTL regardless of the choice of semantics, we show that the model-checking
problem, unless the specification is alternation-free, is undecidable even when
very restricted timing constraints are allowed. On the positive side, we show
that model checking HyperMTL with quantifier alternations is possible under
certain conditions in the synchronous semantics, or when there is a fixed bound
on the length of the time domain.EP/K026399/1 and EP/P020011/
Isoperimetric Partitioning: A New Algorithm for Graph Partitioning
Temporal structure is skilled, fluent action exists at several nested levels. At the largest scale considered here, short sequences of actions that are planned collectively in prefronatal cortex appear to be queued for performance by a cyclic competitive process that operates in concert with a parallel analog representation that implicitly specifies the relative priority of elements of the sequence. At an intermediate scale, single acts, like reaching to grasp, depend on coordinated scaling of the rates at which many muscles shorten or lengthen in parallel. To ensure success of acts such as catching an approaching ball, such parallel rate scaling, which appears to be one function of the basal ganglia, must be coupled to perceptual variables such as time-to-contact. At a finer scale, within each act, desired rate scaling can be realized only if precisely timed muscle activations first accelerate and then decelerate the limbs, to ensure that muscle length changes do not under- or over- shoot the amounts needed for precise acts. Each context of action may require a different timed muscle activation pattern than similar contexts. Because context differences that require different treatment cannot be known in advance, a formidable adaptive engine-the cerebellum-is needed to amplify differences within, and continuosly search, a vast parallel signal flow, in order to discover contextual "leading indicators" of when to generate distinctive patterns of analog signals. From some parts of the cerebellum, such signals control muscles. But a recent model shows how the lateral cerebellum may serve the competitive queuing system (frontal cortex) as a repository of quickly accessed long-term sequence memories. Thus different parts of the cerebellum may use the same adaptive engine design to serve the lowest and highest of the three levels of temporal structure treated. If so, no one-to-one mapping exists between leveels of temporal structure and major parts of the brain. Finally, recent data cast doubt on network-delay models of cerebellar adaptive timing.National Institute of Mental Health (R01 DC02582
Recommended from our members
Timed hyperproperties
We study the satisfiability and model-checking problems for timed hyperproperties specified with HyperMITL, a timed extension of HyperLTL. While the satisfiability problem can be solved similarly as for HyperLTL, we show that the model-checking problem for HyperMITL, unless the specification is alternation-free, is undecidable even when very restricted timing constraints are allowed. On the positive side, we show that model checking HyperMITL with quantifier alternations is possible under certain semantic restrictions. As an intermediate tool, we give an ‘asynchronous’ interpretation of Wilke's monadic logic of relative distance (L ) and show that it characterises timed languages recognised by timed automata with silent transitions. d
Adaptive Neural Models of Queuing and Timing in Fluent Action
Temporal structure in skilled, fluent action exists at several nested levels. At the largest scale considered here, short sequences of actions that are planned collectively in prefrontal cortex appear to be queued for performance by a cyclic competitive process that operates in concert with a parallel analog representation that implicitly specifies the relative priority of elements of the sequence. At an intermediate scale, single acts, like reaching to grasp, depend on coordinated scaling of the rates at which many muscles shorten or lengthen in parallel. To ensure success of acts such as catching an approaching ball, such parallel rate scaling, which appears to be one function of the basal ganglia, must be coupled to perceptual variables, such as time-to-contact. At a fine scale, within each act, desired rate scaling can be realized only if precisely timed muscle activations first accelerate and then decelerate the limbs, to ensure that muscle length changes do not under- or over-shoot the amounts needed for the precise acts. Each context of action may require a much different timed muscle activation pattern than similar contexts. Because context differences that require different treatment cannot be known in advance, a formidable adaptive engine-the cerebellum-is needed to amplify differences within, and continuosly search, a vast parallel signal flow, in order to discover contextual "leading indicators" of when to generate distinctive parallel patterns of analog signals. From some parts of the cerebellum, such signals controls muscles. But a recent model shows how the lateral cerebellum, such signals control muscles. But a recent model shows how the lateral cerebellum may serve the competitive queuing system (in frontal cortex) as a repository of quickly accessed long-term sequence memories. Thus different parts of the cerebellum may use the same adaptive engine system design to serve the lowest and the highest of the three levels of temporal structure treated. If so, no one-to-one mapping exists between levels of temporal structure and major parts of the brain. Finally, recent data cast doubt on network-delay models of cerebellar adaptive timing.National Institute of Mental Health (R01 DC02852
Verification and Control of Turn-Based Probabilistic Real-Time Games
Quantitative verification techniques have been developed for the formal analysis of a variety of probabilistic models, such as Markov chains, Markov decision process and their variants. They can be used to produce guarantees on quantitative aspects of system behaviour, for example safety, reliability and performance, or to help synthesise controllers that ensure such guarantees are met. We propose the model of turn-based probabilistic timed multi-player games, which incorporates probabilistic choice, real-time clocks and nondeterministic behaviour across multiple players. Building on the digital clocks approach for the simpler model of probabilistic timed automata, we show how to compute the key measures that underlie quantitative verification, namely the probability and expected cumulative price to reach a target. We illustrate this on case studies from computer security and task scheduling
- …