223 research outputs found

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Polynomial Time Corresponds to Solutions of Polynomial Ordinary Differential Equations of Polynomial Length: The General Purpose Analog Computer and Computable Analysis Are Two Efficiently Equivalent Models of Computations

    Get PDF
    The outcomes of this paper are twofold. Implicit complexity. We provide an implicit characterization of polynomial time computation in terms of ordinary differential equations: we characterize the class P of languages computable in polynomial time in terms of differential equations with polynomial right-hand side. This result gives a purely continuous (time and space) elegant and simple characterization of P. We believe it is the first time such classes are characterized using only ordinary differential equations. Our characterization extends to functions computable in polynomial time over the reals in the sense of computable analysis. Our results may provide a new perspective on classical complexity, by giving a way to define complexity classes, like P, in a very simple way, without any reference to a notion of (discrete) machine. This may also provide ways to state classical questions about computational complexity via ordinary differential equations. Continuous-Time Models of Computation. Our results can also be interpreted in terms of analog computers or analog model of computation: As a side effect, we get that the 1941 General Purpose Analog Computer (GPAC) of Claude Shannon is provably equivalent to Turing machines both at the computability and complexity level, a fact that has never been established before. This result provides arguments in favour of a generalised form of the Church-Turing Hypothesis, which states that any physically realistic (macroscopic) computer is equivalent to Turing machines both at a computability and at a computational complexity level

    Polynomial Time corresponds to Solutions of Polynomial Ordinary Differential Equations of Polynomial Length

    Full text link
    We provide an implicit characterization of polynomial time computation in terms of ordinary differential equations: we characterize the class PTIME\operatorname{PTIME} of languages computable in polynomial time in terms of differential equations with polynomial right-hand side. This result gives a purely continuous (time and space) elegant and simple characterization of PTIME\operatorname{PTIME}. This is the first time such classes are characterized using only ordinary differential equations. Our characterization extends to functions computable in polynomial time over the reals in the sense of computable analysis. This extends to deterministic complexity classes above polynomial time. This may provide a new perspective on classical complexity, by giving a way to define complexity classes, like PTIME\operatorname{PTIME}, in a very simple way, without any reference to a notion of (discrete) machine. This may also provide ways to state classical questions about computational complexity via ordinary differential equations, i.e.~by using the framework of analysis

    An evaluation of approximate probabilistic reachability techniques for stochastic parametric hybrid systems

    Get PDF
    Ph. D. ThesisStochastic parametric hybrid systems allow formalising automata with discrete interruptions, continuous nonlinear dynamics and parametric uncertainty (e.g. randomness and/or nondeterminism), and are a useful framework for cyber-physical systems modelling. The problem of designing safe cyber-physical systems is very timely, given that such systems are ubiquitous in modern society, often in safety-critical contexts (e.g., aircraft and cars) with possibly some level of decisional autonomy. Therefore, the verification of cyber-physical systems (and consequently of hybrid systems) is a problem urgently demanding innovative solutions. Unfortunately, this problem is also extremely challenging. Reachability checking is a crucial element of designing safe systems. Given a system model, we specify a set of "goal" states (indicating (un)wanted behaviour) and ask whether the system evolution can reach these states or not. Probabilistic reachability is the corresponding problem for stochastic systems, and it amounts to computing the probability that the system reaches a goal state. The main problem researched in this thesis is probabilistic reachability analysis of hybrid systems with random and/or nondeterministic parameters. For nondeterministic systems, this problem amounts to computing a range of reachability probabilities depending on how nondeterminism is resolved. In this thesis I have investigated and developed three distinct techniques: Statistical methods, involving Monte Carlo, Quasi-Monte Carlo and Randomised Quasi-Monte Carlo sampling with interval estimation techniques which give statistical guarantees; An analytical approximation method, utilising Gaussian Processes that offer a statistical approximation for an (unknown) smooth function over its entire domain; A promising combination of a formal approach, based on formal reasoning which provides absolute numerical guarantees, and the Gaussian Regression method. This research offers contributions on two different levels to the verification of stochastic parametric hybrid systems. From a theoretical point of view, it offers a proof that the reachability probability function is a smooth function of the uncertain parameters of the model, and hence Gaussian Processes techniques can be used to obtain an efficient analytical approximation of the function. From a practical point of view, I have implemented all the above described statistical and approximation techniques as part of the publicly available ProbReach tool, including a Gaussian Process Expectation Propagation algorithm that performs Gaussian Process classification and regression for uni-variate and multiple class labels. My empirical evaluation of the presented techniques to a number of case studies has shown a great Gaussian Process approach advantage with respect to standard statistical model checking techniques.SAgE Doctoral Training Scholarships of Newcastle Universit

    Acta Cybernetica : Volume 24. Number 4.

    Get PDF

    Spatio-temporal logic for the analysis of biochemical models

    Get PDF
    Process algebra, formal specification, and model checking are all well studied techniques in the analysis of concurrent computer systems. More recently these techniques have been applied to the analysis of biochemical systems which, at an abstract level, have similar patterns of behaviour to concurrent processes. Process algebraic models and temporal logic specifications, along with their associated model-checking techniques, have been used to analyse biochemical systems. In this thesis we develop a spatio-temporal logic, the Logic of Behaviour in Context (LBC), for the analysis of biochemical models. That is, we define and study the application of a formal specification language which not only expresses temporal properties of biochemical models, but expresses spatial or contextual properties as well. The logic can be used to express, or specify, the behaviour of a model when it is placed into the context of another model. We also explore the types of properties which can be expressed in LBC, various algorithms for model checking LBC - each an improvement on the last, the implementation of the computational tools to support model checking LBC, and a case study on the analysis of models of post-translational biochemical oscillators using LBC. We show that a number of interesting and useful properties can be expressed in LBC and that it is possible to express highly useful properties of real models in the biochemistry domain, with practical application. Statements in LBC can be thought of as expressing computational experiments which can be performed automatically by means of the model checker. Indeed, many of these computational experiments can be higher-order meaning that one succinct and precise specification in LBC can represent a number of experiments which can be automatically executed by the model checker

    Efficient Model Checking: The Power of Randomness

    Get PDF

    Dynamic analysis of Cyber-Physical Systems

    Get PDF
    With the recent advances in communication and computation technologies, integration of software into the sensing, actuation, and control is common. This has lead to a new branch of study called Cyber-Physical Systems (CPS). Avionics, automotives, power grid, medical devices, and robotics are a few examples of such systems. As these systems are part of critical infrastructure, it is very important to ensure that these systems function reliably without any failures. While testing improves confidence in these systems, it does not establish the absence of scenarios where the system fails. The focus of this thesis is on formal verification techniques for cyber-physical systems that prove the absence of errors in a given system. In particular, this thesis focuses on {\em dynamic analysis} techniques that bridge the gap between testing and verification. This thesis uses the framework of hybrid input output automata for modeling CPS. Formal verification of hybrid automata is undecidable in general. Because of the undecidability result, no algorithm is guaranteed to terminate for all models. This thesis focuses on developing heuristics for verification that exploit sample executions of the system. Moreover, the goal of the dynamic analysis techniques proposed in this thesis is to ensure that the techniques are sound, i.e., they always return the right answer, and they are relatively complete, i.e., the techniques terminate when the system satisfies certain special conditions. For undecidable problems, such theoretical guarantees are the strongest that can be expected out of any automatic procedure. This thesis focuses on safety properties, which require that nothing bad happens. In particular we consider invariant and temporal precedence properties; temporal precedence properties ensure that the temporal ordering of certain events in every execution satisfy a given specification. This thesis introduces the notion of a discrepancy function that aids in dynamic analysis of CPS. Informally, these discrepancy functions capture the convergence or divergence of continuous behaviors in CPS systems. In control theory, several proof certificates such as contraction metric and incremental stability have been proposed to capture the convergence and divergence of solutions of ordinary differential equations. This thesis establishes that discrepancy functions generalize such proof certificates. Further, this thesis also proposes a new technique to compute discrepancy functions for continuous systems with linear ODEs from sample executions. One of the main contributions of this thesis is a technique to compute an over-approximation of the set of reachable states using sample executions and discrepancy functions. Using the reachability computation technique, this thesis proposes a safety verification algorithm which is proved to be sound and relatively complete. This technique is implemented in a tool called, Compare-Execute-Check-Engine (C2E2) and experimental results show that it is scalable. To demonstrate the applicability of the algorithms presented, two challenging case studies are analyzed as a part of this thesis. The first case study is about an alerting mechanism in parallel aircraft landing. For performing this case study, the dynamic analysis presented for invariant verification is extended to handle temporal properties. The second case study is about verifying key specification of powertrain control system. New algorithms for computing discrepancy function were implemented in C2E2 for performing this case study. Both these case studies demonstrate that dynamic analysis technique gives promising results and can be applied to realistic CPS. For distributed CPS implementations, where message passing, and clocks skews between agents make formal verification difficult to scale, this thesis presents a dynamic analysis algorithm for inferring global predicates. Such global predicates include assertions about the physical state and the software state of all the agents involved in distributed CPS. This algorithm is applied to coordinated robotic maneuvers for inferring safety and detecting deadlock

    Combining Machine Learning and Formal Methods for Complex Systems Design

    Get PDF
    During the last 20 years, model-based design has become a standard practice in many fields such as automotive, aerospace engineering, systems and synthetic biology. This approach allows a considerable improvement of the final product quality and reduces the overall prototyping costs. In these contexts, formal methods, such as temporal logics, and model checking approaches have been successfully applied. They allow a precise description and automatic verification of the prototype's requirements. In the recent past, the increasing market requests for performing and safer devices shows an unstoppable growth which inevitably brings to the creation of more and more complicated devices. The rise of cyber-physical systems, which are on their way to become massively pervasive, brings the complexity level to the next step and open many new challenges. First, the descriptive power of standard temporal logics is no more sufficient to handle all kind of requirements the designers need (consider, for example, non-functional requirements). Second, the standard model checking techniques are unable to manage such level of complexity (consider the well-known curse of state space explosion). In this thesis, we leverage machine learning techniques, active learning, and optimization approaches to face the challenges mentioned above. In particular, we define signal measure logic, a novel temporal logic suited to describe non-functional requirements. We also use evolutionary algorithms and signal temporal logic to tackle a supervised classification problem and a system design problem which involves multiple conflicting requirements (i.e., multi-objective optimization problems). Finally, we use an active learning approach, based on Gaussian processes, to deal with falsification problems in the automotive field and to solve a so-called threshold synthesis problem, discussing an epidemics case study.During the last 20 years, model-based design has become a standard practice in many fields such as automotive, aerospace engineering, systems and synthetic biology. This approach allows a considerable improvement of the final product quality and reduces the overall prototyping costs. In these contexts, formal methods, such as temporal logics, and model checking approaches have been successfully applied. They allow a precise description and automatic verification of the prototype's requirements. In the recent past, the increasing market requests for performing and safer devices shows an unstoppable growth which inevitably brings to the creation of more and more complicated devices. The rise of cyber-physical systems, which are on their way to become massively pervasive, brings the complexity level to the next step and open many new challenges. First, the descriptive power of standard temporal logics is no more sufficient to handle all kind of requirements the designers need (consider, for example, non-functional requirements). Second, the standard model checking techniques are unable to manage such level of complexity (consider the well-known curse of state space explosion). In this thesis, we leverage machine learning techniques, active learning, and optimization approaches to face the challenges mentioned above. In particular, we define signal measure logic, a novel temporal logic suited to describe non-functional requirements. We also use evolutionary algorithms and signal temporal logic to tackle a supervised classification problem and a system design problem which involves multiple conflicting requirements (i.e., multi-objective optimization problems). Finally, we use an active learning approach, based on Gaussian processes, to deal with falsification problems in the automotive field and to solve a so-called threshold synthesis problem, discussing an epidemics case study
    corecore