47,059 research outputs found

    An expectation transformer approach to predicate abstraction and data independence for probabilistic programs

    Full text link
    In this paper we revisit the well-known technique of predicate abstraction to characterise performance attributes of system models incorporating probability. We recast the theory using expectation transformers, and identify transformer properties which correspond to abstractions that yield nevertheless exact bound on the performance of infinite state probabilistic systems. In addition, we extend the developed technique to the special case of "data independent" programs incorporating probability. Finally, we demonstrate the subtleness of the extended technique by using the PRISM model checking tool to analyse an infinite state protocol, obtaining exact bounds on its performance

    On spatially irregular ordinary differential equations and a pathwise volatility modelling framework

    Get PDF
    This thesis develops a new framework for modelling price processes in finance, such as an equity price or foreign exchange rate. This can be related to the conventional Ito calculus-based framework through the time integral of a price's squared volatility. In the new framework, corresponding processes are strictly increasing, solve random ODEs, and are composed with geometric Brownian motion to obtain price processes. The new framework has no dependence on stochastic calculus, so processes can be studied on a pathwise basis using probability-free ODE techniques and functional analysis. The ODEs considered depend on continuous driving functions which are `spatially irregular', meaning they need not have any spatial regularity properties such as Holder continuity. They are however strictly increasing in time, thus temporally asymmetric. When sensible initial values are chosen, IVP solutions are also strictly increasing, and the IVPs' solution set is shown to contain all differentiable bijections on the non-negative reals. This enables the modelling of any non-negative volatility path which is not zero over intervals, via the time derivative of solutions. Despite this generality, new well-posedness results establish the uniqueness of solutions going forwards in time, and continuity of the IVPs' solution map. Motivation to explore this framework comes from its connection with the Heston volatility model. The framework explains how Heston price processes can converge to an interval-valued generalisation of the NIG Levy process, and reveals a deeper relationship between integrated CIR processes and the IG Levy process. Within this framework, a `Riemann-Liouville-Heston' martingale model is defined which generalises these relationships to fractional counterparts. Implied volatilities from this model are simulated, and exhibit features characteristic of leading `rough' volatility models.Comment: The author's PhD thesis. Major extension of v2. 211 pages, 22 figure

    Formal Verification of Probabilistic SystemC Models with Statistical Model Checking

    Full text link
    Transaction-level modeling with SystemC has been very successful in describing the behavior of embedded systems by providing high-level executable models, in which many of them have inherent probabilistic behaviors, e.g., random data and unreliable components. It thus is crucial to have both quantitative and qualitative analysis of the probabilities of system properties. Such analysis can be conducted by constructing a formal model of the system under verification and using Probabilistic Model Checking (PMC). However, this method is infeasible for large systems, due to the state space explosion. In this article, we demonstrate the successful use of Statistical Model Checking (SMC) to carry out such analysis directly from large SystemC models and allow designers to express a wide range of useful properties. The first contribution of this work is a framework to verify properties expressed in Bounded Linear Temporal Logic (BLTL) for SystemC models with both timed and probabilistic characteristics. Second, the framework allows users to expose a rich set of user-code primitives as atomic propositions in BLTL. Moreover, users can define their own fine-grained time resolution rather than the boundary of clock cycles in the SystemC simulation. The third contribution is an implementation of a statistical model checker. It contains an automatic monitor generation for producing execution traces of the model-under-verification (MUV), the mechanism for automatically instrumenting the MUV, and the interaction with statistical model checking algorithms.Comment: Journal of Software: Evolution and Process. Wiley, 2017. arXiv admin note: substantial text overlap with arXiv:1507.0818

    An Approach to Static Performance Guarantees for Programs with Run-time Checks

    Full text link
    Instrumenting programs for performing run-time checking of properties, such as regular shapes, is a common and useful technique that helps programmers detect incorrect program behaviors. This is specially true in dynamic languages such as Prolog. However, such run-time checks inevitably introduce run-time overhead (in execution time, memory, energy, etc.). Several approaches have been proposed for reducing such overhead, such as eliminating the checks that can statically be proved to always succeed, and/or optimizing the way in which the (remaining) checks are performed. However, there are cases in which it is not possible to remove all checks statically (e.g., open libraries which must check their interfaces, complex properties, unknown code, etc.) and in which, even after optimizations, these remaining checks still may introduce an unacceptable level of overhead. It is thus important for programmers to be able to determine the additional cost due to the run-time checks and compare it to some notion of admissible cost. The common practice used for estimating run-time checking overhead is profiling, which is not exhaustive by nature. Instead, we propose a method that uses static analysis to estimate such overhead, with the advantage that the estimations are functions parameterized by input data sizes. Unlike profiling, this approach can provide guarantees for all possible execution traces, and allows assessing how the overhead grows as the size of the input grows. Our method also extends an existing assertion verification framework to express "admissible" overheads, and statically and automatically checks whether the instrumented program conforms with such specifications. Finally, we present an experimental evaluation of our approach that suggests that our method is feasible and promising.Comment: 15 pages, 3 tables; submitted to ICLP'18, accepted as technical communicatio
    • …
    corecore