1,106 research outputs found

    Recognizable tree series with discounting

    Get PDF
    We consider weighted tree automata with discounting over commutative semirings. For their behaviors we establish a Kleene theorem and an MSO-logic characterization. We introduce also weighted Muller tree automata with discounting over the max-plus and the min-plus semirings, and we show their expressive equivalence with two fragments of weighted MSO-sentences

    Near-Optimal Scheduling for LTL with Future Discounting

    Full text link
    We study the search problem for optimal schedulers for the linear temporal logic (LTL) with future discounting. The logic, introduced by Almagor, Boker and Kupferman, is a quantitative variant of LTL in which an event in the far future has only discounted contribution to a truth value (that is a real number in the unit interval [0, 1]). The precise problem we study---it naturally arises e.g. in search for a scheduler that recovers from an internal error state as soon as possible---is the following: given a Kripke frame, a formula and a number in [0, 1] called a margin, find a path of the Kripke frame that is optimal with respect to the formula up to the prescribed margin (a truly optimal path may not exist). We present an algorithm for the problem; it works even in the extended setting with propositional quality operators, a setting where (threshold) model-checking is known to be undecidable

    Discounting in LTL

    Full text link
    In recent years, there is growing need and interest in formalizing and reasoning about the quality of software and hardware systems. As opposed to traditional verification, where one handles the question of whether a system satisfies, or not, a given specification, reasoning about quality addresses the question of \emph{how well} the system satisfies the specification. One direction in this effort is to refine the "eventually" operators of temporal logic to {\em discounting operators}: the satisfaction value of a specification is a value in [0,1][0,1], where the longer it takes to fulfill eventuality requirements, the smaller the satisfaction value is. In this paper we introduce an augmentation by discounting of Linear Temporal Logic (LTL), and study it, as well as its combination with propositional quality operators. We show that one can augment LTL with an arbitrary set of discounting functions, while preserving the decidability of the model-checking problem. Further augmenting the logic with unary propositional quality operators preserves decidability, whereas adding an average-operator makes some problems undecidable. We also discuss the complexity of the problem, as well as various extensions

    Exact and Approximate Determinization of Discounted-Sum Automata

    Get PDF
    A discounted-sum automaton (NDA) is a nondeterministic finite automaton with edge weights, valuing a run by the discounted sum of visited edge weights. More precisely, the weight in the i-th position of the run is divided by λi\lambda^i, where the discount factor λ\lambda is a fixed rational number greater than 1. The value of a word is the minimal value of the automaton runs on it. Discounted summation is a common and useful measuring scheme, especially for infinite sequences, reflecting the assumption that earlier weights are more important than later weights. Unfortunately, determinization of NDAs, which is often essential in formal verification, is, in general, not possible. We provide positive news, showing that every NDA with an integral discount factor is determinizable. We complete the picture by proving that the integers characterize exactly the discount factors that guarantee determinizability: for every nonintegral rational discount factor λ\lambda, there is a nondeterminizable λ\lambda-NDA. We also prove that the class of NDAs with integral discount factors enjoys closure under the algebraic operations min, max, addition, and subtraction, which is not the case for general NDAs nor for deterministic NDAs. For general NDAs, we look into approximate determinization, which is always possible as the influence of a word's suffix decays. We show that the naive approach, of unfolding the automaton computations up to a sufficient level, is doubly exponential in the discount factor. We provide an alternative construction for approximate determinization, which is singly exponential in the discount factor, in the precision, and in the number of states. We also prove matching lower bounds, showing that the exponential dependency on each of these three parameters cannot be avoided. All our results hold equally for automata over finite words and for automata over infinite words

    A Semantic Framework for Test Coverage (Extended Version)

    Get PDF
    Since testing is inherently incomplete, test selection is of vital importance. Coverage measures evaluate the quality of a test suite and help the tester select test cases with maximal impact at minimum cost. Existing coverage criteria for test suites are usually defined in terms of syntactic characteristics of the implementation under test or its specification. Typical black-box coverage metrics are state and transition coverage of the specification. White-box testing often considers statement, condition and path coverage. A disadvantage of this syntactic approach is that different coverage figures are assigned to systems that are behaviorally equivalent, but syntactically different. Moreover, those coverage metrics do not take into account that certain failures are more severe than others, and that more testing effort should be devoted to uncover the most important bugs, while less critical system parts can be tested less thoroughly. This paper introduces a semantic approach to test coverage. Our starting point is a weighted fault model, which assigns a weight to each potential error in an implementation. We define a framework to express coverage measures that express how well a test suite covers such a specification, taking into account the error weight. Since our notions are semantic, they are insensitive to replacing a specification by one with equivalent behaviour.We present several algorithms that, given a certain minimality criterion, compute a minimal test suite with maximal coverage. These algorithms work on a syntactic representation of weighted fault models as fault automata. They are based on existing and novel optimization\ud problems. Finally, we illustrate our approach by analyzing and comparing a number of test suites for a chat protocol

    Linear Time Logics - A Coalgebraic Perspective

    Full text link
    We describe a general approach to deriving linear time logics for a wide variety of state-based, quantitative systems, by modelling the latter as coalgebras whose type incorporates both branching behaviour and linear behaviour. Concretely, we define logics whose syntax is determined by the choice of linear behaviour and whose domain of truth values is determined by the choice of branching, and we provide two equivalent semantics for them: a step-wise semantics amenable to automata-based verification, and a path-based semantics akin to those of standard linear time logics. We also provide a semantic characterisation of the associated notion of logical equivalence, and relate it to previously-defined maximal trace semantics for such systems. Instances of our logics support reasoning about the possibility, likelihood or minimal cost of exhibiting a given linear time property. We conclude with a generalisation of the logics, dual in spirit to logics with discounting, which increases their practical appeal in the context of resource-aware computation by incorporating a notion of offsetting.Comment: Major revision of previous version: Sections 4 and 5 generalise the results in the previous version, with new proofs; Section 6 contains new result

    Weighted Modal Transition Systems

    Get PDF
    Specification theories as a tool in model-driven development processes of component-based software systems have recently attracted a considerable attention. Current specification theories are however qualitative in nature, and therefore fragile in the sense that the inevitable approximation of systems by models, combined with the fundamental unpredictability of hardware platforms, makes it difficult to transfer conclusions about the behavior, based on models, to the actual system. Hence this approach is arguably unsuited for modern software systems. We propose here the first specification theory which allows to capture quantitative aspects during the refinement and implementation process, thus leveraging the problems of the qualitative setting. Our proposed quantitative specification framework uses weighted modal transition systems as a formal model of specifications. These are labeled transition systems with the additional feature that they can model optional behavior which may or may not be implemented by the system. Satisfaction and refinement is lifted from the well-known qualitative to our quantitative setting, by introducing a notion of distances between weighted modal transition systems. We show that quantitative versions of parallel composition as well as quotient (the dual to parallel composition) inherit the properties from the Boolean setting.Comment: Submitted to Formal Methods in System Desig

    A Semantic Framework for Test Coverage

    Get PDF
    Since testing is inherently incomplete, test selection is of vital importance. Coverage measures evaluate the quality of a test suite and help the tester select test cases with maximal impact at minimum cost. Existing coverage criteria for test suites are usually defined in terms of syntactic characteristics of the implementation under test or its specification. Typical black-box coverage metrics are state and transition coverage of the specification. White-box testing often considers statement, condition and path coverage. A disadvantage of this syntactic approach is that different coverage figures are assigned to systems that are behaviorally equivalent, but syntactically different. Moreover, those coverage metrics do not take into account that certain failures are more severe than others, and that more testing effort should be devoted to uncover the most important bugs, while less critical system parts can be tested less thoroughly. This paper introduces a semantic approach to test coverage. Our starting point is a weighted fault model, which assigns a weight to each potential error in an implementation. We define a framework to express coverage measures that express how well a test suite covers such a specification, taking into account the error weight. Since our notions are semantic, they are insensitive to replacing a specification by one with equivalent behaviour.We present several algorithms that, given a certain minimality criterion, compute a minimal test suite with maximal coverage. These algorithms work on a syntactic representation of weighted fault models as fault automata. They are based on existing and novel optimization\ud problems. Finally, we illustrate our approach by analyzing and comparing a number of test suites for a chat protocol

    Regular Combinators for String Transformations

    Full text link
    We focus on (partial) functions that map input strings to a monoid such as the set of integers with addition and the set of output strings with concatenation. The notion of regularity for such functions has been defined using two-way finite-state transducers, (one-way) cost register automata, and MSO-definable graph transformations. In this paper, we give an algebraic and machine-independent characterization of this class analogous to the definition of regular languages by regular expressions. When the monoid is commutative, we prove that every regular function can be constructed from constant functions using the combinators of choice, split sum, and iterated sum, that are analogs of union, concatenation, and Kleene-*, respectively, but enforce unique (or unambiguous) parsing. Our main result is for the general case of non-commutative monoids, which is of particular interest for capturing regular string-to-string transformations for document processing. We prove that the following additional combinators suffice for constructing all regular functions: (1) the left-additive versions of split sum and iterated sum, which allow transformations such as string reversal; (2) sum of functions, which allows transformations such as copying of strings; and (3) function composition, or alternatively, a new concept of chained sum, which allows output values from adjacent blocks to mix.Comment: This is the full version, with omitted proofs and constructions, of the conference paper currently in submissio
    corecore