122 research outputs found

    Turing Automata and Graph Machines

    Full text link
    Indexed monoidal algebras are introduced as an equivalent structure for self-dual compact closed categories, and a coherence theorem is proved for the category of such algebras. Turing automata and Turing graph machines are defined by generalizing the classical Turing machine concept, so that the collection of such machines becomes an indexed monoidal algebra. On the analogy of the von Neumann data-flow computer architecture, Turing graph machines are proposed as potentially reversible low-level universal computational devices, and a truly reversible molecular size hardware model is presented as an example

    Verifying sequentially consistent memory using interface refinement

    Get PDF

    On the Modeling and Verification of Collective and Cooperative Systems

    Get PDF
    none1noThe formal description and verification of networks of cooperative and interacting agents is made difficult by the interplay of several different behavioral patterns, models of communication, scalability issues. In this paper, we will explore the functionalities and the expressiveness of a general-purpose process algebraic framework for the specification and model checking based analysis of collective and cooperative systems. The proposed syntactic and semantic schemes are general enough to be adapted with small modifications to heterogeneous application domains, like, e.g., crowdsourcing systems, trustworthy networks, and distributed ledger technologies.Aldini, AlessandroAldini, Alessandr

    Dynamic production system identification for smart manufacturing systems

    Get PDF
    This paper presents a methodology, called production system identification, to produce a model of a manufacturing system from logs of the system's operation. The model produced is intended to aid in making production scheduling decisions. Production system identification is similar to machine-learning methods of process mining in that they both use logs of operations. However, process mining falls short of addressing important requirements; process mining does not (1) account for infrequent exceptional events that may provide insight into system capabilities and reliability, (2) offer means to validate the model relative to an understanding of causes, and (3) updated the model as the situation on the production floor changes. The paper describes a genetic programming (GP) methodology that uses Petri nets, probabilistic neural nets, and a causal model of production system dynamics to address these shortcomings. A coloured Petri net formalism appropriate to GP is developed and used to interpret the log. Interpreted logs provide a relation between Petri net states and exceptional system states that can be learned by means of novel formulation of probabilistic neural nets (PNNs). A generalized stochastic Petri net and the PNNs are used to validate the GP-generated solutions. The methodology is evaluated with an example based on an automotive assembly system

    A complete quantitative deduction system for the bisimilarity distance on Markov chains

    Get PDF
    In this paper we propose a complete axiomatization of the bisimilarity distance of Desharnais et al. for the class of finite labelled Markov chains. Our axiomatization is given in the style of a quantitative extension of equational logic recently proposed by Mardare, Panangaden, and Plotkin (LICS 2016) that uses equality relations t ≡ ε s indexed by rationals, expressing that “t is approximately equal to s up to an error ε”. Notably, our quantitative deduction system extends in a natural way the equational system for probabilistic bisimilarity given by Stark and Smolka by introducing an axiom for dealing with the Kantorovich distance between probability distributions. The axiomatization is then used to propose a metric extension of a Kleene’s style representation theorem for finite labelled Markov chains, that was proposed (in a more general coalgebraic fashion) by Silva et al. (Inf. Comput. 2011)

    Sound and complete axiomatizations of coalgebraic language equivalence

    Get PDF
    Coalgebras provide a uniform framework to study dynamical systems, including several types of automata. In this paper, we make use of the coalgebraic view on systems to investigate, in a uniform way, under which conditions calculi that are sound and complete with respect to behavioral equivalence can be extended to a coarser coalgebraic language equivalence, which arises from a generalised powerset construction that determinises coalgebras. We show that soundness and completeness are established by proving that expressions modulo axioms of a calculus form the rational fixpoint of the given type functor. Our main result is that the rational fixpoint of the functor FTFT, where TT is a monad describing the branching of the systems (e.g. non-determinism, weights, probability etc.), has as a quotient the rational fixpoint of the "determinised" type functor Fˉ\bar F, a lifting of FF to the category of TT-algebras. We apply our framework to the concrete example of weighted automata, for which we present a new sound and complete calculus for weighted language equivalence. As a special case, we obtain non-deterministic automata, where we recover Rabinovich's sound and complete calculus for language equivalence.Comment: Corrected version of published journal articl

    Essays on Decision Making: Intertemporal Choice and Uncertainty

    Get PDF
    Being labeled as a social science, much of economics is about understanding human behavior; be it in the face of uncertainty or delayed payoffs through time or strategic situations such as auctions, bargaining, and so on. This thesis will be concerned with the first two, namely uncertainty and time preferences. The main focus of this thesis is what we can summarize with two broad titles: "irrationalities" in human behavior and an alternative perspective on 'rational behavior". My claim requires a clarification of what is meant by rational or irrational behavior. In one of the early discussions of this topic, Richter (1966) defined a rational consumer as someone for whom there exists a total, reflexive, and transitive binary relation on the set of commodities so that his choice data consists of maximal elements of this binary relation. In this respect, Richter (1966) only imposed minimal consistency conditions on behavior for it to be labeled as rational. Although his setting does not involve any uncertainty or time dimension, analogues of these conditions exist for the models we consider here as well. So one can extend the rationality notion of Richter (1966) to our models too. Yet the essence of his approach to rationality is different than the one we take up in this thesis. This minimalistic approach of Richter would leave little space for discussions on rational behavior because much behavior would be rational except for a few cleverly constructed counterexamples. Instead we will consider more widely accepted norms of rationality and analyze them in the framework of uncertainty and time preferences. The widely accepted norms of rationality mentioned above are understood to be axioms that lead to decision rules describing people's behavior. In the case of decision making under risk and uncertainty the most commonly used decision model is expected utility, and in the case of dynamic decision making, it is the constant discounted utility model. Although there are models that combine both to explain decision making in a dynamic stochastic settings, in this thesis we study them in isolation to assess the nature of the models in more detail
    • …
    corecore