961 research outputs found

    Probabilistic Bisimulation: Naturally on Distributions

    Full text link
    In contrast to the usual understanding of probabilistic systems as stochastic processes, recently these systems have also been regarded as transformers of probabilities. In this paper, we give a natural definition of strong bisimulation for probabilistic systems corresponding to this view that treats probability distributions as first-class citizens. Our definition applies in the same way to discrete systems as well as to systems with uncountable state and action spaces. Several examples demonstrate that our definition refines the understanding of behavioural equivalences of probabilistic systems. In particular, it solves a long-standing open problem concerning the representation of memoryless continuous time by memory-full continuous time. Finally, we give algorithms for computing this bisimulation not only for finite but also for classes of uncountably infinite systems

    Problems

    Get PDF
    I. Definition of the Subject and Its Importanc

    www.springerreference.com/docs/html/chapterdbid/60497.html Mechanical Computing: The Computational Complexity of Physical Devices

    Get PDF
    - Mechanism: A machine or part of a machine that performs a particular task computation: the use of a computer for calculation.- Computable: Capable of being worked out by calculation, especially using a computer.- Simulation: Used to denote both the modeling of a physical system by a computer as well as the modeling of the operation of a computer by a mechanical system; the difference will be clear from the context. Definition of the Subject Mechanical devices for computation appear to be largely displaced by the widespread use of microprocessor-based computers that are pervading almost all aspects of our lives. Nevertheless, mechanical devices for computation are of interest for at least three reasons: (a) Historical: The use of mechanical devices for computation is of central importance in the historical study of technologies, with a history dating back thousands of years and with surprising applications even in relatively recent times. (b) Technical & Practical: The use of mechanical devices for computation persists and has not yet been completely displaced by widespread use of microprocessor-based computers. Mechanical computers have found applications in various emerging technologies at the micro-scale that combine mechanical functions with computational and control functions not feasible by purely electronic processing. Mechanical computers also have been demonstrated at the molecular scale, and may also provide unique capabilities at that scale. The physical designs for these modern micro and molecular-scale mechanical computers may be based on the prior designs of the large-scale mechanical computers constructed in the past. (c) Impact of Physical Assumptions on Complexity of Motion Planning, Design, and Simulation: The study of computation done by mechanical devices is also of central importance in providing lower bounds on the computational resources such as time and/or space required to simulate a mechanical syste

    State-deterministic Finite Automata with Translucent Letters and Finite Automata with Nondeterministically Translucent Letters

    Full text link
    Deterministic and nondeterministic finite automata with translucent letters were introduced by Nagy and Otto more than a decade ago as Cooperative Distributed systems of a kind of stateless restarting automata with window size one. These finite state machines have a surprisingly large expressive power: all commutative semi-linear languages and all rational trace languages can be accepted by them including various not context-free languages. While the nondeterministic variant defines a language class with nice closure properties, the deterministic variant is weaker, however it contains all regular languages, some non-regular context-free languages, as the Dyck language, and also some languages that are not even context-free. In all those models for each state, the letters of the alphabet could be in one of the following categories: the automaton cannot see the letter (it is translucent), there is a transition defined on the letter (maybe more than one transitions in nondeterministic case) or none of the above categories (the automaton gets stuck by seeing this letter at the given state and this computation is not accepting). State-deterministic automata are recent models, where the next state of the computation determined by the structure of the automata and it is independent of the processed letters. In this paper our aim is twofold, on the one hand, we investigate state-deterministic finite automata with translucent letters. These automata are specially restricted deterministic finite automata with translucent letters. In the other novel model we present, it is allowed that for a state the set of translucent letters and the set of letters for which transition is defined are not disjoint. One can interpret this fact that the automaton has a nondeterministic choice for each occurrence of such letters to see them (and then erase and make the transition) or not to see that occurrence at that time. Based on these semi-translucent letters, the expressive power of the automata increases, i.e., in this way a proper generalization of the previous models is obtained.Comment: In Proceedings AFL 2023, arXiv:2309.0112

    Engineering failure analysis and design optimisation with HiP-HOPS

    Get PDF
    The scale and complexity of computer-based safety critical systems, like those used in the transport and manufacturing industries, pose significant challenges for failure analysis. Over the last decade, research has focused on automating this task. In one approach, predictive models of system failure are constructed from the topology of the system and local component failure models using a process of composition. An alternative approach employs model-checking of state automata to study the effects of failure and verify system safety properties. In this paper, we discuss these two approaches to failure analysis. We then focus on Hierarchically Performed Hazard Origin & Propagation Studies (HiP-HOPS) - one of the more advanced compositional approaches - and discuss its capabilities for automatic synthesis of fault trees, combinatorial Failure Modes and Effects Analyses, and reliability versus cost optimisation of systems via application of automatic model transformations. We summarise these contributions and demonstrate the application of HiP-HOPS on a simplified fuel oil system for a ship engine. In light of this example, we discuss strengths and limitations of the method in relation to other state-of-the-art techniques. In particular, because HiP-HOPS is deductive in nature, relating system failures back to their causes, it is less prone to combinatorial explosion and can more readily be iterated. For this reason, it enables exhaustive assessment of combinations of failures and design optimisation using computationally expensive meta-heuristics. (C) 2010 Elsevier Ltd. All rights reserved

    Capturing the dynamics of cellular automata, for the generation of synthetic persian music, using conditional restricted Boltzmann machines

    Get PDF
    © Springer International Publishing AG 2017. In this paper the generative and feature extracting powers of the family of Boltzmann Machines are employed in an algorithmic music composition system. Liquid Persian Music (LPM) system is an audio generator using cellular automata progressions as a creative core source. LPM provides an infrastructure for creating novel Dastgāh-like Persian music. Pattern matching rules extract features from the cellular automata sequences and populate the parameters of a Persian musical instrument synthesizer [1]. Applying restricted Boltzmann machines, and conditional restricted Boltzmann machines as two family members of Boltzmann machines provide new ways for interpreting the patterns emanating from the cellular automata. Conditional restricted Boltzmann machines are particularly employed for capturing the dynamics of cellular automata

    Privacy in the Genomic Era

    Get PDF
    Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highly-detailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward

    Non-determinism in the narrative structure of video games

    Get PDF
    PhD ThesisAt the present time, computer games represent a finite interactive system. Even in their more experimental forms, the number of possible interactions between player and NPCs (non-player characters) and among NPCs and the game world has a finite number and is led by a deterministic system in which events can therefore be predicted. This implies that the story itself, seen as the series of events that will unfold during gameplay, is a closed system that can be predicted a priori. This study looks beyond this limitation, and identifies the elements needed for the emergence of a non-finite, emergent narrative structure. Two major contributions are offered through this research. The first contribution comes in the form of a clear categorization of the narrative structures embracing all video game production since the inception of the medium. In order to look for ways to generate a non-deterministic narrative in games, it is necessary to first gain a clear understanding of the current narrative structures implemented and how their impact on users’ experiencing of the story. While many studies have observed the storytelling aspect, no attempt has been made to systematically distinguish among the different ways designers decide how stories are told in games. The second contribution is guided by the following research question: Is it possible to incorporate non-determinism into the narrative structure of computer games? The hypothesis offered is that non-determinism can be incorporated by means of nonlinear dynamical systems in general and Cellular Automata in particular

    Changing a semantics: opportunism or courage?

    Full text link
    The generalized models for higher-order logics introduced by Leon Henkin, and their multiple offspring over the years, have become a standard tool in many areas of logic. Even so, discussion has persisted about their technical status, and perhaps even their conceptual legitimacy. This paper gives a systematic view of generalized model techniques, discusses what they mean in mathematical and philosophical terms, and presents a few technical themes and results about their role in algebraic representation, calibrating provability, lowering complexity, understanding fixed-point logics, and achieving set-theoretic absoluteness. We also show how thinking about Henkin's approach to semantics of logical systems in this generality can yield new results, dispelling the impression of adhocness. This paper is dedicated to Leon Henkin, a deep logician who has changed the way we all work, while also being an always open, modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and Alonso, E., 201
    • 

    corecore