23,120 research outputs found
Recommended from our members
Testing from a stochastic timed system with a fault model
In this paper we present a method for testing a system against a non-deterministic stochastic finite state machine. As usual, we assume that the functional behaviour of the system under test
(SUT) is deterministic but we allow the timing to be non-deterministic. We extend the state counting method of deriving tests, adapting it to the presence of temporal requirements represented by means of random variables. The notion of conformance is introduced using an implementation relation considering temporal aspects and the limitations imposed by a black-box framework. We propose an algorithm for generating a test suite that determines the conformance of a deterministic SUT with respect to a non-deterministic specification. We show how previous work on testing from stochastic systems can be encoded into the framework presented in this paper as an instantiation of our parameterized implementation relation. In this setting, we use a notion of conformance up to a given confidence level
Testing timed systems modeled by stream X-machines
Stream X-machines have been used to specify real systems where complex data structures. They are a variety of extended finite state machine where a shared memory is used to represent communications between the components of systems. In this paper we introduce an extension of the Stream X-machines formalism in order to specify systems that present temporal requirements. We add time in two different ways. First, we consider that (output) actions take time to be performed. Second, our formalism allows to specify timeouts. Timeouts represent the time a system can wait for the environment to react without changing its internal state. Since timeous affect the set of available actions of the system, a relation focusing on the functional behavior of systems, that is, the actions that they can perform, must explicitly take into account the possible timeouts. In this paper we also propose a formal testing methodology allowing to systematically test a system with respect to a specification. Finally, we introduce a test derivation algorithm. Given a specification, the derived test suite is sound and complete, that is, a system under test successfully passes the test suite if and only if this system conforms to the specification
The Origins of Computational Mechanics: A Brief Intellectual History and Several Clarifications
The principle goal of computational mechanics is to define pattern and
structure so that the organization of complex systems can be detected and
quantified. Computational mechanics developed from efforts in the 1970s and
early 1980s to identify strange attractors as the mechanism driving weak fluid
turbulence via the method of reconstructing attractor geometry from measurement
time series and in the mid-1980s to estimate equations of motion directly from
complex time series. In providing a mathematical and operational definition of
structure it addressed weaknesses of these early approaches to discovering
patterns in natural systems.
Since then, computational mechanics has led to a range of results from
theoretical physics and nonlinear mathematics to diverse applications---from
closed-form analysis of Markov and non-Markov stochastic processes that are
ergodic or nonergodic and their measures of information and intrinsic
computation to complex materials and deterministic chaos and intelligence in
Maxwellian demons to quantum compression of classical processes and the
evolution of computation and language.
This brief review clarifies several misunderstandings and addresses concerns
recently raised regarding early works in the field (1980s). We show that
misguided evaluations of the contributions of computational mechanics are
groundless and stem from a lack of familiarity with its basic goals and from a
failure to consider its historical context. For all practical purposes, its
modern methods and results largely supersede the early works. This not only
renders recent criticism moot and shows the solid ground on which computational
mechanics stands but, most importantly, shows the significant progress achieved
over three decades and points to the many intriguing and outstanding challenges
in understanding the computational nature of complex dynamic systems.Comment: 11 pages, 123 citations;
http://csc.ucdavis.edu/~cmg/compmech/pubs/cmr.ht
Extending stream X-machines to specify and test systems with timeouts
Stream X-machines are a kind of extended finite state machine used to specify real systems where communication between the components is modeled by using a shared memory.In this paper we introduce an extension of the Stream X-machines formalism in order to specify delays/timeouts.The time spent by a system waiting for the environment to react has the capability of affecting the set of available outputs of the system. So, a relation focusing on functional aspects must explicitly take into account the possible timeouts.We also propose a formal testing methodology allowing to systematically test a system with respect to a specification. Finally, we introduce a test derivation algorithm. Given a specification, the derived test suite is sound and complete, that is, a system under test successfully passes the test suite if and only if this system conforms to the specification
Programmability of Chemical Reaction Networks
Motivated by the intriguing complexity of biochemical circuitry within individual cells we study Stochastic Chemical Reaction Networks (SCRNs), a formal model that considers a set of chemical reactions acting on a finite number of molecules in a well-stirred solution according to standard chemical kinetics equations. SCRNs have been widely used for describing naturally occurring (bio)chemical systems, and with the advent of synthetic biology they become a promising language for the design of artificial biochemical circuits. Our interest here is the computational power of SCRNs and how they relate to more conventional models of computation. We survey known connections and give new connections between SCRNs and Boolean Logic Circuits, Vector Addition Systems, Petri Nets, Gate Implementability, Primitive Recursive Functions, Register Machines, Fractran, and Turing Machines. A theme to these investigations is the thin line between decidable and undecidable questions about SCRN behavior
Lurching Toward Chernobyl: Dysfunctions of Real-Time Computation
Cognitive biological structures, social organizations, and computing machines operating in real time are subject to Rate Distortion Theorem constraints driven by the homology between information source uncertainty and free energy density. This exposes the unitary structure/environment system to a relentless entropic torrent compounded by sudden large deviations causing increased distortion between intent and impact, particularly as demands escalate. The phase transitions characteristic of information phenomena suggest that, rather than graceful decay under increasing load, these structures will undergo punctuated degradation akin to spontaneous symmetry breaking in physical systems. Rate distortion problems, that also affect internal structural dynamics, can become synergistic with limitations equivalent to the inattentional blindness of natural cognitive process. These mechanisms, and their interactions, are unlikely to scale well, so that, depending on architecture, enlarging the structure or its duties may lead to a crossover point at which added resources must be almost entirely devoted to ensuring system stability -- a form of allometric scaling familiar from biological examples. This suggests a critical need to tune architecture to problem type and system demand. A real-time computational structure and its environment are a unitary phenomenon, and environments are usually idiosyncratic. Thus the resulting path dependence in the development of pathology could often require an individualized approach to remediation more akin to an arduous psychiatric intervention than to the traditional engineering or medical quick fix. Failure to recognize the depth of these problems seems likely to produce a relentless chain of the Chernobyl-like failures that are necessary, bot often insufficient, for remediation under our system
Quantum Thermodynamics
Quantum thermodynamics is an emerging research field aiming to extend
standard thermodynamics and non-equilibrium statistical physics to ensembles of
sizes well below the thermodynamic limit, in non-equilibrium situations, and
with the full inclusion of quantum effects. Fuelled by experimental advances
and the potential of future nanoscale applications this research effort is
pursued by scientists with different backgrounds, including statistical
physics, many-body theory, mesoscopic physics and quantum information theory,
who bring various tools and methods to the field. A multitude of theoretical
questions are being addressed ranging from issues of thermalisation of quantum
systems and various definitions of "work", to the efficiency and power of
quantum engines. This overview provides a perspective on a selection of these
current trends accessible to postgraduate students and researchers alike.Comment: 48 pages, improved and expanded several sections. Comments welcom
Strong and Weak Optimizations in Classical and Quantum Models of Stochastic Processes
Among the predictive hidden Markov models that describe a given stochastic
process, the {\epsilon}-machine is strongly minimal in that it minimizes every
R\'enyi-based memory measure. Quantum models can be smaller still. In contrast
with the {\epsilon}-machine's unique role in the classical setting, however,
among the class of processes described by pure-state hidden quantum Markov
models, there are those for which there does not exist any strongly minimal
model. Quantum memory optimization then depends on which memory measure best
matches a given problem circumstance.Comment: 14 pages, 14 figures;
http://csc.ucdavis.edu/~cmg/compmech/pubs/uemum.ht
Computation in Finitary Stochastic and Quantum Processes
We introduce stochastic and quantum finite-state transducers as
computation-theoretic models of classical stochastic and quantum finitary
processes. Formal process languages, representing the distribution over a
process's behaviors, are recognized and generated by suitable specializations.
We characterize and compare deterministic and nondeterministic versions,
summarizing their relative computational power in a hierarchy of finitary
process languages. Quantum finite-state transducers and generators are a first
step toward a computation-theoretic analysis of individual, repeatedly measured
quantum dynamical systems. They are explored via several physical systems,
including an iterated beam splitter, an atom in a magnetic field, and atoms in
an ion trap--a special case of which implements the Deutsch quantum algorithm.
We show that these systems' behaviors, and so their information processing
capacity, depends sensitively on the measurement protocol.Comment: 25 pages, 16 figures, 1 table; http://cse.ucdavis.edu/~cmg; numerous
corrections and update
- …