11,056 research outputs found
Generic Trace Semantics via Coinduction
Trace semantics has been defined for various kinds of state-based systems,
notably with different forms of branching such as non-determinism vs.
probability. In this paper we claim to identify one underlying mathematical
structure behind these "trace semantics," namely coinduction in a Kleisli
category. This claim is based on our technical result that, under a suitably
order-enriched setting, a final coalgebra in a Kleisli category is given by an
initial algebra in the category Sets. Formerly the theory of coalgebras has
been employed mostly in Sets where coinduction yields a finer process semantics
of bisimilarity. Therefore this paper extends the application field of
coalgebras, providing a new instance of the principle "process semantics via
coinduction."Comment: To appear in Logical Methods in Computer Science. 36 page
Minimization via duality
We show how to use duality theory to construct minimized versions of a wide class of automata. We work out three cases in detail: (a variant of) ordinary automata, weighted automata and probabilistic automata. The basic idea is that instead of constructing a maximal quotient we go to the dual and look for a minimal subalgebra and then return to the original category. Duality ensures that the minimal subobject becomes the maximally quotiented object
Coalgebraic Behavioral Metrics
We study different behavioral metrics, such as those arising from both
branching and linear-time semantics, in a coalgebraic setting. Given a
coalgebra for a functor , we define a framework for deriving pseudometrics on which
measure the behavioral distance of states.
A crucial step is the lifting of the functor on to a
functor on the category of pseudometric spaces.
We present two different approaches which can be viewed as generalizations of
the Kantorovich and Wasserstein pseudometrics for probability measures. We show
that the pseudometrics provided by the two approaches coincide on several
natural examples, but in general they differ.
If has a final coalgebra, every lifting yields in a
canonical way a behavioral distance which is usually branching-time, i.e., it
generalizes bisimilarity. In order to model linear-time metrics (generalizing
trace equivalences), we show sufficient conditions for lifting distributive
laws and monads. These results enable us to employ the generalized powerset
construction
Entropy, majorization and thermodynamics in general probabilistic theories
In this note we lay some groundwork for the resource theory of thermodynamics
in general probabilistic theories (GPTs). We consider theories satisfying a
purely convex abstraction of the spectral decomposition of density matrices:
that every state has a decomposition, with unique probabilities, into perfectly
distinguishable pure states. The spectral entropy, and analogues using other
Schur-concave functions, can be defined as the entropy of these probabilities.
We describe additional conditions under which the outcome probabilities of a
fine-grained measurement are majorized by those for a spectral measurement, and
therefore the "spectral entropy" is the measurement entropy (and therefore
concave). These conditions are (1) projectivity, which abstracts aspects of the
Lueders-von Neumann projection postulate in quantum theory, in particular that
every face of the state space is the positive part of the image of a certain
kind of projection operator called a filter; and (2) symmetry of transition
probabilities. The conjunction of these, as shown earlier by Araki, is
equivalent to a strong geometric property of the unnormalized state cone known
as perfection: that there is an inner product according to which every face of
the cone, including the cone itself, is self-dual. Using some assumptions about
the thermodynamic cost of certain processes that are partially motivated by our
postulates, especially projectivity, we extend von Neumann's argument that the
thermodynamic entropy of a quantum system is its spectral entropy to
generalized probabilistic systems satisfying spectrality.Comment: In Proceedings QPL 2015, arXiv:1511.0118
Site-bond representation and self-duality for totalistic probabilistic cellular automata
We study the one-dimensional two-state totalistic probabilistic cellular
automata (TPCA) having an absorbing state with long-range interactions, which
can be considered as a natural extension of the Domany-Kinzel model. We
establish the conditions for existence of a site-bond representation and
self-dual property. Moreover we present an expression of a set-to-set
connectedness between two sets, a matrix expression for a condition of the
self-duality, and a convergence theorem for the TPCA.Comment: 11 pages, minor corrections, journal reference adde
Probabilistic cellular automata, invariant measures, and perfect sampling
A probabilistic cellular automaton (PCA) can be viewed as a Markov chain. The
cells are updated synchronously and independently, according to a distribution
depending on a finite neighborhood. We investigate the ergodicity of this
Markov chain. A classical cellular automaton is a particular case of PCA. For a
1-dimensional cellular automaton, we prove that ergodicity is equivalent to
nilpotency, and is therefore undecidable. We then propose an efficient perfect
sampling algorithm for the invariant measure of an ergodic PCA. Our algorithm
does not assume any monotonicity property of the local rule. It is based on a
bounding process which is shown to be also a PCA. Last, we focus on the PCA
Majority, whose asymptotic behavior is unknown, and perform numerical
experiments using the perfect sampling procedure
- …