722,902 research outputs found
Models of Type Theory Based on Moore Paths
This paper introduces a new family of models of intensional Martin-L\"of type
theory. We use constructive ordered algebra in toposes. Identity types in the
models are given by a notion of Moore path. By considering a particular gros
topos, we show that there is such a model that is non-truncated, i.e. contains
non-trivial structure at all dimensions. In other words, in this model a type
in a nested sequence of identity types can contain more than one element, no
matter how great the degree of nesting. Although inspired by existing
non-truncated models of type theory based on simplicial and cubical sets, the
notion of model presented here is notable for avoiding any form of Kan filling
condition in the semantics of types.Comment: This is a revised and expanded version of a paper with the same name
that appeared in the proceedings of the 2nd International Conference on
Formal Structures for Computation and Deduction (FSCD 2017
Computation of epidemic final size distributions
We develop a new methodology for the efficient computation of epidemic final
size distributions for a broad class of Markovian models. We exploit a
particular representation of the stochastic epidemic process to derive a method
which is both computationally efficient and numerically stable. The algorithms
we present are also physically transparent and so allow us to extend this
method from the basic SIR model to a model with a phase-type infectious period
and another with waning immunity. The underlying theory is applicable to many
Markovian models where we wish to efficiently calculate hitting probabilities.Comment: final published versio
Testing for Homogeneity in Mixture Models
Statistical models of unobserved heterogeneity are typically formalized as
mixtures of simple parametric models and interest naturally focuses on testing
for homogeneity versus general mixture alternatives. Many tests of this type
can be interpreted as tests, as in Neyman (1959), and shown to be
locally, asymptotically optimal. These tests will be contrasted
with a new approach to likelihood ratio testing for general mixture models. The
latter tests are based on estimation of general nonparametric mixing
distribution with the Kiefer and Wolfowitz (1956) maximum likelihood estimator.
Recent developments in convex optimization have dramatically improved upon
earlier EM methods for computation of these estimators, and recent results on
the large sample behavior of likelihood ratios involving such estimators yield
a tractable form of asymptotic inference. Improvement in computation efficiency
also facilitates the use of a bootstrap methods to determine critical values
that are shown to work better than the asymptotic critical values in finite
samples. Consistency of the bootstrap procedure is also formally established.
We compare performance of the two approaches identifying circumstances in which
each is preferred
Super-Exponential Solution in Markovian Supermarket Models: Framework and Challenge
Marcel F. Neuts opened a key door in numerical computation of stochastic
models by means of phase-type (PH) distributions and Markovian arrival
processes (MAPs). To celebrate his 75th birthday, this paper reports a more
general framework of Markovian supermarket models, including a system of
differential equations for the fraction measure and a system of nonlinear
equations for the fixed point. To understand this framework heuristically, this
paper gives a detailed analysis for three important supermarket examples: M/G/1
type, GI/M/1 type and multiple choices, explains how to derive the system of
differential equations by means of density-dependent jump Markov processes, and
shows that the fixed point may be simply super-exponential through solving the
system of nonlinear equations. Note that supermarket models are a class of
complicated queueing systems and their analysis can not apply popular queueing
theory, it is necessary in the study of supermarket models to summarize such a
more general framework which enables us to focus on important research issues.
On this line, this paper develops matrix-analytical methods of Markovian
supermarket models. We hope this will be able to open a new avenue in
performance evaluation of supermarket models by means of matrix-analytical
methods.Comment: Randomized load balancing, supermarket model, matrix-analytic method,
super-exponential solution, density-dependent jump Markov process, Batch
Markovian Arrival Process (BMAP), phase-type (PH) distribution, fixed poin
Quantum state transfer in spin chains with q-deformed interaction terms
We study the time evolution of a single spin excitation state in certain
linear spin chains, as a model for quantum communication. Some years ago it was
discovered that when the spin chain data (the nearest neighbour interaction
strengths and the magnetic field strengths) are related to the Jacobi matrix
entries of Krawtchouk polynomials or dual Hahn polynomials, so-called perfect
state transfer takes place. The extension of these ideas to other types of
discrete orthogonal polynomials did not lead to new models with perfect state
transfer, but did allow more insight in the general computation of the
correlation function. In the present paper, we extend the study to discrete
orthogonal polynomials of q-hypergeometric type. A remarkable result is a new
analytic model where perfect state transfer is achieved: this is when the spin
chain data are related to the Jacobi matrix of q-Krawtchouk polynomials. The
other cases studied here (affine q-Krawtchouk polynomials, quantum q-Krawtchouk
polynomials, dual q-Krawtchouk polynomials, q-Hahn polynomials, dual q-Hahn
polynomials and q-Racah polynomials) do not give rise to models with perfect
state transfer. However, the computation of the correlation function itself is
quite interesting, leading to advanced q-series manipulations
Type-II Quantum Algorithms
We review and analyze the hybrid quantum-classical NMR computing methodology
referred to as Type-II quantum computing. We show that all such algorithms
considered so far within this paradigm are equivalent to some classical
lattice-Boltzmann scheme. We derive a sufficient and necessary constraint on
the unitary operator representing the quantum mechanical part of the
computation which ensures that the model reproduces the Boltzmann approximation
of a lattice-gas model satisfying semi-detailed balance. Models which do not
satisfy this constraint represent new lattice-Boltzmann schemes which cannot be
formulated as the average over some underlying lattice gas. We close the paper
with some discussion of the strengths, weaknesses and possible future direction
of Type-II quantum computing.Comment: To appear in Physica
- …