134,188 research outputs found
Central limit theorem for exponentially quasi-local statistics of spin models on Cayley graphs
Central limit theorems for linear statistics of lattice random fields
(including spin models) are usually proven under suitable mixing conditions or
quasi-associativity. Many interesting examples of spin models do not satisfy
mixing conditions, and on the other hand, it does not seem easy to show central
limit theorem for local statistics via quasi-associativity. In this work, we
prove general central limit theorems for local statistics and exponentially
quasi-local statistics of spin models on discrete Cayley graphs with polynomial
growth. Further, we supplement these results by proving similar central limit
theorems for random fields on discrete Cayley graphs and taking values in a
countable space but under the stronger assumptions of {\alpha}-mixing (for
local statistics) and exponential {\alpha}-mixing (for exponentially
quasi-local statistics). All our central limit theorems assume a suitable
variance lower bound like many others in the literature. We illustrate our
general central limit theorem with specific examples of lattice spin models and
statistics arising in computational topology, statistical physics and random
networks. Examples of clustering spin models include quasi-associated spin
models with fast decaying covariances like the off-critical Ising model, level
sets of Gaussian random fields with fast decaying covariances like the massive
Gaussian free field and determinantal point processes with fast decaying
kernels. Examples of local statistics include intrinsic volumes, face counts,
component counts of random cubical complexes while exponentially quasi-local
statistics include nearest neighbour distances in spin models and Betti numbers
of sub-critical random cubical complexes.Comment: Minor changes incorporated based on suggestions by referee
Droplet mixer based on siphon-induced flow discretization and phase shifting
We present a novel mixing principle for centrifugal microfluidic platforms. Siphon structures are designed to disrupt continuous flows in a controlled manner into a sequence of discrete droplets, displaying individual volumes as low as 60 nL. When discrete volumes of different liquids are alternately issued into a common reservoir, a striation pattern of alternating liquid layers is obtained. In this manner diffusion distances are drastically decreased and a fast and homogeneous mixing is achieved. Efficient mixing is demonstrated for a range of liquid combinations of varying fluid properties such as aqueous inks or saline solutions and human plasma. Volumes of 5 muL have been mixed in less than 20 s to a high mixing quality. One-step dilutions of plasma in a standard phosphate buffer solution up to 1:5 are also demonstrated
Simulation of quantum walks and fast mixing with classical processes
We compare discrete-time quantum walks on graphs to their natural classical equivalents, which we argue are lifted Markov chains (LMCs), that is, classical Markov chains with added memory. We show that LMCs can simulate the mixing behavior of any quantum walk, under a commonly satisfied invariance condition. This allows us to answer an open question on how the graph topology ultimately bounds a quantum walk's mixing performance, and that of any stochastic local evolution. The results highlight that speedups in mixing and transport phenomena are not necessarily diagnostic of quantum effects, although superdiffusive spreading is more prominent with quantum walks. The general simulating LMC construction may lead to large memory, yet we show that for the main graphs under study (i.e., lattices) this memory can be brought down to the same size employed in the quantum walks proposed in the literature
Fast MCMC sampling for Markov jump processes and extensions
Markov jump processes (or continuous-time Markov chains) are a simple and
important class of continuous-time dynamical systems. In this paper, we tackle
the problem of simulating from the posterior distribution over paths in these
models, given partial and noisy observations. Our approach is an auxiliary
variable Gibbs sampler, and is based on the idea of uniformization. This sets
up a Markov chain over paths by alternately sampling a finite set of virtual
jump times given the current path and then sampling a new path given the set of
extant and virtual jump times using a standard hidden Markov model forward
filtering-backward sampling algorithm. Our method is exact and does not involve
approximations like time-discretization. We demonstrate how our sampler extends
naturally to MJP-based models like Markov-modulated Poisson processes and
continuous-time Bayesian networks and show significant computational benefits
over state-of-the-art MCMC samplers for these models.Comment: Accepted at the Journal of Machine Learning Research (JMLR
Bayesian spike inference from calcium imaging data
We present efficient Bayesian methods for extracting neuronal spiking
information from calcium imaging data. The goal of our methods is to sample
from the posterior distribution of spike trains and model parameters (baseline
concentration, spike amplitude etc) given noisy calcium imaging data. We
present discrete time algorithms where we sample the existence of a spike at
each time bin using Gibbs methods, as well as continuous time algorithms where
we sample over the number of spikes and their locations at an arbitrary
resolution using Metropolis-Hastings methods for point processes. We provide
Rao-Blackwellized extensions that (i) marginalize over several model parameters
and (ii) provide smooth estimates of the marginal spike posterior distribution
in continuous time. Our methods serve as complements to standard point
estimates and allow for quantification of uncertainty in estimating the
underlying spike train and model parameters
- âŠ