1,028 research outputs found
Parabolic Anderson model with a finite number of moving catalysts
We consider the parabolic Anderson model (PAM) which is given by the equation
with , where is the diffusion constant,
is the discrete Laplacian, and
is a space-time random environment that drives the equation. The solution of
this equation describes the evolution of a "reactant" under the influence
of a "catalyst" . In the present paper we focus on the case where is
a system of independent simple random walks each with step rate
and starting from the origin. We study the \emph{annealed} Lyapunov exponents,
i.e., the exponential growth rates of the successive moments of w.r.t.\
and show that these exponents, as a function of the diffusion constant
and the rate constant , behave differently depending on the
dimension . In particular, we give a description of the intermittent
behavior of the system in terms of the annealed Lyapunov exponents, depicting
how the total mass of concentrates as . Our results are both a
generalization and an extension of the work of G\"artner and Heydenreich 2006,
where only the case was investigated.Comment: In honour of J\"urgen G\"artner on the occasion of his 60th birthday,
25 pages. Updated version following the referee's comment
Intermittency on catalysts: Voter model
In this paper we study intermittency for the parabolic Anderson equation
with
, where is
the diffusion constant, is the discrete Laplacian,
is the coupling constant, and
is a space--time random medium.
The solution of this equation describes the evolution of a ``reactant''
under the influence of a ``catalyst'' . We focus on the case where
is the voter model with opinions 0 and 1 that are updated according to a random
walk transition kernel, starting from either the Bernoulli measure
or the equilibrium measure , where is the density of
1's. We consider the annealed Lyapunov exponents, that is, the exponential
growth rates of the successive moments of . We show that if the random walk
transition kernel has zero mean and finite variance, then these exponents are
trivial for , but display an interesting dependence on the
diffusion constant for , with qualitatively different
behavior in different dimensions. In earlier work we considered the case where
is a field of independent simple random walks in a Poisson equilibrium,
respectively, a symmetric exclusion process in a Bernoulli equilibrium, which
are both reversible dynamics. In the present work a main obstacle is the
nonreversibility of the voter model dynamics, since this precludes the
application of spectral techniques. The duality with coalescing random walks is
key to our analysis, and leads to a representation formula for the Lyapunov
exponents that allows for the application of large deviation estimates.Comment: Published in at http://dx.doi.org/10.1214/10-AOP535 the Annals of
Probability (http://www.imstat.org/aop/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Two New Bounds on the Random-Edge Simplex Algorithm
We prove that the Random-Edge simplex algorithm requires an expected number
of at most 13n/sqrt(d) pivot steps on any simple d-polytope with n vertices.
This is the first nontrivial upper bound for general polytopes. We also
describe a refined analysis that potentially yields much better bounds for
specific classes of polytopes. As one application, we show that for
combinatorial d-cubes, the trivial upper bound of 2^d on the performance of
Random-Edge can asymptotically be improved by any desired polynomial factor in
d.Comment: 10 page
Quenched Lyapunov exponent for the parabolic Anderson model in a dynamic random environment
We continue our study of the parabolic Anderson equation Âżu/Âżt =kÂżu+¿¿u for the space-time field u: Zd Ă[0,8) Âż R, where k Âż [0,8) is the diffusion constant, Âż is the discrete Laplacian, Âż Âż (0,8) is the coupling constant, and Âż : Zd Ă[0,8)ÂżR is a space-time random environment that drives the equation. The solution of this equation describes the evolution of a "reactant" u under the influence of a "catalyst" Âż, both living on Zd. In earlier work we considered three choices for Âż: independent simple random walks, the symmetric exclusion process, and the symmetric voter model, all in equilibrium at a given density. We analyzed the annealed Lyapunov exponents, i.e., the exponential growth rates of the successive moments of u w.r.t. Âż , and showed that these exponents display an interesting dependence on the diffusion constant k, with qualitatively different behavior in different dimensions d. In the present paper we focus on the quenched Lyapunov exponent, i.e., the exponential growth rate of u conditional on Âż . We first prove existence and derive some qualitative properties of the quenched Lyapunov exponent for a general Âż that is stationary and ergodic w.r.t. translations in Zd and satisfies certain noisiness conditions. After that we focus on the three particular choices for Âż mentioned above and derive some more detailed properties.We close by formulating a number of open problems
Extending local features with contextual information in graph kernels
Graph kernels are usually defined in terms of simpler kernels over local
substructures of the original graphs. Different kernels consider different
types of substructures. However, in some cases they have similar predictive
performances, probably because the substructures can be interpreted as
approximations of the subgraphs they induce. In this paper, we propose to
associate to each feature a piece of information about the context in which the
feature appears in the graph. A substructure appearing in two different graphs
will match only if it appears with the same context in both graphs. We propose
a kernel based on this idea that considers trees as substructures, and where
the contexts are features too. The kernel is inspired from the framework in
[6], even if it is not part of it. We give an efficient algorithm for computing
the kernel and show promising results on real-world graph classification
datasets.Comment: To appear in ICONIP 201
Space-efficient Feature Maps for String Alignment Kernels
String kernels are attractive data analysis tools for analyzing string data.
Among them, alignment kernels are known for their high prediction accuracies in
string classifications when tested in combination with SVM in various
applications. However, alignment kernels have a crucial drawback in that they
scale poorly due to their quadratic computation complexity in the number of
input strings, which limits large-scale applications in practice. We address
this need by presenting the first approximation for string alignment kernels,
which we call space-efficient feature maps for edit distance with moves
(SFMEDM), by leveraging a metric embedding named edit sensitive parsing (ESP)
and feature maps (FMs) of random Fourier features (RFFs) for large-scale string
analyses. The original FMs for RFFs consume a huge amount of memory
proportional to the dimension d of input vectors and the dimension D of output
vectors, which prohibits its large-scale applications. We present novel
space-efficient feature maps (SFMs) of RFFs for a space reduction from O(dD) of
the original FMs to O(d) of SFMs with a theoretical guarantee with respect to
concentration bounds. We experimentally test SFMEDM on its ability to learn SVM
for large-scale string classifications with various massive string data, and we
demonstrate the superior performance of SFMEDM with respect to prediction
accuracy, scalability and computation efficiency.Comment: Full version for ICDM'19 pape
Recommended from our members
A novel string representation and kernel function for the comparison of I/O access patterns
Parallel I/O access patterns act as fingerprints of a parallel program. In order to extract meaningful information from these patterns, they have to be represented appropriately. Due to the fact that string objects can be easily compared using Kernel Methods, a conversion to a weighted string representation is proposed in this paper, together with a novel string kernel function called Kast Spectrum Kernel. The similarity matrices, obtained after applying the mentioned kernel over a set of examples from a real application, were analyzed using Kernel Principal Component Analysis (Kernel PCA) and Hierarchical Clustering. The evaluation showed that 2 out of 4 I/O access pattern groups were completely identified, while the other 2 conformed a single cluster due to the intrinsic similarity of their members. The proposed strategy can be promisingly applied to other similarity problems involving tree-like structured data
N-cadherin: A new player in neuronal polarity
Comment on: GĂ€rtner A, et al. EMBO J 2012; <span class="b">31</span>:1893-90
- âŠ