1,028 research outputs found

    Parabolic Anderson model with a finite number of moving catalysts

    Get PDF
    We consider the parabolic Anderson model (PAM) which is given by the equation ∂u/∂t=ÎșΔu+Οu\partial u/\partial t = \kappa\Delta u + \xi u with u ⁣: Zd×[0,∞)→Ru\colon\, \Z^d\times [0,\infty)\to \R, where Îș∈[0,∞)\kappa \in [0,\infty) is the diffusion constant, Δ\Delta is the discrete Laplacian, and ÎŸâ€‰âŁ: Zd×[0,∞)→R\xi\colon\,\Z^d\times [0,\infty)\to\R is a space-time random environment that drives the equation. The solution of this equation describes the evolution of a "reactant" uu under the influence of a "catalyst" Ο\xi. In the present paper we focus on the case where Ο\xi is a system of nn independent simple random walks each with step rate 2dρ2d\rho and starting from the origin. We study the \emph{annealed} Lyapunov exponents, i.e., the exponential growth rates of the successive moments of uu w.r.t.\ Ο\xi and show that these exponents, as a function of the diffusion constant Îș\kappa and the rate constant ρ\rho, behave differently depending on the dimension dd. In particular, we give a description of the intermittent behavior of the system in terms of the annealed Lyapunov exponents, depicting how the total mass of uu concentrates as t→∞t\to\infty. Our results are both a generalization and an extension of the work of G\"artner and Heydenreich 2006, where only the case n=1n=1 was investigated.Comment: In honour of J\"urgen G\"artner on the occasion of his 60th birthday, 25 pages. Updated version following the referee's comment

    Intermittency on catalysts: Voter model

    Get PDF
    In this paper we study intermittency for the parabolic Anderson equation ∂u/∂t=ÎșΔu+ÎłÎŸu\partial u/\partial t=\kappa\Delta u+\gamma\xi u with u:Zd×[0,∞)→Ru:\mathbb{Z}^d\times[0,\infty)\to\mathbb{R}, where Îș∈[0,∞)\kappa\in[0,\infty) is the diffusion constant, Δ\Delta is the discrete Laplacian, γ∈(0,∞)\gamma\in(0,\infty) is the coupling constant, and Ο:Zd×[0,∞)→R\xi:\mathbb{Z}^d\times[0,\infty)\to\mathbb{R} is a space--time random medium. The solution of this equation describes the evolution of a ``reactant'' uu under the influence of a ``catalyst'' Ο\xi. We focus on the case where Ο\xi is the voter model with opinions 0 and 1 that are updated according to a random walk transition kernel, starting from either the Bernoulli measure Μρ\nu_{\rho} or the equilibrium measure Όρ\mu_{\rho}, where ρ∈(0,1)\rho\in(0,1) is the density of 1's. We consider the annealed Lyapunov exponents, that is, the exponential growth rates of the successive moments of uu. We show that if the random walk transition kernel has zero mean and finite variance, then these exponents are trivial for 1≀d≀41\leq d\leq4, but display an interesting dependence on the diffusion constant Îș\kappa for d≄5d\geq 5, with qualitatively different behavior in different dimensions. In earlier work we considered the case where Ο\xi is a field of independent simple random walks in a Poisson equilibrium, respectively, a symmetric exclusion process in a Bernoulli equilibrium, which are both reversible dynamics. In the present work a main obstacle is the nonreversibility of the voter model dynamics, since this precludes the application of spectral techniques. The duality with coalescing random walks is key to our analysis, and leads to a representation formula for the Lyapunov exponents that allows for the application of large deviation estimates.Comment: Published in at http://dx.doi.org/10.1214/10-AOP535 the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Two New Bounds on the Random-Edge Simplex Algorithm

    Full text link
    We prove that the Random-Edge simplex algorithm requires an expected number of at most 13n/sqrt(d) pivot steps on any simple d-polytope with n vertices. This is the first nontrivial upper bound for general polytopes. We also describe a refined analysis that potentially yields much better bounds for specific classes of polytopes. As one application, we show that for combinatorial d-cubes, the trivial upper bound of 2^d on the performance of Random-Edge can asymptotically be improved by any desired polynomial factor in d.Comment: 10 page

    Intermittency of catalysts: voter model

    Get PDF
    Article / Letter to editorMathematisch Instituu

    Quenched Lyapunov exponent for the parabolic Anderson model in a dynamic random environment

    Get PDF
    We continue our study of the parabolic Anderson equation ¿u/¿t =k¿u+¿¿u for the space-time field u: Zd ×[0,8) ¿ R, where k ¿ [0,8) is the diffusion constant, ¿ is the discrete Laplacian, ¿ ¿ (0,8) is the coupling constant, and ¿ : Zd ×[0,8)¿R is a space-time random environment that drives the equation. The solution of this equation describes the evolution of a "reactant" u under the influence of a "catalyst" ¿, both living on Zd. In earlier work we considered three choices for ¿: independent simple random walks, the symmetric exclusion process, and the symmetric voter model, all in equilibrium at a given density. We analyzed the annealed Lyapunov exponents, i.e., the exponential growth rates of the successive moments of u w.r.t. ¿ , and showed that these exponents display an interesting dependence on the diffusion constant k, with qualitatively different behavior in different dimensions d. In the present paper we focus on the quenched Lyapunov exponent, i.e., the exponential growth rate of u conditional on ¿ . We first prove existence and derive some qualitative properties of the quenched Lyapunov exponent for a general ¿ that is stationary and ergodic w.r.t. translations in Zd and satisfies certain noisiness conditions. After that we focus on the three particular choices for ¿ mentioned above and derive some more detailed properties.We close by formulating a number of open problems

    Extending local features with contextual information in graph kernels

    Full text link
    Graph kernels are usually defined in terms of simpler kernels over local substructures of the original graphs. Different kernels consider different types of substructures. However, in some cases they have similar predictive performances, probably because the substructures can be interpreted as approximations of the subgraphs they induce. In this paper, we propose to associate to each feature a piece of information about the context in which the feature appears in the graph. A substructure appearing in two different graphs will match only if it appears with the same context in both graphs. We propose a kernel based on this idea that considers trees as substructures, and where the contexts are features too. The kernel is inspired from the framework in [6], even if it is not part of it. We give an efficient algorithm for computing the kernel and show promising results on real-world graph classification datasets.Comment: To appear in ICONIP 201

    Space-efficient Feature Maps for String Alignment Kernels

    Get PDF
    String kernels are attractive data analysis tools for analyzing string data. Among them, alignment kernels are known for their high prediction accuracies in string classifications when tested in combination with SVM in various applications. However, alignment kernels have a crucial drawback in that they scale poorly due to their quadratic computation complexity in the number of input strings, which limits large-scale applications in practice. We address this need by presenting the first approximation for string alignment kernels, which we call space-efficient feature maps for edit distance with moves (SFMEDM), by leveraging a metric embedding named edit sensitive parsing (ESP) and feature maps (FMs) of random Fourier features (RFFs) for large-scale string analyses. The original FMs for RFFs consume a huge amount of memory proportional to the dimension d of input vectors and the dimension D of output vectors, which prohibits its large-scale applications. We present novel space-efficient feature maps (SFMs) of RFFs for a space reduction from O(dD) of the original FMs to O(d) of SFMs with a theoretical guarantee with respect to concentration bounds. We experimentally test SFMEDM on its ability to learn SVM for large-scale string classifications with various massive string data, and we demonstrate the superior performance of SFMEDM with respect to prediction accuracy, scalability and computation efficiency.Comment: Full version for ICDM'19 pape

    N-cadherin: A new player in neuronal polarity

    Get PDF
    Comment on: GĂ€rtner A, et al. EMBO J 2012; <span class="b">31</span>:1893-90
    • 

    corecore