10 research outputs found

    Field theories for stochastic processes

    Get PDF
    This thesis is a collection of collaborative research work which uses field-theoretic techniques to approach three different areas of stochastic dynamics: Branching Processes, First-passage times of processes with are subject to both white and coloured noise, and numerical and analytical aspects of first-passage times in fractional Brownian Motion. Chapter 1 (joint work with Rosalba Garcia Millan, Johannes Pausch, and Gunnar Pruessner, appeared in Phys. Rev. E 98 (6):062107) contains an analysis of non-spatial branching processes with arbitrary offspring distribution. Here our focus lies on the statistics of the number of particles in the system at any given time. We calculate a host of observables using Doi-Peliti field theory and find that close to criticality these observables no longer depend on the details of the offspring distribution, and are thus universal. In Chapter 2 (joint work with Ignacio Bordeu, Saoirse Amarteifio, Rosalba Garcia Millan, Nanxin Wei, and Gunnar Pruessner, appeared in Sci. Rep. 9:15590) we study the number of sites visited by a branching random walk on general graphs. To do so, we introduce a fieldtheoretic tracing mechanism which keeps track of all already visited sites. We find the scaling laws of the moments of the distribution near the critical point. Chapter 3 (joint work with Gunnar Pruessner and Guillaume Salbreux, submitted, arXiv: 2006.00116) provides an analysis of the first-passage time problem for stochastic processes subject to white and coloured noise. By way of a perturbation theory, I give a systematic and controlled expansion of the moment generating function of first-passage times. In Chapter 4, we revise the tracing mechanism found earlier and use it to characterise three different extreme values, first-passage times, running maxima, and mean volume explored. By formulating these in field-theoretic language, we are able to derive new results for a class of non-Markovian stochastic processes. Chapter 5 and 6 are concerned with the first-passage time distribution of fractional Brownian Motion. Chapter 5 (joint work with Kay Wiese, appeared in Phys. Rev. E 101 (4):043312) introduces a new algorithm to sample them efficiently. Chapter 6 (joint work with Maxence Arutkin and Kay Wiese, submitted, arXiv:1908.10801) gives a field-theoretically obtained perturbative result of the first-passage time distribution in the presence of linear and non-linear drift.Open Acces

    27th Annual European Symposium on Algorithms: ESA 2019, September 9-11, 2019, Munich/Garching, Germany

    Get PDF

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Random Walks and Bisections in Random Circulant Graphs

    No full text
    Abstract. Using number theoretical tools, we prove two main results for random r-regular circulant graphs with n vertices, when n is sufficiently large and r is fixed. First, for any fixed ε>0, prime n and L ≥ n 1/r (log n) 1+1/r+ε, walks of length at most L terminate at every vertex with asymptotically the same probability. Second, for any n, there is a polynomial time algorithm to find a vertex bisector and an edge bisector, both of size less than n 1−1/r+o(1). As circulant graphs are popular network topologies in distributed computing, we show that our results can be exploited for various information dissemination schemes. In particular, we provide lower bounds on the number of rounds required by any gossiping algorithms for any n. This settles an open question in an earlier work of the authors (2004) and shows that the generic gossiping algorithms of that work are nearly optimal

    A Study on the Properties and Applications of Advanced MCMC Methods for Diffusion Models

    Get PDF
    The aim of this thesis is to find ways to make advanced Markov Chain Monte Carlo (MCMC) algorithms more efficient. Our framework is relevant for target distributions defined as change of measures from Gaussian laws; we use this def- inition because it provides the flexibility to apply our methods to a wider range of problems –including models driven by Stochastic Differential Equations (SDE). The advanced MCMC algorithms presented in this thesis are well-defined on the infinite-dimensional path-space and exhibit superior properties in terms of compu- tational complexity. The consequence of the well-definition of these algorithms is that they have mesh-free mixing properties and their convergence time does not de- teriorate when the dimension of the path increases. The contributions we make in this thesis are in four areas: First, we present a new proof for the well-posedness of the advanced Hybrid Monte Carlo (HMC) algorithm; this proof allows us to verify the validity of the required assumptions for well-posedness in several practi- cal applications. Second, by comparing analytically and with numerical examples the computational costs of different algorithms, we show that the advanced Ran- dom Walk Metropolis and the Metropolis-adjusted Langevin algorithm (MALA) have similar complexity when applied to ‘long’ diffusion paths, whereas the HMC algorithm is more efficient than both. Third, we demonstrate that the Golightly- Wikinson transformation can be applied to a wider range of applications – than the typically used Lamperti– when using HMC algorithms to sample from complex target distributions such as SDEs with general diffusion coefficients. Four, we im- plemented a novel joint update scheme to sample from a path observed with error, where the path itself was driven by a fractional Brownian motion (fBm) instead of a Wiener process. Here HMC’s scaling properties proved desirable, since, the non-Markovian properties of fBm made techniques like blocking overly expensive. We achieved this by a well-planned use of the Davies-Harte algorithm to provide the mapping between fBm and uncorrelated white noise that we used to decouple the a-priori involved model parameters from the high-dimensional latent variables. Finally, we showed numerically that our proposed algorithm works efficiently and provided ample comparisons to corroborate it

    Random walks, bisections and gossiping in circulant graphs

    No full text
    Circulant graphs are regular graphs based on Cayley graphs defined on the Abelian group Zn. They are popular network topologies that arise in distributed computing. Using number theoretical tools, we first prove two main results for random directed k-regular circulant graphs with n vertices, when n is sufficiently large and k is fixed. First, for any fixed ε>0, n=p prime and L≥p 1/k (logp)1+1/k+ε, walks of length at most L terminate at every vertex with asymptotically the same probability. Second, for any n, there is a polynomial time algorithm that for almost all undirected 2r-regular circulant graphs finds a vertex bisector and an edge bisector, both of size less than n 1-1/r+o(1). We then prove that the latter result also holds for all (rather than for almost all) 2r-regular circulant graphs with n=p, prime, vertices, while, in general, it does not hold for composite n. Using the bisection results, we provide lower bounds on the number of rounds required by any gossiping algorithms for any n. We introduce generic distributed algorithms to solve the gossip problem in any circulant graphs. We illustrate the efficiency of these algorithms by giving nearly matching upper bounds of the number of rounds required by these algorithms in the vertex-disjoint and the edge-disjoint paths communication models in particular circulant graphs.25 page(s

    Élaboration d'une nouvelle métaheuristique pour le partitionnement de graphe : la méthode de fusion-fission. Application au découpage de l'espace aérien

    Get PDF
    Dans cette thèse, nous étudions des méthodes de partitionnement de graphe et les appliquons au découpage de l'espace aérien, ainsi qu'à d'autres problèmes. L'espace aérien est composé de volumes limités, appelés secteurs de contrôle, chacun étant sous la responsabilité d'un contrôleur. Chaque contrôleur est habilité sur un ensemble de secteurs, appelé zone de qualification. Les secteurs sont également regroupés en centres de contrôle, qui englobent au moins une zone de qualification. Dans le cadre du ciel unique européen, la Commission européenne a prévu la création de blocs fonctionnels d'espace aérien. La création de ces blocs entre pays européens entraînera probablement un redécoupage des centres actuels. Cette thèse propose des outils d'aide à la conception d'un nouveau découpage de l'espace européen en centres et en zones de qualification. À cet effet, plusieurs méthodes sont étudiées : des méthodes de partitionnement classiques,comme l'expansion de région, le multiniveaux ou les algorithmes de type Kernighan-Lin ; des métaheuristiques, comme le recuit simulé, les algorithmes de colonies de fourmis et les algorithmes évolutionnaires ; et une nouvelle méthode que nous avons mise au point, la fusion-fission. C'est cette dernière qui permet de trouver les découpages les plus performants, au sens de la fonction de coût utilisée, pour le découpage de l'espace aérien. Afin de diversifier ses applications, nous l'avons aussi adaptée à la segmentation d'images et à la classification de documents. Enfin, la qualité de cette méthode a été éprouvée sur les bancs de tests classiques du partitionnement de graphe et confrontée aux méthodes concurrentes. Elle a permis de trouver pour plusieurs problèmes de test des partitions dont le coût est le plus bas obtenu jusqu'à présent. ABSTRACT : This thesis studies graph partitioning methods and applies them to airspace partitioning and other partitioning problems. Each air traffic controller supervises a limited space, called an air traffic sector. Controllers have qualifications to work only on a set of sectors, called qualification air zone. Sectors are grouped together into control centers wich include almost one qualification air zone. The European single sky project intended by the European Commission could involve a new airspace partitioning into control centers and qualification air zones. In this framework, this thesis proposes some tools to design the airspace. Classical graph partitioning methods are studied (load-balancing, region growing and multilevel algorithms), a well as some metaheuristics (simulated annealing, ant colonies and evolutionary algorithms). A new method is introduced in this thesis : the fusion-fission method. Compared with the others, this method allows to find the best airspace partitioning for our objective function. To diversify its applications, the fusion- ission method has also been applied to image segmentation and documents clustering. Finally, it has been tested on classical benchmarks and compared with contestant methods. On benchmarks, it finds some new partitions which have the lowest cut ever foun
    corecore