1,076 research outputs found

    A new, efficient algorithm for the Forest Fire Model

    Full text link
    The Drossel-Schwabl Forest Fire Model is one of the best studied models of non-conservative self-organised criticality. However, using a new algorithm, which allows us to study the model on large statistical and spatial scales, it has been shown to lack simple scaling. We thereby show that the considered model is not critical. This paper presents the algorithm and its parallel implementation in detail, together with large scale numerical results for several observables. The algorithm can easily be adapted to related problems such as percolation.Comment: 38 pages, 28 figures, REVTeX 4, RMP style; V2 is for clarifications as well as corrections and update of reference

    High Lundquist Number Simulations of Parker\u27s Model of Coronal Heating: Scaling and Current Sheet Statistics Using Heterogeneous Computing Architectures

    Get PDF
    Parker\u27s model [Parker, Astrophys. J., 174, 499 (1972)] is one of the most discussed mechanisms for coronal heating and has generated much debate. We have recently obtained new scaling results for a 2D version of this problem suggesting that the heating rate becomes independent of resistivity in a statistical steady state [Ng and Bhattacharjee, Astrophys. J., 675, 899 (2008)]. Our numerical work has now been extended to 3D using high resolution MHD numerical simulations. Random photospheric footpoint motion is applied for a time much longer than the correlation time of the motion to obtain converged average coronal heating rates. Simulations are done for different values of the Lundquist number to determine scaling. In the high-Lundquist number limit (S \u3e 1000), the coronal heating rate obtained is consistent with a trend that is independent of the Lundquist number, as predicted by previous analysis and 2D simulations. We will present scaling analysis showing that when the dissipation time is comparable or larger than the correlation time of the random footpoint motion, the heating rate tends to become independent of Lundquist number, and that the magnetic energy production is also reduced significantly. We also present a comprehensive reprogramming of our simulation code to run on NVidia graphics processing units using the Compute Unified Device Architecture (CUDA) and report code performance on several large scale heterogenous machines

    Field theories for stochastic processes

    Get PDF
    This thesis is a collection of collaborative research work which uses field-theoretic techniques to approach three different areas of stochastic dynamics: Branching Processes, First-passage times of processes with are subject to both white and coloured noise, and numerical and analytical aspects of first-passage times in fractional Brownian Motion. Chapter 1 (joint work with Rosalba Garcia Millan, Johannes Pausch, and Gunnar Pruessner, appeared in Phys. Rev. E 98 (6):062107) contains an analysis of non-spatial branching processes with arbitrary offspring distribution. Here our focus lies on the statistics of the number of particles in the system at any given time. We calculate a host of observables using Doi-Peliti field theory and find that close to criticality these observables no longer depend on the details of the offspring distribution, and are thus universal. In Chapter 2 (joint work with Ignacio Bordeu, Saoirse Amarteifio, Rosalba Garcia Millan, Nanxin Wei, and Gunnar Pruessner, appeared in Sci. Rep. 9:15590) we study the number of sites visited by a branching random walk on general graphs. To do so, we introduce a fieldtheoretic tracing mechanism which keeps track of all already visited sites. We find the scaling laws of the moments of the distribution near the critical point. Chapter 3 (joint work with Gunnar Pruessner and Guillaume Salbreux, submitted, arXiv: 2006.00116) provides an analysis of the first-passage time problem for stochastic processes subject to white and coloured noise. By way of a perturbation theory, I give a systematic and controlled expansion of the moment generating function of first-passage times. In Chapter 4, we revise the tracing mechanism found earlier and use it to characterise three different extreme values, first-passage times, running maxima, and mean volume explored. By formulating these in field-theoretic language, we are able to derive new results for a class of non-Markovian stochastic processes. Chapter 5 and 6 are concerned with the first-passage time distribution of fractional Brownian Motion. Chapter 5 (joint work with Kay Wiese, appeared in Phys. Rev. E 101 (4):043312) introduces a new algorithm to sample them efficiently. Chapter 6 (joint work with Maxence Arutkin and Kay Wiese, submitted, arXiv:1908.10801) gives a field-theoretically obtained perturbative result of the first-passage time distribution in the presence of linear and non-linear drift.Open Acces

    Perceptually Driven Simulation

    Get PDF
    This dissertation describes, implements and analyzes a comprehensive system for perceptually-driven virtual reality simulation, based on algorithms which dynamically adjust level of detail (LOD) for entity simulation in order to maximize simulation realism as perceived by the viewer. First we review related work in simulation LOD, and describe the weaknesses of the analogy that has traditionally been drawn between simulation LOD and graphical LOD. We describe the process of perceptual criticality modeling for quantitatively estimating the relative importance of different entities in maintaining perceived realism and predicting the consequences of LOD transitions on perceived realism. We present heuristic cognitive models of human perception, memory, and attention to perform this modeling. We then propose the LOD Trader , a framework for perceptually driven LOD selection and an online approximation algorithm for efficiently identifying useful LOD transitions. We then describe alibi generation , a method of retroactively elaborating a human agent\u27s behavior to maintain its realism under prolonged scrutiny from the viewer, and discuss its integration into a heterogeneous perceptually driven simulation. We then present the Marketplace simulation system and describe how perceptually driven simulation techniques were used to maximize perceived realism, and evaluate their success in doing so. Finally, we summarize the dissertation work performed and its expected contributions to real-time modeling and simulation environments

    Criticality of large delay tolerant networks via directed continuum percolation in space-time

    Full text link

    Performance evaluation of floating content for context-aware applications

    Get PDF
    Context-awareness is a peculiar characteristic of an expanding set of applications that make use of a combination of restricted spatio-temporal locality and mobile communications, to deliver a variety of services. Opportunistic communications satisfy well the communication requirements of these applications, because they naturally incorporate context. Recently, an opportunistic communication paradigm called "Floating Content" was proposed, to support infrastructure-less, distributed content sharing. It aims at ensuring the availability of data within a certain geographic area called "anchor zone". In literature, the focus was on understanding the asymptotic properties of the floating lifetime, i.e., the duration of time for which content floats in the anchor zone. Instead, our objective is to characterize the performance of context-aware applications using floating content as a communication service. First, we present a simple approximate analytical model for accessing the viability of floating content to act as a communication service for context-aware applications. We focus on the "success probability", which captures the likelihood for a user to receive the content when traversing the anchor zone and apply our analysis to estimate the success probability for three representative categories of context-aware applications, and show how the system can be configured to achieve the application’s target. Second, we investigate the impact of different mobility models on the performance of context-aware applications that use floating content. In particular, we consider four different mobility models, and, by using extensive simulation experiments, we investigate the performance of three different categories of context-aware applications. By comparing the simulation results to the performance predictions of our previously proposed simple analytical model, we show that our model can provide useful performance predictions even for complex and realistic mobility models. Simulation results under different mobility models also confirm the viability of floating content to act as a communication service for a variety of context-aware applications. Finally, we investigate the performance of floating content in a real world setting by developing and deploying an Android mobile application based on floating content in an office and a university campus environment. To the best of our knowledge, this is the first ever experimental evaluation of floating content service in a real setting. Our results provide quite interesting indications for the viability and the implementation of applications using floating content in the considered environments. We also propose a novel simple analytical model that accounts for the peculiarities of the mobility patterns in such a real world setting, and that can accurately predict the effectiveness of floating content for the implementation of context-aware applications in an office and a campus setting.Programa Oficial de Doctorado en Ingeniería TelemáticaPresidente: Claudio E. Casetti.- Secretario: Carlos Jesús Bernardos Cano.- Vocal: Antonio García Marqué

    Infrastructure-less D2D Communications through Opportunistic Networks

    Get PDF
    Mención Internacional en el título de doctorIn recent years, we have experienced several social media blackouts, which have shown how much our daily experiences depend on high-quality communication services. Blackouts have occurred because of technical problems, natural disasters, hacker attacks or even due to deliberate censorship actions undertaken by governments. In all cases, the spontaneous reaction of people consisted in finding alternative channels and media so as to reach out to their contacts and partake their experiences. Thus, it has clearly emerged that infrastructured networks—and cellular networks in particular—are well engineered and have been extremely successful so far, although other paradigms should be explored to connect people. The most promising of today’s alternative paradigms is Device-to-Device (D2D) because it allows for building networks almost freely, and because 5G standards are (for the first time) seriously addressing the possibility of using D2D communications. In this dissertation I look at opportunistic D2D networking, possibly operating in an infrastructure-less environment, and I investigate several schemes through modeling and simulation, deriving metrics that characterize their performance. In particular, I consider variations of the Floating Content (FC) paradigm, that was previously proposed in the technical literature. Using FC, it is possible to probabilistically store information over a given restricted local area of interest, by opportunistically spreading it to mobile users while in the area. In more detail, a piece of information which is injected in the area by delivering it to one or more of the mobile users, is opportunistically exchanged among mobile users whenever they come in proximity of one another, progressively reaching most (ideally all) users in the area and thus making the information dwell in the area of interest, like in a sort of distributed storage. While previous works on FC almost exclusively concentrated on the communication component, in this dissertation I look at the storage and computing components of FC, as well as its capability of transferring information from one area of interest to another. I first present background work, including a brief review of my Master Thesis activity, devoted to the design, implementation and validation of a smartphone opportunistic information sharing application. The goal of the app was to collect experimental data that permitted a detailed analysis of the occurring events, and a careful assessment of the performance of opportunistic information sharing services. Through experiments, I showed that many key assumptions commonly adopted in analytical and simulation works do not hold with current technologies. I also showed that the high density of devices and the enforcement of long transmission ranges for links at the edge might counter-intuitively impair performance. The insight obtained during my Master Thesis work was extremely useful to devise smart operating procedures for the opportunistic D2D communications considered in this dissertation. In the core of this dissertation, initially I propose and study a set of schemes to explore and combine different information dissemination paradigms along with real users mobility and predictions focused on the smart diffusion of content over disjoint areas of interest. To analyze the viability of such schemes, I have implemented a Python simulator to evaluate the average availability and lifetime of a piece of information, as well as storage usage and network utilization metrics. Comparing the performance of these predictive schemes with state-of-the-art approaches, results demonstrate the need for smart usage of communication opportunities and storage. The proposed algorithms allow for an important reduction in network activity by decreasing the number of data exchanges by up to 92%, requiring the use of up to 50% less of on-device storage, while guaranteeing the dissemination of information with performance similar to legacy epidemic dissemination protocols. In a second step, I have worked on the analysis of the storage capacity of probabilistic distributed storage systems, developing a simple yet powerful information theoretical analysis based on a mean field model of opportunistic information exchange. I have also extended the previous simulator to compare the numerical results generated by the analytical model to the predictions of realistic simulations under different setups, showing in this way the accuracy of the analytical approach, and characterizing the properties of the system storage capacity. I conclude from analysis and simulated results that when the density of contents seeded in a floating system is larger than the maximum amount which can be sustained by the system in steady state, the mean content availability decreases, and the stored information saturates due to the effects of resource contention. With the presence of static nodes, in a system with infinite host memory and at the mean field limit, there is no upper bound to the amount of injected contents which a floating system can sustain. However, as with no static nodes, by increasing the injected information, the amount of stored information eventually reaches a saturation value which corresponds to the injected information at which the mean amount of time spent exchanging content during a contact is equal to the mean duration of a contact. As a final step of my dissertation, I have also explored by simulation the computing and learning capabilities of an infrastructure-less opportunistic communication, storage and computing system, considering an environment that hosts a distributed Machine Learning (ML) paradigm that uses observations collected in the area over which the FC system operates to infer properties of the area. Results show that the ML system can operate in two regimes, depending on the load of the FC scheme. At low FC load, the ML system in each node operates on observations collected by all users and opportunistically shared among nodes. At high FC load, especially when the data to be opportunistically exchanged becomes too large to be transmitted during the average contact time between nodes, the ML system can only exploit the observations endogenous to each user, which are much less numerous. As a result, I conclude that such setups are adequate to support general instances of distributed ML algorithms with continuous learning, only under the condition of low to medium loads of the FC system. While the load of the FC system induces a sort of phase transition on the ML system performance, the effect of computing load is more progressive. When the computing capacity is not sufficient to train all observations, some will be skipped, and performance progressively declines. In summary, with respect to traditional studies of the FC opportunistic information diffusion paradigm, which only look at the communication component over one area of interest, I have considered three types of extensions by looking at the performance of FC: over several disjoint areas of interest; in terms of information storage capacity; in terms of computing capacity that supports distributed learning. The three topics are treated respectively in Chapters 3 to 5.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Claudio Ettori Casetti.- Secretario: Antonio de la Oliva Delgado.- Vocal: Christoph Somme
    • …
    corecore