178 research outputs found

    Optimizing experimental parameters for tracking of diffusing particles

    Get PDF
    We describe how a single-particle tracking experiment should be designed in order for its recorded trajectories to contain the most information about a tracked particle's diffusion coefficient. The precision of estimators for the diffusion coefficient is affected by motion blur, limited photon statistics, and the length of recorded time-series. We demonstrate for a particle undergoing free diffusion that precision is negligibly affected by motion blur in typical experiments, while optimizing photon counts and the number of recorded frames is the key to precision. Building on these results, we describe for a wide range of experimental scenarios how to choose experimental parameters in order to optimize the precision. Generally, one should choose quantity over quality: experiments should be designed to maximize the number of frames recorded in a time-series, even if this means lower information content in individual frames

    Temporal Gillespie algorithm: Fast simulation of contagion processes on time-varying networks

    Full text link
    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.Comment: Minor changes and updates to reference

    How memory generates heterogeneous dynamics in temporal networks

    Full text link
    Empirical temporal networks display strong heterogeneities in their dynamics, which profoundly affect processes taking place on these networks, such as rumor and epidemic spreading. Despite the recent wealth of data on temporal networks, little work has been devoted to the understanding of how such heterogeneities can emerge from microscopic mechanisms at the level of nodes and links. Here we show that long-term memory effects are present in the creation and disappearance of links in empirical networks. We thus consider a simple generative modeling framework for temporal networks able to incorporate these memory mechanisms. This allows us to study separately the role of each of these mechanisms in the emergence of heterogeneous network dynamics. In particular, we show analytically and numerically how heterogeneous distributions of contact durations, of inter-contact durations and of numbers of contacts per link emerge. We also study the individual effect of heterogeneities on dynamical processes, such as the paradigmatic Susceptible-Infected epidemic spreading model. Our results confirm in particular the crucial role of the distributions of inter-contact durations and of the numbers of contacts per link

    Compensating for population sampling in simulations of epidemic spread on temporal contact networks

    Full text link
    Data describing human interactions often suffer from incomplete sampling of the underlying population. As a consequence, the study of contagion processes using data-driven models can lead to a severe underestimation of the epidemic risk. Here we present a systematic method to alleviate this issue and obtain a better estimation of the risk in the context of epidemic models informed by high-resolution time-resolved contact data. We consider several such data sets collected in various contexts and perform controlled resampling experiments. We show how the statistical information contained in the resampled data can be used to build a series of surrogate versions of the unknown contacts. We simulate epidemic processes on the resulting reconstructed data sets and show that it is possible to obtain good estimates of the outcome of simulations performed using the complete data set. We discuss limitations and potential improvements of our method

    Impact of spatially constrained sampling of temporal contact networks on the evaluation of the epidemic risk

    Full text link
    The ability to directly record human face-to-face interactions increasingly enables the development of detailed data-driven models for the spread of directly transmitted infectious diseases at the scale of individuals. Complete coverage of the contacts occurring in a population is however generally unattainable, due for instance to limited participation rates or experimental constraints in spatial coverage. Here, we study the impact of spatially constrained sampling on our ability to estimate the epidemic risk in a population using such detailed data-driven models. The epidemic risk is quantified by the epidemic threshold of the susceptible-infectious-recovered-susceptible model for the propagation of communicable diseases, i.e. the critical value of disease transmissibility above which the disease turns endemic. We verify for both synthetic and empirical data of human interactions that the use of incomplete data sets due to spatial sampling leads to the underestimation of the epidemic risk. The bias is however smaller than the one obtained by uniformly sampling the same fraction of contacts: it depends nonlinearly on the fraction of contacts that are recorded and becomes negligible if this fraction is large enough. Moreover, it depends on the interplay between the timescales of population and spreading dynamics.Comment: 21 pages, 7 figure

    Data on face-to-face contacts in an office building suggests a low-cost vaccination strategy based on community linkers

    Full text link
    Empirical data on contacts between individuals in social contexts play an important role in providing information for models describing human behavior and how epidemics spread in populations. Here, we analyze data on face-to-face contacts collected in an office building. The statistical properties of contacts are similar to other social situations, but important differences are observed in the contact network structure. In particular, the contact network is strongly shaped by the organization of the offices in departments, which has consequences in the design of accurate agent-based models of epidemic spread. We consider the contact network as a potential substrate for infectious disease spread and show that its sparsity tends to prevent outbreaks of rapidly spreading epidemics. Moreover, we define three typical behaviors according to the fraction ff of links each individual shares outside its own department: residents, wanderers and linkers. Linkers (f50%f\sim 50\%) act as bridges in the network and have large betweenness centralities. Thus, a vaccination strategy targeting linkers efficiently prevents large outbreaks. As such a behavior may be spotted a priori in the offices' organization or from surveys, without the full knowledge of the time-resolved contact network, this result may help the design of efficient, low-cost vaccination or social-distancing strategies

    Intracellular signaling by diffusion: can waves of hydrogen peroxide transmit intracellular information in plant cells?

    Get PDF
    Amplitude- and frequency-modulated waves of Ca(2+) ions transmit information inside cells. Reactive Oxygen Species (ROS), specifically hydrogen peroxide, have been proposed to have a similar role in plant cells. We consider the feasibility of such an intracellular communication system in view of the physical and biochemical conditions in plant cells. As model system, we use a H(2)O(2) signal originating at the plasma membrane (PM) and spreading through the cytosol. We consider two maximally simple types of signals, isolated pulses and harmonic oscillations. First we consider the basic limits on such signals as regards signal origin, frequency, amplitude, and distance. Then we establish the impact of ROS-removing enzymes on the ability of H(2)O(2) to transmit signals. Finally, we consider to what extent cytoplasmic streaming distorts signals. This modeling allows us to predict the conditions under which diffusion-mediated signaling is possible. We show that purely diffusive transmission of intracellular information by H(2)O(2) over a distance of 1 μm (typical distance between organelles, which may function as relay stations) is possible at frequencies well above 1 Hz, which is the highest frequency observed experimentally. This allows both frequency and amplitude modulation of the signal. Signaling over a distance of 10 μm (typical distance between the PM and the nucleus) may be possible, but requires high signal amplitudes or, equivalently, a very low detection threshold. Furthermore, at this longer distance a high rate of enzymatic degradation is required to make signaling at frequencies above 0.1 Hz possible. In either case, cytoplasmic streaming does not seriously disturb signals. We conclude that although purely diffusion-mediated signaling without relaying stations is theoretically possible, it is unlikely to work in practice, since it requires a much faster enzymatic degradation and a much lower cellular background concentration of H(2)O(2) than observed experimentally

    Compression-based inference of network motif sets

    Full text link
    Physical and functional constraints on biological networks lead to complex topological patterns across multiple scales in their organization. A particular type of higher-order network feature that has received considerable interest is network motifs, defined as statistically regular subgraphs. These may implement fundamental logical and computational circuits and are referred as ``building blocks of complex networks''. Their well-defined structures and small sizes also enables the testing of their functions in synthetic and natural biological experiments. The statistical inference of network motifs is however fraught with difficulties, from defining and sampling the right null model to accounting for the large number of possible motifs and their potential correlations in statistical testing. Here we develop a framework for motif mining based on lossless network compression using subgraph contractions. The minimum description length principle allows us to select the most significant set of motifs as well as other prominent network features in terms of their combined compression of the network. The approach inherently accounts for multiple testing and correlations between subgraphs and does not rely on a priori specification of an appropriate null model. This provides an alternative definition of motif significance which guarantees more robust statistical inference. Our approach overcomes the common problems in classic testing-based motif analysis. We apply our methodology to perform comparative connectomics by evaluating the compressibility and the circuit motifs of a range of synaptic-resolution neural connectomes

    Estimation of motility parameters from trajectory data:A condensate of our recent results

    Get PDF
    International audienceGiven a theoretical model for a self-propelled particle or micro-organism, how does one optimally determine the parameters of the model from experimental data in the form of a time-lapse recorded trajectory? For very long trajectories, one has very good statistics, and optimality may matter little. However, for biological micro-organisms, one may not control the duration of recordings, and then optimality can matter. This is especially the case if one is interested in individuality and hence cannot improve statistics by taking population averages over many trajectories. One can learn much about this problem by studying its simplest case, pure diffusion with no self-propagation. This is an interesting problem also in its own right for the very same reasons: interest in individuality and short trajectories. We summarize our recent results on this latter issue here and speculate about the extent to which similar results may be obtained also for self-propelled particles
    corecore