2,489 research outputs found

    Optimizing experimental parameters for tracking of diffusing particles

    Get PDF
    We describe how a single-particle tracking experiment should be designed in order for its recorded trajectories to contain the most information about a tracked particle's diffusion coefficient. The precision of estimators for the diffusion coefficient is affected by motion blur, limited photon statistics, and the length of recorded time-series. We demonstrate for a particle undergoing free diffusion that precision is negligibly affected by motion blur in typical experiments, while optimizing photon counts and the number of recorded frames is the key to precision. Building on these results, we describe for a wide range of experimental scenarios how to choose experimental parameters in order to optimize the precision. Generally, one should choose quantity over quality: experiments should be designed to maximize the number of frames recorded in a time-series, even if this means lower information content in individual frames

    Temporal Gillespie algorithm: Fast simulation of contagion processes on time-varying networks

    Full text link
    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.Comment: Minor changes and updates to reference

    How memory generates heterogeneous dynamics in temporal networks

    Full text link
    Empirical temporal networks display strong heterogeneities in their dynamics, which profoundly affect processes taking place on these networks, such as rumor and epidemic spreading. Despite the recent wealth of data on temporal networks, little work has been devoted to the understanding of how such heterogeneities can emerge from microscopic mechanisms at the level of nodes and links. Here we show that long-term memory effects are present in the creation and disappearance of links in empirical networks. We thus consider a simple generative modeling framework for temporal networks able to incorporate these memory mechanisms. This allows us to study separately the role of each of these mechanisms in the emergence of heterogeneous network dynamics. In particular, we show analytically and numerically how heterogeneous distributions of contact durations, of inter-contact durations and of numbers of contacts per link emerge. We also study the individual effect of heterogeneities on dynamical processes, such as the paradigmatic Susceptible-Infected epidemic spreading model. Our results confirm in particular the crucial role of the distributions of inter-contact durations and of the numbers of contacts per link

    Panel performance: Modelling variation in sensory profiling data by multiway analysis

    Get PDF
    Sensory profiling data is essentially three-way data where samples, attributes and assessors are the three dimensions of information. It is common practice to average over the assessors and focus the analysis on the relations between samples and sensory descriptors. However, since assessor reliability can not be controlled in advance, posthoc analysis on assessors is needed to assess performance of the individual and at the panel level. For this purpose, multiway analysis is a very efficient data method as it provides information on samples, attributes and assessors, simultaneously [1]. PARAllel FACtor (PARAFAC) analysis is one of the most used multiway methods in sensory analysis [2][3]. It is based on two basic assumptions: 1) there exist latent variables behind the identified sensory descriptors describing the variation among the products; 2) assessors have different sensitivities to these common latent variables. However, assessors may perceive the factors differently, so the assumption of “common latent variables” becomes questionable. This may happen when the panel is not well trained and/or the samples present subtle differences difficult to detect. In this work a more flexible approach to the analysis of sensory data is presented. Specifically, the work proposes to use PARAFAC2 modelling [4] as it allows each assessor to have an individual idiosyncratic perceptive model. The data was obtained from a descriptive sensory analysis of organic milk samples. Results show that PARAFAC2 is very useful to highlight disagreement in the panel on specific attributes and to detect outlying assessors. In addition, by using PARAFAC2 an improvement in the description of samples is also achieved. On the other hand, PARAFAC has to be preferred to PARAFAC2 when a good panel agreement is observed, since it provides more stable solutions and no further gain in information is obtained from PARAFAC2. Finally, the work proposes an index to measure the performance of each assessor based on individual sensitivity and reproducibility

    Compensating for population sampling in simulations of epidemic spread on temporal contact networks

    Full text link
    Data describing human interactions often suffer from incomplete sampling of the underlying population. As a consequence, the study of contagion processes using data-driven models can lead to a severe underestimation of the epidemic risk. Here we present a systematic method to alleviate this issue and obtain a better estimation of the risk in the context of epidemic models informed by high-resolution time-resolved contact data. We consider several such data sets collected in various contexts and perform controlled resampling experiments. We show how the statistical information contained in the resampled data can be used to build a series of surrogate versions of the unknown contacts. We simulate epidemic processes on the resulting reconstructed data sets and show that it is possible to obtain good estimates of the outcome of simulations performed using the complete data set. We discuss limitations and potential improvements of our method

    Technical Efficiency of the Danish Trawl fleet: Are the Industrial Vessels Better than Others?

    Get PDF
    Technical efficiency in the Danish trawl fishery in the North Sea is estimated for 1997 and 1998 by a stochastic production frontier model. This model allows noise when the frontier and the technical efficiency is found, which for fisheries is a reasonable assumption. The results show that the production frontier can be modelled by a translog function without time effects and a technical ineffi-ciency function. The type of fishery (industrial or consumption), size of vessel (greater or lesser than 60 GRT) and year give a good explanation for the ineffi-ciency in the fleet. The average technical efficiency is estimated to be 0.82. On average, industrial vessels have a higher technical efficiency than human con-sumption vessels, and smaller vessels have higher technical efficiency than lar-ger vessels. In sum, the analysis reveals that vessel larger than 60 GRT and fishing industrial species are the most efficient.Technical efficiency, stochastic production frontier, Danish trawl fishery

    Composition of volatile compounds in bovine milk heat treated by instant infusion pasteurization and correlation to sensory analysis

    Get PDF
    Volatile compounds in skim milk and nonstandardised milk subjected to instant infusion pasteurisation at 80°C, 100°C and 120°C were compared with raw milk, high temperature short time pasteurised milk and milk pasteurised at 85°C/30 s. The composition of volatile compounds differed between infusion pasteurisation treated samples and the reference pasteurisations. The sensory properties of skim milk subjected to instant infusion pasteurisation were described by negative attributes, such as cardboard sour and plastic flavours, which are not associated normally with fresh milk. Partial least squares modelling showed good correlation between the volatile compounds and the sensory properties, indicating the predictive and possible causal importance of the volatile compounds for the sensory characteristics

    High Redshift Standard Candles: Predicted Cosmological Constraints

    Get PDF
    We investigate whether future measurements of high redshift standard candles (HzSCs) will be a powerful probe of dark energy, when compared to other types of planned dark energy measurements. Active galactic nuclei and gamma ray bursts have both been proposed as potential HzSC candidates. Due to their high luminosity, they can be used to probe unexplored regions in the expansion history of the universe. Information from these regions can help constrain the properties of dark energy, and in particular, whether it varies over time. We consider both linear and piecewise parameterizations of the dark energy equation of state, w(z)w(z), and assess the optimal redshift distribution a high-redshift standard-candle survey could take to constrain these models. The more general the form of the dark energy equation of state w(z)w(z) being tested, the more useful high-redshift standard candles become. For a linear parameterization of w(z)w(z), HzSCs give only small improvements over planned supernova and baryon acoustic oscillation measurements; a wide redshift range with many low redshift points is optimal to constrain this linear model. However to constrain a general, and thus potentially more informative, form of w(z)w(z), having many HzSCs can significantly improve limits on the nature of dark energy.Comment: Accepted MNRAS, 27 Pages, 15 figures, matches published versio

    Monitoring panel performance within and between sensory experiments by multi-way analysis

    Get PDF
    In sensory analysis a panel of trained assessors evaluates a set of samples according to specific sensory descriptors. The training improves objectivity and reliability of assessments. However, there can be individual differences between assessors left after the training that should be taken into account in the analysis. Monitoring panel performance is then crucial for optimal sensory evaluations. The quality of the results is strongly dependent on the performance of each assessor and of the panel as a whole. The present work proposes to analyze the panel performance within single sensory evaluations and between consecutive evaluations. The basic idea is to use multi-way models to handle the three-way nature of the sensory data. Specifically, a PARAFAC model is used to investigate the panel performance in the single experiment. N-PLS model is used to test the predictive ability of the panel on each experiment. A PARAFAC model is also used for monitoring panel performance over different experiments

    Impact of spatially constrained sampling of temporal contact networks on the evaluation of the epidemic risk

    Full text link
    The ability to directly record human face-to-face interactions increasingly enables the development of detailed data-driven models for the spread of directly transmitted infectious diseases at the scale of individuals. Complete coverage of the contacts occurring in a population is however generally unattainable, due for instance to limited participation rates or experimental constraints in spatial coverage. Here, we study the impact of spatially constrained sampling on our ability to estimate the epidemic risk in a population using such detailed data-driven models. The epidemic risk is quantified by the epidemic threshold of the susceptible-infectious-recovered-susceptible model for the propagation of communicable diseases, i.e. the critical value of disease transmissibility above which the disease turns endemic. We verify for both synthetic and empirical data of human interactions that the use of incomplete data sets due to spatial sampling leads to the underestimation of the epidemic risk. The bias is however smaller than the one obtained by uniformly sampling the same fraction of contacts: it depends nonlinearly on the fraction of contacts that are recorded and becomes negligible if this fraction is large enough. Moreover, it depends on the interplay between the timescales of population and spreading dynamics.Comment: 21 pages, 7 figure
    corecore