37 research outputs found

    Optimising node selection probabilities in multi-hop M/D/1 queuing networks to reduce latency of Tor

    Get PDF
    In this paper the expected cell latency for multi-hop M/D/1 queuing networks, where users choose nodes randomly according to some distribution, is derived. It is shown that the resulting optimisation surface is convex, and thus gradient based methods can be used to find the optimal node assignment probabilities. This is applied to a typical snapshot of the Tor anonymity network at 50%usage, and leads to a reduction in expected cell latency from 11.7 ms using the original method of assigning node selection probabilities to 1.3 ms. It is also shown that even if the usage is not known exactly, the proposed method still leads to an improvement.This is the accepted manuscript version. The final version is available from IET at http://digital-library.theiet.org/content/journals/10.1049/el.2014.2136

    Optimising node selection probabilities in multi-hop M/D/1 queuing networks to reduce latency of Tor

    Get PDF
    The expected cell latency for multi-hop M/D/1 queuing networks, where users choose nodes randomly according to some distribution, is derived. It is shown that the resulting optimisation surface is convex, and thus gradient-based methods can be used to find the optimal node assignment probabilities. This is applied to a typical snapshot of the Tor anonymity network at 50% usage, and leads to a reduction in expected cell latency from 11.7 ms using the original method of assigning node selection probabilities to 1.3 ms. It is also shown that even if the usage is not known exactly, the proposed method still leads to an improvement

    Bayesian changepoint analysis for atomic force microscopy and soft material indentation

    Full text link
    Material indentation studies, in which a probe is brought into controlled physical contact with an experimental sample, have long been a primary means by which scientists characterize the mechanical properties of materials. More recently, the advent of atomic force microscopy, which operates on the same fundamental principle, has in turn revolutionized the nanoscale analysis of soft biomaterials such as cells and tissues. This paper addresses the inferential problems associated with material indentation and atomic force microscopy, through a framework for the changepoint analysis of pre- and post-contact data that is applicable to experiments across a variety of physical scales. A hierarchical Bayesian model is proposed to account for experimentally observed changepoint smoothness constraints and measurement error variability, with efficient Monte Carlo methods developed and employed to realize inference via posterior sampling for parameters such as Young's modulus, a key quantifier of material stiffness. These results are the first to provide the materials science community with rigorous inference procedures and uncertainty quantification, via optimized and fully automated high-throughput algorithms, implemented as the publicly available software package BayesCP. To demonstrate the consistent accuracy and wide applicability of this approach, results are shown for a variety of data sets from both macro- and micro-materials experiments--including silicone, neurons, and red blood cells--conducted by the authors and others.Comment: 20 pages, 6 figures; submitted for publicatio

    Bayesian separation of spectral sources under non-negativity and full additivity constraints

    Get PDF
    This paper addresses the problem of separating spectral sources which are linearly mixed with unknown proportions. The main difficulty of the problem is to ensure the full additivity (sum-to-one) of the mixing coefficients and non-negativity of sources and mixing coefficients. A Bayesian estimation approach based on Gamma priors was recently proposed to handle the non-negativity constraints in a linear mixture model. However, incorporating the full additivity constraint requires further developments. This paper studies a new hierarchical Bayesian model appropriate to the non-negativity and sum-to-one constraints associated to the regressors and regression coefficients of linear mixtures. The estimation of the unknown parameters of this model is performed using samples generated using an appropriate Gibbs sampler. The performance of the proposed algorithm is evaluated through simulation results conducted on synthetic mixture models. The proposed approach is also applied to the processing of multicomponent chemical mixtures resulting from Raman spectroscopy.Comment: v4: minor grammatical changes; Signal Processing, 200

    INTEGRAL/SPI data segmentation to retrieve sources intensity variations

    Get PDF
    International audienceContext. The INTEGRAL/SPI, X/γ-ray spectrometer (20 keV–8 MeV) is an instrument for which recovering source intensity variations is not straightforward and can constitute a difficulty for data analysis. In most cases, determining the source intensity changes between exposures is largely based on a priori information.Aims. We propose techniques that help to overcome the difficulty related to source intensity variations, which make this step more rational. In addition, the constructed “synthetic” light curves should permit us to obtain a sky model that describes the data better and optimizes the source signal-to-noise ratios.Methods. For this purpose, the time intensity variation of each source was modeled as a combination of piecewise segments of time during which a given source exhibits a constant intensity. To optimize the signal-to-noise ratios, the number of segments was minimized. We present a first method that takes advantage of previous time series that can be obtained from another instrument on-board the INTEGRAL observatory. A data segmentation algorithm was then used to synthesize the time series into segments. The second method no longer needs external light curves, but solely SPI raw data. For this, we developed a specific algorithm that involves the SPI transfer function.Results. The time segmentation algorithms that were developed solve a difficulty inherent to the SPI instrument, which is the intensity variations of sources between exposures, and it allows us to obtain more information about the sources’ behavior
    corecore