13,377 research outputs found

    Adaptive Delivery in Caching Networks

    Full text link
    The problem of content delivery in caching networks is investigated for scenarios where multiple users request identical files. Redundant user demands are likely when the file popularity distribution is highly non-uniform or the user demands are positively correlated. An adaptive method is proposed for the delivery of redundant demands in caching networks. Based on the redundancy pattern in the current demand vector, the proposed method decides between the transmission of uncoded messages or the coded messages of [1] for delivery. Moreover, a lower bound on the delivery rate of redundant requests is derived based on a cutset bound argument. The performance of the adaptive method is investigated through numerical examples of the delivery rate of several specific demand vectors as well as the average delivery rate of a caching network with correlated requests. The adaptive method is shown to considerably reduce the gap between the non-adaptive delivery rate and the lower bound. In some specific cases, using the adaptive method, this gap shrinks by almost 50% for the average rate.Comment: 8 pages,8 figures. Submitted to IEEE transaction on Communications in 2015. A short version of this article was published as an IEEE Communications Letter with DOI: 10.1109/LCOMM.2016.255814

    On p-Robust Saturation for hp-AFEM

    Full text link
    We consider the standard adaptive finite element loop SOLVE, ESTIMATE, MARK, REFINE, with ESTIMATE being implemented using the pp-robust equilibrated flux estimator, and MARK being D\"orfler marking. As a refinement strategy we employ pp-refinement. We investigate the question by which amount the local polynomial degree on any marked patch has to be increase in order to achieve a pp-independent error reduction. The resulting adaptive method can be turned into an instance optimal hphp-adaptive method by the addition of a coarsening routine

    An automatic adaptive method to combine summary statistics in approximate Bayesian computation

    Get PDF
    To infer the parameters of mechanistic models with intractable likelihoods, techniques such as approximate Bayesian computation (ABC) are increasingly being adopted. One of the main disadvantages of ABC in practical situations, however, is that parameter inference must generally rely on summary statistics of the data. This is particularly the case for problems involving high-dimensional data, such as biological imaging experiments. However, some summary statistics contain more information about parameters of interest than others, and it is not always clear how to weight their contributions within the ABC framework. We address this problem by developing an automatic, adaptive algorithm that chooses weights for each summary statistic. Our algorithm aims to maximize the distance between the prior and the approximate posterior by automatically adapting the weights within the ABC distance function. Computationally, we use a nearest neighbour estimator of the distance between distributions. We justify the algorithm theoretically based on properties of the nearest neighbour distance estimator. To demonstrate the effectiveness of our algorithm, we apply it to a variety of test problems, including several stochastic models of biochemical reaction networks, and a spatial model of diffusion, and compare our results with existing algorithms

    Numerical Methods for Singular Perturbation Problems

    Get PDF
    Consider the two-point boundary value problem for a stiff system of ordinary differential equations. An adaptive method to solve these problems even when turning points are present is discussed

    An automatic adaptive method to combine summary statistics in approximate Bayesian computation

    Full text link
    To infer the parameters of mechanistic models with intractable likelihoods, techniques such as approximate Bayesian computation (ABC) are increasingly being adopted. One of the main disadvantages of ABC in practical situations, however, is that parameter inference must generally rely on summary statistics of the data. This is particularly the case for problems involving high-dimensional data, such as biological imaging experiments. However, some summary statistics contain more information about parameters of interest than others, and it is not always clear how to weight their contributions within the ABC framework. We address this problem by developing an automatic, adaptive algorithm that chooses weights for each summary statistic. Our algorithm aims to maximize the distance between the prior and the approximate posterior by automatically adapting the weights within the ABC distance function. Computationally, we use a nearest neighbour estimator of the distance between distributions. We justify the algorithm theoretically based on properties of the nearest neighbour distance estimator. To demonstrate the effectiveness of our algorithm, we apply it to a variety of test problems, including several stochastic models of biochemical reaction networks, and a spatial model of diffusion, and compare our results with existing algorithms
    • …
    corecore