60,659 research outputs found

    Averting HIV Infections in New York City: A Modeling Approach Estimating the Future Impact of Additional Behavioral and Biomedical HIV Prevention Strategies

    Get PDF
    Background:New York City (NYC) remains an epicenter of the HIV epidemic in the United States. Given the variety of evidence-based HIV prevention strategies available and the significant resources required to implement each of them, comparative studies are needed to identify how to maximize the number of HIV cases prevented most economically.Methods:A new model of HIV disease transmission was developed integrating information from a previously validated micro-simulation HIV disease progression model. Specification and parameterization of the model and its inputs, including the intervention portfolio, intervention effects and costs were conducted through a collaborative process between the academic modeling team and the NYC Department of Health and Mental Hygiene. The model projects the impact of different prevention strategies, or portfolios of prevention strategies, on the HIV epidemic in NYC.Results:Ten unique interventions were able to provide a prevention benefit at an annual program cost of less than 360,000,thethresholdforconsiderationasacostsavingintervention(becauseofoffsetsbyfutureHIVtreatmentcostsaverted).Anoptimizedportfolioofthesespecificinterventionscouldresultinuptoa34360,000, the threshold for consideration as a cost-saving intervention (because of offsets by future HIV treatment costs averted). An optimized portfolio of these specific interventions could result in up to a 34% reduction in new HIV infections over the next 20 years. The cost-per-infection averted of the portfolio was estimated to be 106,378; the total cost was in excess of 2billion(overthe20yearperiod,orapproximately2 billion (over the 20 year period, or approximately 100 million per year, on average). The cost-savings of prevented infections was estimated at more than 5billion(orapproximately5 billion (or approximately 250 million per year, on average).Conclusions:Optimal implementation of a portfolio of evidence-based interventions can have a substantial, favorable impact on the ongoing HIV epidemic in NYC and provide future cost-saving despite significant initial costs. © 2013 Kessler et al

    Caching with Partial Adaptive Matching

    Full text link
    We study the caching problem when we are allowed to match each user to one of a subset of caches after its request is revealed. We focus on non-uniformly popular content, specifically when the file popularities obey a Zipf distribution. We study two extremal schemes, one focusing on coded server transmissions while ignoring matching capabilities, and the other focusing on adaptive matching while ignoring potential coding opportunities. We derive the rates achieved by these schemes and characterize the regimes in which one outperforms the other. We also compare them to information-theoretic outer bounds, and finally propose a hybrid scheme that generalizes ideas from the two schemes and performs at least as well as either of them in most memory regimes.Comment: 35 pages, 7 figures. Shorter versions have appeared in IEEE ISIT 2017 and IEEE ITW 201

    Relieving the Wireless Infrastructure: When Opportunistic Networks Meet Guaranteed Delays

    Full text link
    Major wireless operators are nowadays facing network capacity issues in striving to meet the growing demands of mobile users. At the same time, 3G-enabled devices increasingly benefit from ad hoc radio connectivity (e.g., Wi-Fi). In this context of hybrid connectivity, we propose Push-and-track, a content dissemination framework that harnesses ad hoc communication opportunities to minimize the load on the wireless infrastructure while guaranteeing tight delivery delays. It achieves this through a control loop that collects user-sent acknowledgements to determine if new copies need to be reinjected into the network through the 3G interface. Push-and-Track includes multiple strategies to determine how many copies of the content should be injected, when, and to whom. The short delay-tolerance of common content, such as news or road traffic updates, make them suitable for such a system. Based on a realistic large-scale vehicular dataset from the city of Bologna composed of more than 10,000 vehicles, we demonstrate that Push-and-Track consistently meets its delivery objectives while reducing the use of the 3G network by over 90%.Comment: Accepted at IEEE WoWMoM 2011 conferenc

    Impact of variance components on reliability of absolute quantification using digital PCR

    Get PDF
    Background: Digital polymerase chain reaction (dPCR) is an increasingly popular technology for detecting and quantifying target nucleic acids. Its advertised strength is high precision absolute quantification without needing reference curves. The standard data analytic approach follows a seemingly straightforward theoretical framework but ignores sources of variation in the data generating process. These stem from both technical and biological factors, where we distinguish features that are 1) hard-wired in the equipment, 2) user-dependent and 3) provided by manufacturers but may be adapted by the user. The impact of the corresponding variance components on the accuracy and precision of target concentration estimators presented in the literature is studied through simulation. Results: We reveal how system-specific technical factors influence accuracy as well as precision of concentration estimates. We find that a well-chosen sample dilution level and modifiable settings such as the fluorescence cut-off for target copy detection have a substantial impact on reliability and can be adapted to the sample analysed in ways that matter. User-dependent technical variation, including pipette inaccuracy and specific sources of sample heterogeneity, leads to a steep increase in uncertainty of estimated concentrations. Users can discover this through replicate experiments and derived variance estimation. Finally, the detection performance can be improved by optimizing the fluorescence intensity cut point as suboptimal thresholds reduce the accuracy of concentration estimates considerably. Conclusions: Like any other technology, dPCR is subject to variation induced by natural perturbations, systematic settings as well as user-dependent protocols. Corresponding uncertainty may be controlled with an adapted experimental design. Our findings point to modifiable key sources of uncertainty that form an important starting point for the development of guidelines on dPCR design and data analysis with correct precision bounds. Besides clever choices of sample dilution levels, experiment-specific tuning of machine settings can greatly improve results. Well-chosen data-driven fluorescence intensity thresholds in particular result in major improvements in target presence detection. We call on manufacturers to provide sufficiently detailed output data that allows users to maximize the potential of the method in their setting and obtain high precision and accuracy for their experiments
    corecore