1,902 research outputs found

    The Cost-Effectiveness of Early Access to HIV Services and Starting cART in the UK 1996–2008

    Get PDF
    To calculate use, cost and cost-effectiveness of people living with HIV (PLHIV) starting routine treatment and care before starting combination antiretroviral therapy (cART) and PLHIV starting first-line 2NRTIs+NNRTI or 2NRTIs+PI(boosted), comparing PLHIV with CD4≀200 cells/mm3 and CD4>200 cells/mm3. Few studies have calculated the use, cost and cost-effectiveness of routine treatment and care before starting cART and starting cART above and below CD4 200 cells/mm3.Use, costs and cost-effectiveness were calculated for PLHIV in routine pre-cART and starting first-line cART, comparing CD4≀200 cells/mm3 with CD4>200 cells/mm3 (2008 UK prices).cART naĂŻve patients CD4≀200 cells/mm3 had an annual cost of ÂŁ6,407 (95%CI ÂŁ6,382 to ÂŁ6,425) PPY compared with ÂŁ2,758 (95%CI ÂŁ2,752 to ÂŁ2,761) PPY for those with CD4>200 cells/mm3; cost per life year gained of pre-cART treatment and care for those with CD4>200 cells/mm3 was ÂŁ1,776 (cost-saving to ÂŁ2,752). Annual cost for starting 2NRTIs+NNRTI or 2NRTIs+PI(boosted) with CD4≀200 cells/mm3 was ÂŁ12,812 (95%CI ÂŁ12,685-ÂŁ12,937) compared with ÂŁ10,478 (95%CI ÂŁ10,376-ÂŁ10,581) for PLHIV with CD4>200 cells/mm3. Cost per additional life-year gained on first-line therapy for those with CD4>200 cells/mm3 was ÂŁ4639 (ÂŁ3,967 to ÂŁ2,960).PLHIV starting to use HIV services before CD4≀200 cells/mm3 is cost-effective and enables them to be monitored so they start cART with a CD4>200 cells/mm3, which results in better outcomes and is cost-effective. However, 25% of PLHIV accessing services continue to present with CD4≀200 cells/mm3. This highlights the need to investigate the cost-effectiveness of testing and early treatment programs for key populations in the UK

    Time-Energy Tradeoffs for Evacuation by Two Robots in the Wireless Model

    Full text link
    Two robots stand at the origin of the infinite line and are tasked with searching collaboratively for an exit at an unknown location on the line. They can travel at maximum speed bb and can change speed or direction at any time. The two robots can communicate with each other at any distance and at any time. The task is completed when the last robot arrives at the exit and evacuates. We study time-energy tradeoffs for the above evacuation problem. The evacuation time is the time it takes the last robot to reach the exit. The energy it takes for a robot to travel a distance xx at speed ss is measured as xs2xs^2. The total and makespan evacuation energies are respectively the sum and maximum of the energy consumption of the two robots while executing the evacuation algorithm. Assuming that the maximum speed is bb, and the evacuation time is at most cdcd, where dd is the distance of the exit from the origin, we study the problem of minimizing the total energy consumption of the robots. We prove that the problem is solvable only for bc≄3bc \geq 3. For the case bc=3bc=3, we give an optimal algorithm, and give upper bounds on the energy for the case bc>3bc>3. We also consider the problem of minimizing the evacuation time when the available energy is bounded by Δ\Delta. Surprisingly, when Δ\Delta is a constant, independent of the distance dd of the exit from the origin, we prove that evacuation is possible in time O(d3/2log⁥d)O(d^{3/2}\log d), and this is optimal up to a logarithmic factor. When Δ\Delta is linear in dd, we give upper bounds on the evacuation time.Comment: This is the full version of the paper with the same title which will appear in the proceedings of the 26th International Colloquium on Structural Information and Communication Complexity (SIROCCO'19) L'Aquila, Italy during July 1-4, 201

    Conditional q-Entropies and Quantum Separability: A Numerical Exploration

    Full text link
    We revisit the relationship between quantum separability and the sign of the relative q-entropies of composite quantum systems. The q-entropies depend on the density matrix eigenvalues p_i through the quantity omega_q = sum_i p_i^q. Renyi's and Tsallis' measures constitute particular instances of these entropies. We perform a systematic numerical survey of the space of mixed states of two-qubit systems in order to determine, as a function of the degree of mixture, and for different values of the entropic parameter q, the volume in state space occupied by those states characterized by positive values of the relative entropy. Similar calculations are performed for qubit-qutrit systems and for composite systems described by Hilbert spaces of larger dimensionality. We pay particular attention to the limit case q --> infinity. Our numerical results indicate that, as the dimensionalities of both subsystems increase, composite quantum systems tend, as far as their relative q-entropies are concerned, to behave in a classical way

    Snapshot Provisioning of Cloud Application Stacks to Face Traffic Surges

    No full text
    Traffic surges, like the Slashdot effect, occur when a web application is overloaded by a huge number of requests, potentially leading to unavailability. Unfortunately, such traffic variations are generally totally unplanned, of great amplitude, within a very short period, and a variable delay to return to a normal regime. In this report, we introduce PeakForecast as an elastic middleware solution to detect and absorb a traffic surge. In particular, PeakForecast can, from a trace of queries received in the last seconds, minutes or hours, to detect if the underlying system is facing a traffic surge or not, and then estimate the future traffic using a forecast model with an acceptable precision, thereby calculating the number of resources required to absorb the remaining traffic to come. We validate our solution by experimental results demonstrating that it can provide instantaneous elasticity of resources for traffic surges observed on the Japanese version of Wikipedia during the Fukushima Daiichi nuclear disaster in March 2011.Les pics de trafic, tels que l'effet Slashdot, apparaissent lorsqu'une application web doit faire face un nombre important de requĂȘtes qui peut potentiellement entraĂźner une mise hors service de l'application. Malheureusement, de telles variations de traffic sont en gĂ©nĂ©ral totalement imprĂ©vues et d'une grande amplitude, arrivent pendant une trĂšs courte pĂ©riode de temps et le retour Ă  un rĂ©gime normal prend un dĂ©lai variable. Dans ce rapport, nous prĂ©sentons PeakForecast qui est une solution intergicielle Ă©lastique pour dĂ©tecter et absorber les pics de trafic. En particulier, PeakForecast peut, Ă  partir des traces de requĂȘtes reçues dans les derniĂšres secondes, minutes ou heures, dĂ©tecter si le systĂšme sous-jacent fait face ou non Ă  un pic de trafic, estimer le trafic futur en utilisant un modĂšle de prĂ©diction suffisamment prĂ©cis, et calculer le nombre de ressources nĂ©cessaires Ă  l'absorption du trafic restant Ă  venir. Nous validons notre solution avec des rĂ©sultats expĂ©rimentaux qui dĂ©montrent qu'elle fournit une Ă©lasticitĂ© instantanĂ©e des ressources pour des pics de trafic qui ont Ă©tĂ© observĂ©s sur la version japonaise de Wikipedia lors de l'accident nuclĂ©aire de Fukushima Daiichi en mars 2011

    New mobilities across the lifecourse: A framework for analysing demographically-linked drivers of migration

    Get PDF
    Date of acceptance: 17/02/2015Taking the life course as the central concern, the authors set out a conceptual framework and define some key research questions for a programme of research that explores how the linked lives of mobile people are situated in time–space within the economic, social, and cultural structures of contemporary society. Drawing on methodologically innovative techniques, these perspectives can offer new insights into the changing nature and meanings of migration across the life course.Publisher PDFPeer reviewe

    Processing Images from the Zwicky Transient Facility

    Get PDF
    The Zwicky Transient Facility is a new robotic-observing program, in which a newly engineered 600-MP digital camera with a pioneeringly large field of view, 47~square degrees, will be installed into the 48-inch Samuel Oschin Telescope at the Palomar Observatory. The camera will generate ∌1\sim 1~petabyte of raw image data over three years of operations. In parallel related work, new hardware and software systems are being developed to process these data in real time and build a long-term archive for the processed products. The first public release of archived products is planned for early 2019, which will include processed images and astronomical-source catalogs of the northern sky in the gg and rr bands. Source catalogs based on two different methods will be generated for the archive: aperture photometry and point-spread-function fitting.Comment: 6 pages, 4 figures, submitted to RTSRE Proceedings (www.rtsre.org
    • 

    corecore