6 research outputs found
How should HIV resources be allocated? Lessons learnt from applying Optima HIV in 23 countries.
INTRODUCTION: With limited funds available, meeting global health targets requires countries to both mobilize and prioritize their health spending. Within this context, countries have recognized the importance of allocating funds for HIV as efficiently as possible to maximize impact. Over the past six years, the governments of 23 countries in Africa, Asia, Eastern Europe and Latin America have used the Optima HIV tool to estimate the optimal allocation of HIV resources. METHODS: Each study commenced with a request by the national government for technical assistance in conducting an HIV allocative efficiency study using Optima HIV. Each study team validated the required data, calibrated the Optima HIV epidemic model to produce HIV epidemic projections, agreed on cost functions for interventions, and used the model to calculate the optimal allocation of available funds to best address national strategic plan targets. From a review and analysis of these 23 country studies, we extract common themes around the optimal allocation of HIV funding in different epidemiological contexts. RESULTS AND DISCUSSION: The optimal distribution of HIV resources depends on the amount of funding available and the characteristics of each country's epidemic, response and targets. Universally, the modelling results indicated that scaling up treatment coverage is an efficient use of resources. There is scope for efficiency gains by targeting the HIV response towards the populations and geographical regions where HIV incidence is highest. Across a range of countries, the model results indicate that a more efficient allocation of HIV resources could reduce cumulative new HIV infections by an average of 18% over the years to 2020 and 25% over the years to 2030, along with an approximately 25% reduction in deaths for both timelines. However, in most countries this would still not be sufficient to meet the targets of the national strategic plan, with modelling results indicating that budget increases of up to 185% would be required. CONCLUSIONS: Greater epidemiological impact would be possible through better targeting of existing resources, but additional resources would still be required to meet targets. Allocative efficiency models have proven valuable in improving the HIV planning and budgeting process
Getting it right when budgets are tight: Using optimal expansion pathways to prioritize responses to concentrated and mixed HIV epidemics.
BACKGROUND: Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. METHODS: We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed "investment staircases", a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. FINDINGS: We find that when budgets are very limited, the optimal HIV response consists of a smaller number of 'core' interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. SIGNIFICANCE: It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future
Factors Affecting Human Force Perception and Performance in Haptic-Enabled Virtual Environments
Haptic technology enables computer users to touch and/or manipulate virtual objects in virtual environments (VEs). Similar to other human-in-the-loop applications, haptic applications require interactions between humans and computers. Thus, human-factors studies are required to recognize the limitations and capabilities of the user. This thesis establishes human-factors criteria to improve various haptic applications such as perception-based haptic compression techniques and haptic-enabled computer-aided design (CAD).
Today, data compression plays a significant role in the transmission of haptic information since the efficient use of the available bandwidth is a concern. Most lossy haptic compression techniques rely on the limitations of human force perception, and this is used in the design of perception-based haptic compression techniques. Researchers have studied force perception when a user is in static interaction with a stationary object. This thesis focuses on cases where the human user and the object are in relative motion. The limitations of force perception are quantified using psychophysical methods, and the effects of several factors, including user hand velocity and sensory adaptation, are investigated. The results indicate that fewer haptic details need to be calculated or transmitted when the user's hand is in motion.
In traditional CAD systems, users usually design virtual prototypes using a mouse via their vision system only, and it is difficult to design curved surfaces due to the number, shape, and position of the curves. Adding haptics to CAD systems enables users to explore and manipulate virtual objects using the sense of touch. In addition, human performance is important in CAD environments. To maintain the accuracy, active haptic manipulation of the user response can be incorporated in CAD applications. This thesis investigates the effect of forces on the accuracy of movement in VEs. The results indicate that factors such as the base force intensity and force increment/decrement can be incorporated in the control of users' movements in VEs. In other words, we can pull/push the users' hands by increasing/decreasing the force without the users being aware of it
The global Optima HIV allocative efficiency model: targeting resources in efforts to end AIDS.
BACKGROUND: To move towards ending AIDS by 2030, HIV resources should be allocated cost-effectively. We used the Optima HIV model to estimate how global HIV resources could be retargeted for greatest epidemiological effect and how many additional new infections could be averted by 2030. METHODS: We collated standard data used in country modelling exercises (including demographic, epidemiological, behavioural, programmatic, and expenditure data) from Jan 1, 2000, to Dec 31, 2015 for 44 countries, capturing 80% of people living with HIV worldwide. These data were used to parameterise separate subnational and national models within the Optima HIV framework. To estimate optimal resource allocation at subnational, national, regional, and global levels, we used an adaptive stochastic descent optimisation algorithm in combination with the epidemic models and cost functions for each programme in each country. Optimal allocation analyses were done with international HIV funds remaining the same to each country and by redistributing these funds between countries. FINDINGS: Without additional funding, if countries were to optimally allocate their HIV resources from 2016 to 2030, we estimate that an additional 7·4 million (uncertainty range 3·9 million-14·0 million) new infections could be averted, representing a 26% (uncertainty range 13-50%) incidence reduction. Redistribution of international funds between countries could avert a further 1·9 million infections, which represents a 33% (uncertainty range 20-58%) incidence reduction overall. To reduce HIV incidence by 90% relative to 2010, we estimate that more than a three-fold increase of current annual funds will be necessary until 2030. The most common priorities for optimal resource reallocation are to scale up treatment and prevention programmes targeting key populations at greatest risk in each setting. Prioritisation of other HIV programmes depends on the epidemiology and cost-effectiveness of service delivery in each setting as well as resource availability. INTERPRETATION: Further reductions in global HIV incidence are possible through improved targeting of international and national HIV resources. FUNDING: World Bank and Australian NHMRC
Monitoring and switching of first-line antiretroviral therapy in adult treatment cohorts in sub-Saharan Africa: Collaborative analysis
Background
HIV-1 viral load testing is recommended to monitor antiretroviral therapy (ART) but is not universally available. The aim of our study was to assess monitoring of first-line ART and switching to second-line ART in sub-Saharan Africa.
Methods
We did a collaborative analysis of cohort studies from 16 countries in east Africa, southern Africa, and west Africa that participate in the international epidemiological database to evaluate AIDS (IeDEA). We included adults infected with HIV-1 who started combination ART between January, 2004, and January, 2013. We defined switching of ART as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to one including a protease inhibitor, with adjustment of one or more nucleoside reverse-transcriptase inhibitors (NRTIs). Virological and immunological failures were defined according to WHO criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% CIs comparing routine viral load monitoring, targeted viral load monitoring, CD4 monitoring, and clinical monitoring, adjusting for programme and individual characteristics.
Findings
Of 297 825 eligible patients, 10 352 (3%) switched to second-line ART during 782 412 person-years of follow-up. Compared with CD4 monitoring, hazard ratios for switching were 3·15 (95% CI 2·92–3·40) for routine viral load monitoring, 1·21 (1·13–1·30) for targeted viral load monitoring, and 0·49 (0·43–0·56) for clinical monitoring. Of 6450 patients with confirmed virological failure, 58·0% (95% CI 56·5–59·6) switched by 2 years, and of 15 892 patients with confirmed immunological failure, 19·3% (18·5–20·0) switched by 2 years. Of 10 352 patients who switched, evidence of treatment failure based on one CD4 count or viral load measurement ranged from 86 (32%) of 268 patients with clinical monitoring to 3754 (84%) of 4452 with targeted viral load monitoring. Median CD4 counts at switching were 215 cells per μL (IQR 117–335) with routine viral load monitoring, but were lower with other types of monitoring (range 114–133 cells per μL).
Interpretation
Overall, few patients switched to second-line ART and switching happened late in the absence of routine viral load monitoring. Switching was more common and happened earlier after initiation of ART with targeted or routine viral load testing