1,016 research outputs found

    Predicting the risk and speed of drug resistance emerging in soil-transmitted helminths during preventive chemotherapy

    Get PDF
    Control of soil-transmitted helminths relies heavily on regular large-scale deworming of high-risk groups (e.g., children) with benzimidazole derivatives. Although drug resistance has not yet been documented in human soil-transmitted helminths, regular deworming of cattle and sheep has led to widespread benzimidazole resistance in veterinary helminths. Here we predict the population dynamics of human soil-transmitted helminth infections and drug resistance during 20 years of regular preventive chemotherapy, using an individual-based model. With the current preventive chemotherapy strategy of mainly targeting children in schools, drug resistance may evolve in soil-transmitted helminths within a decade. More intense preventive chemotherapy strategies increase the prospects of soil-transmitted helminths elimination, but also increase the speed at which drug efficacy declines, especially when implementing community-based preventive chemotherapy (population-wide deworming). If during the last decade, preventive chemotherapy against soil-transmitted helminths has led to resistance, we may not have detected it as drug efficacy has not been structurally monitored, or incorrectly so. These findings highlight the need to develop and implement strategies to monitor and mitigate the evolution of benzimidazole resistance.</p

    Mapping onto Eq-5 D for patients in poor health

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>An increasing amount of studies report mapping algorithms which predict EQ-5 D utility values using disease specific non-preference-based measures. Yet many mapping algorithms have been found to systematically overpredict EQ-5 D utility values for patients in poor health. Currently there are no guidelines on how to deal with this problem. This paper is concerned with the question of why overestimation of EQ-5 D utility values occurs for patients in poor health, and explores possible solutions.</p> <p>Method</p> <p>Three existing datasets are used to estimate mapping algorithms and assess existing mapping algorithms from the literature mapping the cancer-specific EORTC-QLQ C-30 and the arthritis-specific Health Assessment Questionnaire (HAQ) onto the EQ-5 D. Separate mapping algorithms are estimated for poor health states. Poor health states are defined using a cut-off point for QLQ-C30 and HAQ, which is determined using association with EQ-5 D values.</p> <p>Results</p> <p>All mapping algorithms suffer from overprediction of utility values for patients in poor health. The large decrement of reporting 'extreme problems' in the EQ-5 D tariff, few observations with the most severe level in any EQ-5 D dimension and many observations at the least severe level in any EQ-5 D dimension led to a bimodal distribution of EQ-5 D index values, which is related to the overprediction of utility values for patients in poor health. Separate algorithms are here proposed to predict utility values for patients in poor health, where these are selected using cut-off points for HAQ-DI (> 2.0) and QLQ C-30 (< 45 average of QLQ C-30 functioning scales). The QLQ-C30 separate algorithm performed better than existing mapping algorithms for predicting utility values for patients in poor health, but still did not accurately predict mean utility values. A HAQ separate algorithm could not be estimated due to data restrictions.</p> <p>Conclusion</p> <p>Mapping algorithms overpredict utility values for patients in poor health but are used in cost-effectiveness analyses nonetheless. Guidelines can be developed on when the use of a mapping algorithms is inappropriate, for instance through the identification of cut-off points. Cut-off points on a disease specific questionnaire can be identified through association with the causes of overprediction. The cut-off points found in this study represent severely impaired health. Specifying a separate mapping algorithm to predict utility values for individuals in poor health greatly reduces overprediction, but does not fully solve the problem.</p

    Balancing equity and efficiency in the Dutch basic benefits package using the principle of proportional shortfall

    Get PDF
    Economic evaluations are increasingly used to inform decisions regarding the allocation of scarce health care resources. To systematically incorporate societal preferences into these evaluations, quality-adjusted life year gains could be weighted according to some equity principle, the most suitable of which is a matter of frequent debate. While many countries still struggle with equity concerns for priority setting in health care, the Netherlands has reached a broad consensus to use the concept of proportional shortfall. Our study evaluates the concept and its support in the Dutch health care context. We discuss arguments in the Netherlands for using proportional shortfall and difficulties in transitioning from principle to practice. In doing so, we address universal issues leading to a systematic consideration of equity concerns for priority setting in health care. The article thus has relevance to all countries struggling with the formalization of equity concerns for priority setting

    Peroxiredoxin 4, a novel circulating biomarker for oxidative stress and the risk of incident cardiovascular disease and all-cause mortality

    Get PDF
    BACKGROUND: Oxidative stress has been suggested to play a key role in the development of cardiovascular disease (CVD). The aim of our study was to investigate the associations of serum peroxiredoxin 4 (Prx4), a hydrogen peroxide-degrading peroxidase, with incident CVD and all-cause mortality. We subsequently examined the incremental value of Prx4 for the risk prediction of CVD compared with the Framingham risk score (FRS). METHODS AND RESULTS: We performed Cox regression analyses in 8141 participants without history of CVD (aged 28 to 75 years; women 52.6%) from the Prevention of Renal and Vascular End-stage Disease (PREVEND) study in Groningen, The Netherlands. Serum Prx4 was measured by an immunoluminometric assay in baseline samples. Main outcomes were: (1) incident CVD events or CVD mortality and (2) all-cause mortality during a median follow-up of 10.5 years. In total, 708 participants (7.8%) developed CVD events or CVD mortality, and 517 participants (6.3%) died. Baseline serum Prx4 levels were significantly higher in participants with incident CVD events or CVD mortality and in those who died than in participants who remained free of outcomes (both P<0.001). In multivariable models with adjustment for Framingham risk factors, hazard ratios were 1.16 (95% CI 1.06 to 1.27, P<0.001) for incident CVD events or CVD mortality and 1.17 (95% CI 1.06 to 1.29, P=0.003) for all-cause mortality per doubling of Prx4 levels. After the addition of Prx4 to the FRS, the net reclassification improvement was 2.7% (P=0.01) using 10-year risk categories of CVD. CONCLUSIONS: Elevated serum Prx4 levels are associated with a significantly higher risk of incident CVD events or CVD mortality and all-cause mortality after adjustment for clinical risk factors. The addition of Prx4 to the FRS marginally improved risk prediction of future CVD

    Predicting lymphatic filariasis transmission and elimination dynamics using a multi-model ensemble framework

    Get PDF
    Mathematical models of parasite transmission provide powerful tools for assessing the impacts of interventions. Owing to complexity and uncertainty, no single model may capture all features of transmission and elimination dynamics. Multi-model ensemble modelling offers a framework to help overcome biases of single models. We report on the development of a first multi-model ensemble of three lymphatic filariasis (LF) models (EPIFIL, LYMFASIM, and TRANSFIL), and evaluate its predictive performance in comparison with that of the constituents using calibration and validation data from three case study sites, one each from the three major LF endemic regions: Africa, Southeast Asia and Papua New Guinea (PNG). We assessed the performance of the respective models for predicting the outcomes of annual MDA strategies for various baseline scenarios thought to exemplify the current endemic conditions in the three regions. The results show that the constructed multi-model ensemble outperformed the single models when evaluated across all sites. Single models that best fitted calibration data tended to do less well in simulating the out-of-sample, or validation, intervention data. Scenario modelling results demonstrate that the multi-model ensemble is able to compensate for variance between single models in order to produce more plausible predictions of intervention impacts. Our results highlight the value of an ensemble approach to modelling parasite control dynamics. However, its optimal use will require further methodological improvements as well as consideration of the organizational mechanisms required to ensure that modelling results and data are shared effectively between all stakeholders

    Structural Uncertainty in Onchocerciasis Transmission Models Influences the Estimation of Elimination Thresholds and Selection of Age Groups for Seromonitoring

    Get PDF
    Background. The World Health Organization recommends monitoring Ov16 serologyin children aged <10 years for stopping mass ivermectin administration. Transmission models can help to identify the most informative age groups for serological monitoring and investigate the discriminatory power of serology-based elimination thresholds.Model predictions will depend on assumed age-exposure patterns and transmission efficiency at low infection levels. Methods. The individual-based transmission model, EPIONCHO-IBM, was used toassess: i) the most informative age groups for serological monitoring using receiveroperator characteristic curves for different elimination thresholds under various age-dependent exposure assumptions, including those of ONCHOSIM (another widely-used model), and ii) the influence of within-human density-dependent parasite establishment (included in EPIONCHO-IBM but not in ONCHOSIM) on positive predictive values for different serological thresholds.Results. When assuming EPIONCHO-IBM exposure patterns, under-10s are themost informative age group for seromonitoring; when assuming ONCHOSIM’s exposure patterns, 5–15-year olds are the most informative (as published elsewhere).Omitting density-dependent parasite establishment results in more lenientseroprevalence thresholds, even for higher baseline infection prevalence and shorter treatment durations.Conclusions. Selecting appropriate seromonitoring age groups depends critically onage-dependent exposure patterns. The role of density dependence on elimination thresholds largely explains differing EPIONCHO-IBM and ONCHOSIM elimination predictions

    Effectiveness of a triple-drug regimen for global elimination of lymphatic filariasis : a modelling study

    Get PDF
    Background: Lymphatic filariasis is targeted for elimination as a public health problem by 2020. The principal approach used by current programmes is annual mass drug administration with two pairs of drugs with a good safety profile. However, one dose of a triple-drug regimen (ivermectin, diethylcarbamazine, and albendazole) has been shown to clear the transmissible stage of the helminth completely in treated individuals. The aim of this study was to use modelling to assess the potential value of mass drug administration with the triple-drug regimen for accelerating elimination of lymphatic filariasis in different epidemiological settings. Methods: We used three different transmission models to compare the number of rounds of mass drug administration needed to achieve a prevalence of microfilaraemia less than 1% with the triple-drug regimen and with current two-drug regimens. Findings: In settings with a low baseline prevalence of lymphatic filariasis (5%), the triple-drug regimen reduced the number of rounds of mass drug administration needed to reach the target prevalence by one or two rounds, compared with the two-drug regimen. For areas with higher baseline prevalence (10–40%), the triple-drug regimen strikingly reduced the number of rounds of mass drug administration needed, by about four or five, but only at moderate-to-high levels of population coverage (>65%) and if systematic non-adherence to mass drug administration was low. Interpretation: Simulation modelling suggests that the triple-drug regimen has potential to accelerate the elimination of lymphatic filariasis if high population coverage of mass drug administration can be achieved and if systematic non-adherence with mass drug administration is low. Future work will reassess these estimates in light of more clinical trial data and to understand the effect on an individual country's programme

    Mathematical modelling of lymphatic filariasis elimination programmes in India: Required duration of mass drug administration and post-treatment level of infection indicators

    Get PDF
    Background: India has made great progress towards the elimination of lymphatic filariasis. By 2015, most endemic districts had completed at least five annual rounds of mass drug administration (MDA). The next challenge is to determine when MDA can be stopped. We performed a simulation study with the individual-based model LYMFASIM to help clarify this. Methods: We used a model-variant for Indian settings. We considered different hypotheses on detectability of antigenaemia (Ag) in relation to underlying adult worm burden, choosing the most likely hypothesis by comparing the model predicted association between community-level microfilaraemia (Mf) and antigenaemia (Ag) prevalence levels to observed data (collated from literature). Next, we estimated how long MDA must be continued in order to achieve elimination in different transmission settings and what Mf and Ag prevalence may still remain 1 year after the last required MDA round. The robustness of key-outcomes was assessed in a sensitivity analysis. Results: Our model matched observed data qualitatively well when we assumed an Ag detection rate of 50 % for single worm infections, which increases with the number of adult worms (modelled by relating detection to the presence of female worms). The required duration of annual MDA increased with higher baseline endemicity and lower coverage (varying between 2 and 12 rounds), while the remaining residual infection 1 year after the last required treatment declined with transmission intensity. For low and high transmission settings, the median residual infection levels were 1.0 % and 0.4 % (Mf prevalence in the 5+ population), and 3.5 % and 2.0 % (Ag prevalence in 6-7 year-old children). Conclusion: To achieve elimination in high transmission settings, MDA must be continued longer and infection levels must be reduced to lower levels than in low-endemic communities. Although our simulations were for Indian settings, qualitatively similar patterns are also expected in other areas. This should be taken into account in decision algorithms to define whether MDA can be interrupted. Transmission assessment surveys should ideally be targeted to communities with the highest pre-control transmission levels, to minimize the risk of programme failure
    • …
    corecore