123 research outputs found

    Taking Sharper Pictures of Malaria with CAMERAs: Combined Antibodies to Measure Exposure Recency Assays.

    Get PDF
    Antibodies directed against malaria parasites are easy and inexpensive to measure but remain an underused surveillance tool because of a lack of consensus on what to measure and how to interpret results. High-throughput screening of antibodies from well-characterized cohorts offers a means to substantially improve existing assays by rationally choosing the most informative sets of responses and analytical methods. Recent data suggest that high-resolution information on malaria exposure can be obtained from a small number of samples by measuring a handful of properly chosen antibody responses. In this review, we discuss how standardized multi-antibody assays can be developed and efficiently integrated into existing surveillance activities, with potential to greatly augment the breadth and quality of information available to direct and monitor malaria control and elimination efforts

    Targeting vaccinations for the licensed dengue vaccine: considerations for serosurvey design

    Get PDF
    Background The CYD-TDV vaccine was unusual in that the recommended target population for vaccination was originally defined not only by age, but also by transmission setting as defined by seroprevalence. WHO originally recommended countries consider vaccination against dengue with CYD-TDV vaccine in geographic settings only where prior infection with any dengue serotype, as measured by seroprevalence, was >170% in the target age group. Vaccine was not recommended in settings where seroprevalence was <50%. Test-and-vaccinate strategies suggested following new analysis by Sanofi will still require age-stratified seroprevalence surveys to optimise age-group targeting. Here we address considerations for serosurvey design in the context of vaccination program planning. Methods To explore how the design of seroprevalence surveys affects estimates of transmission intensity, 100 age-specific seroprevalence surveys were simulated using a beta-binomial distribution and a simple catalytic model for different combinations of age-range, survey size, transmission setting, and test sensitivity/specificity. We then used a Metropolis-Hastings Markov Chain Monte-Carlo algorithm to estimate the force of infection from each simulated dataset. Results Sampling from a wide age-range led to more accurate estimates than merely increasing sample size in a narrow age-range. This finding was consistent across all transmission settings. The optimum test sensitivity and specificity given an imperfect test differed by setting with high sensitivity being important in high transmission settings and high specificity important in low transmission settings. Conclusions When assessing vaccination suitability by seroprevalence surveys, countries should ensure an appropriate age-range is sampled, considering epidemiological evidence about the local burden of disease

    Addressing the unmet needs in patients with type 2 inflammatory diseases: when quality of life can make a difference

    Get PDF
    BackgroundPatients with asthma (AS), atopic dermatitis (AD), allergic rhinitis (AR), eosinophilic esophagitis (EoE), chronic rhinosinusitis with nasal polyps (CRSwNP), chronic urticaria (CU), non-steroidal anti-inflammatory drugs-exacerbated respiratory disease (N-ERD), and certain phenotypes of chronic obstructive pulmonary disease (COPD), among others, have a common underlying pathogenesis known as Type 2 inflammation (T2i). These diseases often coexist with other T2i conditions and have a substantial impact on the quality of life (QoL) of patients. However, limited data on patients’ experiences, perspectives, and current management of T2i diseases have been published thus far.AimsThis survey, promoted by the patient-driven T2i Network Project, aimed at identifying the common drivers and challenges related to the QoL of patients with T2i diseases by putting the patient's perspective at the force and including it in the design of new care strategies.MethodologyAn anonymous online survey was carried out through convenience sampling between May and June 2023. The survey was codesigned by members of different patient associations, healthcare professionals and healthcare quality experts, and implemented using EUSurvey and distributed through eight patient associations from Spain. The survey consisted of 29 questions related to the participant's sociodemographic features, a series of self-reported multiple choice or rating scale questions, including diagnosis, QoL measures, disease severity, healthcare resource utilization, and quality of care.ResultsThe survey included 404 participants, members from eight patient associations, the majority of whom had moderate-to-severe self-reported disease severity (93%) and one or more coexisting pathologies related to T2i (59%). Patients with more than one pathology had a significantly greater impact on QoL than those with only one pathology (p &lt; .001). Participants with self-reported severe symptoms reported significantly worse QoL than those with mild-to-moderate severity (p &lt; .001). More than half of the patients (56%) felt constantly bothered by the unpredictability of their illness caused by potential exposure to known or unknown disease triggers. The lack of coordination between specialists and primary care was also expressed as an area of dissatisfaction by participants, with 52% indicating a complete lack of coordination and 21% indicating an average coordination.ConclusionThis article reports the initial findings of a patient-led initiative, which highlights the common QoL challenges faced by individuals with type 2 inflammation-related diseases and emphasizes the importance of further clinical research to improve the management of this patient group. Considering the significant impact on QoL, a multidisciplinary approach integrated into new healthcare protocols has the potential to improve patient management and QoL, shorten the time to diagnosis and reduce healthcare resource utilization

    The long-term safety, public health impact, and cost-effectiveness of routine vaccination with a recombinant, live-attenuated dengue vaccine (Dengvaxia): a model comparison study

    Get PDF
    This is the final version of the article. Available from the publisher via the DOI in this record.Background: Large Phase III trials across Asia and Latin America have recently demonstrated the efficacy of a recombinant, live-attenuated dengue vaccine (Dengvaxia) over the first 25 mo following vaccination. Subsequent data collected in the longer-term follow-up phase, however, have raised concerns about a potential increase in hospitalization risk of subsequent dengue infections, in particular among young, dengue-naĂŻve vaccinees. We here report predictions from eight independent modelling groups on the long-term safety, public health impact, and cost-effectiveness of routine vaccination with Dengvaxia in a range of transmission settings, as characterised by seroprevalence levels among 9-y-olds (SP9). These predictions were conducted for the World Health Organization to inform their recommendations on optimal use of this vaccine. Methods and Findings: The models adopted, with small variations, a parsimonious vaccine mode of action that was able to reproduce quantitative features of the observed trial data. The adopted mode of action assumed that vaccination, similarly to natural infection, induces transient, heterologous protection and, further, establishes a long-lasting immunogenic memory, which determines disease severity of subsequent infections. The default vaccination policy considered was routine vaccination of 9-y-old children in a three-dose schedule at 80% coverage. The outcomes examined were the impact of vaccination on infections, symptomatic dengue, hospitalised dengue, deaths, and cost-effectiveness over a 30-y postvaccination period. Case definitions were chosen in accordance with the Phase III trials. All models predicted that in settings with moderate to high dengue endemicity (SP9 ≄ 50%), the default vaccination policy would reduce the burden of dengue disease for the population by 6%–25% (all simulations: –3%–34%) and in high-transmission settings (SP9 ≄ 70%) by 13%–25% (all simulations: 10%– 34%). These endemicity levels are representative of the participating sites in both Phase III trials. In contrast, in settings with low transmission intensity (SP9 ≀ 30%), the models predicted that vaccination could lead to a substantial increase in hospitalisation because of dengue. Modelling reduced vaccine coverage or the addition of catch-up campaigns showed that the impact of vaccination scaled approximately linearly with the number of people vaccinated. In assessing the optimal age of vaccination, we found that targeting older children could increase the net benefit of vaccination in settings with moderate transmission intensity (SP9 = 50%). Overall, vaccination was predicted to be potentially cost-effective in most endemic settings if priced competitively. The results are based on the assumption that the vaccine acts similarly to natural infection. This assumption is consistent with the available trial results but cannot be directly validated in the absence of additional data. Furthermore, uncertainties remain regarding the level of protection provided against disease versus infection and the rate at which vaccine-induced protection declines. Conclusions: Dengvaxia has the potential to reduce the burden of dengue disease in areas of moderate to high dengue endemicity. However, the potential risks of vaccination in areas with limited exposure to dengue as well as the local costs and benefits of routine vaccination are important considerations for the inclusion of Dengvaxia into existing immunisation programmes. These results were important inputs into WHO global policy for use of this licensed dengue vaccinSF and MJ received funding from WHO and Gavi, the Vaccine Alliance, to conduct this work. LC is a paid employee at Sanofi Pasteur. GM and JK were funded by the University of Western Australia, with computing resources provided by the Pawsey Supercomputing Centre, which is funded by the Australian Government and the Government of Western Australia. MR is funded by a Royal Society University Research Fellowship. NF, ID and DJL received research funding from the UK Medical Research Council, the UK NIHR under the Health Protection Research Unit initiative, NIGMS under the MIDAS initiative, and the Bill and Melinda Gates Foundation. IRB and DATC were funded by MIDAS Center Grant NIH/NIGMS U54-GM088491 and the Bill and Melinda Gates Foundation. DATC was also supported by NIH/NIAID R01-AI114703. TJH, IL, and CABP were funded by a Dengue Vaccine Initiative Grant to IL, NIH/NIAID R37 AI32042. THJ, IL, and KK were funded by MIDAS Center Grant NIH/NIGMS 1135 U54 GM111274. All other authors have received no specific funding to conduct this work. The funders had no role in the study design, data analyses, decision to publish or preparation of the manuscript

    Variability in dengue titer estimates from plaque reduction neutralization tests poses a challenge to epidemiological studies and vaccine development.

    Get PDF
    BACKGROUND: Accurate determination of neutralization antibody titers supports epidemiological studies of dengue virus transmission and vaccine trials. Neutralization titers measured using the plaque reduction neutralization test (PRNT) are believed to provide a key measure of immunity to dengue viruses, however, the assay's variability is poorly understood, making it difficult to interpret the significance of any assay reading. In addition there is limited standardization of the neutralization evaluation point or statistical model used to estimate titers across laboratories, with little understanding of the optimum approach. METHODOLOGY/PRINCIPAL FINDINGS: We used repeated assays on the same two pools of serum using five different viruses (2,319 assays) to characterize the variability in the technique under identical experimental conditions. We also assessed the performance of multiple statistical models to interpolate continuous values of neutralization titer from discrete measurements from serial dilutions. We found that the variance in plaque reductions for individual dilutions was 0.016, equivalent to a 95% confidence interval of 0.45-0.95 for an observed plaque reduction of 0.7. We identified PRNT75 as the optimum evaluation point with a variance of 0.025 (log10 scale), indicating a titer reading of 1∶500 had 95% confidence intervals of 1∶240-1∶1000 (2.70±0.31 on a log10 scale). The choice of statistical model was not important for the calculation of relative titers, however, cloglog regression out-performed alternatives where absolute titers are of interest. Finally, we estimated that only 0.7% of assays would falsely detect a four-fold difference in titers between acute and convalescent sera where no true difference exists. CONCLUSIONS: Estimating and reporting assay uncertainty will aid the interpretation of individual titers. Laboratories should perform a small number of repeat assays to generate their own variability estimates. These could be used to calculate confidence intervals for all reported titers and allow benchmarking of assay performance

    Thalamic haemorrhage vs internal capsule-basal ganglia haemorrhage: clinical profile and predictors of in-hospital mortality

    Get PDF
    Background: There is a paucity of clinical studies focused specifically on intracerebral haemorrhages of subcortical topography, a subject matter of interest to clinicians involved in stroke management. This single centre, retrospective study was conducted with the following objectives: a) to describe the aetiological, clinical and prognostic characteristics of patients with thalamic haemorrhage as compared with that of patients with internal capsule-basal ganglia haemorrhage, and b) to identify predictors of in-hospital mortality in patients with thalamic haemorrhage. Methods: Forty-seven patients with thalamic haemorrhage were included in the '' Sagrat Cor Hospital of Barcelona Stroke Registry '' during a period of 17 years. Data from stroke patients are entered in the stroke registry following a standardized protocol with 161 items regarding demographics, risk factors, clinical features, laboratory and neuroimaging data, complications and outcome. The region of the intracranial haemorrhage was identified on computerized tomographic (CT) scans and/or magnetic resonance imaging (MRI) of the brain. Results: Thalamic haemorrhage accounted for 1.4% of all cases of stroke (n = 3420) and 13% of intracerebral haemorrhage (n = 364). Hypertension (53.2%), vascular malformations (6.4%), haematological conditions (4.3%) and anticoagulation (2.1%) were the main causes of thalamic haemorrhage. In-hospital mortality was 19% (n = 9). Sensory deficit, speech disturbances and lacunar syndrome were significantly associated with thalamic haemorrhage, whereas altered consciousness (odds ratio [OR] = 39.56), intraventricular involvement (OR = 24.74) and age (OR = 1.23), were independent predictors of in-hospital mortality. Conclusion: One in 8 patients with acute intracerebral haemorrhage had a thalamic hematoma. Altered consciousness, intraventricular extension of the hematoma and advanced age were determinants of a poor early outcome

    Risk profiles and one-year outcomes of patients with newly diagnosed atrial fibrillation in India: Insights from the GARFIELD-AF Registry.

    Get PDF
    BACKGROUND: The Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF) is an ongoing prospective noninterventional registry, which is providing important information on the baseline characteristics, treatment patterns, and 1-year outcomes in patients with newly diagnosed non-valvular atrial fibrillation (NVAF). This report describes data from Indian patients recruited in this registry. METHODS AND RESULTS: A total of 52,014 patients with newly diagnosed AF were enrolled globally; of these, 1388 patients were recruited from 26 sites within India (2012-2016). In India, the mean age was 65.8 years at diagnosis of NVAF. Hypertension was the most prevalent risk factor for AF, present in 68.5% of patients from India and in 76.3% of patients globally (P < 0.001). Diabetes and coronary artery disease (CAD) were prevalent in 36.2% and 28.1% of patients as compared with global prevalence of 22.2% and 21.6%, respectively (P < 0.001 for both). Antiplatelet therapy was the most common antithrombotic treatment in India. With increasing stroke risk, however, patients were more likely to receive oral anticoagulant therapy [mainly vitamin K antagonist (VKA)], but average international normalized ratio (INR) was lower among Indian patients [median INR value 1.6 (interquartile range {IQR}: 1.3-2.3) versus 2.3 (IQR 1.8-2.8) (P < 0.001)]. Compared with other countries, patients from India had markedly higher rates of all-cause mortality [7.68 per 100 person-years (95% confidence interval 6.32-9.35) vs 4.34 (4.16-4.53), P < 0.0001], while rates of stroke/systemic embolism and major bleeding were lower after 1 year of follow-up. CONCLUSION: Compared to previously published registries from India, the GARFIELD-AF registry describes clinical profiles and outcomes in Indian patients with AF of a different etiology. The registry data show that compared to the rest of the world, Indian AF patients are younger in age and have more diabetes and CAD. Patients with a higher stroke risk are more likely to receive anticoagulation therapy with VKA but are underdosed compared with the global average in the GARFIELD-AF. CLINICAL TRIAL REGISTRATION-URL: http://www.clinicaltrials.gov. Unique identifier: NCT01090362
    • 

    corecore