104 research outputs found

    Seroprevalence of Hepatitis B, C and D in Vietnam: A systematic review and meta-analysis.

    Get PDF
    Background: Vietnam has one of the greatest disease burdens from chronic viral hepatitis. Comprehensive prevalence data are essential to support its elimination as a public health threat. Methods: We searched Medline and Embase from 1990 to 2021 for seroprevalence data relating to Hepatitis B (HBV), C (HCV) and D (HDV) in Vietnam. We estimated pooled prevalence with a DerSimonian-Laird random-effects model and stratified study populations into i) low-risk ii) high-risk exposure and iii) liver disease. We further estimated prevalence by decade and region and rates of HIV-coinfection. Findings: We analysed 72 studies, including 120 HBV, 114 HCV and 23 HDV study populations. Pooled HBV prevalence was low in blood donors (1.86% [1.82-1.90]) but high in antenatal populations (10.8% [10.1-11.6]) and adults in the general population (10.5% [10.0-11.0]). It was similar or modestly increased in groups at highest risk of exposure, suggesting the epidemic is largely driven by chronic infections acquired in childhood. HCV pooled prevalence in the general population was lower than historical estimates: 0.26% (0.09-0.51) have active infection defined by detectable antigen or HCV RNA. In contrast, there is an extremely high prevalence of active HCV infection in people who inject drugs (PWID) (57.8% [56.5-59.1]), which has persisted through the decades despite harm-reduction interventions. HDV appears mainly confined to high-risk groups. Interpretation: Blood safety has improved, but renewed focus on HBV vaccination at birth and targeted HCV screening and treatment of PWID are urgently required to meet elimination targets. Large cross-sectional studies are needed to better characterize HDV prevalence, but mass screening may not be warranted. Funding: This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors

    Intrathecal Immunoglobulin for treatment of adult patients with tetanus: A randomized controlled 2x2 factorial trial

    Get PDF
    Despite long-standing availability of an effective vaccine, tetanus remains a significant problem in many countries. Outcome depends on access to mechanical ventilation and intensive care facilities and in settings where these are limited, mortality remains high. Administration of tetanus antitoxin by the intramuscular route is recommended treatment for tetanus, but as the tetanus toxin acts within the central nervous system, it has been suggested that intrathecal administration of antitoxin may be beneficial. Previous studies have indicated benefit, but with the exception of one small trial no blinded studies have been performed. The objective of this study is to establish whether the addition of intrathecal tetanus antitoxin reduces the need for mechanical ventilation in patients with tetanus. Secondary objectives: to determine whether the addition of intrathecal tetanus antitoxin reduces autonomic nervous system dysfunction and length of hospital/ intensive care unit stay; whether the addition of intrathecal tetanus antitoxin in the treatment of tetanus is safe and cost-effective; to provide data to inform recommendation of human rather than equine antitoxin. This study will enroll adult patients (≥16 years old) with tetanus admitted to the Hospital for Tropical Diseases, Ho Chi Minh City. The study is a 2x2 factorial blinded randomized controlled trial. Eligible patients will be randomized in a 1:1:1:1 manner to the four treatment arms (intrathecal treatment and human intramuscular treatment, intrathecal treatment and equine intramuscular treatment, sham procedure and human intramuscular treatment, sham procedure and equine intramuscular treatment). Primary outcome measure will be requirement for mechanical ventilation. Secondary outcome measures: duration of hospital/ intensive care unit stay, duration of mechanical ventilation, in-hospital and 240-day mortality and disability, new antibiotic prescription, incidence of ventilator associated pneumonia and autonomic nervous system dysfunction, total dose of benzodiazepines and pipecuronium, and incidence of adverse events. Trial registration: ClinicalTrials.gov NCT02999815 Registration date: 21 December 2016

    Transmission Selects for HIV-1 Strains of Intermediate Virulence: A Modelling Approach

    Get PDF
    Recent data shows that HIV-1 is characterised by variation in viral virulence factors that is heritable between infections, which suggests that viral virulence can be naturally selected at the population level. A trade-off between transmissibility and duration of infection appears to favour viruses of intermediate virulence. We developed a mathematical model to simulate the dynamics of putative viral genotypes that differ in their virulence. As a proxy for virulence, we use set-point viral load (SPVL), which is the steady density of viral particles in blood during asymptomatic infection. Mutation, the dependency of survival and transmissibility on SPVL, and host effects were incorporated into the model. The model was fitted to data to estimate unknown parameters, and was found to fit existing data well. The maximum likelihood estimates of the parameters produced a model in which SPVL converged from any initial conditions to observed values within 100–150 years of first emergence of HIV-1. We estimated the 1) host effect and 2) the extent to which the viral virulence genotype mutates from one infection to the next, and found a trade-off between these two parameters in explaining the variation in SPVL. The model confirms that evolution of virulence towards intermediate levels is sufficiently rapid for it to have happened in the early stages of the HIV epidemic, and confirms that existing viral loads are nearly optimal given the assumed constraints on evolution. The model provides a useful framework under which to examine the future evolution of HIV-1 virulence

    Mild Cognitive Impairment as a risk factor for Parkinson's disease dementia

    Get PDF
    Background The International Parkinson and Movement Disorders Society criteria for mild cognitive impairment in Parkinson’s disease were recently formulated. Objectives The aim of this international study was to evaluate the predictive validity of the comprehensive (level II) version of these criteria by assessment of their contribution to the hazard of Parkinson’s disease dementia. Methods Individual patient data were selected from four separate studies on cognition in Parkinson’s disease that provided information on demographics, motor examination, depression, neuropsychological examination suitable for application of level II criteria, and longitudinal follow-up for conversion to dementia. Survival analysis evaluated the predictive value of level II criteria for cognitive decline towards dementia as expressed by the relative hazard of dementia. Results A total of 467 patients were included. The analyses showed a clear contribution of impairment according to level II mild cognitive impairment criteria, age and severity of Parkinson’s disease motor symptoms to the hazard of dementia. There was a trend of increasing hazard of dementia with declining neuropsychological performance. Conclusions This is the first large international study evaluating the predictive validity of level II mild cognitive impairment criteria for Parkinson’s disease. The results showed a clear and unique contribution of classification according to level II criteria to the hazard of Parkinson’s disease dementia. This finding supports their predictive validity and shows that they contribute important new information on the hazard of dementia, beyond known demographic and Parkinson’s disease specific factors of influence.Michael J. Fox Foundation Dutch Parkinson Foundatio

    Abundance of Early Functional HIV-Specific CD8+ T Cells Does Not Predict AIDS-Free Survival Time

    Get PDF
    Background T-cell immunity is thought to play an important role in controlling HIV infection, and is a main target for HIV vaccine development. HIV-specific central memory CD8+ and CD4+ T cells producing IFNγ and IL-2 have been associated with control of viremia and are therefore hypothesized to be truly protective and determine subsequent clinical outcome. However, the cause-effect relationship between HIV-specific cellular immunity and disease progression is unknown. We investigated in a large prospective cohort study involving 96 individuals of the Amsterdam Cohort Studies with a known date of seroconversion whether the presence of cytokine-producing HIV-specific CD8+ T cells early in infection was associated with AIDS-free survival time. Methods and Findings The number and percentage of IFNγ and IL-2 producing CD8+ T cells was measured after in vitro stimulation with an overlapping Gag-peptide pool in T cells sampled approximately one year after seroconversion. Kaplan-Meier survival analysis and Cox proportional hazard models showed that frequencies of cytokine-producing Gag-specific CD8+ T cells (IFNγ, IL-2 or both) shortly after seroconversion were neither associated with time to AIDS nor with the rate of CD4+ T-cell decline. Conclusions These data show that high numbers of functional HIV-specific CD8+ T cells can be found early in HIV infection, irrespective of subsequent clinical outcome. The fact that both progressors and long-term non-progressors have abundant T cell immunity of the specificity associated with low viral load shortly after seroconversion suggests that the more rapid loss of T cell immunity observed in progressors may be a consequence rather than a cause of disease progression

    Combination of inflammatory and vascular markers in the febrile phase of dengue is associated with more severe outcomes

    Get PDF
    Background: Early identification of severe dengue patients is important regarding patient management and resource allocation. We investigated the association of 10 biomarkers (VCAM-1, SDC-1, Ang-2, IL-8, IP-10, IL-1RA, sCD163, sTREM-1, ferritin, CRP) with the development of severe/moderate dengue (S/MD). Methods: We performed a nested case-control study from a multi-country study. A total of 281 S/MD and 556 uncomplicated dengue cases were included. Results: On days 1–3 from symptom onset, higher levels of any biomarker increased the risk of developing S/MD. When assessing together, SDC-1 and IL-1RA were stable, while IP-10 changed the association from positive to negative; others showed weaker associations. The best combinations associated with S/MD comprised IL-1RA, Ang-2, IL-8, ferritin, IP-10, and SDC-1 for children, and SDC-1, IL-8, ferritin, sTREM-1, IL-1RA, IP-10, and sCD163 for adults. Conclusions: Our findings assist the development of biomarker panels for clinical use and could improve triage and risk prediction in dengue patients. Funding: This study was supported by the EU's Seventh Framework Programme (FP7-281803 IDAMS), the WHO, and the Bill and Melinda Gates Foundation

    How to handle mortality when investigating length of hospital stay and time to clinical stability

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital length of stay (LOS) and time for a patient to reach clinical stability (TCS) have increasingly become important outcomes when investigating ways in which to combat Community Acquired Pneumonia (CAP). Difficulties arise when deciding how to handle in-hospital mortality. Ad-hoc approaches that are commonly used to handle time to event outcomes with mortality can give disparate results and provide conflicting conclusions based on the same data. To ensure compatibility among studies investigating these outcomes, this type of data should be handled in a consistent and appropriate fashion.</p> <p>Methods</p> <p>Using both simulated data and data from the international Community Acquired Pneumonia Organization (CAPO) database, we evaluate two ad-hoc approaches for handling mortality when estimating the probability of hospital discharge and clinical stability: 1) restricting analysis to those patients who lived, and 2) assigning individuals who die the "worst" outcome (right-censoring them at the longest recorded LOS or TCS). Estimated probability distributions based on these approaches are compared with right-censoring the individuals who died at time of death (the complement of the Kaplan-Meier (KM) estimator), and treating death as a competing risk (the cumulative incidence estimator). Tests for differences in probability distributions based on the four methods are also contrasted.</p> <p>Results</p> <p>The two ad-hoc approaches give different estimates of the probability of discharge and clinical stability. Analysis restricted to patients who survived is conceptually problematic, as estimation is conditioned on events that happen <it>at a future time</it>. Estimation based on assigning those patients who died the worst outcome (longest LOS and TCS) coincides with the complement of the KM estimator based on the subdistribution hazard, which has been previously shown to be equivalent to the cumulative incidence estimator. However, in either case the time to in-hospital mortality is ignored, preventing simultaneous assessment of patient mortality in addition to LOS and/or TCS. The power to detect differences in underlying hazards of discharge between patient populations differs for test statistics based on the four approaches, and depends on the underlying hazard ratio of mortality between the patient groups.</p> <p>Conclusions</p> <p>Treating death as a competing risk gives estimators which address the clinical questions of interest, and allows for simultaneous modelling of both in-hospital mortality and TCS / LOS. This article advocates treating mortality as a competing risk when investigating other time related outcomes.</p

    Viral Load Levels Measured at Set-Point Have Risen Over the Last Decade of the HIV Epidemic in the Netherlands

    Get PDF
    HIV-1 RNA plasma concentration at viral set-point is associated not only with disease outcome but also with the transmission dynamics of HIV-1. We investigated whether plasma HIV-1 RNA concentration and CD4 cell count at viral set-point have changed over time in the HIV epidemic in the Netherlands.We selected 906 therapy-naïve patients with at least one plasma HIV-1 RNA concentration measured 9 to 27 months after estimated seroconversion. Changes in HIV-1 RNA and CD4 cell count at viral set-point over time were analysed using linear regression models. The ATHENA national observational cohort contributed all patients who seroconverted in or after 1996; the Amsterdam Cohort Studies (ACS) contributed seroconverters before 1996. The mean of the first HIV-1 RNA concentration measured 9-27 months after seroconversion was 4.30 log(10) copies/ml (95% CI 4.17-4.42) for seroconverters from 1984 through 1995 (n = 163); 4.27 (4.16-4.37) for seroconverters 1996-2002 (n = 232), and 4.59 (4.52-4.66) for seroconverters 2003-2007 (n = 511). Compared to patients seroconverting between 2003-2007, the adjusted mean HIV-1 RNA concentration at set-point was 0.28 log(10) copies/ml (95% CI 0.16-0.40; p<0.0001) and 0.26 (0.11-0.41; p = 0.0006) lower for those seroconverting between 1996-2002 and 1984-1995, respectively. Results were robust regardless of type of HIV-1 RNA assay, HIV-1 subtype, and interval between measurement and seroconversion. CD4 cell count at viral set-point declined over calendar time at approximately 5 cells/mm(3)/year.The HIV-1 RNA plasma concentration at viral set-point has increased over the last decade of the HIV epidemic in the Netherlands. This is accompanied by a decreasing CD4 cell count over the period 1984-2007 and may have implications for both the course of the HIV infection and the epidemic

    Late cardiac events after childhood cancer: Methodological aspects of the pan-european study pancaresurfup

    Get PDF
    Background and Aim Childhood cancer survivors are at high risk of long-Termadverse effects of cancer and its treatment, including cardiac events. The pan-European PanCareSurFup study determined the incidence and risk factors for cardiac events among childhood cancer survivors. The aim of this article is to describe the methodology of the cardiac cohort and nested case-control study within PanCareSurFup. Methods Eight data providers in Europe participating in PanCareSurFup identified and validated symptomatic cardiac events in their cohorts of childhood cancer survivors. Data onsymptomatic heart failure, ischemia, pericarditis, valvular disease and arrhythmia were collected and graded according to the Criteria for Adverse Events. Detailed treatment data, data on potential confounders, lifestyle related risk factors and general health problems were collected. Results The PanCareSurFup cardiac cohort consisted of 59,915 5-year childhood cancer survivors with malignancies diagnosed between 1940 and 2009 and classified according to the International Classification of Childhood Cancer 3. Different strategies were used to identify cardiac events such as record linkage to population/ hospital or regional based databases, and patient-And general practitioner-based questionnaires. Conclusion The cardiac study of the European collaborative research project PanCareSurFup will provide the largest cohort of 5-year childhood cancer survivors with systematically ascertained and validated data on symptomatic cardiac events. The result of this study can provide information to minimize the burden of cardiac events in childhood cancer survivors by tailoring the follow-up of childhood cancer survivors at high risk of cardiac adverse events, transferring this knowledge into evidence-based clinical practice guidelines and providing a platformfor future research studies in childhood cancer patients. © 2016 Feijen et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited
    corecore