100 research outputs found

    SARS-CoV-2 immunity and vaccine strategies in people with HIV

    Get PDF
    Current SARS-CoV-2 vaccines, based on the ancestral Wuhan strain, were developed rapidly to meet the needs of a devastating global pandemic. People living with HIV (PLWH) have been designated as a priority group for SARS-CoV-2 vaccination in most regions and varying primary courses (2 or 3-dose schedule) and additional boosters are recommended depending on current CD4+ T cell count and/or detectable HIV viraemia. From the current published data, licensed vaccines are safe for PLWH, and stimulate robust responses to vaccination in those well controlled on antiretroviral therapy and with high CD4+ T cell counts. Data on vaccine efficacy and immunogenicity remain, however, scarce in PLWH, especially in people with advanced disease. A greater concern is a potentially diminished immune response to the primary course and subsequent boosters, as well as an attenuated magnitude and durability of protective immune responses. A detailed understanding of the breadth and durability of humoral and T cell responses to vaccination, and the boosting effects of natural immunity to SARS-CoV-2, in more diverse populations of PLWH with a spectrum of HIV-related immunosuppression is therefore critical. This article summarises focused studies of humoral and cellular responses to SARS-CoV-2 infection in PLWH and provides a comprehensive review of the emerging literature on SARS-CoV-2 vaccine responses. Emphasis is placed on the potential effect of HIV-related factors and presence of co-morbidities modulating responses to SARS-CoV-2 vaccination, and the remaining challenges informing the optimal vaccination strategy to elicit enduring responses against existing and emerging variants in PLWH. Lay Abstract People living with Human Immunodeficiency Virus (PLWH), appear to be at a higher risk (approximately 15%) of becoming more seriously unwell if they are infected with severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2), the virus that causes COVID-19 disease, and at least twice as likely to die from COVID-19 as the rest of the population. SARS-CoV-2 vaccination and boosters are recommended for all PLWH. However, there is limited information about the protective immune responses to both vaccination (and actual infection), the protection against serious COVID-19 disease, and whether the safety profile of the vaccines, which are very safe in the general population, differs in PLWH. Here we summarise findings from studies which looked specifically at vaccine-related immune responses in PLWH, and discuss factors – such as age, known to impact negatively on immune responses in the general population, to see whether this effect is worse in PLWH. A better understanding of these issues will help guide tailored vaccination and prevention strategies for PLWH

    The association between serum biomarkers and disease outcome in influenza A(H1N1)pdm09 virus infection: results of two international observational cohort studies

    Get PDF
    BACKGROUND Prospective studies establishing the temporal relationship between the degree of inflammation and human influenza disease progression are scarce. To assess predictors of disease progression among patients with influenza A(H1N1)pdm09 infection, 25 inflammatory biomarkers measured at enrollment were analyzed in two international observational cohort studies. METHODS Among patients with RT-PCR-confirmed influenza A(H1N1)pdm09 virus infection, odds ratios (ORs) estimated by logistic regression were used to summarize the associations of biomarkers measured at enrollment with worsened disease outcome or death after 14 days of follow-up for those seeking outpatient care (FLU 002) or after 60 days for those hospitalized with influenza complications (FLU 003). Biomarkers that were significantly associated with progression in both studies (p<0.05) or only in one (p<0.002 after Bonferroni correction) were identified. RESULTS In FLU 002 28/528 (5.3%) outpatients had influenza A(H1N1)pdm09 virus infection that progressed to a study endpoint of complications, hospitalization or death, whereas in FLU 003 28/170 (16.5%) inpatients enrolled from the general ward and 21/39 (53.8%) inpatients enrolled directly from the ICU experienced disease progression. Higher levels of 12 of the 25 markers were significantly associated with subsequent disease progression. Of these, 7 markers (IL-6, CD163, IL-10, LBP, IL-2, MCP-1, and IP-10), all with ORs for the 3(rd) versus 1(st) tertile of 2.5 or greater, were significant (p<0.05) in both outpatients and inpatients. In contrast, five markers (sICAM-1, IL-8, TNF-α, D-dimer, and sVCAM-1), all with ORs for the 3(rd) versus 1(st) tertile greater than 3.2, were significantly (p≤.002) associated with disease progression among hospitalized patients only. CONCLUSIONS In patients presenting with varying severities of influenza A(H1N1)pdm09 virus infection, a baseline elevation in several biomarkers associated with inflammation, coagulation, or immune function strongly predicted a higher risk of disease progression. It is conceivable that interventions designed to abrogate these baseline elevations might affect disease outcome

    Ecological and Genomic Attributes of Novel Bacterial Taxa That Thrive in Subsurface Soil Horizons.

    Get PDF
    While most bacterial and archaeal taxa living in surface soils remain undescribed, this problem is exacerbated in deeper soils, owing to the unique oligotrophic conditions found in the subsurface. Additionally, previous studies of soil microbiomes have focused almost exclusively on surface soils, even though the microbes living in deeper soils also play critical roles in a wide range of biogeochemical processes. We examined soils collected from 20 distinct profiles across the United States to characterize the bacterial and archaeal communities that live in subsurface soils and to determine whether there are consistent changes in soil microbial communities with depth across a wide range of soil and environmental conditions. We found that bacterial and archaeal diversity generally decreased with depth, as did the degree of similarity of microbial communities to those found in surface horizons. We observed five phyla that consistently increased in relative abundance with depth across our soil profiles: Chloroflexi, Nitrospirae, Euryarchaeota, and candidate phyla GAL15 and Dormibacteraeota (formerly AD3). Leveraging the unusually high abundance of Dormibacteraeota at depth, we assembled genomes representative of this candidate phylum and identified traits that are likely to be beneficial in low-nutrient environments, including the synthesis and storage of carbohydrates, the potential to use carbon monoxide (CO) as a supplemental energy source, and the ability to form spores. Together these attributes likely allow members of the candidate phylum Dormibacteraeota to flourish in deeper soils and provide insight into the survival and growth strategies employed by the microbes that thrive in oligotrophic soil environments.IMPORTANCE Soil profiles are rarely homogeneous. Resource availability and microbial abundances typically decrease with soil depth, but microbes found in deeper horizons are still important components of terrestrial ecosystems. By studying 20 soil profiles across the United States, we documented consistent changes in soil bacterial and archaeal communities with depth. Deeper soils harbored communities distinct from those of the more commonly studied surface horizons. Most notably, we found that the candidate phylum Dormibacteraeota (formerly AD3) was often dominant in subsurface soils, and we used genomes from uncultivated members of this group to identify why these taxa are able to thrive in such resource-limited environments. Simply digging deeper into soil can reveal a surprising number of novel microbes with unique adaptations to oligotrophic subsurface conditions

    Markers of Inflammation, Coagulation, and Renal Function Are Elevated in Adults with HIV Infection

    Get PDF
    (See the article by Kalayjian et al, on pages 1796-1805, and the editorial commentary by Dubé and Sattler, on pages 1783-1785.) Background. Human immunodeficiency virus (HIV) replication and immune activation may increase inflammation and coagulation biomarkers. Limited data exist comparing such biomarkers in persons with and without HIV infection. Methods. For persons 45-76 years of age, levels of high-sensitivity C-reactive protein (hsCRP), interleukin (IL)-6, D-dimer, and cystatin C were compared in 494 HIV-infected individuals in the Strategies for Management of Anti-Retroviral Therapy (SMART) study and 5386 participants in the Multi-Ethnic Study of Atherosclerosis (MESA) study. For persons 33-44 years of age, hsCRP and IL-6 levels were compared in 287 participants in the SMART study and 3231 participants in the Coronary Artery Development in Young Adults (CARDIA) study. Results. hsCRP and IL-6 levels were 55% (P<.001) and 62% (P<.001) higher among HIV-infected participants than among CARDIA study participants. Compared with levels noted in MESA study participants, hsCRP, IL-6, D-dimer, and cystatin C levels were 50%, 152%, 94%, and 27% higher, respectively (P<.001 , for each), among HIV-infected participants. HIV-infected participants receiving antiretroviral therapy who had HIV RNA levels ≤400 copies/mL had levels higher (by 21% to 60%) (P<.001) than those in the general population, for all biomarkers. Conclusions. hsCRP, IL-6, D-dimer, and cystatin C levels are elevated in persons with HIV infection and remain so even after HIV RNA levels are suppressed with antiretroviral therapy. Additional research is needed on the pathophysiology of HIV-induced activation of inflammatory and coagulation pathways, to guide potential intervention

    Causes and Timing of Mortality and Morbidity Among Late Presenters Starting Antiretroviral Therapy in the REALITY Trial.

    Get PDF
    BACKGROUND: In sub-Saharan Africa, 20%-25% of people starting antiretroviral therapy (ART) have severe immunosuppression; approximately 10% die within 3 months. In the Reduction of EArly mortaLITY (REALITY) randomized trial, a broad enhanced anti-infection prophylaxis bundle reduced mortality vs cotrimoxazole. We investigate the contribution and timing of different causes of mortality/morbidity. METHODS: Participants started ART with a CD4 count .3); and reduced nonfatal/fatal tuberculosis and cryptococcosis (P .2). CONCLUSIONS: Enhanced prophylaxis reduced mortality from cryptococcosis and unknown causes and nonfatal tuberculosis and cryptococcosis. High early incidence of fatal/nonfatal events highlights the need for starting enhanced-prophylaxis with ART in advanced disease. CLINICAL TRIALS REGISTRATION: ISRCTN43622374

    Toward optimal implementation of cancer prevention and control programs in public health: A study protocol on mis-implementation

    Get PDF
    Abstract Background Much of the cancer burden in the USA is preventable, through application of existing knowledge. State-level funders and public health practitioners are in ideal positions to affect programs and policies related to cancer control. Mis-implementation refers to ending effective programs and policies prematurely or continuing ineffective ones. Greater attention to mis-implementation should lead to use of effective interventions and more efficient expenditure of resources, which in the long term, will lead to more positive cancer outcomes. Methods This is a three-phase study that takes a comprehensive approach, leading to the elucidation of tactics for addressing mis-implementation. Phase 1: We assess the extent to which mis-implementation is occurring among state cancer control programs in public health. This initial phase will involve a survey of 800 practitioners representing all states. The programs represented will span the full continuum of cancer control, from primary prevention to survivorship. Phase 2: Using data from phase 1 to identify organizations in which mis-implementation is particularly high or low, the team will conduct eight comparative case studies to get a richer understanding of mis-implementation and to understand contextual differences. These case studies will highlight lessons learned about mis-implementation and identify hypothesized drivers. Phase 3: Agent-based modeling will be used to identify dynamic interactions between individual capacity, organizational capacity, use of evidence, funding, and external factors driving mis-implementation. The team will then translate and disseminate findings from phases 1 to 3 to practitioners and practice-related stakeholders to support the reduction of mis-implementation. Discussion This study is innovative and significant because it will (1) be the first to refine and further develop reliable and valid measures of mis-implementation of public health programs; (2) bring together a strong, transdisciplinary team with significant expertise in practice-based research; (3) use agent-based modeling to address cancer control implementation; and (4) use a participatory, evidence-based, stakeholder-driven approach that will identify key leverage points for addressing mis-implementation among state public health programs. This research is expected to provide replicable computational simulation models that can identify leverage points and public health system dynamics to reduce mis-implementation in cancer control and may be of interest to other health areas

    Therapeutic DNA vaccination of vertically HIV-infected children: Report of the first pediatric randomised trial (PEDVAC)

    Get PDF
    Subjects: Twenty vertically HIV-infected children, 6–16 years of age, with stable viral load control and CD4+ values above 400 cells/mm³. Intervention: Ten subjects continued their ongoing antiretroviral treatment (ART, Group A) and 10 were immunized with a HIV-DNA vaccine in addition to their previous therapy (ART and vaccine, Group B). The genetic vaccine represented HIV-1 subtypes A, B and C, encoded Env, Rev, Gag and RT and had no additional adjuvant. Immunizations took place at weeks 0, 4 and 12, with a boosting dose at week 36. Monitoring was performed until week 60 and extended to week 96. Results: Safety data showed good tolerance of the vaccine. Adherence to ART remained high and persistent during the study and did not differ significantly between controls and vaccinees. Neither group experienced either virological failure or a decline of CD4+ counts from baseline. Higher HIV-specific cellular immune responses were noted transiently to Gag but not to other components of the vaccine. Lymphoproliferative responses to a virion antigen HIV-1 MN were higher in the vaccinees than in the controls (p = 0.047), whereas differences in reactivity to clade-specific Gag p24, RT or Env did not reach significance. Compared to baseline, the percentage of HIV-specific CD8+ lymphocytes releasing perforin in the Group B was higher after the vaccination schedule had been completed (p = 0.031). No increased CD8+ perforin levels were observed in control Group A. Conclusions: The present study demonstrates the feasibility, safety and moderate immunogenicity of genetic vaccination in vertically HIV-infected children, paving the way for amplified immunotherapeutic approaches in the pediatric population. Trial registration: clinicaltrialsregister.eu 2007-002359-18; 2007-002359-18/I

    Cost Analysis of Various Low Pathogenic Avian Influenza Surveillance Systems in the Dutch Egg Layer Sector

    Get PDF
    Background: As low pathogenic avian influenza viruses can mutate into high pathogenic viruses the Dutch poultry sector implemented a surveillance system for low pathogenic avian influenza (LPAI) based on blood samples. It has been suggested that egg yolk samples could be sampled instead of blood samples to survey egg layer farms. To support future decision making about AI surveillance economic criteria are important. Therefore a cost analysis is performed on systems that use either blood or eggs as sampled material. Methodology/Principal Findings: The effectiveness of surveillance using egg or blood samples was evaluated using scenario tree models. Then an economic model was developed that calculates the total costs for eight surveillance systems that have equal effectiveness. The model considers costs for sampling, sample preparation, sample transport, testing, communication of test results and for the confirmation test on false positive results. The surveillance systems varied in sampled material (eggs or blood), sampling location (farm or packing station) and location of sample preparation (laboratory or packing station). It is shown that a hypothetical system in which eggs are sampled at the packing station and samples prepared in a laboratory had the lowest total costs (i.e. J 273,393) a year. Compared to this a hypothetical system in which eggs are sampled at the farm and samples prepared at a laboratory, and the currently implemented system in which blood is sampled at the farm and samples prepared at a laboratory have 6 % and 39 % higher costs respectively

    Effect of ready-to-use supplementary food on mortality in severely immunocompromised HIV-infected individuals in Africa initiating antiretroviral therapy (REALITY): an open-label, parallel-group, randomised controlled trial.

    Get PDF
    BACKGROUND: In sub-Saharan Africa, severely immunocompromised HIV-infected individuals have a high risk of mortality during the first few months after starting antiretroviral therapy (ART). We hypothesise that universally providing ready-to-use supplementary food (RUSF) would increase early weight gain, thereby reducing early mortality compared with current guidelines recommending ready-to-use therapeutic food (RUTF) for severely malnourished individuals only. METHODS: We did a 2 × 2 × 2 factorial, open-label, parallel-group trial at inpatient and outpatient facilities in eight urban or periurban regional hospitals in Kenya, Malawi, Uganda, and Zimbabwe. Eligible participants were ART-naive adults and children aged at least 5 years with confirmed HIV infection and a CD4 cell count of fewer than 100 cells per μL, who were initiating ART at the facilities. We randomly assigned participants (1:1) to initiate ART either with (RUSF) or without (no-RUSF) 12 weeks' of peanut-based RUSF containing 1000 kcal per day and micronutrients, given as two 92 g packets per day for adults and one packet (500 kcal per day) for children aged 5-12 years, regardless of nutritional status. In both groups, individuals received supplementation with RUTF only when severely malnourished (ie, body-mass index [BMI] 0·7). Through 48 weeks, adults and adolescents aged 13 years and older in the RUSF group had significantly greater gains in weight, BMI, and MUAC than the no-RUSF group (p=0·004, 0·004, and 0·03, respectively). The most common type of serious adverse event was specific infections, occurring in 90 (10%) of 897 participants assigned RUSF and 87 (10%) of 908 assigned no-RUSF. By week 48, 205 participants had serious adverse events in both groups (p=0·81), and 181 had grade 4 adverse events in the RUSF group compared with 172 in the non-RUSF group (p=0·45). INTERPRETATION: In severely immunocompromised HIV-infected individuals, providing RUSF universally at ART initiation, compared with providing RUTF to severely malnourished individuals only, improved short-term weight gain but not mortality. A change in policy to provide nutritional supplementation to all severely immunocompromised HIV-infected individuals starting ART is therefore not warranted at present. FUNDING: Joint Global Health Trials Scheme (UK Medical Research Council, UK Department for International Development, and Wellcome Trust).This study was funded by the Joint Global Health Trials Scheme (JGHTS) of the UK Department for International Development (DFID), the Wellcome Trust, and the UK Medical Research Council (MRC; G1100693). Additional funding support was provided by the PENTA foundation and core support to the MRC Clinical Trials Unit at University College London (London, UK; MC_UU_12023/23, MC_UU_12023/26). Cipla, Gilead Sciences, ViiV Healthcare/GlaxoSmithKline, and Merck Sharp & Dohme donated drugs for the study and ready-to-use supplementary food was purchased from Valid International. The MRC Clinical Trials Unit has received other funding from Tibotec and Gilead Sciences for data safety monitoring board membership and lectures. The Malawi–Liverpool–Wellcome Trust Clinical Research Programme, University of Malawi College of Medicine (101113/Z/13/Z), and the KEMRI/Wellcome Trust Research Programme, Kilifi (203077/Z/16/Z) are supported by strategic awards from the Wellcome Trust (UK)

    Enhanced prophylaxis with antiretroviral therapy for advanced HIV in Africa

    Get PDF
    BACKGROUND In sub-Saharan Africa, among patients with advanced human immunodeficiency virus (HIV) infection, the rate of death from infection (including tuberculosis and cryptococcus) shortly after the initiation of antiretroviral therapy (ART) is approximately 10%. METHODS In this factorial open-label trial conducted in Uganda, Zimbabwe, Malawi, and Kenya, we enrolled HIV-infected adults and children 5 years of age or older who had not received previous ART and were starting ART with a CD4+ count of fewer than 100 cells per cubic millimeter. They underwent simultaneous randomization to receive enhanced antimicrobial prophylaxis or standard prophylaxis, adjunctive raltegravir or no raltegravir, and supplementary food or no supplementary food. Here, we report on the effects of enhanced antimicrobial prophylaxis, which consisted of continuous trimethoprim–sulfamethoxazole plus at least 12 weeks of isoniazid–pyridoxine (coformulated with trimethoprim–sulfamethoxazole in a single fixed-dose combination tablet), 12 weeks of fluconazole, 5 days of azithromycin, and a single dose of albendazole, as compared with standard prophylaxis (trimethoprim– sulfamethoxazole alone). The primary end point was 24-week mortality. RESULTS A total of 1805 patients (1733 adults and 72 children or adolescents) underwent randomization to receive either enhanced prophylaxis (906 patients) or standard prophylaxis (899 patients) and were followed for 48 weeks (loss to follow-up, 3.1%). The median baseline CD4+ count was 37 cells per cubic millimeter, but 854 patients (47.3%) were asymptomatic or mildly symptomatic. In the Kaplan–Meier analysis at 24 weeks, the rate of death with enhanced prophylaxis was lower than that with standard prophylaxis (80 patients [8.9% vs. 108 [12.2%]; hazard ratio, 0.73; 95% confidence interval [CI], 0.55 to 0.98; P=0.03); 98 patients (11.0%) and 127 (14.4%), respectively, had died by 48 weeks (hazard ratio, 0.76; 95% CI, 0.58 to 0.99; P=0.04). Patients in the enhanced-prophylaxis group had significantly lower rates of tuberculosis (P=0.02), cryptococcal infection (P=0.01), oral or esophageal candidiasis (P=0.02), death of unknown cause (P=0.03), and new hospitalization (P=0.03). However, there was no significant between-group difference in the rate of severe bacterial infection (P=0.32). There were nonsignificantly lower rates of serious adverse events and grade 4 adverse events in the enhanced-prophylaxis group (P=0.08 and P=0.09, respectively). Rates of HIV viral suppression and adherence to ART were similar in the two groups. CONCLUSIONS Among HIV-infected patients with advanced immunosuppression, enhanced antimicrobial prophylaxis combined with ART resulted in reduced rates of death at both 24 weeks and 48 weeks without compromising viral suppression or increasing toxic effects
    • …
    corecore