21 research outputs found

    Children’s Dietary Quality and Micronutrient Adequacy by Food Security in the Household and among Household Children

    Get PDF
    Children’s food-security status has been described largely based on either the classification of food security in the household or among household children, but few studies have investigated the relationship between food security among household children and overall dietary quality. Our goal was to examine children’s dietary quality and micronutrient adequacy by food-security classification for the household and among household children. Data from 5540 children (2–17 years) from the National Health and Nutrition Examination Survey (NHANES) 2011–2014 were analyzed. Food-security status was assessed using the U.S. Household Food Security Survey Module and categorized into high, marginal, low, and very low food security for the households and among household children. Dietary quality and micronutrient adequacy were characterized by the Healthy Eating Index (HEI) 2015 and Mean Adequacy Ratio (MAR; based on total nutrient intakes from diet and dietary supplements), respectively. The HEI 2015 scores did not substantially vary by either food-security classification, but the MAR was greater in high compared to very low food security in households and among household children; a linear relationship was found only among household children. In general, very good agreement was observed between the classifications, but the strength of agreement differed by children’s age, race/Hispanic origin, and family income. In conclusion, micronutrient adequacy, but not dietary quality, significantly differed by food-security status. While the agreement between food security in the household and among household children is very good, classification of food security among household children may be more sensitive to detecting differences in exposure to nutrients

    Inclusive fitness theory and eusociality

    Get PDF

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Children’s Dietary Quality and Micronutrient Adequacy by Food Security in the Household and among Household Children

    No full text
    Children’s food-security status has been described largely based on either the classification of food security in the household or among household children, but few studies have investigated the relationship between food security among household children and overall dietary quality. Our goal was to examine children’s dietary quality and micronutrient adequacy by food-security classification for the household and among household children. Data from 5540 children (2–17 years) from the National Health and Nutrition Examination Survey (NHANES) 2011–2014 were analyzed. Food-security status was assessed using the U.S. Household Food Security Survey Module and categorized into high, marginal, low, and very low food security for the households and among household children. Dietary quality and micronutrient adequacy were characterized by the Healthy Eating Index (HEI) 2015 and Mean Adequacy Ratio (MAR; based on total nutrient intakes from diet and dietary supplements), respectively. The HEI 2015 scores did not substantially vary by either food-security classification, but the MAR was greater in high compared to very low food security in households and among household children; a linear relationship was found only among household children. In general, very good agreement was observed between the classifications, but the strength of agreement differed by children’s age, race/Hispanic origin, and family income. In conclusion, micronutrient adequacy, but not dietary quality, significantly differed by food-security status. While the agreement between food security in the household and among household children is very good, classification of food security among household children may be more sensitive to detecting differences in exposure to nutrients

    Antiretroviral Concentrations in Breast-Feeding Infants of Mothers Receiving Highly Active Antiretroviral Therapyâ–ż

    No full text
    There are limited data describing the concentrations of zidovudine, lamivudine, and nevirapine in nursing infants as a result of transfer via breast milk. The Kisumu Breastfeeding Study is a phase IIb open-label trial of prenatal, intrapartum, and postpartum maternal treatment with zidovudine, lamivudine, and nevirapine from 34 weeks of gestation to 6 months postpartum. In a pharmacokinetic substudy, maternal plasma, breast milk, and infant dried blood spots were collected for drug assay on the day of delivery and at 2, 6, 14, and 24 weeks after delivery. Sixty-seven mother-infant pairs were enrolled. The median concentrations in breast milk of zidovudine, lamivudine, and nevirapine during the study period were 14 ng/ml, 1,214 ng/ml, and 4,546 ng/ml, respectively. Zidovudine was not detectable in any infant plasma samples obtained after the day of delivery, while the median concentrations in infant plasma samples from postpartum weeks 2, 6, and 14 were 67 ng/ml, 32 ng/ml, and 24 ng/ml for lamivudine and 987 ng/ml, 1,032 ng/ml, and 734 ng/ml for nevirapine, respectively. Therefore, lamivudine and nevirapine, but not zidovudine, are transferred to infants via breast milk in biologically significant concentrations. The extent and effect of infant drug exposure via breast milk must be well understood in order to evaluate the benefits and risks of maternal antiretroviral use during lactation
    corecore