49 research outputs found

    Identifying challenges and opportunities for improved nutrient management through U.S.D.A's Dairy Agroecosystem Working Group

    Get PDF
    Nutrient management is a priority of U.S. dairy farms, although specific concerns vary across regions and management systems. To elucidate challenges and opportunities to improving nutrient use efficiencies, the USDA’s Dairy Agroecosystems Working Group investigated 10 case studies of confinement (including open lots and free stall housing) and grazing operations in the seven major U.S. dairy producing states. Simulation modeling was carried out using the Integrated Farm Systems Model over 25 years of historic weather data. Dairies with a preference for importing feed and exporting manure, common for simulated dry lot dairies of the arid west, had lower nutrient use efficiencies at the farm gate than freestall and tie-stall dairies in humid climates. Phosphorus (P) use efficiencies ranged from 33 to 82% of imported P, while N use efficiencies were 25 to 50% of imported N. When viewed from a P budgeting perspective, environmental losses of P were generally negligible, especially from dry lot dairies. Opportunities for greater P use efficiency reside primarily in increasing on-farm feed production and reducing excess P in diets. In contrast with P, environmental losses of nitrogen (N) were 50 to 75% of annual farm N inputs. For dry lot dairies, the greatest potential for N conservation is associated with ammonia (NH3) control from housing, whereas for freestall and tie-stall operations, N conservation opportunities vary with soil and manure management system. Given that fertilizer expenses are equivalent to 2 to 6% of annual farm profits, cost incentives do exist to improve nutrient use efficiencies. However, augmenting on-farm feed production represents an even greater opportunity, especially on large operations with high animal unit densities

    Investigation of SARS-CoV-2 faecal shedding in the community: a prospective household cohort study (COVID-LIV) in the UK

    Get PDF
    Background SARS-CoV-2 is frequently shed in the stool of patients hospitalised with COVID-19. The extent of faecal shedding of SARS-CoV-2 among individuals in the community, and its potential to contribute to spread of disease, is unknown. Methods In this prospective, observational cohort study among households in Liverpool, UK, participants underwent weekly nasal/throat swabbing to detect SARS-CoV-2 virus, over a 12-week period from enrolment starting July 2020. Participants that tested positive for SARS-CoV-2 were asked to provide a stool sample three and 14 days later. In addition, in October and November 2020, during a period of high community transmission, stool sampling was undertaken to determine the prevalence of SARS-CoV-2 faecal shedding among all study participants. SARS-CoV-2 RNA was detected using Real-Time PCR. Results A total of 434 participants from 176 households were enrolled. Eighteen participants (4.2%: 95% confidence interval [CI] 2.5–6.5%) tested positive for SARS-CoV-2 virus on nasal/throat swabs and of these, 3/17 (18%: 95% CI 4–43%) had SARS-CoV-2 detected in stool. Two of three participants demonstrated ongoing faecal shedding of SARS-CoV-2, without gastrointestinal symptoms, after testing negative for SARS-CoV-2 in respiratory samples. Among 165/434 participants without SARS-CoV-2 infection and who took part in the prevalence study, none had SARS-CoV-2 in stool. There was no demonstrable household transmission of SARS-CoV-2 among households containing a participant with faecal shedding. Conclusions Faecal shedding of SARS-CoV-2 occurred among community participants with confirmed SARS-CoV-2 infection. However, during a period of high community transmission, faecal shedding of SARS-CoV-2 was not detected among participants without SARS-CoV-2 infection. It is unlikely that the faecal-oral route plays a significant role in household and community transmission of SARS-CoV-2

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival

    Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial

    Get PDF
    Background Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain. Methods RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov , NCT00541047 . Findings Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths. Interpretation Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy. Funding Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society

    Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial

    Get PDF
    Background Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear. Methods RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047. Findings Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths. Interpretation Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population

    Uncertainty in phosphorus fluxes and budgets across the US long-term agroecosystem research network

    No full text
    Phosphorus (P) budgets can be useful tools for understanding nutrient cycling and quantifying the effectiveness of nutrient management planning and policies; however, uncertainties in agricultural nutrient budgets are not often quantitatively assessed. The objective of this study was to evaluate uncertainty in P fluxes (fertilizer/manure application, atmospheric deposition, irrigation, crop removal, surface runoff, and leachate) and the propagation of these uncertainties to annual P budgets. Data from 56 cropping systems in the P-FLUX database, which spans diverse rotations and landscapes across the United States and Canada, were evaluated. Results showed that across cropping systems, average annual P budget was 22.4 kg P ha−1 (range = −32.7 to 340.6 kg P ha−1), with an average uncertainty of 13.1 kg P ha−1 (range = 1.0–87.1 kg P ha−1). Fertilizer/manure application and crop removal were the largest P fluxes across cropping systems and, as a result, accounted for the largest fraction of uncertainty in annual budgets (61% and 37%, respectively). Remaining fluxes individually accounted for \u3c2% of the budget uncertainty. Uncertainties were large enough that determining whether P was increasing, decreasing, or not changing was inconclusive in 39% of the budgets evaluated. Findings indicate that more careful and/or direct measurements of inputs, outputs, and stocks are needed. Recommendations for minimizing uncertainty in P budgets based on the results of the study were developed. Quantifying, communicating, and constraining uncertainty in budgets among production systems and multiple geographies is critical for engaging stakeholders, developing local and national strategies for P reduction, and informing policy
    corecore