25 research outputs found
Evidence for the efficacy of pre-harvest agricultural practices in mitigating food-safety risks to fresh produce in North America
Consumption of contaminated produce remains a leading cause of foodborne illness. Increasingly, growers are altering agricultural practices and farm environments to manage food-safety hazards, but these changes often result in substantial economic, social, and environmental costs. Here, we present a comprehensive evidence synthesis evaluating the efficacy of soil, non-crop vegetation, animal, landscape, and irrigation water management strategies aimed at reducing produce-safety risk in North America. We systematically summarized findings from 78 peer-reviewed papers on the effect of 21 management practices on the prevalence, abundance, or survival of four foodborne pathogens (i.e., E. coli, Salmonella spp., Listeria spp., and Campylobacter spp.), resulting in 113 summaries. We then organized a 30-member expert panel, who used these summaries to evaluate the impact of each practice on food-safety outcomes. While more than half of the practices were too understudied to confidently evaluate their impact on food safety, the panel did identify several practices that were associated with reduced preharvest food-safety risks, including not using raw manure, separating crop and livestock production, and choosing low-risk irrigation sources. The panel also identified practices that appear ineffective at reducing food-safety risks, such as the removal of non-crop vegetation. Overall, these findings provide insights into the food-safety impacts of agricultural and land management practices that growers, auditors, and extension personnel can use to co-manage produce preharvest environments for food safety and other aimsFunding for this research was made possible by the Center for Produce Safety (Grant# 2019CPS03), by the U.S. Department of Agriculture's (USDA) Agricultural Marketing Service through grant USDA-AMS-TM-SCBG-G-18-003, and by USDA, Agricultural Research Service, National Program 104: Food Safety (animal and plant products)Peer reviewe
Risk factors associated with the prevalence of Shiga-toxin-producing Escherichia coli in manured soils on certified organic farms in four regions of the USA
IntroductionBiological soil amendments of animal origin (BSAAO), including untreated amendments are often used to improve soil fertility and are particularly important in organic agriculture. However, application of untreated manure on cropland can potentially introduce foodborne pathogens into the soil and onto produce. Certified organic farms follow the USDA National Organic Program (NOP) standards that stipulate a 90- or 120-day interval between application of untreated manure and crop harvest, depending on whether the edible portion of the crop directly contacts the soil. This time-interval metric is based on environmental factors and does not consider a multitude of factors that might affect the survival of the main pathogens of concern. The objective of this study was to assess predictors for the prevalence of Shiga-toxin-producing Escherichia coli (non-O157 STEC) in soils amended with untreated manure on USDA-NOP certified farms.MethodsA longitudinal, multi-regional study was conducted on 19 farms in four USA regions for two growing seasons (2017–2018). Untreated manure (cattle, horse, and poultry), soil, and irrigation water samples were collected and enrichment cultured for non-O157 STEC. Mixed effects logistic regression models were used to analyze the predictors of non-O157 STEC in the soil up to 180 days post-manure application.Results and discussionResults show that farm management practices (previous use with livestock, presence of animal feces on the field, season of manure application) and soil characteristics (presence of generic E. coli in the soil, soil moisture, sodium) increased the odds of STEC-positive soil samples. Manure application method and snowfall decreased the odds of detecting STEC in the soil. Time-variant predictors (year and sampling day) affected the presence of STEC. This study shows that a single metric, such as the time interval between application of untreated manure and crop harvest, may not be sufficient to reduce the food safety risks from untreated manure, and additional environmental and farm-management practices should also be considered. These findings are of particular importance because they provide multi-regional baseline data relating to current NOP wait-time standards. They can therefore contribute to the development of strategies to reduce pathogen persistence that may contribute to contamination of fresh produce typically eaten raw from NOP-certified farms using untreated manure
Multiple novel prostate cancer susceptibility signals identified by fine-mapping of known risk loci among Europeans
Genome-wide association studies (GWAS) have identified numerous common prostate cancer (PrCa) susceptibility loci. We have
fine-mapped 64 GWAS regions known at the conclusion of the iCOGS study using large-scale genotyping and imputation in
25 723 PrCa cases and 26 274 controls of European ancestry. We detected evidence for multiple independent signals at 16
regions, 12 of which contained additional newly identified significant associations. A single signal comprising a spectrum of
correlated variation was observed at 39 regions; 35 of which are now described by a novel more significantly associated lead SNP,
while the originally reported variant remained as the lead SNP only in 4 regions. We also confirmed two association signals in
Europeans that had been previously reported only in East-Asian GWAS. Based on statistical evidence and linkage disequilibrium
(LD) structure, we have curated and narrowed down the list of the most likely candidate causal variants for each region.
Functional annotation using data from ENCODE filtered for PrCa cell lines and eQTL analysis demonstrated significant
enrichment for overlap with bio-features within this set. By incorporating the novel risk variants identified here alongside the
refined data for existing association signals, we estimate that these loci now explain ∼38.9% of the familial relative risk of PrCa,
an 8.9% improvement over the previously reported GWAS tag SNPs. This suggests that a significant fraction of the heritability of
PrCa may have been hidden during the discovery phase of GWAS, in particular due to the presence of multiple independent
signals within the same regio
Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial
Background
Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear.
Methods
RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047.
Findings
Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths.
Interpretation
Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population
Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial
Background
Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain.
Methods
RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and
ClinicalTrials.gov
,
NCT00541047
.
Findings
Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths.
Interpretation
Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy.
Funding
Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society
Defining the Impact of Family History on Detection of High-grade Prostate Cancer in a Large Multi-institutional Cohort
BACKGROUND
The risk of high-grade prostate cancer, given a family history of cancer, has been described in the general population, but not among men selected for prostate biopsy in an international cohort.
OBJECTIVE
To estimate the risk of high-grade prostate cancer on biopsy based on a family history of cancer.
DESIGN, SETTING, AND PARTICIPANTS
This is a multicenter study of men undergoing prostate biopsy from 2006 to 2019, including 12 sites in North America and Europe. All sites recorded first-degree prostate cancer family histories; four included more detailed data on the number of affected relatives, second-degree relatives with prostate cancer, and breast cancer family history.
OUTCOMES MEASUREMENTS AND STATISTICAL ANALYSIS
Multivariable logistic regressions evaluated odds of high-grade (Gleason grade group ≥2) prostate cancer. Separate models were fit for family history definitions, including first- and second-degree prostate cancer and breast cancer family histories.
RESULTS AND LIMITATIONS
A first-degree prostate cancer family history was available for 15 799 men, with a more detailed family history for 4617 (median age 65 yr, both cohorts). Adjusted odds of high-grade prostate cancer were 1.77 times greater (95% confidence interval [CI] 1.57-2.00, p < 0.001, risk ratio [RR] = 1.40) with first-degree prostate cancer, 1.38 (95% CI 1.07-1.77, p = 0.011, RR = 1.22) for second-degree prostate cancer, and 1.30 (95% CI 1.01-1.67, p = 0.040, RR = 1.18) for first-degree breast cancer family histories. Interaction terms revealed that the effect of a family history did not differ based on prostate-specific antigen but differed based on age. This study is limited by missing data on race and prior negative biopsy.
CONCLUSIONS
Men with indications for biopsy and a family history of prostate or breast cancer can be counseled that they have a moderately increased risk of high-grade prostate cancer, independent of other risk factors.
PATIENT SUMMARY
In a large international series of men selected for prostate biopsy, finding a high-grade prostate cancer was more likely in men with a family history of prostate or breast cancer
Recommended from our members
Risk factors associated with the prevalence of Shiga-toxin-producing Escherichia coli in manured soils on certified organic farms in four regions of the USA
Introduction: Biological soil amendments of animal origin (BSAAO), including untreated amendments are often used to improve soil fertility and are particularly important in organic agriculture. However, application of untreated manure on cropland can potentially introduce foodborne pathogens into the soil and onto produce. Certified organic farms follow the USDA National Organic Program (NOP) standards that stipulate a 90- or 120-day interval between application of untreated manure and crop harvest, depending on whether the edible portion of the crop directly contacts the soil. This time-interval metric is based on environmental factors and does not consider a multitude of factors that might affect the survival of the main pathogens of concern. The objective of this study was to assess predictors for the prevalence of Shiga-toxin-producing Escherichia coli (non-O157 STEC) in soils amended with untreated manure on USDA-NOP certified farms. Methods: A longitudinal, multi-regional study was conducted on 19 farms in four USA regions for two growing seasons (2017–2018). Untreated manure (cattle, horse, and poultry), soil, and irrigation water samples were collected and enrichment cultured for non-O157 STEC. Mixed effects logistic regression models were used to analyze the predictors of non-O157 STEC in the soil up to 180 days post-manure application. Results and discussion: Results show that farm management practices (previous use with livestock, presence of animal feces on the field, season of manure application) and soil characteristics (presence of generic E. coli in the soil, soil moisture, sodium) increased the odds of STEC-positive soil samples. Manure application method and snowfall decreased the odds of detecting STEC in the soil. Time-variant predictors (year and sampling day) affected the presence of STEC. This study shows that a single metric, such as the time interval between application of untreated manure and crop harvest, may not be sufficient to reduce the food safety risks from untreated manure, and additional environmental and farm-management practices should also be considered. These findings are of particular importance because they provide multi-regional baseline data relating to current NOP wait-time standards. They can therefore contribute to the development of strategies to reduce pathogen persistence that may contribute to contamination of fresh produce typically eaten raw from NOP-certified farms using untreated manure
Survival and Persistence of Foodborne Pathogens in Manure-Amended Soils and Prevalence on Fresh Produce in Certified Organic Farms: A Multi-Regional Baseline Analysis
Biological soil amendments of animal origin (BSAAOs), including untreated (e.g., raw or aged manure, or incompletely composted manure) and treated animal products (e.g., compost), are used for crop production and as part of soil health management. Application of BSAAO's must be done cautiously, as raw manure commonly contains enteric foodborne pathogens that can potentially contaminate edible produce that may be consumed without cooking. USDA National Organic Program (NOP) certified production systems follow the 90-or 120-day interval standards between applications of untreated BSAAOs and crop harvest, depending on whether the edible portions of the crops are in indirect or direct contact with the soil, respectively. This study was conducted to evaluate the survival of four foodborne pathogens in soils amended with BSAAOs and to examine the potential for bacterial transfer to fresh produce harvested from USDA NOP certified organic farms (19) from four states. Only 0.4% (2/527) of produce samples were positive for L. monocytogenes. Among the untreated manure and compost samples, 18.0% (42/233) were positive for at least one of the tested and culturable bacterial foodborne pathogens. The prevalence of non-O157 STEC and Salmonella in untreated manure was substantially > that of E. coli O157:H7 and L. monocytogenes. Of the 2,461 soil samples analyzed in this study, 12.9% (318) were positive for at least one pathogen. In soil amended with untreated manure, the prevalence of non-O157 STEC [7.7% (190) and L. monocytogenes (5.0% (122), was > that of Salmonella (1.1% (26)] or E. coli O157 [0.04% (1)]. Foodborne pathogen prevalence in the soil peaked after manure application and decreased significantly 30 days post-application (dpa). However, non-O157 STEC and L. monocytogenes were recovered from soil samples after 90 and 120 dpa. Results indicate that produce contamination by tested foodborne pathogens was infrequent, but these data should not be generalized outside of the specific wait-time regulations for organic crop production and the farms studied. Moreover, other sources of contamination, e.g., irrigation, wildlife, environmental conditions, cropping and management practices, should be considered. This study also provides multi-regional baseline data relating to current NOP application intervals and development of potential risk mitigation strategies to reduce pathogen persistence in soils amended with BSAAOs. These findings contribute to filling critical data gaps concerning occurrence of fecal pathogens in NOP-certified farming systems used for production of fresh produce in different US regions
Data_Sheet_2_Risk factors associated with the prevalence of Listeria monocytogenes in manured soils on certified organic farms in four regions of the United States.docx
IntroductionBiological soil amendments, including raw or untreated manure, are currently used to improve soil fertility, especially in organic operations that prohibit use of synthetic fertilizers. However, addition of untreated manure may pose a risk of contamination of fresh produce by pathogens of public health significance, including Listeria monocytogenes. Organic growers follow United States Department of Agriculture (USDA) National Organic Program regulations for raw manure use, which stipulate that harvest should commence no earlier than 90- or 120-days post-application, depending on direct contact between the edible portion of the produce and the soil. To inform the protection that such time-intervals provide, this study explored the farm-level risk factors associated with L. monocytogenes prevalence in USDA-certified organic farm soils amended with untreated manures.MethodsA longitudinal, multi-regional study was conducted on 19 farms in four states (California, Minnesota, Maine, and Maryland) over two growing seasons (2017 and 2018). Untreated manure, soil, irrigation water, and produce samples were collected and cultured for L. monocytogenes. Mixed effect logistic regression was used to investigate risk factors associated with L. monocytogenes prevalence in soil.Results and DiscussionResults showed that multiple factors influenced the odds of a soil-positive sample, including temporal [year (OR = 0.19), sampling day (OR = 0.09–0.48)] and weather-related [temperature range (OR = 0.48)] variables, manure characteristics [season of application (OR = 0.04, summer), presence of L. monocytogenes (OR = 2.89) and other pathogens in manure (OR = 5.24)], farm management factors [water source (OR = 2.73, mixed), number of year-round staff (OR = 0.02)], and soil characteristics [concentration of generic Escherichia coli (OR = 1.45), moisture (OR = 0.46), organic matter (OR = 7.30), nitrate (OR = 3.07), potassium (OR = 0.09) and calcium (OR = 2.48)]. This study highlights the complexity of L. monocytogenes prevalence in soil and contributes science-based metrics that may be used when determining risk-mitigation strategies for pathogen contamination.</p