17 research outputs found
Recommended from our members
Chemically Enhanced Treatment Wetland to Improve Water Quality and Mitigate Land Subsidence in the Sacramento‒San Joaquin Delta: Cost and Design Considerations
Water quality impairment and land surface subsidence threaten the viability of the Sacramento–San Joaquin Delta (Delta), a critical component of California’s water conveyance system. Current-day irrigation drainage through Delta island peat soils affects drinking water treatment and is linked to mercury transport, potentially posing both ecological and public health concerns. To cost-effectively treat agricultural drainage water from subsided Delta islands to reduce the export of drinking Water Quality Constituents of Concern and mitigate land subsidence through accretion, we studied hybrid coagulation-treatment wetland systems, termed Chemically Enhanced Treatment Wetlands (CETWs). We provide cost estimates and design recommendations to aid broader implementation of this technology. Over a 20-year horizon using a Total Annualized Cost analysis, we estimate treatment costs of 747 per acre-foot (ac‑ft) water treated, and 70 per kg dissolved organic carbon (DOC) removed, depending upon source water DOC concentrations for a small 3-acre CETW system. For larger CETW systems scaled for island sizes of 3,500 to 14,000 acres, costs decrease to 239 per ac-ft water treated, and 14 per kg DOC removed. We estimated the footprints of CETW systems to be approximately 3% of the area being treated for 4-day hydraulic retention time (HRT) systems, but they would decrease to less than 1% for 1-day HRT systems. CETWs ultimately address several of the Delta’s key internal issues while keeping water treatment costs competitive with other currently available treatment technologies at similar scales on a per-carbon-removed basis. CETWs offer a reliable system to reduce out-going DOC and mercury loads, and they provide the additional benefit of sediment accretion. System costs and treatment efficacy depend significantly on inflow source water conditions, land availability, and other practical matters. To keep costs low and removal efficacy high, wetland design features will need site-specific evaluation
Widespread exon skipping triggers degradation by nuclear RNA surveillance in fission yeast
Exon skipping is considered a principal mechanism by which eukaryotic cells expand their transcriptome and proteome repertoires, creating different splice variants with distinct cellular functions. Here we analyze RNA-seq data from 116 transcriptomes in fission yeast (Schizosaccharomyces pombe), covering multiple physiological conditions as well as transcriptional and RNA processing mutants. We applied brute-force algorithms to detect all possible exon-skipping events, which were widespread but rare compared to normal splicing events. Exon-skipping events increased in cells deficient for the nuclear exosome or the 5'-3' exonuclease Dhp1, and also at late stages of meiotic differentiation when nuclear-exosome transcripts decreased. The pervasive exon-skipping transcripts were stochastic, did not increase in specific physiological conditions, and were mostly present at less than one copy per cell, even in the absence of nuclear RNA surveillance and during late meiosis. These exon-skipping transcripts are therefore unlikely to be functional and may reflect splicing errors that are actively removed by nuclear RNA surveillance. The average splicing rate by exon skipping was ∼ 0.24% in wild type and ∼ 1.75% in nuclear exonuclease mutants. We also detected approximately 250 circular RNAs derived from single or multiple exons. These circular RNAs were rare and stochastic, although a few became stabilized during quiescence and in splicing mutants. Using an exhaustive search algorithm, we also uncovered thousands of previously unknown splice sites, indicating pervasive splicing; yet most of these splicing variants were cryptic and increased in nuclear degradation mutants. This study highlights widespread but low frequency alternative or aberrant splicing events that are targeted by nuclear RNA surveillance
Widespread exon skipping triggers degradation by nuclear RNA surveillance in fission yeast.
Exon skipping is considered a principal mechanism by which eukaryotic cells expand their transcriptome and proteome repertoires, creating different splice variants with distinct cellular functions. Here we analyze RNA-seq data from 116 transcriptomes in fission yeast (Schizosaccharomyces pombe), covering multiple physiological conditions as well as transcriptional and RNA processing mutants. We applied brute-force algorithms to detect all possible exon-skipping events, which were widespread but rare compared to normal splicing events. Exon-skipping events increased in cells deficient for the nuclear exosome or the 5'-3' exonuclease Dhp1, and also at late stages of meiotic differentiation when nuclear-exosome transcripts decreased. The pervasive exon-skipping transcripts were stochastic, did not increase in specific physiological conditions, and were mostly present at less than one copy per cell, even in the absence of nuclear RNA surveillance and during late meiosis. These exon-skipping transcripts are therefore unlikely to be functional and may reflect splicing errors that are actively removed by nuclear RNA surveillance. The average splicing rate by exon skipping was ∼ 0.24% in wild type and ∼ 1.75% in nuclear exonuclease mutants. We also detected approximately 250 circular RNAs derived from single or multiple exons. These circular RNAs were rare and stochastic, although a few became stabilized during quiescence and in splicing mutants. Using an exhaustive search algorithm, we also uncovered thousands of previously unknown splice sites, indicating pervasive splicing; yet most of these splicing variants were cryptic and increased in nuclear degradation mutants. This study highlights widespread but low frequency alternative or aberrant splicing events that are targeted by nuclear RNA surveillance
Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial
Background
Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear.
Methods
RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047.
Findings
Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths.
Interpretation
Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population
Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial
Background
Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain.
Methods
RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and
ClinicalTrials.gov
,
NCT00541047
.
Findings
Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths.
Interpretation
Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy.
Funding
Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society
Evolution of Arability and Land Use, Sacramento–San Joaquin Delta, California
We used available data to estimate changes in land use and wet, non-farmable, and marginally farmable (WNMF) areas in the Delta from 1984 to 2012, and developed a conceptual model for processes that affect the changes observed. We analyzed aerial photography, groundwater levels, land–surface elevation data, well and boring logs, and surface water elevations. We used estimates for sea level rise and future subsidence to assess future vulnerability for the development of WNMF areas. The cumulative WNMF area increased linearly about 10-fold, from about 274 hectares (ha) in 1984 to about 2,800 ha in 2012. Moreover, several islands have experienced land use changes associated with reduced ability to drain the land. These have occurred primarily in the western and central Delta where organic soils have thinned; there are thin underlying mud deposits, and drainage ditches have not been maintained. Subsidence is the key process that will contribute to future increased likelihood of WNMF areas by reducing the thickness of organic soils and increasing hydraulic gradients onto the islands. To a lesser extent, sea level rise will also contribute to increased seepage onto islands by increasing groundwater levels in the aquifer under the organic soil and tidal mud, and increasing the hydraulic gradient onto islands from adjacent channels. WNMF develop from increased seepage under levees, which is caused by changing flow paths as organic soil thickness has decreased. This process is exacerbated by thin tidal mud deposits. Based primarily on projected reduced organic soil thickness and land–surface elevations, we delineated an additional area of about 3,450 ha that will be vulnerable to reduced arability and increased wetness by 2050
Evolution of Arability and Land Use, Sacramento–San Joaquin Delta, California
We used available data to estimate changes in land use and wet, non-farmable, and marginally farmable (WNMF) areas in the Delta from 1984 to 2012, and developed a conceptual model for processes that affect the changes observed. We analyzed aerial photography, groundwater levels, land–surface elevation data, well and boring logs, and surface water elevations. We used estimates for sea level rise and future subsidence to assess future vulnerability for the development of WNMF areas. The cumulative WNMF area increased linearly about 10-fold, from about 274 hectares (ha) in 1984 to about 2,800 ha in 2012. Moreover, several islands have experienced land use changes associated with reduced ability to drain the land. These have occurred primarily in the western and central Delta where organic soils have thinned; there are thin underlying mud deposits, and drainage ditches have not been maintained. Subsidence is the key process that will contribute to future increased likelihood of WNMF areas by reducing the thickness of organic soils and increasing hydraulic gradients onto the islands. To a lesser extent, sea level rise will also contribute to increased seepage onto islands by increasing groundwater levels in the aquifer under the organic soil and tidal mud, and increasing the hydraulic gradient onto islands from adjacent channels. WNMF develop from increased seepage under levees, which is caused by changing flow paths as organic soil thickness has decreased. This process is exacerbated by thin tidal mud deposits. Based primarily on projected reduced organic soil thickness and land–surface elevations, we delineated an additional area of about 3,450 ha that will be vulnerable to reduced arability and increased wetness by 2050
Recommended from our members
Evolution of Arability and Land Use, Sacramento–San Joaquin Delta, California
We used available data to estimate changes in land use and wet, non-farmable, and marginally farmable (WNMF) areas in the Delta from 1984 to 2012, and developed a conceptual model for processes that affect the changes observed. We analyzed aerial photography, groundwater levels, land–surface elevation data, well and boring logs, and surface water elevations. We used estimates for sea level rise and future subsidence to assess future vulnerability for the development of WNMF areas. The cumulative WNMF area increased linearly about 10-fold, from about 274 hectares (ha) in 1984 to about 2,800 ha in 2012. Moreover, several islands have experienced land use changes associated with reduced ability to drain the land. These have occurred primarily in the western and central Delta where organic soils have thinned; there are thin underlying mud deposits, and drainage ditches have not been maintained. Subsidence is the key process that will contribute to future increased likelihood of WNMF areas by reducing the thickness of organic soils and increasing hydraulic gradients onto the islands. To a lesser extent, sea level rise will also contribute to increased seepage onto islands by increasing groundwater levels in the aquifer under the organic soil and tidal mud, and increasing the hydraulic gradient onto islands from adjacent channels. WNMF develop from increased seepage under levees, which is caused by changing flow paths as organic soil thickness has decreased. This process is exacerbated by thin tidal mud deposits. Based primarily on projected reduced organic soil thickness and land–surface elevations, we delineated an additional area of about 3,450 ha that will be vulnerable to reduced arability and increased wetness by 2050
Agricultural managed aquifer recharge — water quality factors to consider
The resilience and productivity of California's agriculture is threatened by groundwater overdraft, reduction in aquifer water quality, increased land subsidence damage to infrastructure and an irreversible reduction in groundwater storage capacity. Intentionally flooding agricultural fields during winter — a practice referred to as agricultural managed aquifer recharge (AgMAR) — can help counteract overdraft. However, the potential for AgMAR to exacerbate nitrate/salt leaching and contamination of at-risk aquifers remains a critical concern. To quantify the risk of groundwater contamination with AgMAR, we took 30-foot-long soil cores in 12 almond orchards, processing tomato fields and wine grape vineyards on low- and high-permeability soils, measured nitrate and total dissolved solids concentrations and calculated stored nitrate-N. Wine grape vineyards on permeable soils had the least nitrate leaching risk observed. However, almond orchards and tomato fields could be leveraged for AgMAR if dedicated recharge sites were established and clean surface water used for recharge. Historical land use, current nitrogen management and soil permeability class are the main factors to consider before implementing AgMAR
Recommended from our members
Chemically Enhanced Treatment Wetland to Improve Water Quality and Mitigate Land Subsidence in the Sacramento‒San Joaquin Delta: Cost and Design Considerations
Water quality impairment and land surface subsidence threaten the viability of the Sacramento–San Joaquin Delta (Delta), a critical component of California’s water conveyance system. Current-day irrigation drainage through Delta island peat soils affects drinking water treatment and is linked to mercury transport, potentially posing both ecological and public health concerns. To cost-effectively treat agricultural drainage water from subsided Delta islands to reduce the export of drinking Water Quality Constituents of Concern and mitigate land subsidence through accretion, we studied hybrid coagulation-treatment wetland systems, termed Chemically Enhanced Treatment Wetlands (CETWs). We provide cost estimates and design recommendations to aid broader implementation of this technology. Over a 20-year horizon using a Total Annualized Cost analysis, we estimate treatment costs of 747 per acre-foot (ac‑ft) water treated, and 70 per kg dissolved organic carbon (DOC) removed, depending upon source water DOC concentrations for a small 3-acre CETW system. For larger CETW systems scaled for island sizes of 3,500 to 14,000 acres, costs decrease to 239 per ac-ft water treated, and 14 per kg DOC removed. We estimated the footprints of CETW systems to be approximately 3% of the area being treated for 4-day hydraulic retention time (HRT) systems, but they would decrease to less than 1% for 1-day HRT systems. CETWs ultimately address several of the Delta’s key internal issues while keeping water treatment costs competitive with other currently available treatment technologies at similar scales on a per-carbon-removed basis. CETWs offer a reliable system to reduce out-going DOC and mercury loads, and they provide the additional benefit of sediment accretion. System costs and treatment efficacy depend significantly on inflow source water conditions, land availability, and other practical matters. To keep costs low and removal efficacy high, wetland design features will need site-specific evaluation