27 research outputs found
Distinguishing Capillary Fringe Reflection in a GPR Profile for Precise Water Table Depth Estimation in a Boreal Podzolic Soil Field
Relative permittivity and soil moisture are highly correlated; therefore, the top boundary of saturated soil gives strong reflections in ground-penetrating radar (GPR) profiles. Conventionally in shallow groundwater systems, the first dominant reflection comes from the capillary fringe, followed by the actual water table. The objective of this study was to calibrate and validate a site-specific relationship between GPR-estimated depth to the capillary fringe (DCF) and measured water table depth (WTDm). Common midpoint (CMP) GPR surveys were carried out in order to estimate the average radar velocity, and common offset (CO) surveys were carried out to map the water table variability in the 2017 and 2018 growing seasons. Also, GPR sampling volume geometry with radar velocities in different soil layers was considered to support the CMP estimations. The regression model (R2 = 0.9778) between DCF and WTDm, developed for the site in 2017, was validated using data from 2018. A regression analysis between DCF and WTDm for the two growing seasons suggested an average capillary height of 0.741 m (R2 = 0.911, n = 16), which is compatible with the existing literature under similar soil conditions. The described method should be further developed over several growing seasons to encompass wider water table variability
Soil Moisture Mapping Using Multi-Frequency and Multi-Coil Electromagnetic Induction Sensors on Managed Podzols
Precision agriculture (PA) involves the management of agricultural fields including spatial information of soil properties derived from apparent electrical conductivity (ECa) measurements. While this approach is gaining much attention in agricultural management, farmed podzolic soils are under-represented in the relevant literature. This study: (i) established the relationship between ECa and soil moisture content (SMC) measured using time domain reflectometry (TDR); and (ii) evaluated the estimated SMC with ECa measurements obtained with two electromagnetic induction (EMI) sensors, i.e., multi-coil and multi-frequency, using TDR measured SMC. Measurements were taken on several plots at Pynn’s Brook Research Station, Pasadena, Newfoundland, Canada. The means of ECa measurements were calculated for the same sampling location in each plot. The linear regression models generated for SMC using the CMD-MINIEXPLORER were statistically significant with the highest R2 of 0.79 and the lowest RMSE (root mean square error) of 0.015 m3 m−3 but were not significant for GEM-2 with the lowest R2 of 0.17 and RMSE of 0.045 m3 m−3; this was due to the difference in the depth of investigation between the two EMI sensors. The validation of the SMC regression models for the two EMI sensors produced the highest R2 = 0.54 with the lowest RMSE prediction = 0.031 m3 m−3 given by CMD-MINIEXPLORER. The result demonstrated that the CMD-MINIEXPLORER based measurements better predicted shallow SMC, while deeper SMC was better predicted by GEM-2 measurements. In addition, the ECa measurements obtained through either multi-coil or multi-frequency sensors have the potential to be successfully employed for SMC mapping at the field scale
Recent warming across the North Atlantic region may be contributing to an expansion in barley cultivation
Although grass dominates most agricultural systems in the North Atlantic region (NAR), spring barley is the most important cereal and is used for animal feed and food and drink products. Recent changes in climate have resulted in warmer conditions across the NAR which have major implications for crop production. In this paper, we investigate the thermal requirement of spring barley in the region and use the results to examine the effects of recent trends in temperature and rainfall on barley cultivation, based on 11 regional meteorological sites. At these sites, between 1975 and 2015, we found significant warming trends for several months of the cropping season and significant trends for increases in the cropping season degree days (CSDD). In recent years, this has resulted in an increased proportion of years when the estimated minimum thermal requirement for barley has been met at sites above about 60°N. However, annual variations in CSDD are large and years still occur at these sites where this is insufficient. While warming could potentially allow an earlier start and later end to the cropping season, it is likely that high rainfall at maritime sites, and low rainfall at continental sites, will limit the ability of growers to benefit from this. Warming is considered to have been one of the main factors contributing to the large expansion of the area of barley cultivated in Iceland since the 1990s.publishedVersio
Corn-Soybean Intercropping Improved the Nutritional Quality of Forage Cultivated on Podzols in Boreal Climate
Intercropping systems could be a potential source of nutrient-rich forage production in cool climates on podzolic soils common in boreal ecosystems. In this study, we evaluated the effects of corn–soybean intercropping (IC) on the nutritional quality of forage. Two silage corn varieties were cultivated as monocropping (MC) or were intercropped with three forage soybean varieties using a randomized complete block design. IC significantly increased the crude protein (22%) and decreased the acid detergent (14%) and neutral detergent (6%) fibers. Forage net energy, total digestible nutrients, ash, dry matter intake, digestible dry matter and relative feed value were also significantly increased (p ≤ 0.05) in the IC treatments compared to corn MC. The macro and micro nutrients were higher in IC than corn MC. Intercropping increased the omega 3 fatty acid (FA) contents (67%) compared to corn MC. IC also increased the active microbial community in the plant root zone, which may contribute to the improvement in forage nutritional quality because the active soil microbial community composition showed significant correlations with soluble sugars, soluble proteins and potassium contents of the forage. These results demonstrate that corn–soybean IC could be a suitable cropping system to increase the nutritional quality of forage cultivated on podzols in boreal climates. The resultant forage has the potential to be a source of high-value animal feed for livestock production in cool climate regions of the world
Strong peak immunogenicity but rapid antibody waning following third vaccine dose in older residents of care homes
Third-dose coronavirus disease 2019 vaccines are being deployed widely but their efficacy has not been assessed adequately in vulnerable older people who exhibit suboptimal responses after primary vaccination series. This observational study, which was carried out by the VIVALDI study based in England, looked at spike-specific immune responses in 341 staff and residents in long-term care facilities who received an mRNA vaccine following dual primary series vaccination with BNT162b2 or ChAdOx1. Third-dose vaccination strongly increased antibody responses with preferential relative enhancement in older people and was required to elicit neutralization of Omicron. Cellular immune responses were also enhanced with strong cross-reactive recognition of Omicron. However, antibody titers fell 21–78% within 100 d after vaccine and 27% of participants developed a breakthrough Omicron infection. These findings reveal strong immunogenicity of a third vaccine in one of the most vulnerable population groups and endorse an approach for widespread delivery across this population. Ongoing assessment will be required to determine the stability of immune protection
Mortality and pulmonary complications in patients undergoing surgery with perioperative SARS-CoV-2 infection: an international cohort study
Background: The impact of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on postoperative recovery needs to be understood to inform clinical decision making during and after the COVID-19 pandemic. This study reports 30-day mortality and pulmonary complication rates in patients with perioperative SARS-CoV-2 infection. Methods: This international, multicentre, cohort study at 235 hospitals in 24 countries included all patients undergoing surgery who had SARS-CoV-2 infection confirmed within 7 days before or 30 days after surgery. The primary outcome measure was 30-day postoperative mortality and was assessed in all enrolled patients. The main secondary outcome measure was pulmonary complications, defined as pneumonia, acute respiratory distress syndrome, or unexpected postoperative ventilation. Findings: This analysis includes 1128 patients who had surgery between Jan 1 and March 31, 2020, of whom 835 (74·0%) had emergency surgery and 280 (24·8%) had elective surgery. SARS-CoV-2 infection was confirmed preoperatively in 294 (26·1%) patients. 30-day mortality was 23·8% (268 of 1128). Pulmonary complications occurred in 577 (51·2%) of 1128 patients; 30-day mortality in these patients was 38·0% (219 of 577), accounting for 81·7% (219 of 268) of all deaths. In adjusted analyses, 30-day mortality was associated with male sex (odds ratio 1·75 [95% CI 1·28–2·40], p\textless0·0001), age 70 years or older versus younger than 70 years (2·30 [1·65–3·22], p\textless0·0001), American Society of Anesthesiologists grades 3–5 versus grades 1–2 (2·35 [1·57–3·53], p\textless0·0001), malignant versus benign or obstetric diagnosis (1·55 [1·01–2·39], p=0·046), emergency versus elective surgery (1·67 [1·06–2·63], p=0·026), and major versus minor surgery (1·52 [1·01–2·31], p=0·047). Interpretation: Postoperative pulmonary complications occur in half of patients with perioperative SARS-CoV-2 infection and are associated with high mortality. Thresholds for surgery during the COVID-19 pandemic should be higher than during normal practice, particularly in men aged 70 years and older. Consideration should be given for postponing non-urgent procedures and promoting non-operative treatment to delay or avoid the need for surgery. Funding: National Institute for Health Research (NIHR), Association of Coloproctology of Great Britain and Ireland, Bowel and Cancer Research, Bowel Disease Research Foundation, Association of Upper Gastrointestinal Surgeons, British Association of Surgical Oncology, British Gynaecological Cancer Society, European Society of Coloproctology, NIHR Academy, Sarcoma UK, Vascular Society for Great Britain and Ireland, and Yorkshire Cancer Research
The P323L substitution in the SARS-CoV-2 polymerase (NSP12) confers a selective advantage during infection
Background
The mutational landscape of SARS-CoV-2 varies at the dominant viral genome sequence and minor genomic variant population. During the COVID-19 pandemic, an early substitution in the genome was the D614G change in the spike protein, associated with an increase in transmissibility. Genomes with D614G are accompanied by a P323L substitution in the viral polymerase (NSP12). However, P323L is not thought to be under strong selective pressure.
Results
Investigation of P323L/D614G substitutions in the population shows rapid emergence during the containment phase and early surge phase during the first wave. These substitutions emerge from minor genomic variants which become dominant viral genome sequence. This is investigated in vivo and in vitro using SARS-CoV-2 with P323 and D614 in the dominant genome sequence and L323 and G614 in the minor variant population. During infection, there is rapid selection of L323 into the dominant viral genome sequence but not G614. Reverse genetics is used to create two viruses (either P323 or L323) with the same genetic background. L323 shows greater abundance of viral RNA and proteins and a smaller plaque morphology than P323.
Conclusions
These data suggest that P323L is an important contribution in the emergence of variants with transmission advantages. Sequence analysis of viral populations suggests it may be possible to predict the emergence of a new variant based on tracking the frequency of minor variant genomes. The ability to predict an emerging variant of SARS-CoV-2 in the global landscape may aid in the evaluation of medical countermeasures and non-pharmaceutical interventions
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial
Background
Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear.
Methods
RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047.
Findings
Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths.
Interpretation
Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population