17 research outputs found
Managing salinity for sustainability of irrigation in areas with shallow saline ground water
Irrigation will be required to meet the demands of the world population for food. Water will also be needed to meet the municipal, industrial, and environmental demands of the growing population. As a result irrigation water supplies will be reduced and irrigators will probably be forced into using degraded water as part of the supply and the possibility for increased salinity in the soil profile will occur. Drainage will be required to assist in the management of the water needed for leaching to prevent soil salinisation. Drainage water containing salt and other contaminants creates a water quality problem for the water body receiving the drainage water. The paper presents the results of three cases studies that address the issue of disposal of saline drainage water through reuse for supplemental irrigation, water table control, and changing the design criteria for subsurface drainage as methods to reduce the drainage volume. The first study demonstrated that over 50% of the crop water requirement can be met with saline drainage water and that salinity in the soil profile can be managed to not adversely affect yields. This is not the case if the drainage water contains high levels of boron. The second study demonstrated that the water table can effectively be manipulated if the drainage system is properly installed. The third study showed the reduction in salt load as a result of implementing drainage control on deep drains or installing shallow drains. The results from these studies demonstrate that irrigated agriculture is sustainable in arid and semi-arid areas through improved management of the subsurface drainage system
The use of antibiotic-loaded bone cement and systemic antibiotic prophylactic use in 2,971,357 primary total knee arthroplasties from 2010 to 2020: an international register-based observational study among countries in Africa, Europe, North America, and Oceania
Background and purpose - Antibiotic-loaded bone cement (ALBC) and systemic antibiotic prophylaxis (SAP) have been used to reduce periprosthetic joint infection (PJI) rates. We investigated the use of ALBC and SAP in primary total knee arthroplasty (TKA).Patients and methods - This observational study is based on 2,971,357 primary TKAs reported in 2010-2020 to national/regional joint arthroplasty registries in Australia, Den-mark, Finland, Germany, Italy, the Netherlands, New Zealand, Norway, Romania, South Africa, Sweden, Switzerland, the UK, and the USA. Aggregate-level data on trends and types of bone cement, antibiotic agents, and doses and duration of SAP used was extracted from participating registries.Results - ALBC was used in 77% of the TKAs with variation ranging from 100% in Norway to 31% in the USA. Palacos R+G was the most common (62%) ALBC type used. The primary antibiotic used in ALBC was gentamicin (94%). Use of ALBC in combination with SAP was common prac-tice (77%). Cefazolin was the most common (32%) SAP agent. The doses and duration of SAP used varied from one single preoperative dosage as standard practice in Bolzano, Italy (98%) to 1-day 4 doses in Norway (83% of the 40,709 TKAs reported to the Norwegian arthroplasty register). Conclusion - The proportion of ALBC usage in pri-mary TKA varies internationally, with gentamicin being the most common antibiotic. ALBC in combination with SAP was common practice, with cefazolin the most common SAP agent. The type of ALBC and type, dose, and duration of SAP varied among participating countries.Orthopaedics, Trauma Surgery and Rehabilitatio
Evaluating the Effects of SARS-CoV-2 Spike Mutation D614G on Transmissibility and Pathogenicity
Global dispersal and increasing frequency of the SARS-CoV-2 spike protein variant D614G are suggestive of a selective advantage but may also be due to a random founder effect. We investigate the hypothesis for positive selection of spike D614G in the United Kingdom using more than 25,000 whole genome SARS-CoV-2 sequences. Despite the availability of a large dataset, well represented by both spike 614 variants, not all approaches showed a conclusive signal of positive selection. Population genetic analysis indicates that 614G increases in frequency relative to 614D in a manner consistent with a selective advantage. We do not find any indication that patients infected with the spike 614G variant have higher COVID-19 mortality or clinical severity, but 614G is associated with higher viral load and younger age of patients. Significant differences in growth and size of 614G phylogenetic clusters indicate a need for continued study of this variant
Using soil surface temperature to assess soil evaporation in a drip irrigated vineyard
Evaporation from the soil is an important part of the water balance of a crop, when considering water use efficiency. In this paper, a non-intensive method is tested to estimate relative soil evaporation, which is based upon a linear function of soil surface temperature change between a saturated and drying soil. The relative evaporation (RE) method of Ben-Asher et al. (1983) was calibrated using microlysimeters and thermal imaging. Soil surface temperature in a drip irrigated vineyard was then collected using infrared temperature sensors mounted on a quad bike, on several days of the 2009â2010 season. Soil surface temperature in the vineyard ranged from 4.6 °C to 65.5 °C undervine and 6.8 °C to 75.6 °C in the middle of the row. The difference between daily minima and maxima of soil surface temperature ranged from 20.2 °C to 59.7 °C in the inter-row and 13.6 °C to 36.4 °C undervine. Relative evaporation averaged 54% of evaporation from a saturated soil in the inter-row and 97% undervine. Based upon the calculation of RE, the average daily amount of soil evaporation undervine was between 0.64 mm and 1.83 mm, and between 0.69 mm and 2.52 mm inter-row. The soil evaporation undervine and inter-row both exhibited spatial variability across the vineyard, however the undervine area had less spatial variability compared to the inter-row area
Irrigation management to optimize controlled drainage in a semi-arid area
On the west side of the San Joaquin Valley, California, groundwater tables have risen after several decades of irrigation. A regional semi-permeable layer at 100 m depth (Corcoran Clay) combined with over-irrigation and leaching is the major cause of the groundwater rise. Subsurface drain systems were installed from the 60Âżs to the 80Âżs to remove excess water and maintain an aerated root zone. However, drainage water resulting from these subsurface systems contained trace elements like selenium, which were determined at toxic levels to fish and waterfowl. To maintain healthy levels of salt and selenium in the San Joaquin River, the natural drain out of the San Joaquin Valley, outflow of drainage water from farms was severely restricted or completely eliminated. Several on-farm management methods are being investigated to maintain agricultural production without off-farm drainage. One method is drainage water reuse through blending with irrigation water. Another method is to reuse drainage water consecutively, where drainage water from one field is used as irrigation water for another field. Progressively more salt tolerant crops need to be grown in such a system along the reuse path, and salts can eventually be harvested using solar evaporators. A method described in this paper aims to reduce the volume of drainage water during the growing season by increasing shallow groundwater use by crops before it is drained from the field. Five years of crops were grown on two weighing lysimeters using drip irrigation. Two years of cotton were grown under high frequency drip irrigation (applications up to 10 times a day), followed by two years of safflower (early season crop) and one year of alfalfa (perennial) under low frequency drip irrigation (twice a week). One lysimeter maintained a shallow groundwater table at 1.0-m below soil surface, while the other lysimeter was freely drained at the bottom (3.0-m below soil surface). High frequency irrigation requires more irrigation water over a season than low frequency irrigation in the presence of shallow groundwater, since low frequency irrigation induces more shallow groundwater use by crops. Groundwater use for cotton was measured as 8% of total seasonal crop water use, while measurements under safflower showed that 25% of seasonal crop water use came from groundwater. Measurements under alfalfa, in its first year of establishment, showed 15% of seasonal crop water use coming from the groundwater. To maintain a sustainable system, leaching of salts need to occur. Leaching under the proposed irrigation/drainage management system would occur in the early growing season with winter precipitation, pre-plant irrigation and the first irrigation of the growing season, when the water table can be maintained at shallower depths through restriction of the outflow of the subsurface drainage system (groundwater control)
Irrigation management to optimize controlled drainage in a semi-arid area
On the west side of the San Joaquin Valley, California, groundwater tables have risen after several decades of irrigation. A regional semi-permeable layer at 100 m depth (Corcoran Clay) combined with over-irrigation and leaching is the major cause of the groundwater rise. Subsurface drain systems were installed from the 60Âżs to the 80Âżs to remove excess water and maintain an aerated root zone. However, drainage water resulting from these subsurface systems contained trace elements like selenium, which were determined at toxic levels to fish and waterfowl. To maintain healthy levels of salt and selenium in the San Joaquin River, the natural drain out of the San Joaquin Valley, outflow of drainage water from farms was severely restricted or completely eliminated. Several on-farm management methods are being investigated to maintain agricultural production without off-farm drainage. One method is drainage water reuse through blending with irrigation water. Another method is to reuse drainage water consecutively, where drainage water from one field is used as irrigation water for another field. Progressively more salt tolerant crops need to be grown in such a system along the reuse path, and salts can eventually be harvested using solar evaporators. A method described in this paper aims to reduce the volume of drainage water during the growing season by increasing shallow groundwater use by crops before it is drained from the field. Five years of crops were grown on two weighing lysimeters using drip irrigation. Two years of cotton were grown under high frequency drip irrigation (applications up to 10 times a day), followed by two years of safflower (early season crop) and one year of alfalfa (perennial) under low frequency drip irrigation (twice a week). One lysimeter maintained a shallow groundwater table at 1.0-m below soil surface, while the other lysimeter was freely drained at the bottom (3.0-m below soil surface). High frequency irrigation requires more irrigation water over a season than low frequency irrigation in the presence of shallow groundwater, since low frequency irrigation induces more shallow groundwater use by crops. Groundwater use for cotton was measured as 8% of total seasonal crop water use, while measurements under safflower showed that 25% of seasonal crop water use came from groundwater. Measurements under alfalfa, in its first year of establishment, showed 15% of seasonal crop water use coming from the groundwater. To maintain a sustainable system, leaching of salts need to occur. Leaching under the proposed irrigation/drainage management system would occur in the early growing season with winter precipitation, pre-plant irrigation and the first irrigation of the growing season, when the water table can be maintained at shallower depths through restriction of the outflow of the subsurface drainage system (groundwater control)