80 research outputs found

    A Dual Isotope Protocol to Separate the Contributions to Phosphorus Content of Maize Shoots and Soil Phosphorus Fractions from Biosolids, Fertilizer and Soil

    Get PDF
    Separation of the phosphorus (P) contributions from soil, fertilizer and biosolids to plants has not been possible without the aid of radioisotopes. Dual labelling of soil with 32P and fertilizer with 33P isotopes has been used to partition the sources of P in maize (Zea mays) shoots and in soil P pools. Biosolids containing 4.1% P that had been prepared using Fe and Al were applied to a Kurosol soil from Goulburn, NSW, Australia. The biosolids were applied at five rates up to 60 dry t/ha with and without P fertilizer. Phosphorus derived from fertilizer was determined directly with 33P and that from soil by 32P reverse dilution. Phosphorus derived from biosolids was estimated as the difference between total P and that derived from the soil plus fertilizer calculated from isotope data. Yield and P content of maize shoots increased linearly with the rate of biosolids application. The proportion of P in the plant derived from biosolids also increased with application rate up to 88% for the soil receiving biosolids at 60 dry t/ha with no fertilizer. The corresponding value with fertilizer applied at 80 kg P/ha was 69%. The proportion of P in the maize shoots derived from soil and fertilizer decreased as biosolids application rate increased. Soil total P, bicarbonate extractable P, Al-P, Fe-P and Ca-P increased with biosolids application rate. The increase in plant P uptake and in bicarbonate extractable P in the soil shows that biosolids P provides a readily available source of P. A decrease in uptake of fertilizer and soil P with increasing biosolids application is attributed to the decrease in the proportion of P from these sources in the total pool of available P, rather than to immobilization of P by Fe and Al in the biosolids

    Preferential phosphorus placement improves the productivity and competitiveness of tropical pasture legumes

    Get PDF
    Extensive grazing systems often receive minimal fertiliser due to the risk associated with using relatively expensive inputs. Nevertheless, nutrient applications are known to improve pasture productivity, and the benefit of applying fertiliser is being more widely accepted. Two tropical pasture mixes (Digit/Desmanthus and Rhodes/Centro) were established in plastic boxes containing phosphorus (P) responsive soil to investigate shoot yield and P fertiliser recovery. The grasses and legumes were planted in separate rows, and three P treatments were applied along with the seed ('BOTH low-P' had 2kg P ha−1 banded below both components, 'BOTH high-P' had 12kg P ha−1 banded below both components and 'LEGUME superhigh-P' had 12kg P ha−1 banded below the legume only). The P applied below the legumes was labelled with 32P-radioisotope tracer. When P fertiliser was applied below both components, the grasses consistently out-yielded the legumes (avg. legume content=29%). Preferential fertiliser application below the legumes increased the average legume content of the two pasture mixes to 66%. Legume tissue P derived from applied P fertiliser increased from 20% to 77% as the P application rate was increased. However, total recovery of applied P by the legumes was relatively low in each of the treatments (≤7% of applied P). These collective results demonstrate that a preferential application of P fertiliser can benefit legume productivity, with applied P being a significant proportion of plant tissue P. Although only a small proportion of applied P was recovered within the seven-week growth period, it is expected that this fertiliser application at planting will remain beneficial for a large proportion of the growing season following pasture establishment

    Fusarium Crown Rot Reduces Water Use and Causes Yield Penalties in Wheat under Adequate and above Average Water Availability

    Get PDF
    The cereal disease Fusarium crown rot (FCR), caused by the fungal pathogen Fusarium pseudograminearum, is a worldwide major constraint to winter cereal production but especially in Australia's northern grain's region (NGR) of NSW and Queensland. Conventionally, FCR induced yield penalties are associated with semi-arid water-limited conditions during flowering and grain-filling. In this study, yield penalties associated with FCR infection were found to be significant under both adequate and above average water conditions which has implication for global wheat production in more favorable environments. This research was conducted to understand the impact of FCR on water availability, yield and grain quality in high protein bread and durum wheat varieties in controlled environment and replicated field experiments across three locations in the NGR over a two-year period. Under controlled conditions, FCR infection significantly decreased water use by 7.5% with an associated yield reduction of 9.5% irrespective of water treatment. Above average rainfall was experienced across all field experimental sites in both 2020 and 2021 growing seasons. The field studies demonstrated a decrease in water use of upwards of 23% at some sites and significant yield penalties across all cultivars of up to 18.4% in natural rainfed scenarios to still 13.2% with further supplementary irrigation

    Physical and chemical characteristics of feedlot pen substrate bedded with woodchip under wet climatic conditions

    Get PDF
    Wet winter conditions can create animal welfare issues in feedlots if the pen surface becomes a deep, wet, penetrable substrate. Feedlot pens with a clay and gravel base (N = 30) bedded with 150 mm (W15) and 300 mm (W30) depth of woodchips were compared to a control treatment with no bedding over a 109-day feeding period, while irrigated to supplement natural rainfall. The pad substrate was measured for variables which would affect cattle comfort and value of the substrate for composting. The penetrable depth of control pens was higher than both woodchip-bedded treatments from week 2, and increased until the end of the experiment. Meanwhile these scores were steady for W30 throughout the experiment, and increased for W15 only after week 10. Moisture content of the pad was higher throughout the experiment in the control pens than in the woodchip-bedded pens. In the control pens, the force required to pull a cattle leg analogue out of the pen substrate was three times that required in woodchip-bedded treatments. The W15 treatment increased C : N in the substrate to the upper limit of suitability for composting, and in W30, C : N was too high for composting after a 109-day feeding period. Overall, providing feedlot cattle with 150 or 300 mm of woodchip bedding during a 109-day feeding period improved the condition of the pad substrate for cattle comfort by reducing penetrable depth and moisture content of the substrate surface stratum, but composting value decreased in W30 over this feeding period duration

    Impact of Fusarium Crown Rot on Root System Area and Links to Genetic Variation within Commercial Wheat Varieties

    Get PDF
    Fusarium crown rot (FCR), caused by the fungal pathogen Fusarium pseudograminearum (Fp), is a major constraint to cereal production worldwide. The pathogen restricts the movement of solutes within the plant due to mycelial colonisation of vascular tissue. Yield loss and quality downgrades are exacerbated by this disease under water stress conditions. Plant root systems are adaptive and can alter their architecture to optimise production in response to changes in environment and plant health. This plasticity of root systems typically favours resource acquisition of primarily water and nutrients. This study examined the impact of FCR on the root system architecture of multiple commercial bread and durum wheat varieties. Root system growth was recorded in-crop in large transparent rhizoboxes allowing visualization of root architecture over time. Furthermore, electrical resistivity tomography was used to quantify spatial root activity vertically down the soil profile. Results demonstrated a significant reduction in the total root length and network area with the inoculation of FCR. Electrical resistivity measurements indicated that the spatial pattern of water use for each cultivar was influenced differently from infection with FCR over the growing season. Specifically temporal water use can be correlated with FCR tolerance of the varieties marking this investigation the first to link root architecture and water use as tolerance mechanisms to FCR infection. This research has implications for more targeted selection of FCR tolerance characteristics in breeding programs along with improved specific varietal management in-crop

    Aggressiveness of Phytophthora medicaginis on chickpea: Phenotyping method determines isolate ranking and host genotype-by-isolate interactions

    Get PDF
    Phytophthora medicaginis causing Phytophthora root rot of chickpea (Cicer arietinum) is an important disease, with genetic resistance using C. arietinum × Cicer echinospermum crosses as the main disease management strategy. We evaluated pathogenic variation in P. medicaginis populations with the aim of improving phenotyping methods for disease resistance. We addressed the question of individual isolate aggressiveness across four different seedling-based phenotyping methods conducted in glasshouses and one field-based phenotyping method. Our results revealed that a seedling media surface inoculation method used on a susceptible C. arietinum variety and a moderately resistant C. arietinum × C. echinospermum backcross detected the greatest variability in aggressiveness among 37 P. medicaginis isolates. Evaluations of different components of resistance, using our different phenotyping methods, revealed that differential pathogen–isolate reactions occur with some phenotyping methods. We found support for our hypotheses that the level of aggressiveness of P. medicaginis isolates depends on the phenotyping method, and that phenotyping methods interact with both isolate and host genotype reactions. Our cup-based root inoculation method showed promise as a non-field-based phenotyping method, as it provided significant correlations with genotype–isolate rankings in the field experiment for a number of disease parameters

    Improving quality of life through the routine use of the patient concerns inventory for head and neck cancer patients: main results of a cluster preference randomised controlled trial

    Get PDF
    Funding: UK National Institute for Health Research (NIHR) under its Research for Patient Benefit (RfPB) Programme (Grant Reference Number PB-PG-0215-36047).Purpose The patient concerns inventory (PCI) is a prompt list allowing head and neck cancer (HNC) patients to discuss issues that otherwise might be overlooked. This trial evaluated the effectiveness of using the PCI at routine outpatient clinics for one year after treatment on health-related QOL (HRQOL). Methods   A pragmatic cluster preference randomised control trial with 15 consultants, 8 ‘using’ and 7 ‘not using’ the PCI intervention. Patients treated with curative intent (all sites, disease stages, treatments) were eligible. Results   Consultants saw a median (inter-quartile range) 16 (13–26) patients, with 140 PCI and 148 control patients. Of the pre-specified outcomes, the 12-month results for the mean University of Washington Quality of Life (UW-QOLv4) social-emotional subscale score suggested a small clinical effect of intervention of 4.6 units (95% CI 0.2, 9.0), p = 0.04 after full adjustment for pre-stated case-mix. Results for UW-QOLv4 overall quality of life being less than good at 12 months (primary outcome) also favoured the PCI with a risk ratio of 0.83 (95% CI 0.66, 1.06) and absolute risk 4.8% (− 2.9%, 12.9%) but without achieving statistical significance. Other non-a-priori analyses, including all 12 UWQOL domains and at consultant level also suggested better HRQOL with PCI. Consultation times were unaffected and the number of items selected decreased over time. Conclusion   This novel trial supports the integration of the PCI approach into routine consultations as a simple low-cost means of benefiting HNC patients. It adds to a growing body of evidence supporting the use of patient prompt lists more generally.Publisher PDFPeer reviewe

    Improving quality of life through the routine use of the patient concerns inventory for head and neck cancer patients : baseline results in a cluster preference randomised controlled trial

    Get PDF
    Funding: RfPB on behalf of the NIHR (PB-PG-0215-36047). This paper presents independent research funded by the National Institute for Health Research (NIHR) under its Research for Patient Benefit (RfPB) Programme (Grant Reference Number PB-PG-0215-36047).Purpose The main aim of this paper is to present baseline demographic and clinical characteristics and HRQOL in the two groups of the Patient Concerns Inventory (PCI) trial. The baseline PCI data will also be described. Methods This is a pragmatic cluster preference randomised control trial with 15 consultant clusters from two sites either ‘using' (n = 8) or ‘not using’ (n = 7) the PCI at a clinic for all of their trial patients. The PCI is a 56-item prompt list that helps patients raise concerns that otherwise might be missed. Eligibility was head and neck cancer patients treated with curative intent (all sites, stage of disease, treatments). Results From 511 patients first identified as eligible when screening for the multi-disciplinary tumour board meetings, 288 attended a first routine outpatient baseline study clinic after completion of their treatment, median (IQR) of 103 (71–162) days. At baseline, the two trial groups were similar in demographic and clinical characteristics as well as in HRQOL measures apart from differences in tumour location, tumour staging and mode of treatment. These exceptions were cluster (consultant) related to Maxillofacial and ENT consultants seeing different types of cases. Consultation times were similar, with PCI group times taking about 1 min longer on average (95% CL for the difference between means was from − 0.7 to + 2.2 min). Conclusion Using the PCI in routine post-treatment head and neck cancer clinics do not elongate consultations. Recruitment has finished but 12-month follow-up is still ongoing.Publisher PDFPeer reviewe

    Global age-sex-specific mortality, life expectancy, and population estimates in 204 countries and territories and 811 subnational locations, 1950–2021, and the impact of the COVID-19 pandemic: a comprehensive demographic analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Estimates of demographic metrics are crucial to assess levels and trends of population health outcomes. The profound impact of the COVID-19 pandemic on populations worldwide has underscored the need for timely estimates to understand this unprecedented event within the context of long-term population health trends. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 provides new demographic estimates for 204 countries and territories and 811 additional subnational locations from 1950 to 2021, with a particular emphasis on changes in mortality and life expectancy that occurred during the 2020–21 COVID-19 pandemic period. Methods: 22 223 data sources from vital registration, sample registration, surveys, censuses, and other sources were used to estimate mortality, with a subset of these sources used exclusively to estimate excess mortality due to the COVID-19 pandemic. 2026 data sources were used for population estimation. Additional sources were used to estimate migration; the effects of the HIV epidemic; and demographic discontinuities due to conflicts, famines, natural disasters, and pandemics, which are used as inputs for estimating mortality and population. Spatiotemporal Gaussian process regression (ST-GPR) was used to generate under-5 mortality rates, which synthesised 30 763 location-years of vital registration and sample registration data, 1365 surveys and censuses, and 80 other sources. ST-GPR was also used to estimate adult mortality (between ages 15 and 59 years) based on information from 31 642 location-years of vital registration and sample registration data, 355 surveys and censuses, and 24 other sources. Estimates of child and adult mortality rates were then used to generate life tables with a relational model life table system. For countries with large HIV epidemics, life tables were adjusted using independent estimates of HIV-specific mortality generated via an epidemiological analysis of HIV prevalence surveys, antenatal clinic serosurveillance, and other data sources. Excess mortality due to the COVID-19 pandemic in 2020 and 2021 was determined by subtracting observed all-cause mortality (adjusted for late registration and mortality anomalies) from the mortality expected in the absence of the pandemic. Expected mortality was calculated based on historical trends using an ensemble of models. In location-years where all-cause mortality data were unavailable, we estimated excess mortality rates using a regression model with covariates pertaining to the pandemic. Population size was computed using a Bayesian hierarchical cohort component model. Life expectancy was calculated using age-specific mortality rates and standard demographic methods. Uncertainty intervals (UIs) were calculated for every metric using the 25th and 975th ordered values from a 1000-draw posterior distribution. Findings: Global all-cause mortality followed two distinct patterns over the study period: age-standardised mortality rates declined between 1950 and 2019 (a 62·8% [95% UI 60·5–65·1] decline), and increased during the COVID-19 pandemic period (2020–21; 5·1% [0·9–9·6] increase). In contrast with the overall reverse in mortality trends during the pandemic period, child mortality continued to decline, with 4·66 million (3·98–5·50) global deaths in children younger than 5 years in 2021 compared with 5·21 million (4·50–6·01) in 2019. An estimated 131 million (126–137) people died globally from all causes in 2020 and 2021 combined, of which 15·9 million (14·7–17·2) were due to the COVID-19 pandemic (measured by excess mortality, which includes deaths directly due to SARS-CoV-2 infection and those indirectly due to other social, economic, or behavioural changes associated with the pandemic). Excess mortality rates exceeded 150 deaths per 100 000 population during at least one year of the pandemic in 80 countries and territories, whereas 20 nations had a negative excess mortality rate in 2020 or 2021, indicating that all-cause mortality in these countries was lower during the pandemic than expected based on historical trends. Between 1950 and 2021, global life expectancy at birth increased by 22·7 years (20·8–24·8), from 49·0 years (46·7–51·3) to 71·7 years (70·9–72·5). Global life expectancy at birth declined by 1·6 years (1·0–2·2) between 2019 and 2021, reversing historical trends. An increase in life expectancy was only observed in 32 (15·7%) of 204 countries and territories between 2019 and 2021. The global population reached 7·89 billion (7·67–8·13) people in 2021, by which time 56 of 204 countries and territories had peaked and subsequently populations have declined. The largest proportion of population growth between 2020 and 2021 was in sub-Saharan Africa (39·5% [28·4–52·7]) and south Asia (26·3% [9·0–44·7]). From 2000 to 2021, the ratio of the population aged 65 years and older to the population aged younger than 15 years increased in 188 (92·2%) of 204 nations. Interpretation: Global adult mortality rates markedly increased during the COVID-19 pandemic in 2020 and 2021, reversing past decreasing trends, while child mortality rates continued to decline, albeit more slowly than in earlier years. Although COVID-19 had a substantial impact on many demographic indicators during the first 2 years of the pandemic, overall global health progress over the 72 years evaluated has been profound, with considerable improvements in mortality and life expectancy. Additionally, we observed a deceleration of global population growth since 2017, despite steady or increasing growth in lower-income countries, combined with a continued global shift of population age structures towards older ages. These demographic changes will likely present future challenges to health systems, economies, and societies. The comprehensive demographic estimates reported here will enable researchers, policy makers, health practitioners, and other key stakeholders to better understand and address the profound changes that have occurred in the global health landscape following the first 2 years of the COVID-19 pandemic, and longer-term trends beyond the pandemic

    Global burden and strength of evidence for 88 risk factors in 204 countries and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Understanding the health consequences associated with exposure to risk factors is necessary to inform public health policy and practice. To systematically quantify the contributions of risk factor exposures to specific health outcomes, the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 aims to provide comprehensive estimates of exposure levels, relative health risks, and attributable burden of disease for 88 risk factors in 204 countries and territories and 811 subnational locations, from 1990 to 2021. Methods: The GBD 2021 risk factor analysis used data from 54 561 total distinct sources to produce epidemiological estimates for 88 risk factors and their associated health outcomes for a total of 631 risk–outcome pairs. Pairs were included on the basis of data-driven determination of a risk–outcome association. Age-sex-location-year-specific estimates were generated at global, regional, and national levels. Our approach followed the comparative risk assessment framework predicated on a causal web of hierarchically organised, potentially combinative, modifiable risks. Relative risks (RRs) of a given outcome occurring as a function of risk factor exposure were estimated separately for each risk–outcome pair, and summary exposure values (SEVs), representing risk-weighted exposure prevalence, and theoretical minimum risk exposure levels (TMRELs) were estimated for each risk factor. These estimates were used to calculate the population attributable fraction (PAF; ie, the proportional change in health risk that would occur if exposure to a risk factor were reduced to the TMREL). The product of PAFs and disease burden associated with a given outcome, measured in disability-adjusted life-years (DALYs), yielded measures of attributable burden (ie, the proportion of total disease burden attributable to a particular risk factor or combination of risk factors). Adjustments for mediation were applied to account for relationships involving risk factors that act indirectly on outcomes via intermediate risks. Attributable burden estimates were stratified by Socio-demographic Index (SDI) quintile and presented as counts, age-standardised rates, and rankings. To complement estimates of RR and attributable burden, newly developed burden of proof risk function (BPRF) methods were applied to yield supplementary, conservative interpretations of risk–outcome associations based on the consistency of underlying evidence, accounting for unexplained heterogeneity between input data from different studies. Estimates reported represent the mean value across 500 draws from the estimate's distribution, with 95% uncertainty intervals (UIs) calculated as the 2·5th and 97·5th percentile values across the draws. Findings: Among the specific risk factors analysed for this study, particulate matter air pollution was the leading contributor to the global disease burden in 2021, contributing 8·0% (95% UI 6·7–9·4) of total DALYs, followed by high systolic blood pressure (SBP; 7·8% [6·4–9·2]), smoking (5·7% [4·7–6·8]), low birthweight and short gestation (5·6% [4·8–6·3]), and high fasting plasma glucose (FPG; 5·4% [4·8–6·0]). For younger demographics (ie, those aged 0–4 years and 5–14 years), risks such as low birthweight and short gestation and unsafe water, sanitation, and handwashing (WaSH) were among the leading risk factors, while for older age groups, metabolic risks such as high SBP, high body-mass index (BMI), high FPG, and high LDL cholesterol had a greater impact. From 2000 to 2021, there was an observable shift in global health challenges, marked by a decline in the number of all-age DALYs broadly attributable to behavioural risks (decrease of 20·7% [13·9–27·7]) and environmental and occupational risks (decrease of 22·0% [15·5–28·8]), coupled with a 49·4% (42·3–56·9) increase in DALYs attributable to metabolic risks, all reflecting ageing populations and changing lifestyles on a global scale. Age-standardised global DALY rates attributable to high BMI and high FPG rose considerably (15·7% [9·9–21·7] for high BMI and 7·9% [3·3–12·9] for high FPG) over this period, with exposure to these risks increasing annually at rates of 1·8% (1·6–1·9) for high BMI and 1·3% (1·1–1·5) for high FPG. By contrast, the global risk-attributable burden and exposure to many other risk factors declined, notably for risks such as child growth failure and unsafe water source, with age-standardised attributable DALYs decreasing by 71·5% (64·4–78·8) for child growth failure and 66·3% (60·2–72·0) for unsafe water source. We separated risk factors into three groups according to trajectory over time: those with a decreasing attributable burden, due largely to declining risk exposure (eg, diet high in trans-fat and household air pollution) but also to proportionally smaller child and youth populations (eg, child and maternal malnutrition); those for which the burden increased moderately in spite of declining risk exposure, due largely to population ageing (eg, smoking); and those for which the burden increased considerably due to both increasing risk exposure and population ageing (eg, ambient particulate matter air pollution, high BMI, high FPG, and high SBP). Interpretation: Substantial progress has been made in reducing the global disease burden attributable to a range of risk factors, particularly those related to maternal and child health, WaSH, and household air pollution. Maintaining efforts to minimise the impact of these risk factors, especially in low SDI locations, is necessary to sustain progress. Successes in moderating the smoking-related burden by reducing risk exposure highlight the need to advance policies that reduce exposure to other leading risk factors such as ambient particulate matter air pollution and high SBP. Troubling increases in high FPG, high BMI, and other risk factors related to obesity and metabolic syndrome indicate an urgent need to identify and implement interventions
    corecore