20 research outputs found

    Application of artificial neural networks to the design of subsurface drainage systems in Libyan agricultural projects

    Get PDF
    Study region The study data draws on the drainage design for Hammam agricultural project (HAP) and Eshkeda agricultural project (EAP), located in the south of Libya, north of the Sahara Desert. The results of this study are applicable to other arid areas. Study focus This study aims to improve the prediction of saturated hydraulic conductivity (Ksat) to enhance the efficacy of drainage system design in data-poor areas. Artificial Neural Networks (ANNs) were developed to estimate Ksat and compared with empirical regression-type Pedotransfer Function (PTF) equations. Subsequently, the ANNs and PTFs estimated Ksat values were used in EnDrain software to design subsurface drainage systems which were evaluated against designs using measured Ksat values. New hydrological insights Results showed that ANNs more accurately predicted Ksat than PTFs. Drainage design based on PTFs predictions (1) result in a deeper water-level and (2) higher drainage density, increasing costs. Drainage designs based on ANNs predictions gave drain spacing and water table depth equivalent to those predicted using measured data. The results of this study indicate that ANNs can be developed using existing and under-utilised data sets and applied successfully to data-poor areas. As Ksat is time-consuming to measure, basing drainage designs on ANN predictions generated from alternative datasets will reduce the overall cost of drainage designs making them more accessible to farmers, planners, and decision-makers in least developed countries

    Best management practices to alleviate deep-seated compaction in asparagus (Asparagus officinalis) interrows (UK)

    Get PDF
    Field operations associated with UK asparagus production (re-ridging and intensive foot and vehicular trafficking of the wheelings) can result in severe deep-seated compaction in interrows, impacting on crop health and productivity. In this project, we investigate the long-term efficacy of a range of Best Management Practices (BMPs) targeted at preventing or remediating soil compaction in asparagus (Asparagus officinalis L.) interrows as compared to Conventional practice. BMPs included (1) companion crops - Rye (Sereale cecale L.), Mustard (Sinapis alba L.), (2) interrow surface mulch applications (straw mulch and PAS 100 compost in combination with shallow soil disturbance (SSD)), (3) modifications of the conventional tillage practice (re-ridging (R) or not ridging (NR) and applying SSD or not applying SSD) and (4) a zero-tillage option. In general, companion cropping had no effect on soil compaction or water infiltration rates as compared to the Conventional practice. Application and incorporation of straw mulch or PAS 100 compost however significantly reduced soil compaction of the interrows to >0.45 m beyond the working depth of the subsoiler (0.25 m). Composts and mulches in combination with SSD significantly reduce deep-seated compaction of the interrows within 3 years of annual application. Further, Conventional practice equivalent treatment (Bare soil No-SSD R) was associated with significantly higher PR values as compared to the zero-tillage (Bare soil No-SSD NR). These findings show that the extremely high levels of deep-seated compaction in interrows, associated with re-ridging, foot and vehicular traffic can be alleviated using surface mulches in combination with SSD

    Long-term impacts of repeated cover cropping and cultivation approaches on subsoil physical properties

    Get PDF
    The intensification of arable agriculture has resulted in an increase in vehicle wheel load and the intensity of field operations, which has increased the risk and incidence of degradation in physical properties of the uncultivated subsoil layer. Biopores generated by the long-term, repeated use of specific cover crops within an arable rotation has been suggested as an approach to improve subsoil physical properties. Therefore, this paper aimed to determine the impact of long-term repeated cover cropping and the interaction of rotation treatments with different cultivation approaches on subsoil physical properties. Data was collected at the NIAB ‘Sustainable Trial for Arable Rotations’ long-term, rotation and cultivation field experiment established in 2006. Rotation treatments comprised a brassica cover crop alternated annually with winter wheat (ALTCC) compared to continuous winter wheat (CWW). Cultivation treatments comprised PLOUGH (250 mm depth), and non-inversion cultivation at 250 mm (DEEP) and 100 mm (SHALLOW) depths. Penetration resistance and volumetric soil moisture were collected at bi-monthly intervals during the 2018/19 growing season. Undisturbed soil cores were collected for laboratory analyses of soil water retention, water stable aggregates, root morphology digital scanning and biomass, and X-ray computed tomography (CT). Results showed that treatment ALTCC combined with SHALLOW, resulted in lower penetration resistance and increased moisture in the subsoil. This increased subsoil moisture persisted later into the season compared to the control. SHALLOW increased subsoil water retention, improved subsoil root morphology and increased subsoil porosity. Benefits from treatment ALTCC were not observed where combined with higher intensity, deeper cultivation. Overall, the combination of treatments ALTCC with SHALLOW, produced significant benefits to subsoil physical properties

    Impacts of long-term application of best management practices on yields and root carbohydrate content in asparagus (Asparagus officinalis) (UK)

    Get PDF
    Yield physiology of asparagus (Asparagus officinalis L.) is strongly influenced by biotic factors such as crown and root rot caused by Fusarium spp. and by abiotic conditions such as precipitation or temperatures, duration of each harvest, and field management practices. Asparagus yields are linked to the availability of soluble carbohydrates (CHO) in the storage root system which is considered a key factor in asparagus productivity. The aim of this study was to quantify the impacts of the long-term application of a range of potential Best Management Practices (BMPs) on yield and storage root carbohydrate content in green asparagus in a long-term field trial. The trial was established in 2016 with the asparagus ‘Gijnlim’ variety. Commercial yields were collected in 2018, 2019 and 2020. Root carbohydrate content was determined in 2019 and 2020. BMPs included (1) companion crops - Rye (Secale cereale L.), Mustard (Sinapis alba L.), (2) interrow surface mulch applications of either straw mulch or PAS 100 compost (Publicly available specification) in combination with shallow soil disturbance (SSD), (3) the conventional practice and modifications of the conventional tillage practice by applying SSD or not applying SSD and (4) a zero-tillage option. Annual re-ridging (R) and not ridging (NR) were applied to BMP options 1–3. SSD had no significant impact on asparagus yields while annual re-ridging negatively affected total yields of treatments with bare soil interrows, which were managed without SSD. Conventional practice was associated with a 22% yield reduction and ∼€4250 ha−1 annual loss in potential revenue as compared to the Zero-tillage treatment. Companion cropping with mustard did not have a significant impact on asparagus yields. Rye without annual re-ridging was however associated with yield reductions of > 20% as compared to the Conventional practice. PAS 100 Compost applied in asparagus interrows (at 25 t ha−1 per year) in combination with SSD without annual re-ridging resulted in improvements to yields of 20%, 10% and 34% in 2018, 2019 and 2020, respectively, as compared to the Conventional practice. No correlation was observed between storage root soluble carbohydrate content and asparagus yields. The results of this study confirmed that asparagus yield, and thus total farm income can be significantly improved through implementation of several of the BMPs investigated.Agriculture and Horticulture Development Board (Project FV450a and FV450b

    Evaluating agroecological farming practices

    Get PDF
    There are a range of definitions for agroecologically-related farming systems and practices. In brief, organic farming places strong restrictions on inputs, agroecological analyses often focus on principles, and regenerative farming typically emphasises the enhancement of soil health and the diversity of agricultural and wild species at a farm-scale. Perhaps surprisingly the role of agroecological systems in reducing net greenhouse gas emissions from food and farming is implicit rather than explicit. Despite some literature contrasting agroecological and technical approaches, many authors indicate that the desirability of farming practices should be determined by their impact at the appropriate scale. Sustainable intensification has been defined as maintaining or enhancing agricultural production while enhancing or maintaining the delivery of other ecosystem services. Approaches such as the Global Farm Metric and LEAF Marque Certification can support the integrated assessment of 12 groupings of attributes at a farm-scale covering inputs and outputs, and environmental and social impacts. In this report we reviewed the following 16 practices: crop rotations, conservation agriculture, cover crops, organic crop production, integrated pest management, the integration of livestock to crop systems, the integration of crops to livestock systems, field margin practices, pasture-fed livestock systems, multi-paddock grazing, organic livestock systems, tree crops, tree-intercropping, multistrata agroforestry and permaculture, silvopasture, and rewilding

    The REFER (REFer for EchocaRdiogram) protocol: a prospective validation of a clinical decision rule, NT-proBNP, or their combination, in the diagnosis of heart failure in primary care. Rationale and design.

    Get PDF
    BACKGROUND: Heart failure is a major cause of mortality and morbidity. As mortality rates are high, it is important that patients seen by general practitioners with symptoms suggestive of heart failure are identified quickly and treated appropriately. Identifying patients with heart failure or deciding which patients need further tests is a challenge. All patients with suspected heart failure should be diagnosed using objective tests such as echocardiography, but it is expensive, often delayed, and limited by the significant skill shortage of trained echocardiographers. Alternative approaches for diagnosing heart failure are currently limited. Clinical decision tools that combine clinical signs, symptoms or patient characteristics are designed to be used to support clinical decision-making and validated according to strict methodological procedures. The REFER Study aims to determine the accuracy and cost-effectiveness of our previously derived novel, simple clinical decision rule, a natriuretic peptide assay, or their combination, in the triage for referral for echocardiography of symptomatic adult patients who present in general practice with symptoms suggestive of heart failure. METHODS/DESIGN: This is a prospective, Phase II observational, diagnostic validation study of a clinical decision rule, natriuretic peptides or their combination, for diagnosing heart failure in primary care. Consecutive adult primary care patients 55 years of age or over presenting to their general practitioner with a chief complaint of recent new onset shortness of breath, lethargy or peripheral ankle oedema of over 48 hours duration, with no obvious recurrent, acute or self-limiting cause will be enrolled. Our reference standard is based upon a three step expert specialist consensus using echocardiography and clinical variables and tests. DISCUSSION: Our clinical decision rule offers a potential solution to the diagnostic challenge of providing a timely and accurate diagnosis of heart failure in primary care. Study results will provide an evidence-base from which to develop heart failure care pathway recommendations and may be useful in standardising care. If demonstrated to be effective, the clinical decision rule will be of interest to researchers, policy makers and general practitioners worldwide. TRIAL REGISTRATION: ISRCTN17635379.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    Indicators of soil quality - Physical properties (SP1611). Final report to Defra

    Get PDF
    The condition of soil determines its ability to carry out diverse and essential functions that support human health and wellbeing. These functions (or ecosystem goods and services) include producing food, storing water, carbon and nutrients, protecting our buried cultural heritage and providing a habitat for flora and fauna. Therefore, it is important to know the condition or quality of soil and how this changes over space and time in response to natural factors (such as changing weather patterns) or to land management practices. Meaningful soil quality indicators (SQIs), based on physical, biological or chemical soil properties are needed for the successful implementation of a soil monitoring programme in England and Wales. Soil monitoring can provide decision makers with important data to target, implement and evaluate policies aimed at safeguarding UK soil resources. Indeed, the absence of agreed and well-defined SQIs is likely to be a barrier to the development of soil protection policy and its subsequent implementation. This project assessed whether physical soil properties can be used to indicate the quality of soil in terms of its capacity to deliver ecosystem goods and services. The 22 direct (e.g. bulk density) and 4 indirect (e.g. catchment hydrograph) physical SQIs defined by Loveland and Thompson (2002) and subsequently evaluated by Merrington et al. (2006), were re-visited in the light of new scientific evidence, recent policy drivers and developments in sampling techniques and monitoring methodologies (Work Package 1). The culmination of these efforts resulted in 38 direct and 4 indirect soil physical properties being identified as potential SQIs. Based on the gathered evidence, a ‘logical sieve’ was used to assess the relative strengths, weaknesses and suitability of each potential physical SQI for national scale soil monitoring. Each soil physical property was scored in terms of: soil function – does the candidate SQI reflect all soil function(s)? land use - does the candidate SQI apply to all land uses found nationally? soil degradation - can the candidate SQI express soil degradation processes? does the candidate SQI meet the challenge criteria used by Merrington et al. (2006)?This approach enabled a consistent synthesis of available information and the semi-objective, semi-quantitative and transparent assessment of indicators against a series of scientific and technical criteria (Ritz et al., 2009; Black et al., 2008). The logical sieve was shown to be a flexible decision-support tool to assist a range of stakeholders with different agenda in formulating a prioritised list of potential physical SQIs. This was explored further by members of the soil science and soils policy community at a project workshop. By emphasising the current key policy-related soil functions (i.e. provisioning and regulating), the logical sieve was used to generate scores which were then ranked to identify the most qualified SQIs. The process selected 18 candidate physical SQIs. This list was further filtered to move from the ‘narrative’ to a more ‘numerical’ approach, in order to test the robustness of the candidate SQIs through statistical analysis and modelling (Work Package 2). The remaining 7 physical SQIs were: depth of soil; soil water retention characteristics; packing density; visual soil assessment / evaluation; rate of erosion; sealing; and aggregate stability. For these SQIs to be included in a robust national soil monitoring programme, we investigated the uncertainty in their measurement; the spatial and temporal variability in the indicator as given by observed distributions; and the expected rate of change in the indicator. Whilst a baseline is needed (i.e. the current state of soil), it is the rate of change in soil properties and the implications of that change in terms of soil processes and functioning that are key to effective soil monitoring. Where empirical evidence was available, power analysis was used to understand the variability of indicators as given by the observed distributions. This process determines the ability to detect a particular change in the SQI at a particular confidence level, given the ‘noise’ or variability in the data (i.e. a particular power to detect a change of ‘X’ at a confidence level of ‘Y%’ would require ‘N’ samples). However, the evidence base for analysing the candidate SQIs is poor: data are limited in spatial and temporal extent for England and Wales, in terms of a) the degree (magnitude) of change in the SQI which significantly affects soil processes and functions (i.e. ‘meaningful change’), and b) the change in the SQI that is detectable (i.e. what sample size is needed to detect the meaningful signal from the variability or noise in the signal). This constrains the design and implementation of a scientifically and statistically rigorous and reliable soil monitoring programme. Evidence that is available suggests that what constitutes meaningful change will depend on soil type, current soil state, land use and the soil function under consideration. However, when we tested this by analysing detectable changes in packing density and soil depth (because data were available for these SQIs) over different land covers and soil types, no relationships were found. Schipper and Sparling (2000) identify the challenge: “a standardised methodology may not be appropriate to apply across contrasting soils and land uses. However, it is not practical to optimise sampling and analytical techniques for each soil and land use for extensive sampling on a national scale”. Despite the paucity in data, all seven SQIs have direct relevance to current and likely future soil and environmental policy, because they can be related (qualitatively) to soil processes, soil functions and delivery of ecosystem goods and services. Even so, meaningful and detectable changes in physical SQIs may be out of time with any soil policy change and it is not usually possible to link particular changes in SQIs to particular policy activities. This presents challenges in ascertaining trends that can feed into policy development or be used to gauge the effectiveness of soil protection policies (Work Package 3). Of the seven candidate physical SQIs identified, soil depth and surface sealing are regarded by many as indicators of soil quantity rather than quality. Visual soil evaluation is currently not suited to soil monitoring in the strictest sense, as its semi-qualitative basis cannot be analysed statistically. Also, few data exist on how visual evaluation scores relate to soil functions. However, some studies have begun to investigate how VSE might be moved to a more quantified scale and the method has some potential as a low cost field technique to assess soil condition. Packing density requires data on bulk density and clay content, both of which are highly variable, so compounding the error term associated with this physical SQI. More evidence is needed to show how ‘meaningful’ change in aggregate stability affects soil processes and thus soil functions (for example, using the limited data available, an equivocal relationship was found with water regulation / runoff generation). The analysis of available data has given promising results regarding the prediction of soil water retention characteristics and packing density from relatively easy to measure soil properties (bulk density, texture and organic C) using pedotransfer functions. Expanding the evidence base is possible with the development of rapid, cost-effective techniques such as NIR sensors to measure soil properties. Defra project SP1303 (Brazier et al., 2012) used power analyses to estimate the number of monitoring locations required to detect a statistically significant change in soil erosion rate on cultivated land. However, what constitutes a meaningful change in erosion rates still requires data on the impacts of erosion on soil functions. Priority cannot be given amongst the seven SQIs, because the evidence base for each varies in its robustness and extent. Lack of data (including uncertainty in measurement and variability in observed distributions) applies to individual SQIs; attempts at integrating more than one SQI (including physical, biological and chemical SQIs) to improve associations between soil properties and processes / functions are only likely to propagate errors. Whether existing monitoring programmes can be adapted to incorporate additional measurement of physical SQIs was explored. We considered options where one or more of the candidate physical SQIs might be implemented into soil monitoring programmes (e.g. as a new national monitoring scheme; as part of the Countryside Survey; and as part of the National Soil Inventory). The challenge is to decide whether carrying out soil monitoring that is not statistically robust is still valuable in answering questions regarding current and future soil quality. The relationship between physical (and other) SQIs, soil processes and soil functions is complex, as is how this influences ecosystem services’ delivery. Important gaps remain in even the realisation of a conceptual model for these inter-relationships, let alone their quantification. There is also a question of whether individual quantitative SQIs can be related to ecosystem services, given the number of variables

    A DNA-barcode biodiversity standard analysis method (DNA-BSAM) reveals a large variance in the effect of a range of biological, chemical and physical soil management interventions at different sites, but location is one of the most important aspects determining the nature of agricultural soil microbiology

    Get PDF
    There are significant knowledge gaps in our understanding of how to sustainably manage agricultural soils to preserve soil biodiversity. Here we evaluate and quantify the effects of agricultural management and location on soil microbiology using nine field trials that have consistently applied different soil management practices in the United Kingdom using DNA barcode sequence data. We tested the basic hypothesis that various agricultural management interventions have a significant and greater effect on soil bacterial and fungal diversity than geographic location. The analyses of soil microbial DNA sequence data to date has lacked standardisation which prevents meaningful comparisons across sites and studies. Therefore, to analyse these data and crucially compare and quantify the size of any effects on soil bacterial and fungal biodiversity between sites, we developed and employed a post-sequencing DNA-barcode biodiversity standard analysis method (DNA-BSAM). The DNA-BSAM comprises a series of standardised bioinformatic steps for processing sequences but more importantly defines a standardised set of ecological indices and statistical tests. Use of the DNA-BSAM reveals the hypothesis was not strongly supported, and this was primarily because: 1) there was a large variance in the effects of various management interventions at different sites, and 2) that location had an equivalent or greater effect size than most management interventions for most metrics. Some dispersed sites imposed the same organic amendments interventions but showed different responses, and this combined with observations of strong differences in soil microbiomes by location tentatively suggests that any effect of management may be contingent on location. This means it could be unreliable to extrapolate the findings of individual trials to others. The widespread use of a standard approach will allow meaningful cross-comparisons between soil microbiome studies and thus a substantial evidence-base of the effects of land-use on soil microbiology to accumulate and inform soil management decisions.Agriculture and Horticulture Development Board (AHDB); British Beet Research Organisation (BBRO

    MICE or NICE? An economic evaluation of clinical decision rules in the diagnosis of heart failure in primary care.

    Get PDF
    BACKGROUND: Detection and treatment of heart failure (HF) can improve quality of life and reduce premature mortality. However, symptoms such as breathlessness are common in primary care, have a variety of causes and not all patients require cardiac imaging. In systems where healthcare resources are limited, ensuring those patients who are likely to have HF undergo appropriate and timely investigation is vital. DESIGN: A decision tree was developed to assess the cost-effectiveness of using the MICE (Male, Infarction, Crepitations, Edema) decision rule compared to other diagnostic strategies to identify HF patients presenting to primary care. METHODS: Data from REFER (REFer for EchocaRdiogram), a HF diagnostic accuracy study, was used to determine which patients received the correct diagnosis decision. The model adopted a UK National Health Service (NHS) perspective. RESULTS: The current recommended National Institute for Health and Care Excellence (NICE) guidelines for identifying patients with HF was the most cost-effective option with a cost of £4400 per quality adjusted life year (QALY) gained compared to a "do nothing" strategy. That is, patients presenting with symptoms suggestive of HF should be referred straight for echocardiography if they had a history of myocardial infarction or if their NT-proBNP level was ≥400pg/ml. The MICE rule was more expensive and less effective than the other comparators. Base-case results were robust to sensitivity analyses. CONCLUSIONS: This represents the first cost-utility analysis comparing HF diagnostic strategies for symptomatic patients. Current guidelines in England were the most cost-effective option for identifying patients for confirmatory HF diagnosis. The low number of HF with Reduced Ejection Fraction patients (12%) in the REFER patient population limited the benefits of early detection
    corecore