455 research outputs found

    Evaluation of seafloor infrastructure risk associated with submarine morphodynamics: Part 1 - Scoping

    Get PDF

    Development of a Proximal Soil Sensing System for the Continuous Management of Acid Soil

    Get PDF
    The notion that agriculturally productive land may be treated as a relatively homogeneous resource at thewithin-field scale is not sound. This assumption and the subsequent uniform application of planting material,chemicals and/or tillage effort may result in zones within a field being under- or over-treated. Arising fromthese are problems associated with the inefficient use of input resources, economically significant yield losses,excessive energy costs, gaseous or percolatory release of chemicals into the environment, unacceptable long-term retention of chemicals and a less-than-optimal growing environment. The environmental impact of cropproduction systems is substantial. In this millennium, three important issues for scientists and agrariancommunities to address are the need to efficiently manage agricultural land for sustainable production, the maintenance of soil and water resources and the environmental quality of agricultural land.Precision agriculture (PA) aims to identify soil and crop attribute variability, and manage it in an accurate and timely manner for near-optimal crop production. Unlike conventional agricultural management where an averaged whole-field analytical result is employed for decision-making, management in PA is based on site-specific soil and crop information. That is, resource application and agronomic practices are matched with variation in soil attributes and crop requirements across a field or management unit. Conceptually PA makes economic and environmental sense, optimising gross margins and minimising the environmental impact of crop production systems. Although the economic justification for PA can be readily calculated, concepts such as environmental containment and the safety of agrochemicals in soil are more difficult to estimate. However,it may be argued that if PA lessens the overall agrochemical load in agricultural and non-agricultural environments, then its value as a management system for agriculture increases substantially.Management using PA requires detailed information of the spatial and temporal variation in crop yield components, weeds, soil-borne pests and attributes of physical, chemical and biological soil fertility. However,detailed descriptions of fine scale variation in soil properties have always been difficult and costly to perform.Sensing and scanning technologies need to be developed to more efficiently and economically obtain accurate information on the extent and variability of soil attributes that affect crop growth and yield. The primary aim of this work is to conduct research towards the development of an 'on-the-go' proximal soil pH and lime requirement sensing system for real-time continuous management of acid soil. It is divided into four sections.Section one consists of two chapters; the first describes global and historical events that converged into the development of precision agriculture, while chapter two provides reviews of statistical and geostatistical techniques that are used for the quantification of soil spatial variability and of topics that are integral to the concept of precision agriculture. The review then focuses on technologies that are used for the complete enumeration of soil, namely remote and proximal sensing.Section two comprises three chapters that deal with sampling and mapping methods. Chapter three provides a general description of the environment in the experimental field. It provides descriptions of the field site,topography, soil condition at the time of sampling, and the spatial variability of surface soil chemical properties. It also described the methods of sampling and laboratory analyses. Chapter four discusses some of the implications of soil sampling on analytical results and presents a review that quantifies the accuracy,precision and cost of current laboratory techniques. The chapter also presents analytical results that show theloss of information in kriged maps of lime requirement resulting from decreases in sample size. The messageof chapter four is that the evolution of precision agriculture calls for the development of 'on-the-go' proximal soil sensing systems to characterise soil spatial variability rapidly, economically, accurately and in a timely manner. Chapter five suggests that for sparsely sampled data the choice of spatial modelling and mapping techniques is important for reliable results and accurate representations of field soil variability. It assesses a number of geostatistical methodologies that may be used to model and map non-stationary soil data, in this instance soil pH and organic carbon. Intrinsic random functions of order k produced the most accurate and parsimonious predictions of all of the methods tested.Section three consists of two chapters whose theme pertains to sustainable and efficient management of acid agricultural soil. Chapter six discusses soil acidity, its causes, consequences and current management practices.It also reports the global extent of soil acidity and that which occurs in Australia. The chapter closes by proposing a real-time continuous management system for the management of acid soil. Chapter seven reports results from experiments conducted towards the development of an 'on-the-go' proximal soil pH and lime requirement sensing system that may be used for the real-time continuous management of acid soil. Assessment of four potentiometric sensors showed that the pH Ion Sensitive Field Effect Transistor (ISFET)was most suitable for inclusion in the proposed sensing system. It is accurate and precise, drift and hysteresis are low, and most importantly it's response time is small. A design for the analytical system was presented based on flow injection analysis (FIA) and sequential injection analysis (SIA) concepts. Two different modes of operation were described. Kinetic experiments were conducted to characterise soil:0.01M CaCl2 pH(pHCaCl2) and soil:lime requirement buffer (pH buffer) reactions. Modelling of the pH buffer reactions described their sequential, biphasic nature. A statistical methodology was devised to predict pH buffer measurements using only initial reaction measurements at 0.5s, 1s, 2s and 3s measurements. The accuracy of the technique was 0.1pH buffer units and the bias was low. Finally, the chapter describes a framework for the development of a prototype soil pH and lime requirement sensing system and the creative design of the system.The final section relates to the management of acid soil by liming. Chapter eight describes the development of empirical deterministic models for rapid predictions of lime requirement. The response surface models are based on soil:lime incubations, pH buffer measurements and the selection of target pH values. These models are more accurate and more practical than more conventional techniques, and may be more suitably incorporated into the spatial decision-support system of the proposed real-time continuous system for the management of acid soil. Chapter nine presents a glasshouse liming experiment that was used to authenticate the lime requirement model derived in the previous chapter. It also presents soil property interactions and soil-plant relationships in acid and ameliorated soil, to compare the effects of no lime applications, single-rate and variable-rate liming. Chapter X presents a methodology for modelling crop yields in the presence of uncertainty. The local uncertainty about soil properties and the uncertainty about model parameters were accounted for by using indicator kriging and Latin Hypercube Sampling for the propagation of uncertainties through two regression functions; a yield response function and one that equates resultant pH after the application of lime. Under the assumptions and constraints of the analysis, single-rate liming was found to be the best management option

    Developing Efficient Strategies For Global Sensitivity Analysis Of Complex Environmental Systems Models

    Get PDF
    Complex Environmental Systems Models (CESMs) have been developed and applied as vital tools to tackle the ecological, water, food, and energy crises that humanity faces, and have been used widely to support decision-making about management of the quality and quantity of Earth’s resources. CESMs are often controlled by many interacting and uncertain parameters, and typically integrate data from multiple sources at different spatio-temporal scales, which make them highly complex. Global Sensitivity Analysis (GSA) techniques have proven to be promising for deepening our understanding of the model complexity and interactions between various parameters and providing helpful recommendations for further model development and data acquisition. Aside from the complexity issue, the computationally expensive nature of the CESMs precludes effective application of the existing GSA techniques in quantifying the global influence of each parameter on variability of the CESMs’ outputs. This is because a comprehensive sensitivity analysis often requires performing a very large number of model runs. Therefore, there is a need to break down this barrier by the development of more efficient strategies for sensitivity analysis. The research undertaken in this dissertation is mainly focused on alleviating the computational burden associated with GSA of the computationally expensive CESMs through developing efficiency-increasing strategies for robust sensitivity analysis. This is accomplished by: (1) proposing an efficient sequential sampling strategy for robust sampling-based analysis of CESMs; (2) developing an automated parameter grouping strategy of high-dimensional CESMs, (3) introducing a new robustness measure for convergence assessment of the GSA methods; and (4) investigating time-saving strategies for handling simulation failures/crashes during the sensitivity analysis of computationally expensive CESMs. This dissertation provides a set of innovative numerical techniques that can be used in conjunction with any GSA algorithm and be integrated in model building and systems analysis procedures in any field where models are used. A range of analytical test functions and environmental models with varying complexity and dimensionality are utilized across this research to test the performance of the proposed methods. These methods, which are embedded in the VARS–TOOL software package, can also provide information useful for diagnostic testing, parameter identifiability analysis, model simplification, model calibration, and experimental design. They can be further applied to address a range of decision making-related problems such as characterizing the main causes of risk in the context of probabilistic risk assessment and exploring the CESMs’ sensitivity to a wide range of plausible future changes (e.g., hydrometeorological conditions) in the context of scenario analysis

    Texture analysis and Its applications in biomedical imaging: a survey

    Get PDF
    Texture analysis describes a variety of image analysis techniques that quantify the variation in intensity and pattern. This paper provides an overview of several texture analysis approaches addressing the rationale supporting them, their advantages, drawbacks, and applications. This survey’s emphasis is in collecting and categorising over five decades of active research on texture analysis.Brief descriptions of different approaches are presented along with application examples. From a broad range of texture analysis applications, this survey’s final focus is on biomedical image analysis. An up-to-date list of biological tissues and organs in which disorders produce texture changes that may be used to spot disease onset and progression is provided. Finally, the role of texture analysis methods as biomarkers of disease is summarised.Manuscript received February 3, 2021; revised June 23, 2021; accepted September 21, 2021. Date of publication September 27, 2021; date of current version January 24, 2022. This work was supported in part by the Portuguese Foundation for Science and Technology (FCT) under Grants PTDC/EMD-EMD/28039/2017, UIDB/04950/2020, PestUID/NEU/04539/2019, and CENTRO-01-0145-FEDER-000016 and by FEDER-COMPETE under Grant POCI-01-0145-FEDER-028039. (Corresponding author: Rui Bernardes.)info:eu-repo/semantics/publishedVersio

    Evaluating covariance-based geostatistical methods with bed-scale outcrop statistics conditioning for reproduction of intra-point bar facies architecture, Cretaceous Horseshoe Canyon Formation, Alberta, Canada

    Get PDF
    2022 Summer.Includes bibliographical references.Geostatistical characterization of petroleum reservoirs typically suffers from problems of sparse data, and modelers often draw key parameters from analogous outcrop, numerical, and experimental studies to improve predictions. While quantitative information (bed-scale statistical distributions) from outcrop studies is available, translating the data from outcrop to models and generating geologically-realistic realizations with available geostatistical algorithms is often problematic. The overarching goal of this thesis is to test the capacity of covariance-based geostatistical methods to reproduce intra-point bar facies architecture while guiding those algorithms with bed-scale outcrop statistics from the Late Cretaceous Horseshoe Canyon Formation in southeastern Alberta. First, general facies architecture reproduction is tested with 2- and 3-facies synthetic and outcrop-based experiments with variable hard data, soft data weight, and soft data reliability. Next, 3-D sector models compare performance of different geostatistical simulation methods: sequential / co-sequential indicator, plurigaussian, and nested truncated gaussian. Findings show that despite integration of outcrop statistics, all conventional covariance-based geostatistical algorithms struggle to reproduce complex facies architecture that is observed in outcrop. Specifically, problems arise with: 1) low-proportion facies and 2) a weak statistical relationship between hard data (measured sections) and soft data (probability models). Nested modeling partially mitigates low-proportion issues and performs better as a result

    An Agent-Based Variogram Modeller: Investigating Intelligent, Distributed-Component Geographical Information Systems

    Get PDF
    Geo-Information Science (GIScience) is the field of study that addresses substantive questions concerning the handling, analysis and visualisation of spatial data. Geo- Information Systems (GIS), including software, data acquisition and organisational arrangements, are the key technologies underpinning GIScience. A GIS is normally tailored to the service it is supposed to perform. However, there is often the need to do a function that might not be supported by the GIS tool being used. The normal solution in these circumstances is to go out and look for another tool that can do the service, and often an expert to use that tool. This is expensive, time consuming and certainly stressful to the geographical data analyses. On the other hand, GIS is often used in conjunction with other technologies to form a geocomputational environment. One of the complex tools in geocomputation is geostatistics. One of its functions is to provide the means to determine the extent of spatial dependencies within geographical data and processes. Spatial datasets are often large and complex. Currently Agent system are being integrated into GIS to offer flexibility and allow better data analysis. The theis will look into the current application of Agents in within the GIS community, determine if they are used to representing data, process or act a service. The thesis looks into proving the applicability of an agent-oriented paradigm as a service based GIS, having the possibility of providing greater interoperability and reducing resource requirements (human and tools). In particular, analysis was undertaken to determine the need to introduce enhanced features to agents, in order to maximise their effectiveness in GIS. This was achieved by addressing the software agent complexity in design and implementation for the GIS environment and by suggesting possible solutions to encountered problems. The software agent characteristics and features (which include the dynamic binding of plans to software agents in order to tackle the levels of complexity and range of contexts) were examined, as well as discussing current GIScience and the applications of agent technology to GIS, agents as entities, objects and processes. These concepts and their functionalities to GIS are then analysed and discussed. The extent of agent functionality, analysis of the gaps and the use these technologies to express a distributed service providing an agent-based GIS framework is then presented. Thus, a general agent-based framework for GIS and a novel agent-based architecture for a specific part of GIS, the variogram, to examine the applicability of the agent- oriented paradigm to GIS, was devised. An examination of the current mechanisms for constructing variograms, underlying processes and functions was undertaken, then these processes were embedded into a novel agent architecture for GIS. Once the successful software agent implementation had been achieved, the corresponding tool was tested and validated - internally for code errors and externally to determine its functional requirements and whether it enhances the GIS process of dealing with data. Thereafter, its compared with other known service based GIS agents and its advantages and disadvantages analysed

    Ricerche di Geomatica 2011

    Get PDF
    Questo volume raccoglie gli articoli che hanno partecipato al Premio AUTeC 2011. Il premio è stato istituito nel 2005. Viene conferito ogni anno ad una tesi di Dottorato giudicata particolarmente significativa sui temi di pertinenza del SSD ICAR/06 (Topografia e Cartografia) nei diversi Dottorati attivi in Italia

    Mineral resource evaluation of a platinum tailings resource: a case study

    Get PDF
    A Research Report submitted in partial fulfilment of the requirements for the degree of Master of Science in Engineering (Mining) to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, Johannesburg, South Africa, July 2017The project investigated the application of geostatistical techniques in evaluating a mechanically deposited platinum tailings resource. The project was undertaken on one of the Anglo American Platinum tailings dams, the identity of which cannot be revealed, due to the agreement in place or permission given. Remnant unrecovered minerals of economic potential still exist in tailings dams. These unrecovered minerals have influenced several mining companies to turn their attention to the economic potential that still exists in tailings, making them a key strategic component of their resources and reserves. Geostatistics has been developed and thoroughly tested or improved to address challenges experienced in estimating in situ geological ore bodies. The main aim of this Research Project is to test whether these fundamental principles and theories of geostatistics are relevant and appropriate in evaluating man-made ore bodies, such as a Platinum tailings dam, without any significant changes needed on the underlying principles or estimation algorithms. The findings on the Case Study tailings resource can be applied in the evaluation of other tailings dams, as well as any other man-made structures such as low grade rock dumps, muck piles, with related characteristics. A standard approach (methodology) was followed to evaluate the Case Study tailings resource. Drilling and sampling was conducted through sonic drilling. It is a dry drilling technique that is suitable for sampling unconsolidated particles such as tailings. Thereafter, 2 samples were sent to the laboratory to establish grade (concentration) of Platinum Group Metals (Platinum, Palladium and Rhodium), Gold and Base Metals (Copper and Nickel). Density was also measured, and comprehensively analysed as part of variables of interest in this research. Statistical analyses were performed on all variables of interest contained in the dam: which are Platinum (Pt), Palladium (Pd), Gold (Au), 3E (two PGMs plus Gold), Copper (Cu), Nickel (Ni) and Density. The underlying statistical distributions of all metals and density were found to be non-symmetrical and slightly positive skewed. The skewness of the distributions was established to be marginal. Differences between raw data (untransformed) averages and the log-normal estimates were analysed and found to be insignificant. As such Ordinary Kriging of untransformed data was concluded to be the appropriate geostatistical technique for Case Study tailings resource. Analysis of mineralisation continuity (variography), a pre-requisite for geostatistical techniques such as Ordinary Kriging applied on the case study tailings resource, was also performed. Reasonable and sufficient mineralisation continuity was established to exist in the Case Study tailings resource. Although characterised by high nugget effect, these spatial correlations were established to be continuous with ranges of influence well beyond 450 m in all variables. Anisotropic variograms were modelled for all variables and are comprised of nested structures with two to three spherical models. Resource estimation was conducted through Ordinary Kriging in Datamine. All the seven variables were successfully interpolated into each cell of the 5m x 5m x 5m block model. Rigorous validation of the resource model was performed to establish the quality and reliability of the estimation carried out. Estimated resource model was analysed against the original borehole data, through comparison of grade profiles, statistical analysis, QQ Plots and histograms. The grade profile was recognised to be similar between boreholes (5 m composites) and the adjacent cells that have been estimated. Furthermore, statistical analyses revealed minimal differences between means of the estimated model and the original borehole data: the highest difference being 1.7% realised on 3E, followed by 1.1% on Density and Gold (Au). The rest of the variables (Pt, Pd, Cu, and Ni) have differences that are below 1%. 3 QQ plots and histogram were plotted from resource model with 5m x 5m x 5m cells and 5 m composited boreholes. Although these data sets are of different (slightly incompatible) supports, the intended purpose of comparing distributions was achieved. QQ plots and histograms revealed approximately identical shaped distributions of the two data sets, with some minor deviations noticeable in graphs of only two variables (Au and Density) that are underlain by two populations. The validation process carried out gave a compelling assurance on the quality and reliability of the resource model produced. The Case Study tailings resource therefore is successfully estimated by Ordinary Kriging. The results achieved on the Case Study tailings dam has successfully proved that geostatistical principles and theories can confidently be applied, in their current form or understanding, to any man-made tailings resourceCK201

    Elucidating the performance of hybrid models for predicting extreme water flow events through variography and wavelet analyses

    Get PDF
    Accurate prediction of extreme flow events is important for mitigating natural disasters such as flooding. We explore and refine two modelling approaches (both separately and in combination) that have been demonstrated to improve the prediction of daily peak flow events. These two approaches are firstly, models that aggregate fine resolution (sub-daily) simulated flow from a process-based model (PBM) to daily, and secondly, hybrid models that combine PBMs with statistical and machine learning methods. We propose the use of variography and wavelet analyses to evaluate these models across temporal scales. These exploratory methods are applied to both measured and modelled data in order to assess the performance of the latter in capturing variation, at different scales, of the former. We compare change points detected by the wavelet analysis (measured and modelled) with the extreme flow events identified in the measured data. We found that combining the two modelling approaches improves prediction at finer scales, but at coarser scales advantages are less pronounced. Although aggregating fine-scale model outputs improved the partition of wavelet variation across scales, the autocorrelation in the signal is less well represented as demonstrated by variography. We demonstrate that exploratory time-series analyses, using variograms and wavelets, provides a useful assessment of existing and newly proposed models, with respect to how they capture changes in flow variance at different scales and also how this correlates with measured flow data – all in the context of extreme flow events. © 2021 Elsevier B.V
    • …
    corecore