1,004 research outputs found
Hydraulic head and groundwater 111Cd content interpolations using empirical Bayesian kriging (EBK) and geo-adaptive neuro-fuzzy inference system (geo-ANFIS)
In this study, hydraulic head and 111Cd interpolations based on the geo-adaptive neuro-fuzzy inference system (Geo-ANFIS) and empirical Bayesian kriging (EBK) were performed for the alluvium unit of KarabaÄlar Polje in MuÄla, Turkey. Hydraulic head measurements and 111Cd analyses were done for 42 water wells during a snapshot campaign in April 2013. The main objective of this study was to compare Geo-ANFIS and EBK to interpolate hydraulic head and 111Cd content of groundwater. Both models were applied on the same case study: alluvium of KarabaÄlar Polje, which covers an area of 25 km2 in MuÄla basin, in the southwest of Turkey. The ANFIS method (called ANFISXY) uses two reduced centred pre-processed inputs, which are cartesian coordinates (XY). Geo-ANFIS is tested on a 100-random-data subset of 8 data among 42, with the remaining data used to train and validate the models. ANFISXY and EBK were then used to interpolate hydraulic head and heavy metal distribution, on a 50 m2 grid covering the study area for ANFISXY, while a 100 m2 grid was used for EBK. Both EBK- and ANFISXY-simulated hydraulic head and 111Cd distributions exhibit realistic patterns, with RMSE < 9 m and RMSE < 8 ÎŒg/L, respectively. In conclusion, EBK can be considered as a better interpolation method than ANFISXY for both parameters.Keywords: ANFIS, EBK, interpolation, hydraulic head, metal, 111Cd, alluvium, MuÄl
Quantification of uncertainty of geometallurgical variables for mine planning optimisation
Interest in geometallurgy has increased significantly over the past 15 years or
so because of the benefits it brings to mine planning and operation. Its use
and integration into design, planning and operation is becoming increasingly
critical especially in the context of declining ore grades and increasing mining
and processing costs.
This thesis, comprising four papers, offers methodologies and methods to
quantify geometallurgical uncertainty and enrich the block model with geometallurgical
variables, which contribute to improved optimisation of mining
operations. This enhanced block model is termed a geometallurgical block
model.
Bootstrapped non-linear regression models by projection pursuit were built
to predict grindability indices and recovery, and quantify model uncertainty.
These models are useful for populating the geometallurgical block model with
response attributes. New multi-objective optimisation formulations for block
caving mining were formulated and solved by a meta-heuristics solver focussing
on maximising the project revenue and, at the same time, minimising
several risk measures. A novel clustering method, which is able to use
both continuous and categorical attributes and incorporate expert knowledge,
was also developed for geometallurgical domaining which characterises the
deposit according to its metallurgical response. The concept of geometallurgical
dilution was formulated and used for optimising production scheduling in
an open-pit case study.Thesis (Ph.D.) (Research by Publication) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 201
AUTOMATED INTERPRETATION OF THE BACKGROUND EEG USING FUZZY LOGIC
A new framework is described for managing uncertainty and for deahng with artefact
corruption to introduce objectivity in the interpretation of the electroencephalogram
(EEG).
Conventionally, EEG interpretation is time consuming and subjective, and is known to
show significant inter- and intra-personnel variation. A need thus exists to automate the
interpretation of the EEG to provide a more consistent and efficient assessment.
However, automated analysis of EEGs by computers is complicated by two major
factors. The difficulty of adequately capturing in machine form, the skills and subjective
expertise of the experienced electroencephalbgrapher, and the lack of a reliable means of
dealing with the range of EEG artefacts (signal contamination). In this thesis, a new
framework is described which introduces objectivity in two important outcomes of
clinical evaluation of the EEG, namely, the clinical factual report and the clinical
'conclusion', by capturing the subjective expertise of the electroencephalographer and
dealing with the problem of artefact corruption.
The framework is separated into two stages .to assist piecewise optimisation and to cater
for different requirements. The first stage, 'quantitative analysis', relies on novel digital
signal processing algorithms and cluster analysis techniques to reduce data and identify
and describe background activities in the EEG. To deal with artefact corruption, an
artefact removal strategy, based on new reUable techniques for artefact identification is
used to ensure that artefact-free activities only are used in the analysis. The outcome is a
quantitative analysis, which efficiently describes the background activity in the record,
and can support future clinical investigations in neurophysiology. In clinical practice,
many of the EEG features are described by the clinicians in natural language terms, such
as very high, extremely irregular, somewhat abnormal etc. The second stage of the
framework, 'qualitative analysis', captures the subjectivity and linguistic uncertainty
expressed.by the clinical experts, using novel, intelligent models, based on fuzzy logic, to
provide an analysis closely comparable to the clinical interpretation made in practice.
The outcome of this stage is an EEG report with qualitative descriptions to complement
the quantitative analysis.
The system was evaluated using EEG records from 1 patient with Alzheimer's disease
and 2 age-matched normal controls for the factual report, and 3 patients with Alzheimer's
disease and 7 age-matched nonnal controls for the 'conclusion'. Good agreement was
found between factual reports produced by the system and factual reports produced by
qualified clinicians. Further, the 'conclusion' produced by the system achieved 100%
discrimination between the two subject groups. After a thorough evaluation, the system
should significantly aid the process of EEG interpretation and diagnosis
Novel fuzzy techniques for modelling human decision making
Standard (type-1) fuzzy sets were introduced to resemble human reasoning in its use of approximate information and uncertainty to generate decisions. Since knowledge can be expressed in a more natural by using fuzzy sets, many decision problems can be greatly simplified. However, standard type-1 fuzzy sets have limitations when it comes to modelling human decision making.
In many applications involving the modelling of human decision making (expert systems) the more traditional membership functions do not provide a wide enough choice for the system developer. They are therefore missing an opportunity to produce simpler or better systems. The use of complex non-convex membership functions in the context of human decision making systems were investigated. It was demonstrated that non-convex membership functions are plausible, reasonable membership functions in the sense originally intended by Zadeh.
All humans, including âexpertsâ, exhibit variation in their decision making. To date, it has been an implicit assumption that expert systems, including fuzzy expert systems, should not exhibit such variation. Type-2 fuzzy sets feature membership functions that are themselves fuzzy sets. While type-2 fuzzy sets capture uncertainty by introducing a range of membership values associated with each value of the base variable, but they do not capture the notion of variability. To overcome this limitation of type-2 fuzzy sets, Garibaldi previously proposed the term ânon-deterministic fuzzy reasoningâ in which variability is introduced into the membership functions of a fuzzy system through the use of random alterations to the parameters.
In this thesis, this notion is extended and formalised through the introduction of a notion termed a non-stationary fuzzy set. The concept of random perturbations that can be used for generating these non-stationary fuzzy sets is proposed. The footprint of variation (FOV) is introduced to describe the area covering the range from the minimum to the maximum fuzzy sets which comprise the non-stationary fuzzy sets (this is similar to the footprint of uncertainty of type-2 sets). Basic operators, i.e. union, intersection and complement, for non-stationary fuzzy sets are also proposed. Proofs of properties of non-stationary fuzzy sets to satisfy the set theoretic laws are also given in this thesis.
It can be observed that, firstly, a non-stationary fuzzy set is a collection of type-1 fuzzy sets in which there is an explicit, defined, relationship between the fuzzy sets. Specifically, each of the instantiations (individual type-1 sets) is derived by a perturbation of (making a small change to) a single underlying membership function. Secondly, a non-stationary fuzzy set does not have secondary membership functions, and secondary membership grades. Hence, there is no âdirectâ equivalent to the embedded type-2 sets of a type-2 fuzzy sets. Lastly, the non-stationary inference process is quite different from type-2 inference, in that non-stationary inference is just a repeated type-1 inference.
Several case studies have been carried out in this research. Experiments have been carried out to investigate the use of non-stationary fuzzy sets, and the relationship between non-stationary and interval type-2 fuzzy sets. The results from these experiments are compared with results produced by type-2 fuzzy systems. As an aside, experiments were carried out to investigate the effect of the number of tunable parameters on performance in type-1 and type-2 fuzzy systems. It was demonstrated that more tunable parameters can improve the performance of a non-singleton type-1 fuzzy system to be as good as or better than the equivalent type-2 fuzzy system.
Taken as a whole, the techniques presented in this thesis represent a valuable addition to the tools available to a model designer for constructing fuzzy models of human decision making
On weather and waves : applications to coastal engineering.
Ph. D. University of KwaZulu-Natal, Durban 2015.Shoreline erosion in response to extreme wave events can be severe. The reduction in
beach width leaves development within the hinterland exposed and vulnerable to future
wave attack. Wave climates are a fundamental driver of coastal erosion and changes to
wave height, direction and period can severely impact a coastline. These changes are
directly linked to changes within the principle drivers of wave climates namely synoptic
scale atmospheric circulation. The links are complex and if they can be clarified they
can be used to provide insight into wave climates and improve the evaluation of future
climate scenarios. The coupling between atmospheric circulation and wave climates
provides a tool for risk assessment that is strongly based on fundamental physical
processes. This study is focused on exploring this relationship and its effect on coastal
vulnerability.
A statistical classification algorithm is utilized to explore the relationship between
synoptic scale circulation patterns and regional wave climates. The algorithm is fully
automated and discrete atmospheric patterns are derived through an optimization procedure.
It is driven to an optimal solution through statistical links between regional
wave climates and atmospheric circulation patterns (CPs). The classification is based
on the concept of fuzzy sets and differs from standard classification techniques. It employs
a "bottomâup" approach as the classes (or CPs) are derived through a procedure
that is guided by the wave climate. In contrast existing classification techniques first
explore the atmospheric pressure space while links to the variable of interest are only
made post classification.
The east coast of South Africa was used as a case study. Wave data off the Durban
coastline were utilized to evaluate the drivers of the wave climate. A few dominant
patterns are shown to drive extreme wave events. Their persistence and strong highâ
low coupling drive winds toward the coastline and result in extreme wave events.
The sensitivity of the algorithm to key input parameters such as the number of CP
classes and temporal resolution of the data was evaluated. The Shannon entropy is
introduced to measure the performance of the algorithm. This method benefits from incorporating the link between atmospheric CPs and the wave climate.
A new stochastic wave simulation technique was developed that is fundamentally
based on the CPs. This technique improves the realism of stochastic models while
retaining their simplicity and parsimony relative to process-based models. The simplicity
of the technique provides the framework to evaluate coastal vulnerability at
site specific locations. Furthermore the technique was extended to evaluate changes
in wave behaviour due to climate change effects
Blind restoration of images with penalty-based decision making : a consensus approach
In this thesis we show a relationship between fuzzy decision making and image processing . Various applications for image noise reduction with consensus methodology are introduced.
A new approach is introduced to deal with non-stationary Gaussian noise and spatial non-stationary noise in MRI
Contributions to statistical machine learning algorithm
This thesis's research focus is on computational statistics along with DEAR (abbreviation of differential equation associated regression) model direction, and that in mind, the journal papers are written as contributions to statistical machine learning algorithm literature
A comparative study of time-series forecasting applied to stock market price
This thesis is a comparative study on forecasting New Zealand stock market daily closing prices by treating them as a time series. The methods used here are Box and Jenkins autoregressive integrated moving average (ARIMA) model, Bayesian dynamic linear model and Fuzzy neural networks. These methods are compared by using simple trading strategies, resulting in potentially profitable forecasting especially through the fuzzy neural networks.
In addition, the final part of this thesis are summary and comments on different methods that have been used by researchers to predict the stock prices
Women in Artificial intelligence (AI)
This Special Issue, entitled "Women in Artificial Intelligence" includes 17 papers from leading women scientists. The papers cover a broad scope of research areas within Artificial Intelligence, including machine learning, perception, reasoning or planning, among others. The papers have applications to relevant fields, such as human health, finance, or education. It is worth noting that the Issue includes three papers that deal with different aspects of gender bias in Artificial Intelligence. All the papers have a woman as the first author. We can proudly say that these women are from countries worldwide, such as France, Czech Republic, United Kingdom, Australia, Bangladesh, Yemen, Romania, India, Cuba, Bangladesh and Spain. In conclusion, apart from its intrinsic scientific value as a Special Issue, combining interesting research works, this Special Issue intends to increase the invisibility of women in AI, showing where they are, what they do, and how they contribute to developments in Artificial Intelligence from their different places, positions, research branches and application fields. We planned to issue this book on the on Ada Lovelace Day (11/10/2022), a date internationally dedicated to the first computer programmer, a woman who had to fight the gender difficulties of her times, in the XIX century. We also thank the publisher for making this possible, thus allowing for this book to become a part of the international activities dedicated to celebrating the value of women in ICT all over the world. With this book, we want to pay homage to all the women that contributed over the years to the field of AI
Fuzzy clustering with spatial correction and its application to geometallurgical domaining
Published online: 25 July 2018This paper describes a proposed method for clustering attributes on
the basis of their spatial variability and the uncertainty of cluster member-
ship. The method is applied to geometallurgical domaining in mining ap-
plications. The main objective of geometallurgical clustering is to ensure
consistent feed to a processing plant by minimising transitions between
di erent types of feed coming from di erent domains (clusters). For this
purpose, clusters should contain not only similar geometallurgical char-
acteristics but also be located in as few contiguous and compact spatial
locations as possible so as to maximise the homogeneity of ore delivered
to the plant. Most existing clustering methods applied to geometallurgy
have two problems. Firstly, they are unable to di erentiate subsets of
attributes at the cluster level and therefore cluster membership can only
be assigned on the basis of exactly identical attributes, which may not be
the case in practice. Secondly, as they do not take account of the spatial
relationships they can produce clusters which may be spatially dispersed
and/or overlapped. In the work described in this paper a new clustering
method is introduced that integrates three distinct steps to ensure qual-
ity clustering. In the rst step, fuzzy membership information is used to
minimise compactness and maximise separation. In the second step, the
best subsets of attributes are de ned and applied for domaining purposes.
These two steps are iterated to convergence. In the nal step a graph-
based labelling method, which takes spatial constraints into account, is
used to produce the nal clusters. Three examples are presented to illus-
trate the application of the proposed method. These examples demon-
strate that the proposed method can reveal useful relationships among
geometallurgical attributes within a clear and compact spatial structure.
The resulting clusters can be used directly in mine planning to optimise
the ore feed to be delivered to the processing plant.E. SepĂșlveda, P. A. Dowd, C. X
- âŠ