388 research outputs found

    Dispersive flies optimisation and medical imaging

    Get PDF
    One of the main sources of inspiration for techniques applicable to complex search space and optimisation problems is nature. This paper introduces a new metaheuristic—Dispersive Flies Optimisation (DFO)—whose inspiration is beckoned from the swarming behaviour of flies over food sources in nature. The simplicity of the algorithm facilitates the analysis of its behaviour. A series of experimental trials confirms the promising performance of the optimiser over a set of benchmarks, as well as its competitiveness when compared against three other well-known population based algorithms. The convergence-independent diversity of DFO algorithm makes it a potentially suitable candidate for dynamically changing environment. In addition to diversity, the performance of the newly introduced algorithm is investigated using the three performance measures of accuracy, efficiency and reliability and its outperformance is demonstrated in the paper. Then the proposed swarm intelligence algorithm is used as a tool to identify microcalcifications on the mammographs. This algorithm is adapted for this particular purpose and its performance is investigated by running the agents of the swarm intelligence algorithm on sample mammographs whose status have been determined by the experts. Two modes of the algorithms are introduced in the paper, each providing the clinicians with a different set of outputs, highlighting the areas of interest where more attention should be given by those in charge of the care of the patients

    Dispersive Flies Optimisation and Medical Imaging

    Get PDF

    Analysis of physiological signals using machine learning methods

    Get PDF
    Technological advances in data collection enable scientists to suggest novel approaches, such as Machine Learning algorithms, to process and make sense of this information. However, during this process of collection, data loss and damage can occur for reasons such as faulty device sensors or miscommunication. In the context of time-series data such as multi-channel bio-signals, there is a possibility of losing a whole channel. In such cases, existing research suggests imputing the missing parts when the majority of data is available. One way of understanding and classifying complex signals is by using deep neural networks. The hyper-parameters of such models have been optimised using the process of back propagation. Over time, improvements have been suggested to enhance this algorithm. However, an essential drawback of the back propagation can be the sensitivity to noisy data. This thesis proposes two novel approaches to address the missing data challenge and back propagation drawbacks: First, suggesting a gradient-free model in order to discover the optimal hyper-parameters of a deep neural network. The complexity of deep networks and high-dimensional optimisation parameters presents challenges to find a suitable network structure and hyper-parameter configuration. This thesis proposes the use of a minimalist swarm optimiser, Dispersive Flies Optimisation(DFO), to enable the selected model to achieve better results in comparison with the traditional back propagation algorithm in certain conditions such as limited number of training samples. The DFO algorithm offers a robust search process for finding and determining the hyper-parameter configurations. Second, imputing whole missing bio-signals within a multi-channel sample. This approach comprises two experiments, namely the two-signal and five-signal imputation models. The first experiment attempts to implement and evaluate the performance of a model mapping bio-signals from A toB and vice versa. Conceptually, this is an extension to transfer learning using CycleGenerative Adversarial Networks (CycleGANs). The second experiment attempts to suggest a mechanism imputing missing signals in instances where multiple data channels are available for each sample. The capability to map to a target signal through multiple source domains achieves a more accurate estimate for the target domain. The results of the experiments performed indicate that in certain circumstances, such as having a limited number of samples, finding the optimal hyper-parameters of a neural network using gradient-free algorithms outperforms traditional gradient-based algorithms, leading to more accurate classification results. In addition, Generative Adversarial Networks could be used to impute the missing data channels in multi-channel bio-signals, and the generated data used for further analysis and classification tasks

    Swarmic autopoiesis and computational creativity

    Get PDF
    In this paper two swarm intelligence algorithms are used, the first leading the “attention” of the swarm and the latter responsible for the tracing mechanism. The attention mechanism is coordinated by agents of Stochastic Diffusion Search where they selectively attend to areas of a digital canvas (with line drawings) which contains (sharper) corners. Once the swarm's attention is drawn to the line of interest with a sharp corner, the corresponding line segment is fed into the tracing algorithm, Dispersive Flies Optimisation which “consumes” the input in order to generate a “swarmic sketch” of the input line. The sketching process is the result of the “flies” leaving traces of their movements on the digital canvas which are then revisited repeatedly in an attempt to re-sketch the traces they left. This cyclic process is then introduced in the context of autopoiesis, where the philosophical aspects of the autopoietic artist are discussed. The autopoetic artist is described in two modalities: gluttonous and contented. In the Gluttonous Autopoietic Artist mode, by iteratively focussing on areas-of-rich-complexity, as the decoding process of the input sketch unfolds, it leads to a less complex structure which ultimately results in an empty canvas; therein reifying the artwork's “death”. In the Contented Autopoietic Artist mode, by refocussing the autopoietic artist's reflections on “meaning” onto different constitutive elements, and modifying her reconstitution, different behaviours of autopoietic creativity can be induced and therefore, the autopoietic processes become less likely to fade away and more open-ended in their creative endeavour

    Swarm-based identification of animation key points from 2D-medialness maps

    Get PDF
    In this article we present the use of dispersive flies optimisation (DFO) for swarms of particles active on a medialness map — a 2D field representation of shape informed by perception studies. Optimising swarms activity permits to efficiently identify shape-based keypoints to automatically annotate movement and is capable of producing meaningful qualitative descriptions for animation applications. When taken together as a set, these keypoints represent the full body pose of a character in each processed frame. In addition, such keypoints can be used to embody the notion of the Line of Action (LoA), a well known classic technique from the Disney studios used to capture the overall pose of a character to be fleshed out. Keypoints along a medialness ridge are local peaks which are efficiently localised using DFO driven swarms. DFO is optimised in a way so that it does not need to scan every image pixel and always tend to converge at these peaks. A series of experimental trials on different animation characters in movement sequences confirms the promising performance of the optimiser over a simpler, currently-in-use brute-force approach

    Deep Neuroevolution: Training Deep Neural Networks for False Alarm Detection in Intensive Care Units

    Get PDF
    We present a neuroevolution based-approach for training neural networks based on genetic algorithms, as applied to the problem of detecting false alarms in Intensive Care Units (ICU) based on physiological data. Typically, optimisation in neural networks is performed via backpropagation (BP) with stochastic gradient-based learning. Nevertheless, recent works have shown promising results in terms of utilising gradient-free, population-based genetic algorithms, suggesting that in certain cases gradient-based optimisation is not the best approach to follow. In this paper, we empirically show that utilising evolutionary and swarm intelligence algorithms can improve the performance of deep neural networks in problems such as the detection of false alarms in ICU. In more detail, we present results that improve the state-of-the-art accuracy on the corresponding Physionet challenge, while reducing the number of suppressed true alarms by deploying and adapting Dispersive Flies Optimisation (DFO)

    Improving forensic casework analysis and interpretation of gunshot residue (GSR) evidence

    Full text link
    University of Technology, Sydney. Faculty of Science.There are two main challenges to gunshot residue (GSR) evidence. The first concerns analysis. The lack of screening techniques complicates sampling and analysis of large areas or numbers of exhibits. Also, lead or heavy metal free ammunitions present limitations to the technique for confirmatory detection of residues - scanning electron microscopy/energy dispersive X-ray analysis (SEM/EDX). A screening technique was developed to detect GSR components from all ammunition types. Ion mobility spectrometry (IMS) was proven to allow sensitive and effective screening before proceeding to confirmatory analysis. Lead and heavy metal free ammunitions were examined and a technique developed for detecting components in the organic portion of the residue. Liquid chromatography tandem mass spectrometry (LC-MS/MS) was extremely effective, detecting twenty seven components. The technique is sensitive (to around 1 ppb ), selective, rapid and cost effective. The combination of IMS, SEM/EDX and LC-MS/MS, with visual, physica] and microscopic examination, is proposed as a complete protocol for GSR analysis from all ammunition types. The second challenge involves interpretation. Factors that lead to positive and negative findings must be considered and the weight of evidence assessed. Both background data and application of an interpretive framework have been inadequate. Background levels of GSR in the NSW general population and NSW Police Force were studied and the chances of random presence on a suspect and of contamination during arrest and sampling process determined. Nil GSR was detected on hands of the NSW general population or the sample of general duties police officers. A moderate probability was demonstrated for low levels of GSR on hands of crime scene investigators. GSR was detected on hands of all forensic firearms examiners tested, however their role limits access to suspects and items sampled for GSR, limiting the chance of contamination. Significantly, one high risk area for contamination was identified, the tactical response officers. Background levels of GSR in the Australian Federal Police laboratories were compared before and after implementing contamination controls. The configuration of the original laboratory along with the lack of controls lead to GSR being detected on almost every sample. The newer laboratory was extremely clean, only one GSR particle being detected, demonstrating the importance of effective contamination controls during sample collection and analysis. A statistical interpretive framework was developed. The model utilises Bayesian networks to consider existing data relating to transfer and persistence, and new data from this research, providing more objective assessment and allowing broader application of the Bayesian framework

    Sum of Three Cubes via Optimisation

    Full text link
    By first solving the equation x3+y3+z3=kx^3+y^3+z^3=k with fixed kk for zz and then considering the distance to the nearest integer function of the result, we turn the sum of three cubes problem into an optimisation one. We then apply three stochastic optimisation algorithms to this function in the case with k=2k=2, where there are many known solutions. The goal is to test the effectiveness of the method in searching for integer solutions. The algorithms are a modification of particle swarm optimisation and two implementations of simulated annealing. We want to compare their effectiveness as measured by the running times of the algorithms. To this end, we model the time data by assuming two underlying probability distributions -- exponential and log-normal, and calculate some numerical characteristics for them. Finally, we evaluate the statistical distinguishability of our models with respect to the geodesic distance in the manifold with the corresponding Fisher information metric.Comment: 21 pages without the appendices. Any comments will be greatly appreciated

    Evolutionary optimisation of beer organoleptic properties: a simulation framework

    Get PDF
    Modern computational techniques offer new perspectives for the personalisation of food properties through the optimisation of their production process. This paper addresses the personalisation of beer properties in the specific case of craft beers where the production process is more flexible. Furthermore, this work presents a solution discovery method that could be suitable for more complex, industrial setups. An evolutionary computation technique was used to map brewers’ desired organoleptic properties to their constrained ingredients to design novel recipes tailored for specific brews. While there exist several mathematical tools, using the original mathematical and chemistry formulas, or machine learning models that deal with the process of determining beer properties based on the predetermined quantities of ingredients, this work investigates an automated quantitative ingredient-selection approach. The process, which was applied to this problem for the first time, was investigated in a number of simulations by “cloning” several commercial brands with diverse properties. Additional experiments were conducted, demonstrating the system’s ability to deal with on-the-fly changes to users’ preferences during the optimisation process. The results of the experiments pave the way for the discovery of new recipes under varying preferences, therefore facilitating the personalisation and alternative high-fidelity reproduction of existing and new products
    • …
    corecore