106 research outputs found

    "Going back to our roots": second generation biocomputing

    Full text link
    Researchers in the field of biocomputing have, for many years, successfully "harvested and exploited" the natural world for inspiration in developing systems that are robust, adaptable and capable of generating novel and even "creative" solutions to human-defined problems. However, in this position paper we argue that the time has now come for a reassessment of how we exploit biology to generate new computational systems. Previous solutions (the "first generation" of biocomputing techniques), whilst reasonably effective, are crude analogues of actual biological systems. We believe that a new, inherently inter-disciplinary approach is needed for the development of the emerging "second generation" of bio-inspired methods. This new modus operandi will require much closer interaction between the engineering and life sciences communities, as well as a bidirectional flow of concepts, applications and expertise. We support our argument by examining, in this new light, three existing areas of biocomputing (genetic programming, artificial immune systems and evolvable hardware), as well as an emerging area (natural genetic engineering) which may provide useful pointers as to the way forward.Comment: Submitted to the International Journal of Unconventional Computin

    The application of evolutionary computation towards the characterization and classification of urothelium cell cultures

    Get PDF
    This thesis presents a novel method for classifying and characterizing urothelial cell cultures. A system of cell tracking employing computer vision techniques was applied to a one day long time-lapse videos of replicate normal human uroepithelial cell cultures exposed to different concentrations of adenosine triphosphate (ATP) and a selective purinergic P2X antagonist (PPADS) as inhibitor. Subsequent analysis following feature extraction on both cell culture and single-cell demonstrated the ability of the approach to successfully classify the modulated classes of cells using evolutionary algorithms. Specifically, a Cartesian Genetic Program (CGP) network was evolved that identified average migration speed, in-contact angular velocity, cohesivity and average cell clump size as the principal features contributing to the cell class separation. This approach provides a non-biased insight into modulated cell class behaviours

    Reconsideration and extension of Cartesian genetic programming

    Get PDF
    This dissertation aims on analyzing fundamental concepts and dogmas of a graph-based genetic programming approach called Cartesian Genetic Programming (CGP) and introduces advanced genetic operators for CGP. The results of the experiments presented in this thesis lead to more knowledge about the algorithmic use of CGP and its underlying working mechanisms. CGP has been mostly used with a parametrization pattern, which has been prematurely generalized as the most efficient pattern for standard CGP and its variants. Several parametrization patterns are evaluated with more detailed and comprehensive experiments by using meta-optimization. This thesis also presents a first runtime analysis of CGP. The time complexity of a simple (1+1)-CGP algorithm is analyzed with a simple mathematical problem and a simple Boolean function problem. In the subfield of genetic operators for CGP, new recombination and mutation techniques that work on a phenotypic level are presented. The effectiveness of these operators is demonstrated on a widespread set of popular benchmark problems. Especially the role of recombination can be seen as a big open question in the field of CGP, since the lack of an effective recombination operator limits CGP to mutation-only use. Phenotypic exploration analysis is used to analyze the effects caused by the presented operators. This type of analysis also leads to new insights into the search behavior of CGP in continuous and discrete fitness spaces. Overall, the outcome of this thesis leads to a reconsideration of how CGP is effectively used and extends its adaption from Darwin's and Lamarck's theories of biological evolution

    UKACM Proceedings 2024

    Get PDF

    Isogeometric Analysis for BIM-Based Design and Simulation of Sub-Rectangular Tunnel

    Get PDF
    The design and analysis of segmental tunnel lining is today often based on empirical solutions with simplified assumptions. This work showcases the application of Isogeometric Analysis (IGA) for computationally efficient simulations of tunnel linings [1, 2]. In our past research, we developed a design-through-analysis procedure that consists of i) parametric modeling of the segmented tunnel lining; ii) development of an IGA computational framework, iii) reconstruction of the BIM lining model for IGA analysis, and iv) simulation model for lining including a reconstructed IGA model, contact interfaces between the joints, and a non-linear soil-structure interaction model based on the Variational Hyperstatic Reaction Method (VHRM) [3].In this paper, we extend our method for the analysis of subrectangular tunnel linings and demonstrate its efficiency using the example of the Shanghai express tunnel. The advantage of our novel method is the flexibility in adapting the tunnel alignment with the help of NURBS/CAD technology. Employing the high-order geometry definition, convergence of the mesh refinement procedure can be obtained with much faster rate. As a result, the modelling effort and computational time are reduced significantly. Moreover, this approach allows to capture the bending moment with better regularity. The combination with an existing BIM modelling approach via geometryreconstruction leads to a very efficient framework for tunnel lining analysis and design

    A Practical Investigation into Achieving Bio-Plausibility in Evo-Devo Neural Microcircuits Feasible in an FPGA

    Get PDF
    Many researchers has conjectured, argued, or in some cases demonstrated, that bio-plausibility can bring about emergent properties such as adaptability, scalability, fault-tolerance, self-repair, reliability, and autonomy to bio-inspired intelligent systems. Evolutionary-developmental (evo-devo) spiking neural networks are a very bio-plausible mixture of such bio-inspired intelligent systems that have been proposed and studied by a few researchers. However, the general trend is that the complexity and thus the computational cost grow with the bio-plausibility of the system. FPGAs (Field- Programmable Gate Arrays) have been used and proved to be one of the flexible and cost efficient hardware platforms for research' and development of such evo-devo systems. However, mapping a bio-plausible evo-devo spiking neural network to an FPGA is a daunting task full of different constraints and trade-offs that makes it, if not infeasible, very challenging. This thesis explores the challenges, trade-offs, constraints, practical issues, and some possible approaches in achieving bio-plausibility in creating evolutionary developmental spiking neural microcircuits in an FPGA through a practical investigation along with a series of case studies. In this study, the system performance, cost, reliability, scalability, availability, and design and testing time and complexity are defined as measures for feasibility of a system and structural accuracy and consistency with the current knowledge in biology as measures for bio-plausibility. Investigation of the challenges starts with the hardware platform selection and then neuron, cortex, and evo-devo models and integration of these models into a whole bio-inspired intelligent system are examined one by one. For further practical investigation, a new PLAQIF Digital Neuron model, a novel Cortex model, and a new multicellular LGRN evo-devo model are designed, implemented and tested as case studies. Results and their implications for the researchers, designers of such systems, and FPGA manufacturers are discussed and concluded in form of general trends, trade-offs, suggestions, and recommendations

    Terrain Representation And Reasoning In Computer Generated Forces : A Survey Of Computer Generated Forces Systems And How They Represent And Reason About Terrain

    Get PDF
    Report on a survey of computer systems used to produce realistic or intelligent behavior by autonomous entities in simulation systems. In particular, it is concerned with the data structures used by computer generated forces systems to represent terrain and the algorithmic approaches used by those systems to reason about terrain

    Classification of Resting-State fMRI using Evolutionary Algorithms: Towards a Brain Imaging Biomarker for Parkinson’s Disease

    Get PDF
    It is commonly accepted that accurate early diagnosis and monitoring of neurodegenerative conditions is essential for effective disease management and delivery of medication and treatment. This research develops automatic methods for detecting brain imaging preclinical biomarkers for Parkinson’s disease (PD) by considering the novel application of evolutionary algorithms. An additional novel element of this work is the use of evolutionary algorithms to both map and predict the functional connectivity in patients using rs-fMRI data. Specifically, Cartesian Genetic Programming was used to classify dynamic causal modelling data as well as timeseries data. The findings were validated using two other commonly used classification methods (Artificial Neural Networks and Support Vector Machines) and by employing k-fold cross-validation. Across dynamic causal modelling and timeseries analyses, findings revealed maximum accuracies of 75.21% for early stage (prodromal) PD patients in which patients reveal no motor symptoms versus healthy controls, 85.87% for PD patients versus prodromal PD patients, and 92.09% for PD patients versus healthy controls. Prodromal PD patients were classified from healthy controls with high accuracy – this is notable and represents the key finding since current methods of diagnosing prodromal PD have low reliability and low accuracy. Furthermore, Cartesian Genetic Programming provided comparable performance accuracy relative to Artificial Neural Networks and Support Vector Machines. Nevertheless, evolutionary algorithms enable us to decode the classifier in terms of understanding the data inputs that are used, more easily than in Artificial Neural Networks and Support Vector Machines. Hence, these findings underscore the relevance of both dynamic causal modelling analyses for classification and Cartesian Genetic Programming as a novel classification tool for brain imaging data with medical implications for disease diagnosis, particularly in early stages 5-20 years prior to motor symptoms

    Acta Cybernetica : Volume 25. Number 2.

    Get PDF

    Intelligent data mining using artificial neural networks and genetic algorithms : techniques and applications

    Get PDF
    Data Mining (DM) refers to the analysis of observational datasets to find relationships and to summarize the data in ways that are both understandable and useful. Many DM techniques exist. Compared with other DM techniques, Intelligent Systems (ISs) based approaches, which include Artificial Neural Networks (ANNs), fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as Genetic Algorithms (GAs), are tolerant of imprecision, uncertainty, partial truth, and approximation. They provide flexible information processing capability for handling real-life situations. This thesis is concerned with the ideas behind design, implementation, testing and application of a novel ISs based DM technique. The unique contribution of this thesis is in the implementation of a hybrid IS DM technique (Genetic Neural Mathematical Method, GNMM) for solving novel practical problems, the detailed description of this technique, and the illustrations of several applications solved by this novel technique. GNMM consists of three steps: (1) GA-based input variable selection, (2) Multi- Layer Perceptron (MLP) modelling, and (3) mathematical programming based rule extraction. In the first step, GAs are used to evolve an optimal set of MLP inputs. An adaptive method based on the average fitness of successive generations is used to adjust the mutation rate, and hence the exploration/exploitation balance. In addition, GNMM uses the elite group and appearance percentage to minimize the randomness associated with GAs. In the second step, MLP modelling serves as the core DM engine in performing classification/prediction tasks. An Independent Component Analysis (ICA) based weight initialization algorithm is used to determine optimal weights before the commencement of training algorithms. The Levenberg-Marquardt (LM) algorithm is used to achieve a second-order speedup compared to conventional Back-Propagation (BP) training. In the third step, mathematical programming based rule extraction is not only used to identify the premises of multivariate polynomial rules, but also to explore features from the extracted rules based on data samples associated with each rule. Therefore, the methodology can provide regression rules and features not only in the polyhedrons with data instances, but also in the polyhedrons without data instances. A total of six datasets from environmental and medical disciplines were used as case study applications. These datasets involve the prediction of longitudinal dispersion coefficient, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) data, eye bacteria Multisensor Data Fusion (MDF), and diabetes classification (denoted by Data I through to Data VI). GNMM was applied to all these six datasets to explore its effectiveness, but the emphasis is different for different datasets. For example, the emphasis of Data I and II was to give a detailed illustration of how GNMM works; Data III and IV aimed to show how to deal with difficult classification problems; the aim of Data V was to illustrate the averaging effect of GNMM; and finally Data VI was concerned with the GA parameter selection and benchmarking GNMM with other IS DM techniques such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Fuzzy ARTMAP, and Cartesian Genetic Programming (CGP). In addition, datasets obtained from published works (i.e. Data II & III) or public domains (i.e. Data VI) where previous results were present in the literature were also used to benchmark GNMM’s effectiveness. As a closely integrated system GNMM has the merit that it needs little human interaction. With some predefined parameters, such as GA’s crossover probability and the shape of ANNs’ activation functions, GNMM is able to process raw data until some human-interpretable rules being extracted. This is an important feature in terms of practice as quite often users of a DM system have little or no need to fully understand the internal components of such a system. Through case study applications, it has been shown that the GA-based variable selection stage is capable of: filtering out irrelevant and noisy variables, improving the accuracy of the model; making the ANN structure less complex and easier to understand; and reducing the computational complexity and memory requirements. Furthermore, rule extraction ensures that the MLP training results are easily understandable and transferrable
    • …
    corecore