298 research outputs found

    Evolving Artificial Neural Networks using Cartesian Genetic Programming

    Get PDF
    NeuroEvolution is the application of Evolutionary Algorithms to the training of Artificial Neural Networks. NeuroEvolution is thought to possess many benefits over traditional training methods including: the ability to train recurrent network structures, the capability to adapt network topology, being able to create heterogeneous networks of arbitrary transfer functions, and allowing application to reinforcement as well as supervised learning tasks. This thesis presents a series of rigorous empirical investigations into many of these perceived advantages of NeuroEvolution. In this work it is demonstrated that the ability to simultaneously adapt network topology along with connection weights represents a significant advantage of many NeuroEvolutionary methods. It is also demonstrated that the ability to create heterogeneous networks comprising a range of transfer functions represents a further significant advantage. This thesis also investigates many potential benefits and drawbacks of NeuroEvolution which have been largely overlooked in the literature. This includes the presence and role of genetic redundancy in NeuroEvolution's search and whether program bloat is a limitation. The investigations presented focus on the use of a recently developed NeuroEvolution method based on Cartesian Genetic Programming. This thesis extends Cartesian Genetic Programming such that it can represent recurrent program structures allowing for the creation of recurrent Artificial Neural Networks. Using this newly developed extension, Recurrent Cartesian Genetic Programming, and its application to Artificial Neural Networks, are demonstrated to be extremely competitive in the domain of series forecasting

    Analysis and modeling a distributed co-operative multi agent system for scaling-up business intelligence

    Get PDF
    Modeling A Distributed Co-Operative Multi Agent System in the area of Business Intelligence is the newer topic. During the work carried out a software Integrated Intelligent Advisory Model (IIAM) has been develop, which is a personal finance portfolio ma

    Evolving Fault Tolerant Robotic Controllers

    Get PDF
    Fault tolerant control and evolutionary algorithms are two different research areas. However with the development of artificial intelligence, evolutionary algorithms have demonstrated competitive performance compared to traditional approaches for the optimisation task. For this reason, the combination of fault tolerant control and evolutionary algorithms has become a new research topic with the evolving of controllers so as to achieve different fault tolerant control schemes. However most of the controller evolution tasks are based on the optimisation of controller parameters so as to achieve the fault tolerant control, so structure optimisation based evolutionary algorithm approaches have not been investigated as the same level as parameter optimisation approaches. For this reason, this thesis investigates whether structure optimisation based evolutionary algorithm approaches could be implemented into a robot sensor fault tolerant control scheme based on the phototaxis task in addition to just parameter optimisation, and explores whether controller structure optimisation could demonstrate potential benefit in a greater degree than just controller parameter optimisation. This thesis presents a new multi-objective optimisation algorithm in the structure optimisation level called Multi-objective Cartesian Genetic Programming, which is created based on Cartesian Genetic Programming and Non-dominated Sorting Genetic Algorithm 2, in terms of NeuroEvolution based robotic controller optimisation. In order to solve two main problems during the algorithm development, this thesis investigates the benefit of genetic redundancy as well as preserving neutral genetic drift in order to solve the random neighbour pick problem during crowding fill for survival selection and investigates how hyper-volume indicator is employed to measure the multi-objective optimisation algorithm performance in order to assess the convergence for Multi-objective Cartesian Genetic Programming. Furthermore, this thesis compares Multi-objective Cartesian Genetic Programming with Non-dominated Sorting Genetic Algorithm 2 for their evolution performance and investigates how Multi-objective Cartesian Genetic Programming could be performing for a more difficult fault tolerant control scenario besides the basic one, which further demonstrates the benefit of utilising structure optimisation based evolutionary algorithm approach for robotic fault tolerant control

    Hybrid optimisation and formation of index tracking portfolio in TSE

    Get PDF
    Asset allocation and portfolio optimisation are some of the most important steps in an investors decision making process. In order to manage uncertainty and maximise returns, it is assumed that active investment is a zero-sum game. It is possible however, that market inefficiencies could provide the necessary opportunities for investors to beat the market. In this study we examined a core-satellite approach to gain higher returns than that of the market. The core component of the portfolio consists of an index-tracking portfolio which has been formulated using a meta-heuristic genetic algorithm, allowing for the efficient search of the solution space for an optimal (or near-optimal) solution. The satellite component is made up of publicly traded active managed funds and the weights of each component are optimised using mathematical modelling (quadratics) to maximise the returns of the resultant portfolio.In order to address uncertainty within the model variables, robustness is introduced into the objective function of the model in the form of risk tolerance (Degree of uncertainty). The introduction of robustness as a variable allows us to assess the resultant model in worst-case circumstances and determine suitable levels of risk tolerance. Further attempts at implementing additional robustness within the model using an artificial neural network in an LSTM configuration were inconclusive, suggesting that LSTM networks were unable to make informative predictions on the future returns of the index because market efficiencies render historical data irrelevant and market movement is akin to a random walk. A framework is offered for the formation and optimisation of a hybrid multi-stage core-satellite portfolio which manages risk through the implementation of robustness and passive investment, whilst attempting to beat the market in terms of returns. Using daily returns data from the Tehran Stock Exchange for a four-year period, it is shown that the resultant core-satellite portfolio is able to beat the market considerably after training.Results indicate that the tracking ability of the portfolio is affected by the number of its constituents, that there is a specific time frame of 70 days after which the resultant portfolio needs to be re assessed and readjusted and that the implementation of robustness as a degree of uncertainty variable within the objective function increases the correlation coefficient and reduces tracking error.Keywords: Index Funds, Index Tracking, Passive Portfolio Management, Robust Optimisation, Core Satellite Investment, Quadratic Optimisation, Genetic Algorithms, LSTM, Heuristic Neural Networks, Efficient Market Hypothesis, Modern Portfolio Theory, Portfolio optimisatio

    The Value of Technics: An Ontogenetic Approach to Money, Markets, and Networks

    Full text link
    This thesis investigates the impact of the digitalization of monetary and financial flows on the political-economic sphere in order to provide a novel perspective on the relations between economic and technological forces at the present global juncture. In the aftermath of the Global Financial Crisis and with the rise of the cryptoeconomy, an increasing number of scholars have highlighted the immanence of market logic to cultural and social life. At the same time, speculative practices have emerged that attempt to challenge the political economy through financial experiments. This dissertation complements these approaches by stressing the need to pair the critical study of finance with scholarship in the philosophy of technology that emphasizes the value immanent to technics and technology – i.e. the normative and genetic role of ubiquitous algorithmic networks in the organization of markets and socius. In order to explore these events, I propose an interdisciplinary theoretical framework informed largely by Gilbert Simondon’s philosophy of individuation and technics and the contemporary literature on the ontology of computation, supported by insights drawn from the history of finance and economic theory. This novel framework will provide the means to investigate the ontogenetic processes at work in the techno-cultural ecosystem following the digitalization of monetary and financial flows. Through an exploration of the fleeting materiality and multifaceted character of digital fiat money, the social power of algorithmic financial logic, and the new possibilities offered by the invention of the Bitcoin protocol, this research aims to challenge some of the bedrocks of the economic orthodoxy – economic and monetary value, liquidity, market rationality – in order to move beyond the overarching narrative of capitalism as a monolithic system. The thesis instead foregrounds the techno-historical contingencies that have led to the contemporary power formation. Furthermore, it argues that the ontogenetic character of algorithmic technology ushers in novel possibilities for the speculative engineering of alternative networks of value creation and distribution that have the potential to reverse the current balance of power

    Using MapReduce Streaming for Distributed Life Simulation on the Cloud

    Get PDF
    Distributed software simulations are indispensable in the study of large-scale life models but often require the use of technically complex lower-level distributed computing frameworks, such as MPI. We propose to overcome the complexity challenge by applying the emerging MapReduce (MR) model to distributed life simulations and by running such simulations on the cloud. Technically, we design optimized MR streaming algorithms for discrete and continuous versions of Conway’s life according to a general MR streaming pattern. We chose life because it is simple enough as a testbed for MR’s applicability to a-life simulations and general enough to make our results applicable to various lattice-based a-life models. We implement and empirically evaluate our algorithms’ performance on Amazon’s Elastic MR cloud. Our experiments demonstrate that a single MR optimization technique called strip partitioning can reduce the execution time of continuous life simulations by 64%. To the best of our knowledge, we are the first to propose and evaluate MR streaming algorithms for lattice-based simulations. Our algorithms can serve as prototypes in the development of novel MR simulation algorithms for large-scale lattice-based a-life models.https://digitalcommons.chapman.edu/scs_books/1014/thumbnail.jp

    Volatility modeling and limit-order book analytics with high-frequency data

    Get PDF
    The vast amount of information characterizing nowadays’s high-frequency financial datasets poses both opportunities and challenges. Among the opportunities, existing methods can be employed to provide new insights and better understanding of market’s complexity under different perspectives, while new methods, capable of fully-exploit all the information embedded in high-frequency datasets and addressing new issues, can be devised. Challenges are driven by data complexity: limit-order book datasets constitute of hundreds of thousands of events, interacting with each other, and affecting the event-flow dynamics. This dissertation aims at improving our understanding over the effective applicability of machine learning methods for mid-price movement prediction, over the nature of long-range autocorrelations in financial time-series, and over the econometric modeling and forecasting of volatility dynamics in high-frequency settings. Our results show that simple machine learning methods can be successfully employed for mid-price forecasting, moreover adopting methods that rely on the natural tensorrepresentation of financial time series, inter-temporal connections captured by this convenient representation are shown to be of relevance for the prediction of future mid-price movements. Furthermore, by using ultra-high-frequency order book data over a considerably long period, a quantitative characterization of the long-range autocorrelation is achieved by extracting the so-called scaling exponent. By jointly considering duration series of both inter- and cross- events, for different stocks, and separately for the bid and ask side, long-range autocorrelations are found to be ubiquitous and qualitatively homogeneous. With respect to the scaling exponent, evidence of three cross-overs is found, and complex heterogeneous associations with a number of relevant economic variables discussed. Lastly, the use of copulas as the main ingredient for modeling and forecasting realized measures of volatility is explored. The modeling background resembles but generalizes, the well-known Heterogeneous Autoregressive (HAR) model. In-sample and out-of-sample analyses, based on several performance measures, statistical tests, and robustness checks, show forecasting improvements of copula-based modeling over the HAR benchmark

    Experimental investigation and modelling of the heating value and elemental composition of biomass through artificial intelligence

    Get PDF
    Abstract: Knowledge advancement in artificial intelligence and blockchain technologies provides new potential predictive reliability for biomass energy value chain. However, for the prediction approach against experimental methodology, the prediction accuracy is expected to be high in order to develop a high fidelity and robust software which can serve as a tool in the decision making process. The global standards related to classification methods and energetic properties of biomass are still evolving given different observation and results which have been reported in the literature. Apart from these, there is a need for a holistic understanding of the effect of particle sizes and geospatial factors on the physicochemical properties of biomass to increase the uptake of bioenergy. Therefore, this research carried out an experimental investigation of some selected bioresources and also develops high-fidelity models built on artificial intelligence capability to accurately classify the biomass feedstocks, predict the main elemental composition (Carbon, Hydrogen, and Oxygen) on dry basis and the Heating value in (MJ/kg) of biomass...Ph.D. (Mechanical Engineering Science
    corecore