4,018 research outputs found

    Comprehensibility, Overfitting and Co-Evolution in Genetic Programming for Technical Trading Rules

    Get PDF
    This thesis presents Genetic Programming methodologies to find successful and understandable technical trading rules for financial markets. The methods when applied to the S&P500 consistently beat the buy-and-hold strategy over a 12-year period, even when considering transaction costs. Some of the methods described discover rules that beat the S&P500 with 99% significance. The work describes the use of a complexity-penalizing factor to avoid overfitting and improve comprehensibility of the rules produced by GPs. The effect of this factor on the returns for this domain area is studied and the results indicated that it increased the predictive ability of the rules. A restricted set of operators and domain knowledge were used to improve comprehensibility. In particular, arithmetic operators were eliminated and a number of technical indicators in addition to the widely used moving averages, such as trend lines and local maxima and minima were added. A new evaluation function that tests for consistency of returns in addition to total returns is introduced. Different cooperative coevolutionary genetic programming strategies for improving returns are studied and the results analyzed. We find that paired collaborator coevolution has the best results

    An integer programming approach for Balancing and Scheduling in Extended Manufacturing Environment

    Get PDF
    In the fiercely competitive era induced by expansion of open business archetypes, the managerial aspects of Extended Manufacturing Environments (EMEs) are experiencing growing concerns. There is no scope of leaving a possible operational improvement unexplored. For enhanced operational efficiency and capacity utilization the balancing and scheduling problems of EMEs are, therefore, rightfully considered and an integer programme is proposed in this paper. The model is designed in a spread sheet and solved through What'sBest optimizer. The model capabilities is assessed through a test problem. The results have demonstrated that the model is capable of defining optimized production schedules for EMEs.This study has been conducted under FRGS project (FRGS14- 102-0343) funded by Ministry of Higher Education (MOHE), Malaysia. The authors are grateful to MOHE and Research Management Centre (RMC), International Islamic University Malaysia (IIUM) for their support.info:eu-repo/semantics/publishedVersio

    Continuous Process Improvement Implementation Framework Using Multi-Objective Genetic Algorithms and Discrete Event Simulation

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Purpose Continuous process improvement is a hard problem, especially in high variety/low volume environments due to the complex interrelationships between processes. The purpose of this paper is to address the process improvement issues by simultaneously investigating the job sequencing and buffer size optimization problems. Design/methodology/approach This paper proposes a continuous process improvement implementation framework using a modified genetic algorithm (GA) and discrete event simulation to achieve multi-objective optimization. The proposed combinatorial optimization module combines the problem of job sequencing and buffer size optimization under a generic process improvement framework, where lead time and total inventory holding cost are used as two combinatorial optimization objectives. The proposed approach uses the discrete event simulation to mimic the manufacturing environment, the constraints imposed by the real environment and the different levels of variability associated with the resources. Findings Compared to existing evolutionary algorithm-based methods, the proposed framework considers the interrelationship between succeeding and preceding processes and the variability induced by both job sequence and buffer size problems on each other. A computational analysis shows significant improvement by applying the proposed framework. Originality/value Significant body of work exists in the area of continuous process improvement, discrete event simulation and GAs, a little work has been found where GAs and discrete event simulation are used together to implement continuous process improvement as an iterative approach. Also, a modified GA simultaneously addresses the job sequencing and buffer size optimization problems by considering the interrelationships and the effect of variability due to both on each other

    MT-EA4Cloud: A Methodology For testing and optimising energy-aware cloud systems

    Full text link
    Currently, using conventional techniques for checking and optimising the energy consumption in cloud systems is unpractical, due to the massive computational resources required. An appropriate test suite focusing on the parts of the cloud to be tested must be efficiently synthesised and executed, while the correctness of the test results must be checked. Additionally, alternative cloud configurations that optimise the energetic consumption of the cloud must be generated and analysed accordingly, which is challenging. To solve these issues we present MT-EA4Cloud, a formal approach to check the correctness – from an energy-aware point of view – of cloud systems and optimise their energy consumption. To make the checking of energy consumption practical, MT-EA4Cloud combines metamorphic testing, evolutionary algorithms and simulation. Metamorphic testing allows to formally model the underlying cloud infrastructure in the form of metamorphic relations. We use metamorphic testing to alleviate both the reliable test set problem, generating appropriate test suites focused on the features reflected in the metamorphic relations, and the oracle problem, using the metamorphic relations to check the generated results automatically. MT-EA4Cloud uses evolutionary algorithms to efficiently guide the search for optimising the energetic consumption of cloud systems, which can be calculated using different cloud simulatorsThis work was supported by the Spanish MINECO/FEDER projects DArDOS, FAME and MASSIVE under Grants TIN2015-65845-C3-1-R, RTI2018-093608-B-C31 and RTI2018-095255- B-I00, and the Comunidad de Madrid project FORTE-CM under grant S2018/TCS-4314. The first author is also supported by the Universidad Complutense de Madrid Santander Universidades grant (CT17/17-CT18/17

    Holistic, data-driven, service and supply chain optimisation: linked optimisation.

    Get PDF
    The intensity of competition and technological advancements in the business environment has made companies collaborate and cooperate together as a means of survival. This creates a chain of companies and business components with unified business objectives. However, managing the decision-making process (like scheduling, ordering, delivering and allocating) at the various business components and maintaining a holistic objective is a huge business challenge, as these operations are complex and dynamic. This is because the overall chain of business processes is widely distributed across all the supply chain participants; therefore, no individual collaborator has a complete overview of the processes. Increasingly, such decisions are automated and are strongly supported by optimisation algorithms - manufacturing optimisation, B2B ordering, financial trading, transportation scheduling and allocation. However, most of these algorithms do not incorporate the complexity associated with interacting decision-making systems like supply chains. It is well-known that decisions made at one point in supply chains can have significant consequences that ripple through linked production and transportation systems. Recently, global shocks to supply chains (COVID-19, climate change, blockage of the Suez Canal) have demonstrated the importance of these interdependencies, and the need to create supply chains that are more resilient and have significantly reduced impact on the environment. Such interacting decision-making systems need to be considered through an optimisation process. However, the interactions between such decision-making systems are not modelled. We therefore believe that modelling such interactions is an opportunity to provide computational extensions to current optimisation paradigms. This research study aims to develop a general framework for formulating and solving holistic, data-driven optimisation problems in service and supply chains. This research achieved this aim and contributes to scholarship by firstly considering the complexities of supply chain problems from a linked problem perspective. This leads to developing a formalism for characterising linked optimisation problems as a model for supply chains. Secondly, the research adopts a method for creating a linked optimisation problem benchmark by linking existing classical benchmark sets. This involves using a mix of classical optimisation problems, typically relating to supply chain decision problems, to describe different modes of linkages in linked optimisation problems. Thirdly, several techniques for linking supply chain fragmented data have been proposed in the literature to identify data relationships. Therefore, this thesis explores some of these techniques and combines them in specific ways to improve the data discovery process. Lastly, many state-of-the-art algorithms have been explored in the literature and these algorithms have been used to tackle problems relating to supply chain problems. This research therefore investigates the resilient state-of-the-art optimisation algorithms presented in the literature, and then designs suitable algorithmic approaches inspired by the existing algorithms and the nature of problem linkages to address different problem linkages in supply chains. Considering research findings and future perspectives, the study demonstrates the suitability of algorithms to different linked structures involving two sub-problems, which suggests further investigations on issues like the suitability of algorithms on more complex structures, benchmark methodologies, holistic goals and evaluation, processmining, game theory and dependency analysis
    corecore