149 research outputs found

    Applying the big bang-big crunch metaheuristic to large-sized operational problems

    Get PDF
    In this study, we present an investigation of comparing the capability of a big bang-big crunch metaheuristic (BBBC) for managing operational problems including combinatorial optimization problems. The BBBC is a product of the evolution theory of the universe in physics and astronomy. Two main phases of BBBC are the big bang and the big crunch. The big bang phase involves the creation of a population of random initial solutions, while in the big crunch phase these solutions are shrunk into one elite solution exhibited by a mass center. This study looks into the BBBC’s effectiveness in assignment and scheduling problems. Where it was enhanced by incorporating an elite pool of diverse and high quality solutions; a simple descent heuristic as a local search method; implicit recombination; Euclidean distance; dynamic population size; and elitism strategies. Those strategies provide a balanced search of diverse and good quality population. The investigation is conducted by comparing the proposed BBBC with similar metaheuristics. The BBBC is tested on three different classes of combinatorial optimization problems; namely, quadratic assignment, bin packing, and job shop scheduling problems. Where the incorporated strategies have a greater impact on the BBBC's performance. Experiments showed that the BBBC maintains a good balance between diversity and quality which produces high-quality solutions, and outperforms other identical metaheuristics (e.g. swarm intelligence and evolutionary algorithms) reported in the literature

    Enabling Field Force Operational Sustainability: A Big Bang-Big Crunch Type-2 Fuzzy Logic System for Goal-Driven Simulation

    Get PDF
    Business operational sustainability must allow creating economic value, building healthy ecosystems and developing strong communities. Hence, there is a need to develop solutions which can safeguard companies' business sustainability. Various solutions could have different costs and deliver different benefits. Therefore, there is a need to evaluate these solutions before being implemented. In reality, companies require achieving certain targets according to their plans and strategies. Goal-Driven Simulation (GDS) is an approach that allows evaluating solutions before implementing them in real-life while focusing on achieving desired targets. This paper presents a GDS based on interval type-2 Fuzzy Logic System (IT2FLS) optimized by the big bang-big crunch (BU-BC) algorithm with application to field force allocation within the telecommunications sector. The obtained results show the suitability of the proposed approach to model unexpected factors to protect the business sustainability in the telecommunications industry field force allocation domain

    A manufacturing system energy-efficient optimisation model for maintenance production workforce size determination using integrated fuzzy logic and quality function deployment approach

    Get PDF
    In maintenance systems, the current approach to workforce analysis entails the utilisation of metrics that focus exclusively on workforce cost and productivity. This method omits the “green” concept, which principally hinges on energy-efficient manufacturing and also ignores the production-maintenance integration. The approach is not accurate and could not be heavily relied upon for sound maintenance decisions. Consequently, a comprehensive, scientifically-motivated, cost-effective and an environmentally-conscious approach are needed. With this in view, a deviation from the traditional approach through employing a combined fuzzy, quality function deployment interacting with three meta-heuristics (colliding bodies optimisation, big-bang big-crunch and particle swarm optimisation) for optimisation is made in the current study. The workforce size parameters are determined by maximising workforce size’s earned-valued as well as electric power efficiency maximisation subject to various real-life constraints. The efficacy and robustness of the model is tested with data from an aluminium products manufacturing system operating in a developing country. The results obtained indicate that the proposed colliding bodies’ optimisation framework is effective in comparison with other techniques. This implies that the proposed methodology potentially displays tremendous benefit of conserving energy, thus aiding environmental preservation and cost of energy savings. The principal novelty of the paper is the uniquely new method of quantifying the energy savings contributions of the maintenance workforc

    A comprehensive survey on cultural algorithms

    Get PDF
    Peer reviewedPostprin

    Susceptible exposed infectious recovered-machine learning for COVID-19 prediction in Saudi Arabia

    Get PDF
    Susceptible exposed infectious recovered (SEIR) is among the epidemiological models used in forecasting the spread of disease in large populations. SEIR is a fitting model for coronavirus disease (COVID-19) spread prediction. Somehow, in its original form, SEIR could not measure the impact of lockdowns. So, in the SEIR equations system utilized in this study, a variable was included to evaluate the impact of varying levels of social distance on the transmission of COVID-19. Additionally, we applied artificial intelligence utilizing the deep neural network machine learning (ML) technique. On the initial spread data for Saudi Arabia that were available up to June 25th, 2021, this improved SEIR model was used. The study shows possible infection to around 3.1 million persons without lockdown in Saudi Arabia at the peak of spread, which lasts for about 3 months beginning from the lockdown date (March 21st). On the other hand, the Kingdom's current partial lockdown policy was estimated to cut the estimated number of infections to 0.5 million over nine months. The data shows that stricter lockdowns may successfully flatten the COVID-19 graph curve in Saudi Arabia. We successfully predicted the COVID-19 epidemic's peaks and sizes using our modified deep neural network (DNN) and SEIR model

    A novel hybrid backtracking search optimization algorithm for continuous function optimization

    Get PDF
    Stochastic optimization algorithm provides a robust and efficient approach for solving complex real world problems. Backtracking Search Optimization Algorithm (BSA) is a new stochastic evolutionary algorithm and the aim of this paper is to introduce a hybrid approach combining the BSA and Quadratic approximation (QA), called HBSAfor solving unconstrained non-linear, non-differentiable optimization problems. For the validity of the proposed method the results are compared with five state-of-the-art particle swarm optimization (PSO) variant approaches in terms of the numerical result of the solutions. The sensitivity analysis of the BSA control parameter (F) is also performed

    Enhanced grey wolf optimisation algorithm for feature selection in anomaly detection

    Get PDF
    Anomaly detection deals with identification of items that do not conform to an expected pattern or items present in a dataset. The performance of different mechanisms utilized to perform the anomaly detection depends heavily on the group of features used. Thus, not all features in the dataset can be used in the classification process since some features may lead to low performance of classifier. Feature selection (FS) is a good mechanism that minimises the dimension of high-dimensional datasets by deleting the irrelevant features. Modified Binary Grey Wolf Optimiser (MBGWO) is a modern metaheuristic algorithm that has successfully been used for FS for anomaly detection. However, the MBGWO has several issues in finding a good quality solution. Thus, this study proposes an enhanced binary grey wolf optimiser (EBGWO) algorithm for FS in anomaly detection to overcome the algorithm issues. The first modification enhances the initial population of the MBGWO using a heuristic based Ant Colony Optimisation algorithm. The second modification develops a new position update mechanism using the Bat Algorithm movement. The third modification improves the controlled parameter of the MBGWO algorithm using indicators from the search process to refine the solution. The EBGWO algorithm was evaluated on NSL-KDD and six (6) benchmark datasets from the University California Irvine (UCI) repository against ten (10) benchmark metaheuristic algorithms. Experimental results of the EBGWO algorithm on the NSL-KDD dataset in terms of number of selected features and classification accuracy are superior to other benchmark optimisation algorithms. Moreover, experiments on the six (6) UCI datasets showed that the EBGWO algorithm is superior to the benchmark algorithms in terms of classification accuracy and second best for the number of selected features. The proposed EBGWO algorithm can be used for FS in anomaly detection tasks that involve any dataset size from various application domains

    Ant Colony Optimization

    Get PDF
    Ant Colony Optimization (ACO) is the best example of how studies aimed at understanding and modeling the behavior of ants and other social insects can provide inspiration for the development of computational algorithms for the solution of difficult mathematical problems. Introduced by Marco Dorigo in his PhD thesis (1992) and initially applied to the travelling salesman problem, the ACO field has experienced a tremendous growth, standing today as an important nature-inspired stochastic metaheuristic for hard optimization problems. This book presents state-of-the-art ACO methods and is divided into two parts: (I) Techniques, which includes parallel implementations, and (II) Applications, where recent contributions of ACO to diverse fields, such as traffic congestion and control, structural optimization, manufacturing, and genomics are presented
    • …
    corecore