12 research outputs found

    Bat Algorithm: Literature Review and Applications

    Full text link
    Bat algorithm (BA) is a bio-inspired algorithm developed by Yang in 2010 and BA has been found to be very efficient. As a result, the literature has expanded significantly in the last 3 years. This paper provides a timely review of the bat algorithm and its new variants. A wide range of diverse applications and case studies are also reviewed and summarized briefly here. Further research topics are also discussed.Comment: 10 page

    Intelligent instance selection techniques for support vector machine speed optimization with application to e-fraud detection.

    Get PDF
    Doctor of Philosophy in Computer Science. University of KwaZulu-Natal, Durban 2017.Decision-making is a very important aspect of many businesses. There are grievous penalties involved in wrong decisions, including financial loss, damage of company reputation and reduction in company productivity. Hence, it is of dire importance that managers make the right decisions. Machine Learning (ML) simplifies the process of decision making: it helps to discover useful patterns from historical data, which can be used for meaningful decision-making. The ability to make strategic and meaningful decisions is dependent on the reliability of data. Currently, many organizations are overwhelmed with vast amounts of data, and unfortunately, ML algorithms cannot effectively handle large datasets. This thesis therefore proposes seven filter-based and five wrapper-based intelligent instance selection techniques for optimizing the speed and predictive accuracy of ML algorithms, with a particular focus on Support Vector Machine (SVM). Also, this thesis proposes a novel fitness function for instance selection. The primary difference between the filter-based and wrapper-based technique is in their method of selection. The filter-based techniques utilizes the proposed fitness function for selection, while the wrapper-based technique utilizes SVM algorithm for selection. The proposed techniques are obtained by fusing SVM algorithm with the following Nature Inspired algorithms: flower pollination algorithm, social spider algorithm, firefly algorithm, cuckoo search algorithm and bat algorithm. Also, two of the filter-based techniques are boundary detection algorithms, inspired by edge detection in image processing and edge selection in ant colony optimization. Two different sets of experiments were performed in order to evaluate the performance of the proposed techniques (wrapper-based and filter-based). All experiments were performed on four datasets containing three popular e-fraud types: credit card fraud, email spam and phishing email. In addition, experiments were performed on 20 datasets provided by the well-known UCI data repository. The results show that the proposed filter-based techniques excellently improved SVM training speed in 100% (24 out of 24) of the datasets used for evaluation, without significantly affecting SVM classification quality. Moreover, experimental results also show that the wrapper-based techniques consistently improved SVM predictive accuracy in 78% (18 out of 23) of the datasets used for evaluation and simultaneously improved SVM training speed in all cases. Furthermore, two different statistical tests were conducted to further validate the credibility of the results: Freidman’s test and Holm’s post-hoc test. The statistical test results reveal that the proposed filter-based and wrapper-based techniques are significantly faster, compared to standard SVM and some existing instance selection techniques, in all cases. Moreover, statistical test results also reveal that Cuckoo Search Instance Selection Algorithm outperform all the proposed techniques, in terms of speed. Overall, the proposed techniques have proven to be fast and accurate ML-based e-fraud detection techniques, with improved training speed, predictive accuracy and storage reduction. In real life application, such as video surveillance and intrusion detection systems, that require a classifier to be trained very quickly for speedy classification of new target concepts, the filter-based techniques provide the best solutions; while the wrapper-based techniques are better suited for applications, such as email filters, that are very sensitive to slight changes in predictive accuracy

    Lévy-Flight Krill Herd Algorithm

    Get PDF
    To improve the performance of the krill herd (KH) algorithm, in this paper, a Lévy-flight krill herd (LKH) algorithm is proposed for solving optimization tasks within limited computing time. The improvement includes the addition of a new local Lévy-flight (LLF) operator during the process when updating krill in order to improve its efficiency and reliability coping with global numerical optimization problems. The LLF operator encourages the exploitation and makes the krill individuals search the space carefully at the end of the search. The elitism scheme is also applied to keep the best krill during the process when updating the krill. Fourteen standard benchmark functions are used to verify the effects of these improvements and it is illustrated that, in most cases, the performance of this novel metaheuristic LKH method is superior to, or at least highly competitive with, the standard KH and other population-based optimization methods. Especially, this new method can accelerate the global convergence speed to the true global optimum while preserving the main feature of the basic KH

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Essays on Optimization and Modeling Methods for Reliability and Reliability Growth

    Get PDF
    This research proposes novel solution techniques in the realm of reliability and reliability growth. We first consider a redundancy allocation problem to design a system that maximizes the reliability of a complex series-parallel system comprised of components with deterministic reliability. We propose a new meta-heuristic, inspired by the behavior of bats hunting prey, to find component allocation and redundancy levels that provide optimal or near-optimal system reliability levels. Each component alternative has an associated cost and weight and the system is constrained by cost and weight factors. We allow for component mixing within a subsystem, with a pre-defined maximum level of component redundancy per subsystem, which adds to problem complexity and prevents an optimal solution from being derived analytically. The second problem of interest involves how we model a system\u27s reliability growth as it undergoes testing and how we minimize deviation from planned growth. We propose a Grey Model, GM(1,1) for modeling reliability growth on complex systems when failure data is sparse. The GM(1,1) model\u27s performance is benchmarked with the Army Materiel Systems Analysis Activity (AMSAA) model, the standard within the reliability growth modeling community. For continuous and discrete (one-shot) testing, the GM(1,1) model shows itself to be superior to the AMSAA model when modeling reliability growth with small failure data sets. Finally, to ensure the reliability growth planning curve is followed as closely as possible, we determine the best level of corrective action to employ on a discovered failure mode, with corrective action levels allowed to vary based upon the amount of resources allocated for failure mode improvement. We propose a Markov Decision Process (MDP) approach to handle the stochasticity of failure data and its corresponding system reliability estimate. By minimizing a weighted deviation from the planning curve, systems will ideally meet the reliability milestones specified by the planning curve, while simultaneously avoiding system over-development and unnecessary resource expenditure for over-correction of failure modes

    Pertanika Journal of Science & Technology

    Get PDF

    Shortest Route at Dynamic Location with Node Combination-Dijkstra Algorithm

    Get PDF
    Abstract— Online transportation has become a basic requirement of the general public in support of all activities to go to work, school or vacation to the sights. Public transportation services compete to provide the best service so that consumers feel comfortable using the services offered, so that all activities are noticed, one of them is the search for the shortest route in picking the buyer or delivering to the destination. Node Combination method can minimize memory usage and this methode is more optimal when compared to A* and Ant Colony in the shortest route search like Dijkstra algorithm, but can’t store the history node that has been passed. Therefore, using node combination algorithm is very good in searching the shortest distance is not the shortest route. This paper is structured to modify the node combination algorithm to solve the problem of finding the shortest route at the dynamic location obtained from the transport fleet by displaying the nodes that have the shortest distance and will be implemented in the geographic information system in the form of map to facilitate the use of the system. Keywords— Shortest Path, Algorithm Dijkstra, Node Combination, Dynamic Location (key words
    corecore