17 research outputs found

    A Big-Bang Big-Crunch Type-2 Fuzzy Logic-based System for Malaria Epidemic Prediction in Ethiopia

    Get PDF
    ABSTRACT- Malaria is a life-threatening disease caused by Plasmodium parasite infection with huge medical, economic, and social impact. Malaria is one of a serious public health problem in Ethiopia since 1959, even if, its morbidity and mortality have been reduced starting from 2001. Various studies were conducted to predict the Malaria epidemic using mathematical and statistical regression approaches, nevertheless, they had no learning capabilities. In this paper, we presented a type-2 fuzzy logic-based system for Malaria epidemic prediction (MEP) in Ethiopia which has been optimized by the Big-Bang Big-Crunch (BBBC) approach to maximizing the model accuracy and interpretability to predict for the future occurrence of Malaria. We compared the proposed BBBC optimized type-2 fuzzy logic-based system against its counterpart T1FLS, non-optimized T2FLS, ANFIS and ANN. The results show that the optimized proposed T2FLS provides a more interpretable model that predicts the future occurrence of Malaria from one up to three months ahead with optimal accuracy. This helps to answer the question of when and where must make preparation to prevent and control the occurrence of Malaria epidemic since the generated rules from our system were able to explain the situations and intensity of input factors which contributed to the Malaria epidemic and outbreak

    Symbiotic Organisms Search Algorithm: theory, recent advances and applications

    Get PDF
    The symbiotic organisms search algorithm is a very promising recent metaheuristic algorithm. It has received a plethora of attention from all areas of numerical optimization research, as well as engineering design practices. it has since undergone several modifications, either in the form of hybridization or as some other improved variants of the original algorithm. However, despite all the remarkable achievements and rapidly expanding body of literature regarding the symbiotic organisms search algorithm within its short appearance in the field of swarm intelligence optimization techniques, there has been no collective and comprehensive study on the success of the various implementations of this algorithm. As a way forward, this paper provides an overview of the research conducted on symbiotic organisms search algorithms from inception to the time of writing, in the form of details of various application scenarios with variants and hybrid implementations, and suggestions for future research directions

    BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer

    Get PDF
    In this paper, we propose enhancements to Beetle Antennae search ( BAS ) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation ( ADAM ) update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer ( PSO ) and the original BAS algorithm

    Safe experimentation dynamics algorithm for data-driven PID controller of a class of underactuated systems

    Get PDF
    In recent decades, various control strategies for underactuated mechanical systems (UMS) have been widely reported which are derived from the systems’ model. Due to the problem of the unmodeled dynamics, there is a significant disparity between the theory of control and its actual applications, which makes the model-based controller difficult to apply. In recent years, control researchers have been switching to the method of data-driven control in order to eliminate this disparity. The control performance of this method is independent of the plant’s model accuracy to attain the control objective. This is because its controller’s design is founded only on the input-output (I/O) data measurement of the actual plants. In the industry, the proportional-integral-derivative (PID) controller is the control method that has been widely implemented because of its simplicity, the fact that it is more understandable and more reliable to be used for industrial purposes. So far, the tuning methods used for data-driven PID for the underactuated systems are mostly based on the multi-agent-based optimization, which means that the design requires substantial computation time and make it not practical for on-line tuning applications. Therefore, it is necessary to develop a tuning strategy that requires less computation time. Previously, a stochastic approximation based method such as the norm-limited simultaneous perturbation stochastic approximation (NL-SPSA) and global NL-SPSA (G-NL-SPSA) have shown successful results as tools for the data-driven PID tuning. Notably, the SPSA and GSPSA based methods only produced the optimal design parameter at the final iteration while it may keep a better design parameter during the tuning process if it has a memory feature. Hence, a memory-based optimization tool has good potential to retain the optimal design parameter during the PID tuning process. This can overcome the existing memory-based algorithms such as random search (RS) and simulated annealing (SA) which currently produce less control accuracy due to the local minimum problem. Motivated by the limitations of the current methods, there is an advantage to using safe experimentation dynamics (SED) as a tool for optimization. SED offers memory-based features and effectiveness to perform with lesser computation time to overcome a range of optimization problems, even for high-dimensional parameter tuning. Moreover, other than the memory-based feature, SED algorithm has fewer design parameters to be addressed and the independence of the gain sequence in the tuning process. Previously, SED algorithm has been applied in to control scheme of wind farm to optimize the total power production but has yet to be applied in PID tuning. Therefore, it is good to study the effectiveness of SED in PID tuning. In this study, the efficiency of the proposed approach is tested by applying the PID controller tuning to the slosh control system, double-pendulum-type overhead crane (DPTOC) control system and multi-input-multi-output (MIMO) crane control system. The performance was evaluated using numerical examples in terms of tracking performance and control input energy. Thirty trials have been performed to evaluate the SED, norm limited SPSA (NL-SPSA), global norm limited SPSA (G-NL-SPSA), and RS algorithms in each example. Next, when the pre-stated termination condition is fitted, each method is evaluated based on the statistical analysis involving the objective function, the total norm of the error and total norm of the input. Then, the rise time, settling time, and percentage of overshoot of the one best trial out of the 30 trials were observed for each method. In the DPTOC control system, we also present the examples with disturbance. The performance comparison was made only between the SED based method and G-NL-SPSA based method. In addition, the average percentage of the control objective improvement retrieved from the 30 trials for each method was also observed

    A Hybrid Chimp Optimization Algorithm and Generalized Normal Distribution Algorithm with Opposition-Based Learning Strategy for Solving Data Clustering Problems

    Full text link
    This paper is concerned with data clustering to separate clusters based on the connectivity principle for categorizing similar and dissimilar data into different groups. Although classical clustering algorithms such as K-means are efficient techniques, they often trap in local optima and have a slow convergence rate in solving high-dimensional problems. To address these issues, many successful meta-heuristic optimization algorithms and intelligence-based methods have been introduced to attain the optimal solution in a reasonable time. They are designed to escape from a local optimum problem by allowing flexible movements or random behaviors. In this study, we attempt to conceptualize a powerful approach using the three main components: Chimp Optimization Algorithm (ChOA), Generalized Normal Distribution Algorithm (GNDA), and Opposition-Based Learning (OBL) method. Firstly, two versions of ChOA with two different independent groups' strategies and seven chaotic maps, entitled ChOA(I) and ChOA(II), are presented to achieve the best possible result for data clustering purposes. Secondly, a novel combination of ChOA and GNDA algorithms with the OBL strategy is devised to solve the major shortcomings of the original algorithms. Lastly, the proposed ChOAGNDA method is a Selective Opposition (SO) algorithm based on ChOA and GNDA, which can be used to tackle large and complex real-world optimization problems, particularly data clustering applications. The results are evaluated against seven popular meta-heuristic optimization algorithms and eight recent state-of-the-art clustering techniques. Experimental results illustrate that the proposed work significantly outperforms other existing methods in terms of the achievement in minimizing the Sum of Intra-Cluster Distances (SICD), obtaining the lowest Error Rate (ER), accelerating the convergence speed, and finding the optimal cluster centers.Comment: 48 pages, 14 Tables, 12 Figure

    Role of Metaheuristics in Optimizing Microgrids Operating and Management Issues::A Comprehensive Review

    Get PDF
    The increased interest in renewable-based microgrids imposes several challenges, such as source integration, power quality, and operating cost. Dealing with these problems requires solving nonlinear optimization problems that include multiple linear or nonlinear constraints and continuous variables or discrete ones that require large dimensionality search space to find the optimal or sub-optimal solution. These problems may include the optimal power flow in the microgrid, the best possible configurations, and the accuracy of the models within the microgrid. Metaheuristic optimization algorithms are getting more suggested in the literature contributions for microgrid applications to solve these optimization problems. This paper intends to thoroughly review some significant issues surrounding microgrid operation and solve them using metaheuristic optimization algorithms. This study provides a collection of fundamental principles and concepts that describe metaheuristic optimization algorithms. Then, the most significant metaheuristic optimization algorithms that have been published in the last years in the context of microgrid applications are investigated and analyzed. Finally, the employment of metaheuristic optimization algorithms to specific microgrid issue applications is reviewed, including examples of some used algorithms. These issues include unit commitment, economic dispatch, optimal power flow, distribution system reconfiguration, transmission network expansion and distribution system planning, load and generation forecasting, maintenance schedules, and renewable sources max power tracking

    An Evolutionary Pentagon Support Vector Finder Method

    Get PDF
    In dealing with big data, we need effective algorithms; effectiveness that depends, among others, on the ability to remove outliers from the data set, especially when dealing with classification problems. To this aim, support vector finder algorithms have been created to save just the most important data in the data pool. Nevertheless, existing classification algorithms, such as Fuzzy C-Means (FCM), suffer from the drawback of setting the initial cluster centers imprecisely. In this paper, we avoid existing shortcomings and aim to find and remove unnecessary data in order to speed up the final classification task without losing vital samples and without harming final accuracy; in this sense, we present a unique approach for finding support vectors, named evolutionary Pentagon Support Vector (PSV) finder method. The originality of the current research lies in using geometrical computations and evolutionary algorithms to make a more effective system, which has the advantage of higher accuracy on some data sets. The proposed method is subsequently tested with seven benchmark data sets and the results are compared to those obtained from performing classification on the original data (classification before and after PSV) under the same conditions. The testing returned promising results

    The need for fuzzy AI

    Get PDF
    Artificial intelligence (AI) is once again a topic of huge interest for computer scientists around the world. Whilst advances in the capability of machines are being made all around the world at an incredible rate, there is also increasing focus on the need for computerised systems to be able to explain their decisions, at least to some degree. It is also clear that data and knowledge in the real world are characterised by uncertainty. Fuzzy systems can provide decision support, which both handle uncertainty and have explicit representations of uncertain knowledge and inference processes. However, it is not yet clear how any decision support systems, including those featuring fuzzy methods, should be evaluated as to whether their use is permitted. This paper presents a conceptual framework of indistinguishability as the key component of the evaluation of computerised decision support systems. Case studies are presented in which it has been clearly demonstrated that human expert performance is less than perfect, together with techniques that may enable fuzzy systems to emulate human-level performance including variability. In conclusion, this paper argues for the need for 'fuzzy AI' in two senses: (i) the need for fuzzy methodologies (in the technical sense of Zadeh's fuzzy sets and systems) as knowledge-based systems to represent and reason with uncertainty; and (ii) the need for fuzziness (in the non-technical sense) with an acceptance of imperfect performance in evaluating AI systems

    Analyzing evolution of rare events through social media data

    Get PDF
    Recently, some researchers have attempted to find a relationship between the evolution of rare events and temporal-spatial patterns of social media activities. Their studies verify that the relationship exists in both time and spatial domains. However, few of those studies can accurately deduce a time point when social media activities are most highly affected by a rare event because producing an accurate temporal pattern of social media during the evolution of a rare event is very difficult. This work expands the current studies along three directions. Firstly, we focus on the intensity of information volume and propose an innovative clustering algorithm-based data processing method to characterize the evolution of a rare event by analyzing social media data. Secondly, novel feature extraction and fuzzy logic-based classification methods are proposed to distinguish and classify event-related and unrelated messages. Lastly, since many messages do not have ground truth, we execute four existing ground-truth inference algorithms to deduce the ground truth and compare their performances. Then, an Adaptive Majority Voting (Adaptive MV) method is proposed and compared with two of the existing algorithms based on a set containing manually-labeled social media data. Our case studies focus on Hurricane Sandy in 2012 and Hurricane Maria in 2017. Twitter data collected around them are used to verify the effectiveness of the proposed methods. Firstly, the results of the proposed data processing method not only verify that a rare event and social media activities have strong correlations, but also reveal that they have some time difference. Thus, it is conducive to investigate the temporal pattern of social media activities. Secondly, fuzzy logic-based feature extraction and classification methods are effective in identifying event-related and unrelated messages. Lastly, the Adaptive MV method deduces the ground truth well and performs better on datasets with noisy labels than other two methods, Positive Label Frequency Threshold and Majority Voting
    corecore