3,421 research outputs found
A Hybrid Chimp Optimization Algorithm and Generalized Normal Distribution Algorithm with Opposition-Based Learning Strategy for Solving Data Clustering Problems
This paper is concerned with data clustering to separate clusters based on
the connectivity principle for categorizing similar and dissimilar data into
different groups. Although classical clustering algorithms such as K-means are
efficient techniques, they often trap in local optima and have a slow
convergence rate in solving high-dimensional problems. To address these issues,
many successful meta-heuristic optimization algorithms and intelligence-based
methods have been introduced to attain the optimal solution in a reasonable
time. They are designed to escape from a local optimum problem by allowing
flexible movements or random behaviors. In this study, we attempt to
conceptualize a powerful approach using the three main components: Chimp
Optimization Algorithm (ChOA), Generalized Normal Distribution Algorithm
(GNDA), and Opposition-Based Learning (OBL) method. Firstly, two versions of
ChOA with two different independent groups' strategies and seven chaotic maps,
entitled ChOA(I) and ChOA(II), are presented to achieve the best possible
result for data clustering purposes. Secondly, a novel combination of ChOA and
GNDA algorithms with the OBL strategy is devised to solve the major
shortcomings of the original algorithms. Lastly, the proposed ChOAGNDA method
is a Selective Opposition (SO) algorithm based on ChOA and GNDA, which can be
used to tackle large and complex real-world optimization problems, particularly
data clustering applications. The results are evaluated against seven popular
meta-heuristic optimization algorithms and eight recent state-of-the-art
clustering techniques. Experimental results illustrate that the proposed work
significantly outperforms other existing methods in terms of the achievement in
minimizing the Sum of Intra-Cluster Distances (SICD), obtaining the lowest
Error Rate (ER), accelerating the convergence speed, and finding the optimal
cluster centers.Comment: 48 pages, 14 Tables, 12 Figure
Using the modified k-mean algorithm with an improved teaching-learning-based optimization algorithm for feedforward neural network training
In this paper we proposed a novel procedure for training a feedforward neural network. The accuracy of artificial neural network outputs after determining the proper structure for each problem depends on choosing the appropriate method for determining the best weights, which is the appropriate training algorithm. If the training algorithm starts from a good starting point, it is several steps closer to achieving global optimization. In this paper, we present an optimization strategy for selecting the initial population and determining the optimal weights with the aim of minimizing neural network error. Teaching-learning-based optimization (TLBO) is a less parametric algorithm rather than other evolutionary algorithms, so it is easier to implement. We have improved this algorithm to increase efficiency and balance between global and local search. The improved teaching-learning-based optimization (ITLBO) algorithm has added the concept of neighborhood to the basic algorithm, which improves the ability of global search. Using an initial population that includes the best cluster centers after clustering with the modified k-mean algorithm also helps the algorithm to achieve global optimum. The results are promising, close to optimal, and better than other approach which we compared our proposed algorithm with them
Modeling Financial Time Series with Artificial Neural Networks
Financial time series convey the decisions and actions of a population of human actors over time. Econometric and regressive models have been developed in the past decades for analyzing these time series. More recently, biologically inspired artificial neural network models have been shown to overcome some of the main challenges of traditional techniques by better exploiting the non-linear, non-stationary, and oscillatory nature of noisy, chaotic human interactions. This review paper explores the options, benefits, and weaknesses of the various forms of artificial neural networks as compared with regression techniques in the field of financial time series analysis.CELEST, a National Science Foundation Science of Learning Center (SBE-0354378); SyNAPSE program of the Defense Advanced Research Project Agency (HR001109-03-0001
An optimized deep learning model for optical character recognition applications
The convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recognition applications; the proposed method was evaluated for performance in terms of computational accuracy, convergence analysis, and cost
A hybrid Grey Wolf optimizer with multi-population differential evolution for global optimization problems
The optimization field is the process of solving an optimization problem using an optimization algorithm. Therefore, studying this research field requires to study both of optimization problems and algorithms. In this paper, a hybrid optimization algorithm based on differential evolution (DE) and grey wolf optimizer (GWO) is proposed. The proposed algorithm which is called “MDE-GWONM” is better than the original versions in terms of the balancing between exploration and exploitation. The results of implementing MDE-GWONM over nine benchmark test functions showed the performance is superior as compared to other stat of arts optimization algorithm
Current Studies and Applications of Krill Herd and Gravitational Search Algorithms in Healthcare
Nature-Inspired Computing or NIC for short is a relatively young field that
tries to discover fresh methods of computing by researching how natural
phenomena function to find solutions to complicated issues in many contexts. As
a consequence of this, ground-breaking research has been conducted in a variety
of domains, including synthetic immune functions, neural networks, the
intelligence of swarm, as well as computing of evolutionary. In the domains of
biology, physics, engineering, economics, and management, NIC techniques are
used. In real-world classification, optimization, forecasting, and clustering,
as well as engineering and science issues, meta-heuristics algorithms are
successful, efficient, and resilient. There are two active NIC patterns: the
gravitational search algorithm and the Krill herd algorithm. The study on using
the Krill Herd Algorithm (KH) and the Gravitational Search Algorithm (GSA) in
medicine and healthcare is given a worldwide and historical review in this
publication. Comprehensive surveys have been conducted on some other
nature-inspired algorithms, including KH and GSA. The various versions of the
KH and GSA algorithms and their applications in healthcare are thoroughly
reviewed in the present article. Nonetheless, no survey research on KH and GSA
in the healthcare field has been undertaken. As a result, this work conducts a
thorough review of KH and GSA to assist researchers in using them in diverse
domains or hybridizing them with other popular algorithms. It also provides an
in-depth examination of the KH and GSA in terms of application, modification,
and hybridization. It is important to note that the goal of the study is to
offer a viewpoint on GSA with KH, particularly for academics interested in
investigating the capabilities and performance of the algorithm in the
healthcare and medical domains.Comment: 35 page
- …