8,602 research outputs found
Firefly Algorithm: Recent Advances and Applications
Nature-inspired metaheuristic algorithms, especially those based on swarm
intelligence, have attracted much attention in the last ten years. Firefly
algorithm appeared in about five years ago, its literature has expanded
dramatically with diverse applications. In this paper, we will briefly review
the fundamentals of firefly algorithm together with a selection of recent
publications. Then, we discuss the optimality associated with balancing
exploration and exploitation, which is essential for all metaheuristic
algorithms. By comparing with intermittent search strategy, we conclude that
metaheuristics such as firefly algorithm are better than the optimal
intermittent search strategy. We also analyse algorithms and their implications
for higher-dimensional optimization problems.Comment: 15 page
Recommended from our members
A Metaheuristic Adaptive Cubature Based Algorithm to Find Bayesian Optimal Designs for Nonlinear Models
Finding Bayesian optimal designs for nonlinear models is a difficult task because the optimality criteriontypically requires us to evaluate complex integrals before we perform a constrained optimization. Wepropose a hybridized method where we combine an adaptive multidimensional integration algorithm anda metaheuristic algorithm called imperialist competitive algorithm to find Bayesian optimal designs. Weapply our numerical method to a few challenging design problems to demonstrate its efficiency. Theyinclude finding D-optimal designs for an item response model commonly used in education, Bayesianoptimal designs for survivalmodels, and Bayesian optimal designs for a four-parameter sigmoid Emax doseresponse model. Supplementary materials for this article are available online and they contain an R packagefor implementing the proposed algorithm and codes for reproducing all the results in this paper
Index Information Algorithm with Local Tuning for Solving Multidimensional Global Optimization Problems with Multiextremal Constraints
Multidimensional optimization problems where the objective function and the
constraints are multiextremal non-differentiable Lipschitz functions (with
unknown Lipschitz constants) and the feasible region is a finite collection of
robust nonconvex subregions are considered. Both the objective function and the
constraints may be partially defined. To solve such problems an algorithm is
proposed, that uses Peano space-filling curves and the index scheme to reduce
the original problem to a H\"{o}lder one-dimensional one. Local tuning on the
behaviour of the objective function and constraints is used during the work of
the global optimization procedure in order to accelerate the search. The method
neither uses penalty coefficients nor additional variables. Convergence
conditions are established. Numerical experiments confirm the good performance
of the technique.Comment: 29 pages, 5 figure
Deterministic global optimization using space-filling curves and multiple estimates of Lipschitz and Holder constants
In this paper, the global optimization problem with
being a hyperinterval in and satisfying the Lipschitz condition
with an unknown Lipschitz constant is considered. It is supposed that the
function can be multiextremal, non-differentiable, and given as a
`black-box'. To attack the problem, a new global optimization algorithm based
on the following two ideas is proposed and studied both theoretically and
numerically. First, the new algorithm uses numerical approximations to
space-filling curves to reduce the original Lipschitz multi-dimensional problem
to a univariate one satisfying the H\"{o}lder condition. Second, the algorithm
at each iteration applies a new geometric technique working with a number of
possible H\"{o}lder constants chosen from a set of values varying from zero to
infinity showing so that ideas introduced in a popular DIRECT method can be
used in the H\"{o}lder global optimization. Convergence conditions of the
resulting deterministic global optimization method are established. Numerical
experiments carried out on several hundreds of test functions show quite a
promising performance of the new algorithm in comparison with its direct
competitors.Comment: 26 pages, 10 figures, 4 table
Community structure in directed networks
We consider the problem of finding communities or modules in directed
networks. The most common approach to this problem in the previous literature
has been simply to ignore edge direction and apply methods developed for
community discovery in undirected networks, but this approach discards
potentially useful information contained in the edge directions. Here we show
how the widely used benefit function known as modularity can be generalized in
a principled fashion to incorporate the information contained in edge
directions. This in turn allows us to find communities by maximizing the
modularity over possible divisions of a network, which we do using an algorithm
based on the eigenvectors of the corresponding modularity matrix. This method
is shown to give demonstrably better results than previous methods on a variety
of test networks, both real and computer-generated.Comment: 5 pages, 3 figure
- …