2,034 research outputs found
The Quasi-Newton Method for the Composite Multiobjective Optimization Problems
In this paper, we introduce several new quasi-Newton methods for the
composite multiobjective optimization problems (in short, CMOP) with Armijo
line search. These multiobjective versions of quasi-Newton methods include BFGS
quasi-Newnon method, self-scaling BFGS quasi-Newnon method, and Huang BFGS
quasi-Newnon method. Under some suitable conditions, we show that each
accumulation point of the sequence generated by these algorithms, if exists, is
both a Pareto stationary point and a Pareto optimal point of (CMOP).Comment: 16 pages. arXiv admin note: text overlap with arXiv:2108.0012
Metaheuristic design of feedforward neural networks: a review of two decades of research
Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era
Comparison of Geometric Optimization Methods with Multiobjective Genetic Algorithms for Solving Integrated Optimal Design Problems
In this paper, system design methodologies for optimizing heterogenous power devices in electrical engineering are investigated. The concept of Integrated Optimal Design (IOD) is presented and a simplified but typical example is given. It consists in finding Pareto-optimal configurations for the motor drive of an electric vehicle. For that purpose, a geometric optimization method (i.e the Hooke and Jeeves minimization procedure) associated with an objective weighting sum and a Multiobjective Genetic Algorithm (i.e. the NSGA-II) are compared. Several performance issues are discussed such as the accuracy in the determination of Pareto-optimal configurations and the capability to well spread these solutions in the objective space
A dynamic gradient approach to Pareto optimization with nonsmooth convex objective functions
In a general Hilbert framework, we consider continuous gradient-like
dynamical systems for constrained multiobjective optimization involving
non-smooth convex objective functions. Our approach is in the line of a
previous work where was considered the case of convex di erentiable objective
functions. Based on the Yosida regularization of the subdi erential operators
involved in the system, we obtain the existence of strong global trajectories.
We prove a descent property for each objective function, and the convergence of
trajectories to weak Pareto minima. This approach provides a dynamical
endogenous weighting of the objective functions. Applications are given to
cooperative games, inverse problems, and numerical multiobjective optimization
Two nonmonotone multiobjective memory gradient algorithms
In this paper, two types of nonmonotone memory gradient algorithm for solving
unconstrained multiobjective optimization problems are introduced. Under some
suitable conditions, we show the convergence of the full sequence generated by
the proposed algorithms to a weak Pareto optimal poin
Barzilai-Borwein Descent Methods for Multiobjective Optimization Problems with Variable Trade-off Metrics
The imbalances and conditioning of the objective functions influence the
performance of first-order methods for multiobjective optimization problems
(MOPs). The latter is related to the metric selected in the direction-finding
subproblems. Unlike single-objective optimization problems, capturing the
curvature of all objective functions with a single Hessian matrix is
impossible. On the other hand, second-order methods for MOPs use different
metrics for objectives in direction-finding subproblems, leading to a high
per-iteration cost. To balance per-iteration cost and better curvature
exploration, we propose a Barzilai-Borwein descent method with variable metrics
(BBDMO\_VM). In the direction-finding subproblems, we employ a variable metric
to explore the curvature of all objectives. Subsequently, Barzilai-Borwein's
method relative to the variable metric is applied to tune objectives, which
mitigates the effect of imbalances. We investigate the convergence behaviour of
the BBDMO\_VM, confirming fast linear convergence for well-conditioned problems
relative to the variable metric. In particular, we establish linear convergence
for problems that involve some linear objectives. These convergence results
emphasize the importance of metric selection, motivating us to approximate the
trade-off of Hessian matrices to better capture the geometry of the problem.
Comparative numerical results confirm the efficiency of the proposed method,
even when applied to large-scale and ill-conditioned problems
Variable Metric Method for Unconstrained Multiobjective Optimization Problems
In this paper, we propose a variable metric method for unconstrained
multiobjective optimization problems (MOPs). First, a sequence of points is
generated using different positive definite matrices in the generic framework.
It is proved that accumulation points of the sequence are Pareto critical
points. Then, without convexity assumption, strong convergence is established
for the proposed method. Moreover, we use a common matrix to approximate the
Hessian matrices of all objective functions, along which, a new nonmonotone
line search technique is proposed to achieve a local superlinear convergence
rate. Finally, several numerical results demonstrate the effectiveness of the
proposed method
- …