CORE
πΊπ¦Β
Β make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
Convergence rate of the (1+1)-evolution strategy on locally strongly convex functions with lipschitz continuous gradient and their monotonic transformations
Authors
Youhei Akimoto
Kazuto Fukuchi
Daiki Morinaga
Jun Sakuma
Publication date
24 April 2023
Publisher
View
on
arXiv
Abstract
Evolution strategy (ES) is one of promising classes of algorithms for black-box continuous optimization. Despite its broad successes in applications, theoretical analysis on the speed of its convergence is limited on convex quadratic functions and their monotonic transformation. In this study, an upper bound and a lower bound of the rate of linear convergence of the (1+1)-ES on locally
L
L
L
-strongly convex functions with
U
U
U
-Lipschitz continuous gradient are derived as
exp
β‘
(
β
Ξ©
d
β
β
(
L
d
β
U
)
)
\exp\left(-\Omega_{d\to\infty}\left(\frac{L}{d\cdot U}\right)\right)
exp
(
β
Ξ©
d
β
β
β
(
d
β
U
L
β
)
)
and
exp
β‘
(
β
1
d
)
\exp\left(-\frac1d\right)
exp
(
β
d
1
β
)
, respectively. Notably, any prior knowledge on the mathematical properties of the objective function such as Lipschitz constant is not given to the algorithm, whereas the existing analyses of derivative-free optimization algorithms require them.Comment: 15 page
Similar works
Full text
Available Versions
arXiv.org e-Print Archive
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:arXiv.org:2209.12467
Last time updated on 20/11/2022