3,216 research outputs found
The role of Walsh structure and ordinal linkage in the optimisation of pseudo-Boolean functions under monotonicity invariance.
Optimisation heuristics rely on implicit or explicit assumptions about the structure of the black-box fitness function they optimise. A review of the literature shows that understanding of structure and linkage is helpful to the design and analysis of heuristics. The aim of this thesis is to investigate the role that problem structure plays in heuristic optimisation. Many heuristics use ordinal operators; which are those that are invariant under monotonic transformations of the fitness function. In this thesis we develop a classification of pseudo-Boolean functions based on rank-invariance. This approach classifies functions which are monotonic transformations of one another as equivalent, and so partitions an infinite set of functions into a finite set of classes. Reasoning about heuristics composed of ordinal operators is, by construction, invariant over these classes. We perform a complete analysis of 2-bit and 3-bit pseudo-Boolean functions. We use Walsh analysis to define concepts of necessary, unnecessary, and conditionally necessary interactions, and of Walsh families. This helps to make precise some existing ideas in the literature such as benign interactions. Many algorithms are invariant under the classes we define, which allows us to examine the difficulty of pseudo-Boolean functions in terms of function classes. We analyse a range of ordinal selection operators for an EDA. Using a concept of directed ordinal linkage, we define precedence networks and precedence profiles to represent key algorithmic steps and their interdependency in terms of problem structure. The precedence profiles provide a measure of problem difficulty. This corresponds to problem difficulty and algorithmic steps for optimisation. This work develops insight into the relationship between function structure and problem difficulty for optimisation, which may be used to direct the development of novel algorithms. Concepts of structure are also used to construct easy and hard problems for a hill-climber
Using Recurrent Neural Networks to Optimize Dynamical Decoupling for Quantum Memory
We utilize machine learning models which are based on recurrent neural
networks to optimize dynamical decoupling (DD) sequences. DD is a relatively
simple technique for suppressing the errors in quantum memory for certain noise
models. In numerical simulations, we show that with minimum use of prior
knowledge and starting from random sequences, the models are able to improve
over time and eventually output DD-sequences with performance better than that
of the well known DD-families. Furthermore, our algorithm is easy to implement
in experiments to find solutions tailored to the specific hardware, as it
treats the figure of merit as a black box.Comment: 18 pages, comments are welcom
Consensus theories: an oriented survey
This article surveys seven directions of consensus theories: Arrowian results, federation consensus rules, metric consensus rules, tournament solutions, restricted domains, abstract consensus theories, algorithmic and complexity issues. This survey is oriented in the sense that it is mainly – but not exclusively – concentrated on the most significant results obtained, sometimes with other searchers, by a team of French searchers who are or were full or associate members of the Centre d'Analyse et de Mathématique Sociale (CAMS).Consensus theories ; Arrowian results ; aggregation rules ; metric consensus rules ; median ; tournament solutions ; restricted domains ; lower valuations ; median semilattice ; complexity
Data-Driven Robust Optimization
The last decade witnessed an explosion in the availability of data for
operations research applications. Motivated by this growing availability, we
propose a novel schema for utilizing data to design uncertainty sets for robust
optimization using statistical hypothesis tests. The approach is flexible and
widely applicable, and robust optimization problems built from our new sets are
computationally tractable, both theoretically and practically. Furthermore,
optimal solutions to these problems enjoy a strong, finite-sample probabilistic
guarantee. \edit{We describe concrete procedures for choosing an appropriate
set for a given application and applying our approach to multiple uncertain
constraints. Computational evidence in portfolio management and queuing confirm
that our data-driven sets significantly outperform traditional robust
optimization techniques whenever data is available.Comment: 38 pages, 15 page appendix, 7 figures. This version updated as of
Oct. 201
- …