602,304 research outputs found
Explaining Adaptation in Genetic Algorithms With Uniform Crossover: The Hyperclimbing Hypothesis
The hyperclimbing hypothesis is a hypothetical explanation for adaptation in
genetic algorithms with uniform crossover (UGAs). Hyperclimbing is an
intuitive, general-purpose, non-local search heuristic applicable to discrete
product spaces with rugged or stochastic cost functions. The strength of this
heuristic lie in its insusceptibility to local optima when the cost function is
deterministic, and its tolerance for noise when the cost function is
stochastic. Hyperclimbing works by decimating a search space, i.e. by
iteratively fixing the values of small numbers of variables. The hyperclimbing
hypothesis holds that UGAs work by implementing efficient hyperclimbing. Proof
of concept for this hypothesis comes from the use of a novel analytic technique
involving the exploitation of algorithmic symmetry. We have also obtained
experimental results that show that a simple tweak inspired by the
hyperclimbing hypothesis dramatically improves the performance of a UGA on
large, random instances of MAX-3SAT and the Sherrington Kirkpatrick Spin
Glasses problem.Comment: 22 pages, 5 figure
Recommended from our members
Approach Tolerance in the Assemblies of Evolutionary Hybrid Prototypes
A new answer is proposed to replace the traditional “one shot” prototype (manufactured in
one piece with one process): the hybrid rapid prototype. It is used to highly reduce time,
cost and increase reactivity during the development times of new products.
The part is decomposed in several components which can quickly be changed and can be
manufactured with a process the most adapted.
The main objective of the presented method is to propose an available technological
assembly between the different components of the part in the respect of technological and
topological function, and initial tolerance.
Using a graph of representation, fuzzy logic and a tolerance point of view, some entities are
associated with a CIA (Assembly Identity Card) in accordance with evolutionary and
manufacturing analysis. This work will be illustrated by an industrial tooling for plastic
injection.Mechanical Engineerin
Fourier Based Fast Multipole Method for the Helmholtz Equation
The fast multipole method (FMM) has had great success in reducing the
computational complexity of solving the boundary integral form of the Helmholtz
equation. We present a formulation of the Helmholtz FMM that uses Fourier basis
functions rather than spherical harmonics. By modifying the transfer function
in the precomputation stage of the FMM, time-critical stages of the algorithm
are accelerated by causing the interpolation operators to become
straightforward applications of fast Fourier transforms, retaining the
diagonality of the transfer function, and providing a simplified error
analysis. Using Fourier analysis, constructive algorithms are derived to a
priori determine an integration quadrature for a given error tolerance. Sharp
error bounds are derived and verified numerically. Various optimizations are
considered to reduce the number of quadrature points and reduce the cost of
computing the transfer function.Comment: 24 pages, 13 figure
Overcommitment in Cloud Services -- Bin packing with Chance Constraints
This paper considers a traditional problem of resource allocation, scheduling
jobs on machines. One such recent application is cloud computing, where jobs
arrive in an online fashion with capacity requirements and need to be
immediately scheduled on physical machines in data centers. It is often
observed that the requested capacities are not fully utilized, hence offering
an opportunity to employ an overcommitment policy, i.e., selling resources
beyond capacity. Setting the right overcommitment level can induce a
significant cost reduction for the cloud provider, while only inducing a very
low risk of violating capacity constraints. We introduce and study a model that
quantifies the value of overcommitment by modeling the problem as a bin packing
with chance constraints. We then propose an alternative formulation that
transforms each chance constraint into a submodular function. We show that our
model captures the risk pooling effect and can guide scheduling and
overcommitment decisions. We also develop a family of online algorithms that
are intuitive, easy to implement and provide a constant factor guarantee from
optimal. Finally, we calibrate our model using realistic workload data, and
test our approach in a practical setting. Our analysis and experiments
illustrate the benefit of overcommitment in cloud services, and suggest a cost
reduction of 1.5% to 17% depending on the provider's risk tolerance
- …
