45,144 research outputs found
ZOOpt: Toolbox for Derivative-Free Optimization
Recent advances of derivative-free optimization allow efficient approximating
the global optimal solutions of sophisticated functions, such as functions with
many local optima, non-differentiable and non-continuous functions. This
article describes the ZOOpt (https://github.com/eyounx/ZOOpt) toolbox that
provides efficient derivative-free solvers and are designed easy to use. ZOOpt
provides a Python package for single-thread optimization, and a light-weighted
distributed version with the help of the Julia language for Python described
functions. ZOOpt toolbox particularly focuses on optimization problems in
machine learning, addressing high-dimensional, noisy, and large-scale problems.
The toolbox is being maintained toward ready-to-use tool in real-world machine
learning tasks
Space Shuffle: A Scalable, Flexible, and High-Bandwidth Data Center Network
Data center applications require the network to be scalable and
bandwidth-rich. Current data center network architectures often use rigid
topologies to increase network bandwidth. A major limitation is that they can
hardly support incremental network growth. Recent work proposes to use random
interconnects to provide growth flexibility. However routing on a random
topology suffers from control and data plane scalability problems, because
routing decisions require global information and forwarding state cannot be
aggregated. In this paper we design a novel flexible data center network
architecture, Space Shuffle (S2), which applies greedy routing on multiple ring
spaces to achieve high-throughput, scalability, and flexibility. The proposed
greedy routing protocol of S2 effectively exploits the path diversity of
densely connected topologies and enables key-based routing. Extensive
experimental studies show that S2 provides high bisectional bandwidth and
throughput, near-optimal routing path lengths, extremely small forwarding
state, fairness among concurrent data flows, and resiliency to network
failures
The Sampling-and-Learning Framework: A Statistical View of Evolutionary Algorithms
Evolutionary algorithms (EAs), a large class of general purpose optimization
algorithms inspired from the natural phenomena, are widely used in various
industrial optimizations and often show excellent performance. This paper
presents an attempt towards revealing their general power from a statistical
view of EAs. By summarizing a large range of EAs into the sampling-and-learning
framework, we show that the framework directly admits a general analysis on the
probable-absolute-approximate (PAA) query complexity. We particularly focus on
the framework with the learning subroutine being restricted as a binary
classification, which results in the sampling-and-classification (SAC)
algorithms. With the help of the learning theory, we obtain a general upper
bound on the PAA query complexity of SAC algorithms. We further compare SAC
algorithms with the uniform search in different situations. Under the
error-target independence condition, we show that SAC algorithms can achieve
polynomial speedup to the uniform search, but not super-polynomial speedup.
Under the one-side-error condition, we show that super-polynomial speedup can
be achieved. This work only touches the surface of the framework. Its power
under other conditions is still open
- …