18 research outputs found
Fully Distributed And Mixed Symmetric Diagonal Dominant Solvers For Large Scale Optimization
Over the past twenty years, we have witnessed an unprecedented growth in data, inaugurating the
so-called Big Data Epoch. Throughout these years, the exponential growth in the power of computer
chips forecasted by Moore\u27s Law has allowed us to increasingly handle such growing data
progression. However, due to the physical limitations on the size of transistors we have already
reached the computational limits of traditional microprocessors\u27 architecture.Therefore, we either
need conceptually new computers or distributed models of computation to allow processors to solve
Big Data problems in a collaborative manner.
The purpose of this thesis is to show that decentralized optimization is capable of addressing our
growing computational demands by exploiting the power of coordinated data processing. In particular,
we propose an exact distributed Newton method for two important challenges in large-scale
optimization: Network Flow and Empirical Risk Minimization.
The key observation behind our method is related to the symmetric diagonal dominant structure
of the Hessian of dual functions correspondent to the aforementioned problems. Consequently, one
can calculate the Newton direction by solving symmetric diagonal dominant (SDD) systems in a
decentralized fashion.
We first propose a fully distributed SDD solver based on a recursive approximation of SDD matrix
inverses with a collection of specifically structured distributed matrices. To improve the precision of
the algorithm, we then apply Richardson Preconditioners arriving at an efficient algorithm capable
of approximating the solution of SDD system with any arbitrary precision.
vi
Our second fully distributed SDD solver significantly improves the computational performance of
the rst algorithm by utilizing Chebyshev polynomials for an approximation of the SDD matrix
inverse. The particular choice of Chebyshev polynomials is motivated by their extremal properties
and their recursive relation.
We then explore mixed strategies for solving SDD systems by slightly relaxing the decentralization
requirements. Roughly speaking, by allowing for one computer to aggregate some particular information
from all others, one can gain quite surprising computational benefits. The key idea is to
construct a spectral sparsifier of the underlying graph of computers by using local communication
between them.
Finally, we apply these solvers for calculating the Newton direction for the dual function of Network
Flow and Empirical Risk Minimization. On the theoretical side, we establish quadratic convergence
rate for our algorithms surpassing all existing techniques. On the empirical side, we verify our
superior performance in a set of extensive numerical simulations
Fast, Accurate Second Order Methods for Network Optimization
Dual descent methods are commonly used to solve network flow optimization
problems, since their implementation can be distributed over the network. These
algorithms, however, often exhibit slow convergence rates. Approximate Newton
methods which compute descent directions locally have been proposed as
alternatives to accelerate the convergence rates of conventional dual descent.
The effectiveness of these methods, is limited by the accuracy of such
approximations. In this paper, we propose an efficient and accurate distributed
second order method for network flow problems. The proposed approach utilizes
the sparsity pattern of the dual Hessian to approximate the the Newton
direction using a novel distributed solver for symmetric diagonally dominant
linear equations. Our solver is based on a distributed implementation of a
recent parallel solver of Spielman and Peng (2014). We analyze the properties
of the proposed algorithm and show that, similar to conventional Newton
methods, superlinear convergence within a neighbor- hood of the optimal value
is attained. We finally demonstrate the effectiveness of the approach in a set
of experiments on randomly generated networks.Comment: arXiv admin note: text overlap with arXiv:1502.0315
BOiLS: Bayesian Optimisation for Logic Synthesis
Optimising the quality-of-results (QoR) of circuits during logic synthesis is a formidable challenge necessitating the exploration of exponentially sized search spaces. While expert-designed operations aid in uncovering effective sequences, the increase in complexity of logic circuits favours automated procedures. To enable efficient and scalable solvers, we propose BOiLS, the first algorithm adapting Bayesian optimisation to navigate the space of synthesis operations. BOiLS requires no human intervention and trades-off exploration versus exploitation through novel Gaussian process kernels and trust-region constrained acquisitions. In a set of experiments on EPFL benchmarks, we demonstrate BOiLS's superior performance compared to state-of-the-art in terms of both sample efficiency and QoR values
Are we Forgetting about Compositional Optimisers in Bayesian Optimisation?
Bayesian optimisation presents a sample-efficient methodology for global optimisation. Within this framework, a crucial performance-determining subroutine is the maximisation of the acquisition function, a task complicated by the fact that acquisition functions tend to be non-convex and thus nontrivial to optimise. In this paper, we undertake a comprehensive empirical study of approaches to maximise the acquisition function. Additionally, by deriving novel, yet mathematically equivalent, compositional forms for popular acquisition functions, we recast the maximisation task as a compositional optimisation problem, allowing us to benefit from the extensive literature in this field. We highlight the empirical advantages of the compositional approach to acquisition function maximisation across 3958 individual experiments comprising synthetic optimisation tasks as well as tasks from Bayesmark. Given the generality of the acquisition function maximisation subroutine, we posit that the adoption of compositional optimisers has the potential to yield performance improvements across all domains in which Bayesian optimisation is currently being applied. An open-source implementation is made available at https://github.com/huawei-noah/noah-research/tree/CompBO/BO/HEBO/CompBO