14 research outputs found
Warmstarting of Model-based Algorithm Configuration
The performance of many hard combinatorial problem solvers depends strongly
on their parameter settings, and since manual parameter tuning is both tedious
and suboptimal the AI community has recently developed several algorithm
configuration (AC) methods to automatically address this problem. While all
existing AC methods start the configuration process of an algorithm A from
scratch for each new type of benchmark instances, here we propose to exploit
information about A's performance on previous benchmarks in order to warmstart
its configuration on new types of benchmarks. We introduce two complementary
ways in which we can exploit this information to warmstart AC methods based on
a predictive model. Experiments for optimizing a very flexible modern SAT
solver on twelve different instance sets show that our methods often yield
substantial speedups over existing AC methods (up to 165-fold) and can also
find substantially better configurations given the same compute budget.Comment: Preprint of AAAI'18 pape
Recommended from our members
Bayesian Optimisation for Heuristic Configuration in Automated Theorem Proving
Modern theorem provers such as Vampire utilise premise selection algorithms to control the proof search explosion. Premise selection heuristics often employ an array of continuous and discrete parameters. The quality of recommended premises varies depending on the parameter assignment. In this work, we introduce a principled probabilistic framework for optimisation of a premise selection algorithm. We present results using Sumo Inference Engine (SInE) and the Archive of Formal Proofs (AFP) as a case study. Our approach can be used to optimise heuristics on large theories in minimum number of steps.</jats:p