19,853 research outputs found

    On Rules and Parameter Free Systems in Bounded Arithmetic

    Get PDF
    We present model–theoretic techniques to obtain conservation results for first order bounded arithmetic theories, based on a hierarchical version of the well known notion of an existentially closed model.Ministerio de Educación y Ciencia MTM2005-0865

    Existentially Closed Models and Conservation Results in Bounded Arithmetic

    Get PDF
    We develop model-theoretic techniques to obtain conservation results for first order Bounded Arithmetic theories, based on a hierarchical version of the well-known notion of an existentially closed model. We focus on the classical Buss' theories Si2 and Ti2 and prove that they are ∀Σbi conservative over their inference rule counterparts, and ∃∀Σbi conservative over their parameter-free versions. A similar analysis of the Σbi-replacement scheme is also developed. The proof method is essentially the same for all the schemes we deal with and shows that these conservation results between schemes and inference rules do not depend on the specific combinatorial or arithmetical content of those schemes. We show that similar conservation results can be derived, in a very general setting, for every scheme enjoying some syntactical (or logical) properties common to both the induction and replacement schemes. Hence, previous conservation results for induction and replacement can be also obtained as corollaries of these more general results.Ministerio de Educación y Ciencia MTM2005-08658Junta de Andalucía TIC-13

    Lipschitz Optimisation for Lipschitz Interpolation

    Full text link
    Techniques known as Nonlinear Set Membership prediction, Kinky Inference or Lipschitz Interpolation are fast and numerically robust approaches to nonparametric machine learning that have been proposed to be utilised in the context of system identification and learning-based control. They utilise presupposed Lipschitz properties in order to compute inferences over unobserved function values. Unfortunately, most of these approaches rely on exact knowledge about the input space metric as well as about the Lipschitz constant. Furthermore, existing techniques to estimate the Lipschitz constants from the data are not robust to noise or seem to be ad-hoc and typically are decoupled from the ultimate learning and prediction task. To overcome these limitations, we propose an approach for optimising parameters of the presupposed metrics by minimising validation set prediction errors. To avoid poor performance due to local minima, we propose to utilise Lipschitz properties of the optimisation objective to ensure global optimisation success. The resulting approach is a new flexible method for nonparametric black-box learning. We provide experimental evidence of the competitiveness of our approach on artificial as well as on real data
    • …
    corecore