3,502 research outputs found
Theorem proving support in programming language semantics
We describe several views of the semantics of a simple programming language
as formal documents in the calculus of inductive constructions that can be
verified by the Coq proof system. Covered aspects are natural semantics,
denotational semantics, axiomatic semantics, and abstract interpretation.
Descriptions as recursive functions are also provided whenever suitable, thus
yielding a a verification condition generator and a static analyser that can be
run inside the theorem prover for use in reflective proofs. Extraction of an
interpreter from the denotational semantics is also described. All different
aspects are formally proved sound with respect to the natural semantics
specification.Comment: Propos\'e pour publication dans l'ouvrage \`a la m\'emoire de Gilles
Kah
Inductive reasoning and Kolmogorov complexity
AbstractReasoning to obtain the “truth” about reality, from external data, is an important, controversial, and complicated issue in man's effort to understand nature. (Yet, today, we try to make machines do this.) There have been old useful principles, new exciting models, and intricate theories scattered in vastly different areas including philosophy of science, statistics, computer science, and psychology. We focus on inductive reasoning in correspondence with ideas of R. J. Solomonoff. While his proposals result in perfect procedures, they involve the noncomputable notion of Kolmogorov complexity. In this paper we develop the thesis that Solomonoff's method is fundamental in the sense that many other induction principles can be viewed as particular ways to obtain computable approximations to it. We demonstrate this explicitly in the cases of Gold's paradigm for inductive inference, Rissanen's minimum description length (MDL) principle, Fisher's maximum likelihood principle, and Jaynes' maximum entropy principle. We present several new theorems and derivations to this effect. We also delimit what can be learned and what cannot be learned in terms of Kolmogorov complexity, and we describe an experiment in machine learning of handwritten characters. We also give an application of Kolmogorov complexity in Valiant style learning, where we want to learn a concept probably approximately correct in feasible time and examples
Epistemic virtues, metavirtues, and computational complexity
I argue that considerations about computational complexity show that all finite agents need characteristics like those that have been called epistemic virtues. The necessity of these virtues follows in part from the nonexistence of shortcuts, or efficient ways of finding shortcuts, to cognitively expensive routines. It follows that agents must possess the capacities – metavirtues –of developing in advance the cognitive virtues they will need when time and memory are at a premium
Recursive Neural Networks Can Learn Logical Semantics
Tree-structured recursive neural networks (TreeRNNs) for sentence meaning
have been successful for many applications, but it remains an open question
whether the fixed-length representations that they learn can support tasks as
demanding as logical deduction. We pursue this question by evaluating whether
two such models---plain TreeRNNs and tree-structured neural tensor networks
(TreeRNTNs)---can correctly learn to identify logical relationships such as
entailment and contradiction using these representations. In our first set of
experiments, we generate artificial data from a logical grammar and use it to
evaluate the models' ability to learn to handle basic relational reasoning,
recursive structures, and quantification. We then evaluate the models on the
more natural SICK challenge data. Both models perform competitively on the SICK
data and generalize well in all three experiments on simulated data, suggesting
that they can learn suitable representations for logical inference in natural
language
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
- …