1,804 research outputs found
A Lambda-Calculus Foundation for Universal Probabilistic Programming
We develop the operational semantics of an untyped probabilistic
lambda-calculus with continuous distributions, as a foundation for universal
probabilistic programming languages such as Church, Anglican, and Venture. Our
first contribution is to adapt the classic operational semantics of
lambda-calculus to a continuous setting via creating a measure space on terms
and defining step-indexed approximations. We prove equivalence of big-step and
small-step formulations of this distribution-based semantics. To move closer to
inference techniques, we also define the sampling-based semantics of a term as
a function from a trace of random samples to a value. We show that the
distribution induced by integrating over all traces equals the
distribution-based semantics. Our second contribution is to formalize the
implementation technique of trace Markov chain Monte Carlo (MCMC) for our
calculus and to show its correctness. A key step is defining sufficient
conditions for the distribution induced by trace MCMC to converge to the
distribution-based semantics. To the best of our knowledge, this is the first
rigorous correctness proof for trace MCMC for a higher-order functional
language
A lambda calculus for quantum computation with classical control
The objective of this paper is to develop a functional programming language
for quantum computers. We develop a lambda calculus for the classical control
model, following the first author's work on quantum flow-charts. We define a
call-by-value operational semantics, and we give a type system using affine
intuitionistic linear logic. The main results of this paper are the safety
properties of the language and the development of a type inference algorithm.Comment: 15 pages, submitted to TLCA'05. Note: this is basically the work done
during the first author master, his thesis can be found on his webpage.
Modifications: almost everything reformulated; recursion removed since the
way it was stated didn't satisfy lemma 11; type inference algorithm added;
example of an implementation of quantum teleportation adde
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
- …