25,798 research outputs found
Full abstraction for probabilistic PCF
We present a probabilistic version of PCF, a well-known simply typed
universal functional language. The type hierarchy is based on a single ground
type of natural numbers. Even if the language is globally call-by-name, we
allow a call-by-value evaluation for ground type arguments in order to provide
the language with a suitable algorithmic expressiveness. We describe a
denotational semantics based on probabilistic coherence spaces, a model of
classical Linear Logic developed in previous works. We prove an adequacy and an
equational full abstraction theorem showing that equality in the model
coincides with a natural notion of observational equivalence
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
A dependent nominal type theory
Nominal abstract syntax is an approach to representing names and binding
pioneered by Gabbay and Pitts. So far nominal techniques have mostly been
studied using classical logic or model theory, not type theory. Nominal
extensions to simple, dependent and ML-like polymorphic languages have been
studied, but decidability and normalization results have only been established
for simple nominal type theories. We present a LF-style dependent type theory
extended with name-abstraction types, prove soundness and decidability of
beta-eta-equivalence checking, discuss adequacy and canonical forms via an
example, and discuss extensions such as dependently-typed recursion and
induction principles
Logic programming in the context of multiparadigm programming: the Oz experience
Oz is a multiparadigm language that supports logic programming as one of its
major paradigms. A multiparadigm language is designed to support different
programming paradigms (logic, functional, constraint, object-oriented,
sequential, concurrent, etc.) with equal ease. This article has two goals: to
give a tutorial of logic programming in Oz and to show how logic programming
fits naturally into the wider context of multiparadigm programming. Our
experience shows that there are two classes of problems, which we call
algorithmic and search problems, for which logic programming can help formulate
practical solutions. Algorithmic problems have known efficient algorithms.
Search problems do not have known efficient algorithms but can be solved with
search. The Oz support for logic programming targets these two problem classes
specifically, using the concepts needed for each. This is in contrast to the
Prolog approach, which targets both classes with one set of concepts, which
results in less than optimal support for each class. To explain the essential
difference between algorithmic and search programs, we define the Oz execution
model. This model subsumes both concurrent logic programming
(committed-choice-style) and search-based logic programming (Prolog-style).
Instead of Horn clause syntax, Oz has a simple, fully compositional,
higher-order syntax that accommodates the abilities of the language. We
conclude with lessons learned from this work, a brief history of Oz, and many
entry points into the Oz literature.Comment: 48 pages, to appear in the journal "Theory and Practice of Logic
Programming
A knowledge-based approach to VLSI-design in an open CAD-environment
A knowledge-based approach is suggested to assist a designer in the increasingly complex task of generating VLSI-chips from abstract, high-level specifications of the system. The complexity of designing VLSI-circuits has reached a level where computer-based assistance has become indispensable. Not all of the design tasks allow for algorithmic solutions. AI technique can be used, in order to support the designer with computer-aided tools for tasks not suited for algorithmic approaches. The approach described in this paper is based upon the underlying characteristics of VLSI design processes in general, comprising all stages of the design. A universal model is presented, accompanied with a recording method for the acquisition of design knowledge - strategic and task-specific - in terms of the design actions involved and their effects on the design itself. This method is illustrated by a simple design example: the implementation of the logical EXOR-component. Finally suggestions are made for obtaining a universally usable architecture of a knowledge-based system for VLSI-design
On Equivalence and Canonical Forms in the LF Type Theory
Decidability of definitional equality and conversion of terms into canonical
form play a central role in the meta-theory of a type-theoretic logical
framework. Most studies of definitional equality are based on a confluent,
strongly-normalizing notion of reduction. Coquand has considered a different
approach, directly proving the correctness of a practical equivalance algorithm
based on the shape of terms. Neither approach appears to scale well to richer
languages with unit types or subtyping, and neither directly addresses the
problem of conversion to canonical.
In this paper we present a new, type-directed equivalence algorithm for the
LF type theory that overcomes the weaknesses of previous approaches. The
algorithm is practical, scales to richer languages, and yields a new notion of
canonical form sufficient for adequate encodings of logical systems. The
algorithm is proved complete by a Kripke-style logical relations argument
similar to that suggested by Coquand. Crucially, both the algorithm itself and
the logical relations rely only on the shapes of types, ignoring dependencies
on terms.Comment: 41 page
GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics
Large scale graph processing is a major research area for Big Data
exploration. Vertex centric programming models like Pregel are gaining traction
due to their simple abstraction that allows for scalable execution on
distributed systems naturally. However, there are limitations to this approach
which cause vertex centric algorithms to under-perform due to poor compute to
communication overhead ratio and slow convergence of iterative superstep. In
this paper we introduce GoFFish a scalable sub-graph centric framework
co-designed with a distributed persistent graph storage for large scale graph
analytics on commodity clusters. We introduce a sub-graph centric programming
abstraction that combines the scalability of a vertex centric approach with the
flexibility of shared memory sub-graph computation. We map Connected
Components, SSSP and PageRank algorithms to this model to illustrate its
flexibility. Further, we empirically analyze GoFFish using several real world
graphs and demonstrate its significant performance improvement, orders of
magnitude in some cases, compared to Apache Giraph, the leading open source
vertex centric implementation.Comment: Under review by a conference, 201
- …