2,105 research outputs found
Extending the Calculus of Constructions with Tarski's fix-point theorem
We propose to use Tarski's least fixpoint theorem as a basis to define
recursive functions in the calculus of inductive constructions. This widens the
class of functions that can be modeled in type-theory based theorem proving
tool to potentially non-terminating functions. This is only possible if we
extend the logical framework by adding the axioms that correspond to classical
logic. We claim that the extended framework makes it possible to reason about
terminating and non-terminating computations and we show that common facilities
of the calculus of inductive construction, like program extraction can be
extended to also handle the new functions
Inductive and Coinductive Components of Corecursive Functions in Coq
In Constructive Type Theory, recursive and corecursive definitions are
subject to syntactic restrictions which guarantee termination for recursive
functions and productivity for corecursive functions. However, many terminating
and productive functions do not pass the syntactic tests. Bove proposed in her
thesis an elegant reformulation of the method of accessibility predicates that
widens the range of terminative recursive functions formalisable in
Constructive Type Theory. In this paper, we pursue the same goal for productive
corecursive functions. Notably, our method of formalisation of coinductive
definitions of productive functions in Coq requires not only the use of ad-hoc
predicates, but also a systematic algorithm that separates the inductive and
coinductive parts of functions.Comment: Dans Coalgebraic Methods in Computer Science (2008
Computer theorem proving in math
We give an overview of issues surrounding computer-verified theorem proving
in the standard pure-mathematical context. This is based on my talk at the PQR
conference (Brussels, June 2003)
Anticipatory Semantic Processes
Why anticipatory processes correspond to cognitive abilities of living systems? To be adapted to an environment, behaviors need at least i) internal representations of events occurring in the external environment; and ii) internal anticipations of possible events to occur in the external environment. Interactions of these two opposite but complementary cognitive properties lead to various patterns of experimental data on semantic processing.
How to investigate dynamic semantic processes? Experimental studies in cognitive psychology offer several interests such as: i) the control of the semantic environment such as words embedded in sentences; ii) the methodological tools allowing the observation of anticipations and adapted oculomotor behavior during reading; and iii) the analyze of different anticipatory processes within the theoretical framework of semantic processing.
What are the different types of semantic anticipations? Experimental data show that semantic anticipatory processes involve i) the coding in memory of sequences of words occurring in textual environments; ii) the anticipation of possible future words from currently perceived words; and iii) the selection of anticipated words as a function of the sequences of perceived words, achieved by anticipatory activations and inhibitory selection processes.
How to modelize anticipatory semantic processes? Localist or distributed neural networks models can account for some types of semantic processes, anticipatory or not. Attractor neural networks coding temporal sequences are presented as good candidate for modeling anticipatory semantic processes, according to specific properties of the human brain such as i) auto-associative memory; ii) learning and memorization of sequences of patterns; and iii) anticipation of memorized patterns from previously perceived patterns
The union of unit balls has quadratic complexity, even if they all contain the origin
We provide a lower bound construction showing that the union of unit balls in
three-dimensional space has quadratic complexity, even if they all contain the
origin. This settles a conjecture of Sharir.Comment: 5 pages, 5 figure
Evaluation of the quantitative prediction of a trend reversal on the Japanese stock market in 1999
In January 1999, the authors published a quantitative prediction that the
Nikkei index should recover from its 14 year low in January 1999 and reach
a year later. The purpose of the present paper is to evaluate
the performance of this specific prediction as well as the underlying model:
the forecast, performed at a time when the Nikkei was at its lowest (as we can
now judge in hindsight), has correctly captured the change of trend as well as
the quantitative evolution of the Nikkei index since its inception. As the
change of trend from sluggish to recovery was estimated quite unlikely by many
observers at that time, a Bayesian analysis shows that a skeptical (resp.
neutral) Bayesian sees her prior belief in our model amplified into a posterior
belief 19 times larger (resp. reach the 95% level).Comment: 6 pages including 2 figure
On-line list colouring of random graphs
In this paper, the on-line list colouring of binomial random graphs G(n,p) is
studied. We show that the on-line choice number of G(n,p) is asymptotically
almost surely asymptotic to the chromatic number of G(n,p), provided that the
average degree d=p(n-1) tends to infinity faster than (log log n)^1/3(log
n)^2n^(2/3). For sparser graphs, we are slightly less successful; we show that
if d>(log n)^(2+epsilon) for some epsilon>0, then the on-line choice number is
larger than the chromatic number by at most a multiplicative factor of C, where
C in [2,4], depending on the range of d. Also, for d=O(1), the on-line choice
number is by at most a multiplicative constant factor larger than the chromatic
number
Improved Incremental Randomized Delaunay Triangulation
We propose a new data structure to compute the Delaunay triangulation of a
set of points in the plane. It combines good worst case complexity, fast
behavior on real data, and small memory occupation.
The location structure is organized into several levels. The lowest level
just consists of the triangulation, then each level contains the triangulation
of a small sample of the levels below. Point location is done by marching in a
triangulation to determine the nearest neighbor of the query at that level,
then the march restarts from that neighbor at the level below. Using a small
sample (3%) allows a small memory occupation; the march and the use of the
nearest neighbor to change levels quickly locate the query.Comment: 19 pages, 7 figures Proc. 14th Annu. ACM Sympos. Comput. Geom.,
106--115, 199
Continued Fraction Expansion of Real Roots of Polynomial Systems
We present a new algorithm for isolating the real roots of a system of
multivariate polynomials, given in the monomial basis. It is inspired by
existing subdivision methods in the Bernstein basis; it can be seen as
generalization of the univariate continued fraction algorithm or alternatively
as a fully analog of Bernstein subdivision in the monomial basis. The
representation of the subdivided domains is done through homographies, which
allows us to use only integer arithmetic and to treat efficiently unbounded
regions. We use univariate bounding functions, projection and preconditionning
techniques to reduce the domain of search. The resulting boxes have optimized
rational coordinates, corresponding to the first terms of the continued
fraction expansion of the real roots. An extension of Vincent's theorem to
multivariate polynomials is proved and used for the termination of the
algorithm. New complexity bounds are provided for a simplified version of the
algorithm. Examples computed with a preliminary C++ implementation illustrate
the approach.Comment: 10 page
Adaptive modeling of shallow fully nonlinear gravity waves
This paper presents an extended version of the celebrated Serre-Green-Naghdi
(SGN) system. This extension is based on the well-known Bona-Smith-Nwogu trick
which aims to improve the linear dispersion properties. We show that in the
fully nonlinear setting it results in modifying the vertical acceleration. Even
if this technique is well-known, the effect of this modification on the
nonlinear properties of the model is not clear. The first goal of this study is
to shed some light on the properties of solitary waves, as the most important
class of nonlinear permanent solutions. Then, we propose a simple adaptive
strategy to choose the optimal value of the free parameter at every instance of
time. This strategy is validated by comparing the model prediction with the
reference solutions of the full Euler equations and its classical counterpart.
Numerical simulations show that the new adaptive model provides a much better
accuracy for the same computational complexity.Comment: 19 pages, 8 figures, 2 tables, 45 references. Some typos were
corrected. Other author's papers can be downloaded at
http://www.denys-dutykh.com
- …