88 research outputs found
Metallicity determination in gas-rich galaxies with semiempirical methods
A study of the precision of the semiempirical methods used in the
determination of the chemical abundances in gas-rich galaxies is carried out.
In order to do this the oxygen abundances of a total of 438 galaxies were
determined using the electronic temperature, the and the P methods.
The new calibration of the P method gives the smaller dispersion for the low
and high metallicity regions, while the best numbers in the turnaround region
are given by the method. We also found that the dispersion correlates
with the metallicity. Finally, it can be said that all the semiempirical
methods studied here are quite insensitive to metallicity with a value of
dex for more than 50% of the total sample.
\keywords{ISM: abundances; (ISM): H {\sc ii} regions}Comment: 26 pages, 9 figures and 2 tables. To appear at AJ, January 200
A Purely Functional Computer Algebra System Embedded in Haskell
We demonstrate how methods in Functional Programming can be used to implement
a computer algebra system. As a proof-of-concept, we present the
computational-algebra package. It is a computer algebra system implemented as
an embedded domain-specific language in Haskell, a purely functional
programming language. Utilising methods in functional programming and prominent
features of Haskell, this library achieves safety, composability, and
correctness at the same time. To demonstrate the advantages of our approach, we
have implemented advanced Gr\"{o}bner basis algorithms, such as Faug\`{e}re's
and , in a composable way.Comment: 16 pages, Accepted to CASC 201
Bohrification of operator algebras and quantum logic
Following Birkhoff and von Neumann, quantum logic has traditionally been
based on the lattice of closed linear subspaces of some Hilbert space, or, more
generally, on the lattice of projections in a von Neumann algebra A.
Unfortunately, the logical interpretation of these lattices is impaired by
their nondistributivity and by various other problems. We show that a possible
resolution of these difficulties, suggested by the ideas of Bohr, emerges if
instead of single projections one considers elementary propositions to be
families of projections indexed by a partially ordered set C(A) of appropriate
commutative subalgebras of A. In fact, to achieve both maximal generality and
ease of use within topos theory, we assume that A is a so-called Rickart
C*-algebra and that C(A) consists of all unital commutative Rickart
C*-subalgebras of A. Such families of projections form a Heyting algebra in a
natural way, so that the associated propositional logic is intuitionistic:
distributivity is recovered at the expense of the law of the excluded middle.
Subsequently, generalizing an earlier computation for n-by-n matrices, we
prove that the Heyting algebra thus associated to A arises as a basis for the
internal Gelfand spectrum (in the sense of Banaschewski-Mulvey) of the
"Bohrification" of A, which is a commutative Rickart C*-algebra in the topos of
functors from C(A) to the category of sets. We explain the relationship of this
construction to partial Boolean algebras and Bruns-Lakser completions. Finally,
we establish a connection between probability measure on the lattice of
projections on a Hilbert space H and probability valuations on the internal
Gelfand spectrum of A for A = B(H).Comment: 31 page
A new foundational crisis in mathematics, is it really happening?
The article reconsiders the position of the foundations of mathematics after
the discovery of HoTT. Discussion that this discovery has generated in the
community of mathematicians, philosophers and computer scientists might
indicate a new crisis in the foundation of mathematics. By examining the
mathematical facts behind HoTT and their relation with the existing
foundations, we conclude that the present crisis is not one. We reiterate a
pluralist vision of the foundations of mathematics. The article contains a
short survey of the mathematical and historical background needed to understand
the main tenets of the foundational issues.Comment: Final versio
Forcing-based cut-elimination for Gentzen-style intuitionistic sequent calculus
International audienceWe give a simple intuitionistic completeness proof of Kripke semantics for intuitionistic logic with implication and universal quantification with respect to cut-free intuitionistic sequent calculus. The Kripke semantics is ``simplified'' in the way that the domain remains constant. The proof has been formalised in the Coq proof assistant and by combining soundness with completeness, we obtain an executable cut-elimination procedure. The proof easily extends to the case of the absurdity connective using Kripke models with exploding nodes à la Veldman
A Focused Sequent Calculus Framework for Proof Search in Pure Type Systems
Basic proof-search tactics in logic and type theory can be seen as the
root-first applications of rules in an appropriate sequent calculus, preferably
without the redundancies generated by permutation of rules. This paper
addresses the issues of defining such sequent calculi for Pure Type Systems
(PTS, which were originally presented in natural deduction style) and then
organizing their rules for effective proof-search. We introduce the idea of
Pure Type Sequent Calculus with meta-variables (PTSCalpha), by enriching the
syntax of a permutation-free sequent calculus for propositional logic due to
Herbelin, which is strongly related to natural deduction and already well
adapted to proof-search. The operational semantics is adapted from Herbelin's
and is defined by a system of local rewrite rules as in cut-elimination, using
explicit substitutions. We prove confluence for this system. Restricting our
attention to PTSC, a type system for the ground terms of this system, we obtain
the Subject Reduction property and show that each PTSC is logically equivalent
to its corresponding PTS, and the former is strongly normalising iff the latter
is. We show how to make the logical rules of PTSC into a syntax-directed system
PS for proof-search, by incorporating the conversion rules as in
syntax-directed presentations of the PTS rules for type-checking. Finally, we
consider how to use the explicitly scoped meta-variables of PTSCalpha to
represent partial proof-terms, and use them to analyse interactive proof
construction. This sets up a framework PE in which we are able to study
proof-search strategies, type inhabitant enumeration and (higher-order)
unification
Estimation of the length of interactions in arena game semantics
We estimate the maximal length of interactions between strategies in HO/N
game semantics, in the spirit of the work by Schwichtenberg and Beckmann for
the length of reduction in simply typed lambdacalculus. Because of the
operational content of game semantics, the bounds presented here also apply to
head linear reduction on lambda-terms and to the execution of programs by
abstract machines (PAM/KAM), including in presence of computational effects
such as non-determinism or ground type references. The proof proceeds by
extracting from the games model a combinatorial rewriting rule on trees of
natural numbers, which can then be analyzed independently of game semantics or
lambda-calculus.Comment: Foundations of Software Science and Computational Structures 14th
International Conference, FOSSACS 2011, Saarbr\"ucken : Germany (2011
Collection Principles in Dependent Type Theory
We introduce logic-enriched intuitionistic type theories, that extend intuitionistic dependent type theories with primitive judgements to express logic. By adding type theoretic rules that correspond to the collection axiom schemes of the constructive set theory CZF we obtain a generalisation of the type theoretic interpretation of CZF. Suitable logic-enriched type theories allow also the study of reinterpretations of logic. We end the paper with an application to the double-negation in- terpretation
Forcing-based cut-elimination for Gentzen-style intuitionistic sequent calculus
International audienceWe give a simple intuitionistic completeness proof of Kripke semantics for intuitionistic logic with implication and universal quantification with respect to cut-free intuitionistic sequent calculus. The Kripke semantics is ``simplified'' in the way that the domain remains constant. The proof has been formalised in the Coq proof assistant and by combining soundness with completeness, we obtain an executable cut-elimination procedure. The proof easily extends to the case of the absurdity connective using Kripke models with exploding nodes à la Veldman
A Symmetric Approach to Compilation and Decompilation
Just as specializing a source interpreter can achieve compilation from a source language to a target language, we observe that specializing a target interpreter can achieve compilation from the target language to the source language. In both cases, the key issue is the choice of whether to perform an evaluation or to emit code that represents this evaluation. We substantiate this observation by specializing two source interpreters and two target interpreters. We first consider a source language of arithmetic expressions and a target language for a stack machine, and then the lambda-calculus and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., it is a decompiler. In the context of partial evaluation, compilation by source-interpreter specialization is classically referred to as a Futamura projection. By symmetry, it seems logical to refer to decompilation by target-interpreter specialization as a Futamura embedding
- …