562 research outputs found
Dialectica Interpretation with Marked Counterexamples
Goedel's functional "Dialectica" interpretation can be used to extract
functional programs from non-constructive proofs in arithmetic by employing two
sorts of higher-order witnessing terms: positive realisers and negative
counterexamples. In the original interpretation decidability of atoms is
required to compute the correct counterexample from a set of candidates. When
combined with recursion, this choice needs to be made for every step in the
extracted program, however, in some special cases the decision on negative
witnesses can be calculated only once. We present a variant of the
interpretation in which the time complexity of extracted programs can be
improved by marking the chosen witness and thus avoiding recomputation. The
achieved effect is similar to using an abortive control operator to interpret
computational content of non-constructive principles.Comment: In Proceedings CL&C 2010, arXiv:1101.520
Analysis of methods for extraction of programs from non-constructive proofs
The present thesis compares two computational interpretations of non-constructive proofs: refined A-translation and Gödel's functional "Dialectica" interpretation. The behaviour of the extraction methods is evaluated in the light of several case studies, where the resulting programs are analysed and compared. It is argued that the two interpretations correspond to specific backtracking implementations and that programs obtained via the refined A-translation tend to be simpler, faster and more readable than programs obtained via Gödel's interpretation.
Three layers of optimisation are suggested in order to produce faster and more readable programs. First, it is shown that syntactic repetition of subterms can be reduced by using let-constructions instead of meta substitutions abd thus obtaining a near linear size bound of extracted terms. The second improvement allows declaring syntactically computational parts of the proof as irrelevant and that this can be used to remove redundant parameters, possibly improving the efficiency of the program. Finally, a special case of induction is identified, for which a more efficient recursive extracted term can be defined. It is shown the outcome of case distinctions can be memoised, which can result in exponential improvement of the average time complexity of the extracted program
On the Herbrand content of LK
We present a structural representation of the Herbrand content of LK-proofs
with cuts of complexity prenex Sigma-2/Pi-2. The representation takes the form
of a typed non-deterministic tree grammar of order 2 which generates a finite
language of first-order terms that appear in the Herbrand expansions obtained
through cut-elimination. In particular, for every Gentzen-style reduction
between LK-proofs we study the induced grammars and classify the cases in which
language equality and inclusion hold.Comment: In Proceedings CL&C 2016, arXiv:1606.0582
A Mathematical Model of Quantum Computer by Both Arithmetic and Set Theory
A practical viewpoint links reality, representation, and language to calculation by the concept of Turing (1936) machine being the mathematical model of our computers. After the Gödel incompleteness theorems (1931) or the insolvability of the so-called halting problem (Turing 1936; Church 1936) as to a classical machine of Turing, one of the simplest hypotheses is completeness to be suggested for two ones. That is consistent with the provability of completeness by means of two independent Peano arithmetics discussed in Section I.
Many modifications of Turing machines cum quantum ones are researched in Section II for the Halting problem and completeness, and the model of two independent Turing machines seems to generalize them.
Then, that pair can be postulated as the formal definition of reality therefore being complete unlike any of them standalone, remaining incomplete without its complementary counterpart. Representation is formal defined as a one-to-one mapping between the two Turing machines, and the set of all those mappings can be considered as “language” therefore including metaphors as mappings different than representation. Section III investigates that formal relation of “reality”, “representation”, and “language” modeled by (at least two) Turing machines. The independence of (two) Turing machines is interpreted by means of game theory and especially of the Nash equilibrium in Section IV.
Choice and information as the quantity of choices are involved. That approach seems to be equivalent to that based on set theory and the concept of actual infinity in mathematics and allowing of practical implementations
Representation and Reality by Language: How to make a home quantum computer?
A set theory model of reality, representation and language based on the relation of completeness and incompleteness is explored. The problem of completeness of mathematics is linked to its counterpart in quantum mechanics. That model includes two Peano arithmetics or Turing machines independent of each other. The complex Hilbert space underlying quantum mechanics as the base of its mathematical formalism is interpreted as a generalization of Peano arithmetic: It is a doubled infinite set of doubled Peano arithmetics having a remarkable symmetry to the axiom of choice. The quantity of information is interpreted as the number of elementary choices (bits). Quantum information is seen as the generalization of information to infinite sets or series. The equivalence of that model to a quantum computer is demonstrated. The condition for the Turing machines to be independent of each other is reduced to the state of Nash equilibrium between them. Two relative models of language as game in the sense of game theory and as ontology of metaphors (all mappings, which are not one-to-one, i.e. not representations of reality in a formal sense) are deduced
Belief and Credence: Why the Attitude-Type Matters
In this paper, I argue that the relationship between belief and credence is a central question in epistemology. This is because the belief-credence relationship has significant implications for a number of current epistemological issues. I focus on five controversies: permissivism, disagreement, pragmatic encroachment, doxastic voluntarism, and the relationship between doxastic attitudes and prudential rationality. I argue that each debate is constrained in particular ways, depending on whether the relevant attitude is belief or credence. This means that epistemologists should pay attention to whether they are framing questions in terms of belief or in terms of credence and the success or failure of a reductionist project in the belief-credence realm has significant implications for epistemology generally
Mathematical Logic: Proof theory, Constructive Mathematics
The workshop “Mathematical Logic: Proof Theory, Constructive Mathematics” was centered around proof-theoretic aspects of current mathematics, constructive mathematics and logical aspects of computational complexit
Une Dialectica matérialiste
In this thesis, we give a computational interpretation to Gödel's Dialectica translation, in a fashion inspired by classical realizability. In particular, it can be shown that the Dialectica translation manipulates stacks of the Krivine machine as first-class objects and that the main effect at work lies in the accumulation of those stacks at each variable use. The original translation suffers from a handful of defects due to hacks used by Gödel to work around historical limitations. Once these defects are solved, the translation naturally extends to much more expressive settings such as dependent type theory. A few variants are studied thanks to the linear decomposition, and relationships with other translations such as forcing and CPS are scrutinized.Cette thèse fournit une interprétation calculatoire de la traduction dite Dialectica de Gödel, dans une démarche inspirée par la réalisabilité classique. On peut en particulier montrer que Dialectica manipule des piles de la machine de Krivine comme objets de première classe et que le principal effet de cette traduction consiste à accumuler ces piles à chaque utilisation de variables. La traduction d'origine souffre d'une certaine quantité de défauts dus aux hacks utilisés par Gödel pour contourner des limitations historiques. Une fois ces problèmes résolus, la traduction s'étend naturellement à des paradigmes beaucoup plus expressifs tels que la théorie des types dépendants. On étudie d'autres variantes par la suite grâce à la décomposition linéaire, ainsi que lien de parenté avec d'autres traductions tels que le forcing et les CPS
Causarum Investigatio and the Two Bell's Theorems of John Bell
"Bell's theorem" can refer to two different theorems that John Bell proved,
the first in 1964 and the second in 1976. His 1964 theorem is the
incompatibility of quantum phenomena with the joint assumptions of Locality and
Predetermination. His 1976 theorem is their incompatibility with the single
property of Local Causality. This is contrary to Bell's own later assertions,
that his 1964 theorem began with the assumption of Local Causality, even if not
by that name. Although the two Bell's theorems are logically equivalent, their
assumptions are not. Hence, the earlier and later theorems suggest quite
different conclusions, embraced by operationalists and realists, respectively.
The key issue is whether Locality or Local Causality is the appropriate notion
emanating from Relativistic Causality, and this rests on one's basic notion of
causation. For operationalists the appropriate notion is what is here called
the Principle of Agent-Causation, while for realists it is Reichenbach's
Principle of common cause. By breaking down the latter into even more basic
Postulates, it is possible to obtain a version of Bell's theorem in which each
camp could reject one assumption, happy that the remaining assumptions reflect
its weltanschauung. Formulating Bell's theorem in terms of causation is
fruitful not just for attempting to reconcile the two camps, but also for
better describing the ontology of different quantum interpretations and for
more deeply understanding the implications of Bell's marvellous work.Comment: 24 pages. Prepared for proceedings of the "Quantum [Un]speakables II"
conference (Vienna, 2014), to be published by Springe
Process as a world transaction
Transaction is process closure: for a transaction is the limiting process of process itself. In the process world view the universe is the ultimate (intensional) transaction of all its extensional limiting processes that we call reality. ANPA’s PROGRAM UNIVERSE is a computational model which can be explored empirically in commercial database transactions where there has been a wealth of activity over the real world for the last 40 years. Process category theory demonstrates formally the fundamental distinctions between the classical model of a transaction as in PROGRAM UNIVERSE and physical reality. The paper concludes with a short technical summary for those who do not wish to read all the detail
- …