204 research outputs found
Unknowable Truths: The Incompleteness Theorems and the Rise of Modernism
This thesis evaluates the function of the current history of mathematics methodologies and explores ways in which historiographical methodologies could be successfully implemented in the field. Traditional approaches to the history of mathematics often lack either an accurate portrayal of the social and cultural influences of the time, or they lack an effective usage of mathematics discussed. This paper applies a holistic methodology in a case study of Kurt Gödelâs influential work in logic during the Interwar period and the parallel rise of intellectual modernism. In doing so, the proofs for Gödelâs Completeness and Incompleteness theorems will be discussed as well as Gödelâs philosophical interests and influences of the time. To explore the intersection of these worlds, practices are borrowed from the fields of intellectual history and history of science and technology to analyze better the effects of society and culture on the mind of mathematicians like Gödel and their work
Freedom, Anarchy and Conformism in Academic Research
In this paper I attempt to make a case for promoting the courage of rebels within the citadels of orthodoxy in academic research environments. Wicksell in Macroeconomics, Brouwer in the Foundations of Mathematics, Turing in Computability Theory, Sraffa in the Theories of Value and Distribution are, in my own fields of research, paradigmatic examples of rebels, adventurers and non-conformists of the highest caliber in scientific research within University environments. In what sense, and how, can such rebels, adventurers and non-conformists be fostered in the current University research environment dominated by the cult of 'picking winners'? This is the motivational question lying behind the historical outlines of the work of Brouwer, Hilbert, Bishop, Veronese, Gödel, Turing and Sraffa that I describe in this paper. The debate between freedom in research and teaching, and the naked imposition of 'correct' thinking, on potential dissenters of the mind, is of serious concern in this age of austerity of material facilities. It is a debate that has occupied some of the finest minds working at the deepest levels of foundational issues in mathematics, metamathematics and economic theory. By making some of the issues explicit, I hope it is possible to encourage dissenters to remain courageous in the face of current dogmasNon-conformist research, economic theory, mathematical economics, 'Hilbert's Dogma', Hilbert's Program, computability theory
Remarks on Wittgenstein, Gödel, Chaitin, Incompleteness, Impossiblity and the Psychological Basis of Science and Mathematics
It is commonly thought that such topics as Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason are disparate scientific physical or mathematical issues having little or nothing in common. I suggest that they are largely standard philosophical problems (i.e., language games) which were resolved by Wittgenstein over 80 years ago.
Wittgenstein also demonstrated the fatal error in regarding mathematics or language or our behavior in general as a unitary coherent logical âsystem,â rather than as a motley of pieces assembled by the random processes of natural selection. âGödel shows us an unclarity in the concept of âmathematicsâ, which is indicated by the fact that mathematics is taken to be a systemâ and we can say (contra nearly everyone) that is all that Gödel and Chaitin show. Wittgenstein commented many times that âtruthâ in math means axioms or the theorems derived from axioms, and âfalseâ means that one made a mistake in using the definitions, and this is utterly different from empirical matters where one applies a test. Wittgenstein often noted that to be acceptable as mathematics in the usual sense, it must be useable in other proofs and it must have real world applications, but neither is the case with Godelâs Incompleteness. Since it cannot be proved in a consistent system (here Peano Arithmetic but a much wider arena for Chaitin), it cannot be used in proofs and, unlike all the ârestâ of PA it cannot be used in the real world either. As Rodych notes ââŠWittgenstein holds that a formal calculus is only a mathematical calculus (i.e., a mathematical language-game) if it has an extra- systemic application in a system of contingent propositions (e.g., in ordinary counting and measuring or in physics) âŠâ Another way to say this is that one needs a warrant to apply our normal use of words like âproofâ, âpropositionâ, âtrueâ, âincompleteâ, ânumberâ, and âmathematicsâ to a result in the tangle of games created with ânumbersâ and âplusâ and âminusâ signs etc., and with
âIncompletenessâ this warrant is lacking. Rodych sums it up admirably. âOn Wittgensteinâs account, there is no such thing as an incomplete mathematical calculus because âin mathematics, everything is algorithm [and syntax] and nothing is meaning [semantics]âŠâ
I make some brief remarks which note the similarities of these âmathematicalâ issues to economics, physics, game theory, and decision theory.
Those wishing further comments on philosophy and science from a Wittgensteinian two systems of thought viewpoint may consult my other writings -- Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 4th ed (2019), The Logical Structure of Human Behavior (2019), The Logical Structure of Consciousness (2019, Understanding the Connections between Science, Philosophy, Psychology, Religion, Politics, and Economics and Suicidal Utopian Delusions in the 21st Century 5th ed (2019), Remarks on Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason in Chaitin, Wittgenstein, Hofstadter, Wolpert, Doria, da Costa, Godel, Searle, Rodych, Berto, Floyd, Moyal-Sharrock and Yanofsky (2019), and The Logical Structure of Philosophy, Psychology, Sociology, Anthropology, Religion, Politics, Economics, Literature and History (2019)
Strong Types for Direct Logic
This article follows on the introductory article âDirect Logic for Intelligent Applicationsâ [Hewitt 2017a]. Strong Types enable new mathematical theorems to be proved including the Formal Consistency of Mathematics. Also, Strong Types are extremely important in Direct Logic because they block all known paradoxes[Cantini and Bruni 2017]. Blocking known paradoxes makes Direct Logic safer for use in Intelligent Applications by preventing security holes.
Inconsistency Robustness is performance of information systems with pervasively inconsistent information. Inconsistency Robustness of the community of professional mathematicians is their performance repeatedly repairing contradictions over the centuries. In the Inconsistency Robustness paradigm, deriving contradictions has been a progressive development and not âgame stoppers.â Contradictions can be helpful instead of being something to be âswept under the rugâ by denying their existence, which has been repeatedly attempted by authoritarian theoreticians (beginning with some Pythagoreans). Such denial has delayed mathematical development. This article reports how considerations of Inconsistency Robustness have recently influenced the foundations of mathematics for Computer Science continuing a tradition developing the sociological basis for foundations.
Mathematics here means the common foundation of all classical mathematical theories from Euclid to the mathematics used to prove Fermat's Last [McLarty 2010]. Direct Logic provides categorical axiomatizations of the Natural Numbers, Real Numbers, Ordinal Numbers, Set Theory, and the Lambda Calculus meaning that up a unique isomorphism there is only one model that satisfies the respective axioms. Good evidence for the consistency Classical Direct Logic derives from how it blocks the known paradoxes of classical mathematics. Humans have spent millennia devising paradoxes for classical mathematics.
Having a powerful system like Direct Logic is important in computer science because computers must be able to formalize all logical inferences (including inferences about their own inference processes) without requiring recourse to human intervention. Any inconsistency in Classical Direct Logic would be a potential security hole because it could be used to cause computer systems to adopt invalid conclusions.
After [Church 1934], logicians faced the following dilemma:
âą 1st order theories cannot be powerful lest they fall into inconsistency because of Churchâs Paradox.
âą 2nd order theories contravene the philosophical doctrine that theorems must be computationally enumerable.
The above issues can be addressed by requiring Mathematics to be strongly typed using so that:
âą Mathematics self proves that it is âopenâ in the sense that theorems are not computationally enumerable.
âą Mathematics self proves that it is formally consistent.
âą Strong mathematical theories for Natural Numbers, Ordinals, Set Theory, the Lambda Calculus, Actors, etc. are inferentially decidable, meaning that every true proposition is provable and every proposition is either provable or disprovable. Furthermore, theorems of these theories are not enumerable by a provably total procedure
The formal failure and social success of logic
Is formal logic a failure? It may be, if we accept the context-independent limits imposed by Russell, Frege, and others. In response to difficulties arising from such limitations I present a Toulmin-esque social recontextualization of formal logic. The results of my project provide a positive view of formal logic as a success while simultaneously reaffirming the social and contextual concerns of argumentation theorists, critical thinking scholars, and rhetoricians
The Quantum Strategy of Completeness: On the Self-Foundation of Mathematics
Gentzenâs approach by transfinite induction and that of intuitionist Heyting arithmetic to completeness and the self-foundation of mathematics are compared and opposed to the Gödel incompleteness results as to Peano arithmetic. Quantum mechanics involves infinity by Hilbert space, but it is finitist as any experimental science. The absence of hidden variables in it interpretable as its completeness should resurrect Hilbertâs finitism at the cost of relevant modification of the latter already hinted by intuitionism and Gentzenâs approaches for completeness. This paper investigates both conditions and philosophical background necessary for that modification. The main conclusion is that the concept of infinity as underlying contemporary mathematics cannot be reduced to a single Peano arithmetic, but to at least two ones independent of each other. Intuitionism, quantum mechanics, and Gentzenâs approaches to completeness an even Hilbertâs finitism can be unified from that viewpoint. Mathematics may found itself by a way of finitism complemented by choice. The concept of information as the quantity of choices underlies that viewpoint. Quantum mechanics interpretable in terms of information and quantum information is inseparable from mathematics and its foundation
Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics
In this philosophical paper, we explore computational and biological
analogies to address the fine-tuning problem in cosmology. We first clarify
what it means for physical constants or initial conditions to be fine-tuned. We
review important distinctions such as the dimensionless and dimensional
physical constants, and the classification of constants proposed by
Levy-Leblond. Then we explore how two great analogies, computational and
biological, can give new insights into our problem. This paper includes a
preliminary study to examine the two analogies. Importantly, analogies are both
useful and fundamental cognitive tools, but can also be misused or
misinterpreted. The idea that our universe might be modelled as a computational
entity is analysed, and we discuss the distinction between physical laws and
initial conditions using algorithmic information theory. Smolin introduced the
theory of "Cosmological Natural Selection" with a biological analogy in mind.
We examine an extension of this analogy involving intelligent life. We discuss
if and how this extension could be legitimated.
Keywords: origin of the universe, fine-tuning, physical constants, initial
conditions, computational universe, biological universe, role of intelligent
life, cosmological natural selection, cosmological artificial selection,
artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres
- âŠ