1,052 research outputs found

    Matter as Information. Quantum Information as Matter

    Get PDF
    Quantum information is discussed as the universal substance of the world. It is interpreted as that generalization of classical information, which includes both finite and transfinite ordinal numbers. On the other hand, any wave function and thus any state of any quantum system is just one value of quantum information. Information and its generalization as quantum information are considered as quantities of elementary choices. Their units are correspondingly a bit and a qubit. The course of time is what generates choices by itself, thus quantum information and any item in the world in final analysis. The course of time generates necessarily choices so: The future is absolutely unorderable in principle while the past is always well-ordered and thus unchangeable. The present as the mediation between them needs the well-ordered theorem equivalent to the axiom of choice. The latter guarantees the choice even among the elements of an infinite set, which is the case of quantum information. The concrete and abstract objects share information as their common base, which is quantum as to the formers and classical as to the latter. The general quantities of matter in physics, mass and energy can be considered as particular cases of quantum information. The link between choice and abstraction in set theory allows of “Hume’s principle” to be interpreted in terms of quantum mechanics as equivalence of “many” and “much” underlying quantum information. Quantum information as the universal substance of the world calls for the unity of physics and mathematics rather than that of the concrete and abstract objects and thus for a form of quantum neo-Pythagoreanism in final analysis

    G-Complexity, Quantum Computation and Anticipatory Processes

    Get PDF

    Novelty And Surprises In Complex Adaptive System (CAS) Dynamics: A Computational Theory of Actor Innovation

    Get PDF
    The work of John von Neumann in the 1940's on self-reproducing machines as models for biological systems and self-organized complexity provides the computational legacy for CAS. Following this, the major hypothesis emanating from Wolfram (1984), Langton (1992, 1994), Kaufmann (1993) and Casti (1994) is that the sine qua non of complex adaptive systems is their capacity to produce novelty or 'surprises' and the so called Type IV innovation based structure changing dynamics of the Wolfram-Chomsky schema. The Wolfram-Chomsky schema postulates that on varying the computational capabilities of agents, different system wide dynamics can be generated: finite automata produce Type I dynamics with unique limit points or homogeneity; push down automata produce Type II dynamics with limit cycles; linear bounded automata generate Type III chaotic trajectories with strange attractors. The significance of this schema is that it postulates that only agents with the full powers of Turing Machines capable of simulating other Turing Machines, which Wolfram calls computational universality can produce Type IV irregular innovation based structure changing dynamics associated with the three main natural exponents of CAS, evolutionary biology, immunology and capitalist growth. Langton (1990,1992) identifies the above complexity classes for dynamical systems with the halting problem of Turing machines and famously calls the phase transition or the domain on which novel objects emerge as 'life at the edge of chaos'. This paper develops the formal foundations for the emergence of novelty or innovation. Remarkably, following Binmore(1987) who first introduced to game theory the requisite dose of mechanism with players modelled as Turing Machines with the Gödel (1931) logic involving the Liar or the pure logic of opposition, we will see that only agents qua universal Turing Machines which can make self-referential calculation of hostile objectives can bring about adaptive novelty or strategic innovation

    A Computable Economist’s Perspective on Computational Complexity

    Get PDF
    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called 'Post's Program of Research for Higher Recursion Theory'. Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog
    • …
    corecore