1,306 research outputs found

    The Lost Melody Phenomenon

    Full text link
    A typical phenomenon for machine models of transfinite computations is the existence of so-called lost melodies, i.e. real numbers xx such that the characteristic function of the set {x}\{x\} is computable while xx itself is not (a real having the first property is called recognizable). This was first observed by J. D. Hamkins and A. Lewis for infinite time Turing machine, then demonstrated by P. Koepke and the author for ITRMITRMs. We prove that, for unresetting infinite time register machines introduced by P. Koepke, recognizability equals computability, i.e. the lost melody phenomenon does not occur. Then, we give an overview on our results on the behaviour of recognizable reals for ITRMITRMs. We show that there are no lost melodies for ordinal Turing machines or ordinal register machines without parameters and that this is, under the assumption that 00^{\sharp} exists, independent of ZFCZFC. Then, we introduce the notions of resetting and unresetting α\alpha-register machines and give some information on the question for which of these machines there are lost melodies

    ON THE FOUNDATIONS OF COMPUTABILITY THEORY

    Get PDF
    The principal motivation for this work is the observation that there are significant deficiencies in the foundations of conventional computability theory. This thesis examines the problems with conventional computability theory, including its failure to address discrepancies between theory and practice in computer science, semantic confusion in terminology, and limitations in the scope of conventional computing models. In light of these difficulties, fundamental notions are re-examined and revised definitions of key concepts such as “computer,” “computable,” and “computing power” are provided. A detailed analysis is conducted to determine desirable semantics and scope of applicability of foundational notions. The credibility of the revised definitions is ascertained by demonstrating by their ability to address identified problems with conventional definitions. Their practical utility is established through application to examples. Other related issues, including hidden complexity in computations, subtleties related to encodings, and the cardinalities of sets involved in computing, are examined. A resource-based meta-model for characterizing computing model properties is introduced. The proposed definitions are presented as a starting point for an alternate foundation for computability theory. However, formulation of the particular concepts under discussion is not the sole purpose of the thesis. The underlying objective of this research is to open discourse on alternate foundations of computability theory and to inspire re-examination of fundamental notions

    What Do Paraconsistent, Undecidable, Random, Computable and Incomplete mean? A Review of Godel's Way: Exploits into an undecidable world by Gregory Chaitin, Francisco A Doria, Newton C.A. da Costa 160p (2012) (review revised 2019)

    Get PDF
    In ‘Godel’s Way’ three eminent scientists discuss issues such as undecidability, incompleteness, randomness, computability and paraconsistency. I approach these issues from the Wittgensteinian viewpoint that there are two basic issues which have completely different solutions. There are the scientific or empirical issues, which are facts about the world that need to be investigated observationally and philosophical issues as to how language can be used intelligibly (which include certain questions in mathematics and logic), which need to be decided by looking at how we actually use words in particular contexts. When we get clear about which language game we are playing, these topics are seen to be ordinary scientific and mathematical questions like any others. Wittgenstein’s insights have seldom been equaled and never surpassed and are as pertinent today as they were 80 years ago when he dictated the Blue and Brown Books. In spite of its failings—really a series of notes rather than a finished book—this is a unique source of the work of these three famous scholars who have been working at the bleeding edges of physics, math and philosophy for over half a century. Da Costa and Doria are cited by Wolpert (see below or my articles on Wolpert and my review of Yanofsky’s ‘The Outer Limits of Reason’) since they wrote on universal computation, and among his many accomplishments, Da Costa is a pioneer in paraconsistency. Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Human Behavior (2019), and Suicidal Utopian Delusions in the 21st Century 4th ed (2019

    Families of automata characterizing context-sensitive languages

    Get PDF
    International audienceIn the hierarchy of infinite graph families, rational graphs are defined by rational transducers with labelled final states. This paper proves that their traces are precisely context-sensitive languages and that this result remains true for synchronized rational graphs

    Adapting to Computer Science

    Get PDF
    Although I am not an engineer who adapted himself to computer science but a mathematician who did so, I am familiar enough with the development, concepts, and activities of this new discipline to venture an opinion of what must be adapted to in it. Computer and Information Science is known as Informatics on the European continent. It was born as a distinct discipline barely a generation ago. As a fresh young discipline, it is an effervescent mixture of formal theory, empirical applications, and pragmatic design. Mathematics was just such an effervescent mixture in western culture from the renaissance to the middle of the twentieth century. It was then that the dynamic effect of high speed, electronic, general purpose computers accelerated the generalization of the meaning of the word computation This caused the early computer science to recruit not only mathematicians but also philosophers (especially logicians), linguists, psychologists, even economists, as well as physicists, and a variety of engineers. Thus we are, perforce, discussing the changes and adaptations of individuals to disciplines, and especially of people in one discipline to another. As we all know, the very word discipline indicates that there is an initial special effort by an individual to force himself or herself to change. The change involves adaptation of one\u27s perceptions to a special way of viewing certain aspects of the - world, and also one\u27s behavior in order to produce special results. For example we are familiar with the enormous prosthetic devices that physicists have added to their natural sensors and perceptors in order to perceive minute particles and to smash atoms in order to do so (at, we might add, enormous expense, and enormous stretching of computational activity). We are also familiar with the enormously intricate prosthetic devices mathematicians added to their computational effectors, the general symbol manipulators, called computers

    Gods of Transhumanism

    Get PDF
    Purpose of the article is to identify the religious factor in the teaching of transhumanism, to determine its role in the ideology of this flow of thought and to identify the possible limits of technology interference in human nature. Theoretical basis. The methodological basis of the article is the idea of transhumanism. Originality. In the foreseeable future, robots will be able to pass the Turing test, become “electronic personalities” and gain political rights, although the question of the possibility of machine consciousness and self-awareness remains open. In the face of robots, people create their assistants, evolutionary competition with which they will almost certainly lose with the initial data. For successful competition with robots, people will have to change, ceasing to be people in the classical sense. Changing the nature of man will require the emergence of a new – posthuman – anthropology. Conclusions. Against the background of scientific discoveries, technical breakthroughs and everyday improvements of the last decades, an anthropological revolution has taken shape, which made it possible to set the task of creating inhumanly intelligent creatures, as well as changing human nature, up to discussing options for artificial immortality. The history of man ends and the history of the posthuman begins. We can no longer turn off this path, however, in our power to preserve our human qualities in the posthuman future. The theme of the soul again reminded of itself, but from a different perspective – as the theme of consciousness and self-awareness. It became again relevant in connection with the development of computer and cloud technologies, artificial intelligence technologies, etc. If a machine ever becomes a "man", then can a man become a "machine"? However, even if such a hypothetical probability would turn into reality, we cannot talk about any form of individual immortality or about the continuation of existence in a different physical form. A digital copy of the soul will still remain a copy, and I see no fundamental possibility of isolating a substrate-independent mind from the human body. Immortality itself is necessary not so much for stopping someone’s fears or encouraging someone’s hopes, but for the final solution of a religious issue. However, the gods hold the keys to heaven hard and are unlikely to admit our modified descendants there
    corecore