102 research outputs found

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog

    Natural Communication

    Get PDF
    In Natural Communication, the author criticizes the current paradigm of specific goal orientation in the complexity sciences. His model of "natural communication" encapsulates modern theoretical concepts from mathematics and physics, in particular category theory and quantum theory. The author is convinced that only by looking to the past is it possible to establish continuity and coherence in the complexity science

    Proceedings of the 1968 Summer Institute on Symbolic Mathematical Computation

    Get PDF
    Investigating symbolic mathematical computation using PL/1 FORMAC batch system and Scope FORMAC interactive syste

    The physical cosmology of Alfred North Whitehead

    Get PDF
    Throughout the history of philosophy, cosmological theories have always deservedly enjoyed a position of special prominence. Of all recent cosmologies, or phi - losophies of Nature, perhaps the most comprehensive and satisfactory is that offered. by Alfred North Whitehead. Whitehead, always both mathematician and philosopher, enjoyed a full career as mathematician at Cambridge and London Universities before answering an invitation from Harvard University to a chair in philosophy there. His interests invariably carried him to the forefront of the advance, and his more technical mathematical works bore the imprint of a philosopher. His philosophy carried the marks of its birth in mathematics and the physical sciences.Although his Treatise on Universal Algebra (1898) won him an enviable reputation, it was his collaboration with Bertrand Russell in the first decade of the twentieth century on Principia Nathematica which proved his pioneering genius. In the middle of this decade, Whitehead offered to the Royal Society of London a memoir entitled "On Mathematical Concepts of the Material World." This memoir, which fell into oblivion, employed the symbolic technique of Principia Nathematica in solving the fundamental problem of importance to cosmological theory. Given a set of entities and a relation between those entities, Whitehead attempted to show the whole of Euclidean geometry to be an expression of the properties of the field of that relation. Certain extraneous relations served to associate the axioms with the material world of the physicists, of which Whitehead offered seven alternative concepts.The first three volumes of Princiaá Mathematica had been published, and Whitehead had begun his work on the fourth, which was to have been concerned with the application of symbolic reasoning to the foundations of geometry and the problem of space. But by this time the scientific world had been captivated by the publication of the special and general theories of relativity by Einstein. These novelties naturally attracted Whitehead, who wrote several essays on the presuppositions of relativity. Whitehead was convinced that the principle and the method introduced by Einstein constituted a revolution in physical science, but found his explanation faulty.A series of three important "Nature" volumes introduced the philosophy of "Nature" as conceived by Whitehead, using his own interpretation of the meaning of the new relativity. A powerful method of analysis, called the Method of Extensive Abstraction and having as its purpose the definition of spatial and temporal entities so as to avoid a circularity of reasoning was born at this period. The third of the volumes was devoted entirely to the development of his own theory of relativity, to which the philosophically more satisfactory interpretation of relativity could be readily applied. From his original presuppositions Whitehead offered four alternative relativity theories, one of which coincided with Einstein's, and two of which were attempts at a unified field theory. The fourth, a theory of gravitation, used a physical element, the "impetus," instead of an infinitesimal metric element, as Einstein had done. This theory proved to be empirically less satisfactory than that of Einstein. But Professor George Temple generalized this fourth theory by using a space -time of positive uniform curvature, and results more satisfactory empirically than those of Einstein followed. The philosophical advantages of Whitehead's relativity were retained. This result seems to invite a more careful consideration of Temple's generalization of ;Whitehead's relativity than has been obtained at present.But by this time Whitehead's speculations, which took as their restricted field the area of nature in which mind was irrelevant, began to concentrate on the enlarged field of cosmological theory in its points of contact with metaphysics. The most important discovery he believed he had made was that in this enlarged area, all the more special physical and extensive properties of nature were dependent for their existence upon process.Now in his sixties, Whitehead accepted Harvard's invitation to a chair in philosophy. Within a very few years he returned to the United Kingdom to deliver the Gifford Lectures at the University of Edinburgh, in which the implications of adopting process as the central principle in the universe were systematically presented.One outstanding; feature of these lectures has been unfortunately ignored; it is a major and original suggestion of this thesis that the categoreal scheme of Process and Reality is really the axiomatic scheme of "On Mathematical Concepts of the Material World" generalized on the metaphysical level. An attempt at the application of the symbolic method to the axioms (categories of explanation and obligation) is made here. Thus the generalized problem in Process and Reality becomes, "Given a set of onto - logical existents and the operation of creativity, what axioms regarding the operation of creativity will have as their result that the more specialized discoveries of the humanities and the sciences follow from the properties of those entities forming the field of creativity?"These lectures, although they offered a comprehensive metaphysical system justifying the operation of physical field theories, suffered under the misfa' tune that they were given at just the time when the quantum mechanics revolution was precipitated in the physical sciences. From the point of view of quantum mechanics, therefore, the philosophy of organism does not supply a satisfactory cosmology within which it can operate. This is especially unfortunate in view of his possibly superior physical theory of relativity; possible points of expansion to allow for quantum mechanics are indicated, although they do violence to the base of the philosophy of organism.As the chief exemplification of the metaphysical principles, Whitehead postulated a brilliantly conceived metaphysical God who was important in physical cosmology. It is suggested that this metaphysical God is, nevertheless, inadequate to satisfy the demands of the religious conscience.Despite the originality of most of the elements introduced by Whitehead, a full understanding of his meaning and an appreciation of his novelties is possible only by referring his writings to their proper settings. Thus, the philosophy of organism is explained against the background of the process philosophies of Bergson, Alexander, and Horgan. Because of its many similarities in respect to the setting of the cosmological problem and the essentials of the solution to the Timaeus, a special chapter is devoted to the correspondence between the two. Whitehead's relativity and philosophy of Nature requires an understanding of the development of the theory of relativity, the world- models of the relativistic cosmologies, and the attempts at a unified field theory. Similarly, the memoir of 1905 is described in a more general back ground setting forth a broad picture of the state of geometry, physical science, and philosophy at the turn of the century.As a final reflection, certain presuppositions at the base of Whitehead's philosophy of organism are investigated and evaluated. The points believed by the present writer to be especially vulnerable in the philosophy of organism are exposed. An experiment in suggesting the prospectus of an alternative system which might avoid the difficulties, and incorporate the advantages of, the philosophy of organism, is made with the warning that it is no more than a suggestion.Throughout the thesis, certain dominant strains of "Ihitehead's thinking can be detected: the importance in his mind of the axiomatic -deductive method in the sciences; the realization that prevalent habits of thinking need to be altered by new discoveries, but are resisted; the conviction that the sciences must be ontologically centered; the faith in field theories; and the conviction that cosmology must be the search for the forms in the facts; to designate the more outstanding convictions

    Elastodynamics of Failure in a Continuum

    Get PDF
    A general treatment of the elastodynamics of failure in a prestressed elastic continuum is given, with particular emphasis on the geophysical aspects of the problem. The principal purpose of the study is to provide a physical model of the earthquake phenomenon, which yields an explicit description of the radiation field in terms of source parameters. The Green's tensor solution to the equations of motion in a medium with moving boundaries is developed. Using this representation theorem, and its specialization to the scalar case by means of potentials, it is shown that material failure in a continuum can be treated equivalently as a boundary value problem or as an initial value problem. The initial value representation is shown to be preferable for geophysical purposes, and the general solution for a growing and propagating rupture zone is given. The energy balance of the phenomenon is discussed with particular emphasis on the physical source of the radiated energy. It is also argued that the flow of energy is the controlling factor for the propagation and growth of a failure zone. Failure should then be viewed as a generalized phase change of the medium. The theory is applied to the simple case of a growing and propagating spherical failure zone. The model is investigated in detail both analytically and numerically. The analysis is performed in the frequency domain and the radiation fields are given in the form of multipolar expansions. The necessary theorems for the manipulation of such expansions for seismological purposes are proved, and their use discussed on the basis of simple examples. The more realistic ellipsoidal failure zone is investigated. The static problem of an arbitrary ellipsoidal inclusion under homogeneous stress of arbitrary orientation is solved. It is then shown how the analytical solution can be combined with numerical techniques to yield more realistic models. The conclusion is that this general approach yields a very flexible model which can be adapted to a wide variety of physical circumstances. In spite of the simplicity of the model, the predicted radiation field is rather complex; it is discussed as a function of source parameters, and scaling laws are derived which ease the interpretation of observed spectra. Preliminary results in the time domain are also shown. It is concluded that the model can be compared favorably both with the observations, and with results obtained from purely numerical models

    Social work with airports passengers

    Get PDF
    Social work at the airport is in to offer to passengers social services. The main methodological position is that people are under stress, which characterized by a particular set of characteristics in appearance and behavior. In such circumstances passenger attracts in his actions some attention. Only person whom he trusts can help him with the documents or psychologically

    The Principles of Mathematics

    Get PDF
    Published in 1903, this book was the first comprehensive treatise on the logical foundations of mathematics written in English. It sets forth, as far as possible without mathematical and logical symbolism, the grounds in favour of the view that mathematics and logic are identical. It proposes simply that what is commonly called mathematics are merely later deductions from logical premises. It provided the thesis for which _Principia Mathematica_ provided the detailed proof, and introduced the work of Frege to a wider audience. In addition to the new introduction by John Slater, this edition contains Russell's introduction to the 1937 edition in which he defends his position against his formalist and intuitionist critic

    Algorithms, abstraction and implementation : a massively multilevel theory of strong equivalence of complex systems

    Get PDF
    This thesis puts forward a formal theory of levels and algorithms to provide a foundation for those terms as they are used in much of cognitive science and computer science. Abstraction with respect to concreteness is distinguished from abstraction with respect to detail, resulting in three levels of concreteness and a large number of algorithmic levels, which are levels of detail and the primary focus of the theory. An algorithm or ideal machine is a set of sequences of states defining a particular level of detail. Rather than one fundamental ideal machine to describe the behaviour of a complex system, there are many possible ideal machines, extending Turing's approach to reflect the multiplicity of system descriptions required to express more than weak input-output equivalence of systems. Cognitive science is concerned with stronger equivalence; e.g., do two models go through the same states at some level of description? The state-based definition of algorithms serves as a basis for such strong equivalence and facilitates formal renditions of abstraction and implementation as relations between algorithms. It is possible to prove within the new framework whether or not one given algorithm is a valid implementation of another, or whether two unequal algorithms have a common abstraction, for example. Some implications of the theory are discussed, notably a characterisation of connectionist versus classical models

    The Significance of Evidence-based Reasoning in Mathematics, Mathematics Education, Philosophy, and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines
    corecore