43 research outputs found

    A better life through information technology? The techno-theological eschatology of posthuman speculative science

    Get PDF
    This is the pre-peer reviewed version of the article, published in Zygon 41(2) pp.267-288, which has been published in final form at http://www3.interscience.wiley.com/journal/118588124/issueThe depiction of human identity in the pop-science futurology of engineer/inventor Ray Kurzweil, the speculative-robotics of Carnegie Mellon roboticist Hans Moravec and the physics of Tulane University mathematics professor Frank Tipler elevate technology, especially information technology, to a point of ultimate significance. For these three figures, information technology offers the potential means by which the problem of human and cosmic finitude can be rectified. Although Moravec’s vision of intelligent robots, Kurzweil’s hope for immanent human immorality, and Tipler’s description of human-like von Neumann probe colonising the very material fabric of the universe, may all appear to be nothing more than science fictional musings, they raise genuine questions as to the relationship between science, technology, and religion as regards issues of personal and cosmic eschatology. In an attempt to correct what I see as the ‘cybernetic-totalism’ inherent in these ‘techno-theologies’, I will argue for a theology of technology, which seeks to interpret technology hermeneutically and grounds human creativity in the broader context of divine creative activity

    The effect of multiple internal representations on context rich instruction

    Get PDF
    This paper presents n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging studies suggesting the independence of information processing along different modalities: verbal, visual, kinesthetic, social, etc. A study testing the effectiveness of the n-coding construct in an algebra-based mechanics course is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional instruction section used as a control group, each of the remaining three treatment sections were given context rich problems following the 'cooperative group problem solving' approach which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem solving skills were assessed as was conceptual learning using the Force Concept Inventory. However, a number of new measures taking into account students' confidence in concepts were developed to complete the picture of student learning. Results suggest that using the developed n-coding construct to design context rich environments can generate learning gains in problem solving, conceptual knowledge and concept-confidence.Comment: Submitted to the American Journal of Physic

    Linear Decision and Learning Models

    No full text
    This memorandum is a first draft of an essay on the simplest "learning" process. Comments are invited. Subsequent sections will treat, among other things: the "stimulus-sampling" model of Estes, relations between Perceptron-type error, reinforcement and Bayesian-type correlation reinforcement and some other statistical methods viewed in the same way

    Artificial Intelligence

    No full text

    Perceptrons: an introduction to computational geometry

    No full text

    Book Review: Marvin L. Minsky and Seymour A. Papert. Perceptrons: An Introduction to. . .

    No full text
    Introduction to Computational Geometry, Expanded Edition. Cambridge, MA: MIT Press, 1988. 292pp. Reviewed by Jordan B. Pollack The Authors are professors at the Massachusetts Institute of Technology, Minsky in Electrical Engineering and Papert in Applied Mathematics. Minsky's major research interests are artificial intelligence (AI) and theories of computation, and Papert's are cybernetics and child development The Reviewer received his Ph.D. in Computer Science from the University of Illinois in 1987, and is currently an Assistant Professor of Computer and Information Science at the Ohio State University. His research involves the use of connectionist models in traditional AI tasks. Introduction By 1969, the date of the publication of Perceptrons, AI was not operating in an ivory-tower vacuum. Money was at stake. And while this pressured the field into a preference for shortterm achievement, it also put a premium on claims that the sp

    The halting problem for Turning machines

    No full text
    corecore