236 research outputs found

    Computational aerodynamics and artificial intelligence

    Get PDF
    The general principles of artificial intelligence are reviewed and speculations are made concerning how knowledge based systems can accelerate the process of acquiring new knowledge in aerodynamics, how computational fluid dynamics may use expert systems, and how expert systems may speed the design and development process. In addition, the anatomy of an idealized expert system called AERODYNAMICIST is discussed. Resource requirements for using artificial intelligence in computational fluid dynamics and aerodynamics are examined. Three main conclusions are presented. First, there are two related aspects of computational aerodynamics: reasoning and calculating. Second, a substantial portion of reasoning can be achieved with artificial intelligence. It offers the opportunity of using computers as reasoning machines to set the stage for efficient calculating. Third, expert systems are likely to be new assets of institutions involved in aeronautics for various tasks of computational aerodynamics

    A survey of functional programming language principles

    Get PDF
    Research in the area of functional programming languages has intensified in the 8 years since John Backus' Turing Award Lecture on the topic was published. The purpose of this paper is to present a survey of the ideas of functional programming languages. The paper assumes the reader is comfortable with mathematics and has knowledge of the basic principles of traditional programming languages, but does not assume any prior knowledge of the ideas of functional languages. A simple functional language is defined and used to illustrate the basic ideas. Topics discussed include the reasons for developing functional languages, methods of expressing concurrency, the algebra of functional programming languages, program transformation techniques, and implementations of functional languages. Existing functional languages are also mentioned. The paper concludes with the author's opinions as to the future of functional languages. An annotated bibliography on the subject is also included

    The MUSE Machine -- an Architecture for Structured Data Flow Computation

    Get PDF
    Computers employing some degree of data flow organisation are now well established as providing a possible vehicle for concurrent computation. Although data-driven computation frees the architecture from the constraints of the single program counter, processor and global memory, inherent in the classic von Neumann computer, there can still be problems with the unconstrained generation of fresh result tokens if a pure data flow approach is adopted. The advantages of allowing serial processing for those parts of a program which are inherently serial, and of permitting a demand-driven, as well as data-driven, mode of operation are identified and described. The MUSE machine described here is a structured architecture supporting both serial and parallel processing which allows the abstract structure of a program to be mapped onto the machine in a logical way

    The development and use of variables in mathematics and computer science

    Get PDF
    There are a wide variety of uses of variables in mathematics which we cope with in practice through conventions and tacit assumptions. Experience with computers has made us articulate, criticise and develop these assumptions much more carefully. Historically the term 'variable quantity' was introduced in the context of describing and calculating changing quantities which corresponded to phenomena in the observable world (e.g. the velocity of fluxion of a body moving under the inverse square law). The evolution of the concept has divorced it from these routes of reference and required us to establish the formal apparatus of interpretation and valuation. While the changes considered are highly structured this may be satisfactory, but computing power invites us to cope with change in vastly more complex, unstructured situations such as in simulation of 'real world' processes. We relate this challenge to the distinctive differences in the use of variables in mathematics and practical computing, and we develop a general framework in which all uses of variables can be described in a unified way

    An exercise in transformational programming: Backtracking and Branch-and-Bound

    Get PDF
    We present a formal derivation of program schemes that are usually called Backtracking programs and Branch-and-Bound programs. The derivation consists of a series of transformation steps, specifically algebraic manipulations, on the initial specification until the desired programs are obtained. The well-known notions of linear recursion and tail recursion are extended, for structures, to elementwise linear recursion and elementwise tail recursion; and a transformation between them is derived too

    Extended logic-plus-functional programming

    Get PDF
    Extensions of logic and functional programming are integrated in RELFUN. Its valued clauses comprise Horn clauses (true\u27-valued) and clauses with a distinguished foot\u27 premise (returning arbitrary values). Both the logic and functional components permit LISP-like varying-arity and higher-order operators. The DATAFUN sublanguage of the functional component is shown to be preferable to relational encodings of functions in DATALOG. RELFUN permits non-ground, non-deterministic functions, hence certain functions can be inverted using an is\u27-primitive generalizing that of PROLOG. For function nestings a strict call-by-value strategy is employed. The reduction of these extensions to a relational sublanguage is discussed and their WAM compilation is sketched. Three examples (serialise\u27, wang\u27, and eval\u27) demonstrate the relational/functional style in use. The list expressions of RELFUN\u27s LISP implementation are presented in an extended PROLOG-like syntax

    Highly concurrent vs. control flow computing models

    Get PDF
    Typescript (photocopy)

    A scalable multi-core architecture with heterogeneous memory structures for Dynamic Neuromorphic Asynchronous Processors (DYNAPs)

    Full text link
    Neuromorphic computing systems comprise networks of neurons that use asynchronous events for both computation and communication. This type of representation offers several advantages in terms of bandwidth and power consumption in neuromorphic electronic systems. However, managing the traffic of asynchronous events in large scale systems is a daunting task, both in terms of circuit complexity and memory requirements. Here we present a novel routing methodology that employs both hierarchical and mesh routing strategies and combines heterogeneous memory structures for minimizing both memory requirements and latency, while maximizing programming flexibility to support a wide range of event-based neural network architectures, through parameter configuration. We validated the proposed scheme in a prototype multi-core neuromorphic processor chip that employs hybrid analog/digital circuits for emulating synapse and neuron dynamics together with asynchronous digital circuits for managing the address-event traffic. We present a theoretical analysis of the proposed connectivity scheme, describe the methods and circuits used to implement such scheme, and characterize the prototype chip. Finally, we demonstrate the use of the neuromorphic processor with a convolutional neural network for the real-time classification of visual symbols being flashed to a dynamic vision sensor (DVS) at high speed.Comment: 17 pages, 14 figure
    • ā€¦
    corecore