14,399 research outputs found

    Cognitive architectures as Lakatosian research programmes: two case studies

    Get PDF
    Cognitive architectures - task-general theories of the structure and function of the complete cognitive system - are sometimes argued to be more akin to frameworks or belief systems than scientific theories. The argument stems from the apparent non-falsifiability of existing cognitive architectures. Newell was aware of this criticism and argued that architectures should be viewed not as theories subject to Popperian falsification, but rather as Lakatosian research programs based on cumulative growth. Newell's argument is undermined because he failed to demonstrate that the development of Soar, his own candidate architecture, adhered to Lakatosian principles. This paper presents detailed case studies of the development of two cognitive architectures, Soar and ACT-R, from a Lakatosian perspective. It is demonstrated that both are broadly Lakatosian, but that in both cases there have been theoretical progressions that, according to Lakatosian criteria, are pseudo-scientific. Thus, Newell's defense of Soar as a scientific rather than pseudo-scientific theory is not supported in practice. The ACT series of architectures has fewer pseudo-scientific progressions than Soar, but it too is vulnerable to accusations of pseudo-science. From this analysis, it is argued that successive versions of theories of the human cognitive architecture must explicitly address five questions to maintain scientific credibility

    Instructional strategies and tactics for the design of introductory computer programming courses in high school

    Get PDF
    This article offers an examination of instructional strategies and tactics for the design of introductory computer programming courses in high school. We distinguish the Expert, Spiral and Reading approach as groups of instructional strategies that mainly differ in their general design plan to control students' processing load. In order, they emphasize topdown program design, incremental learning, and program modification and amplification. In contrast, tactics are specific design plans that prescribe methods to reach desired learning outcomes under given circumstances. Based on ACT* (Anderson, 1983) and relevant research, we distinguish between declarative and procedural instruction and present six tactics which can be used both to design courses and to evaluate strategies. Three tactics for declarative instruction involve concrete computer models, programming plans and design diagrams; three tactics for procedural instruction involve worked-out examples, practice of basic cognitive skills and task variation. In our evaluation of groups of instructional strategies, the Reading approach has been found to be superior to the Expert and Spiral approaches

    Declarative Specification

    Get PDF
    Deriving formal specifications from informal requirements is extremely difficult since one has to overcome the conceptual gap between an application domain and the domain of formal specification methods. To reduce this gap we introduce application-specific specification languages, i.e., graphical and textual notations that can be unambiguously mapped to formal specifications in a logic language. We describe a number of realised approaches based on this idea, and evaluate them with respect to their domain specificity vs. generalit

    Experience, expertise and expert-performance research in public accounting

    Get PDF
    Bibliography: p. [22-25]

    S-Net for multi-memory multicores

    Get PDF
    Copyright ACM, 2010. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the 5th ACM SIGPLAN Workshop on Declarative Aspects of Multicore Programming: http://doi.acm.org/10.1145/1708046.1708054S-Net is a declarative coordination language and component technology aimed at modern multi-core/many-core architectures and systems-on-chip. It builds on the concept of stream processing to structure dynamically evolving networks of communicating asynchronous components. Components themselves are implemented using a conventional language suitable for the application domain. This two-level software architecture maintains a familiar sequential development environment for large parts of an application and offers a high-level declarative approach to component coordination. In this paper we present a conservative language extension for the placement of components and component networks in a multi-memory environment, i.e. architectures that associate individual compute cores or groups thereof with private memories. We describe a novel distributed runtime system layer that complements our existing multithreaded runtime system for shared memory multicores. Particular emphasis is put on efficient management of data communication. Last not least, we present preliminary experimental data

    Phoneme Recognition Using Acoustic Events

    Get PDF
    This paper presents a new approach to phoneme recognition using nonsequential sub--phoneme units. These units are called acoustic events and are phonologically meaningful as well as recognizable from speech signals. Acoustic events form a phonologically incomplete representation as compared to distinctive features. This problem may partly be overcome by incorporating phonological constraints. Currently, 24 binary events describing manner and place of articulation, vowel quality and voicing are used to recognize all German phonemes. Phoneme recognition in this paradigm consists of two steps: After the acoustic events have been determined from the speech signal, a phonological parser is used to generate syllable and phoneme hypotheses from the event lattice. Results obtained on a speaker--dependent corpus are presented.Comment: 4 pages, to appear at ICSLP'94, PostScript version (compressed and uuencoded

    Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    Get PDF
    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained provenance information based on the workflow provenance graph and the availability of data. To broaden the scope of our approach and demonstrate its viability, in this paper we extend it beyond procedural languages, to be used for purely declarative languages such as logic programming under the stable model semantics. For experiments and validation, we use the Answer Set Programming solver oClingo, which makes it possible to formulate and solve stream reasoning problems in a purely declarative fashion. We demonstrate how the benefits of the provenance inference over the explicit provenance still holds in a declarative setting, and we briefly discuss the potential impact for declarative programming, in particular for instance-driven debugging of the model in declarative problem solving
    corecore