727,826 research outputs found

    The Role of Consciousness in Memory

    Get PDF
    Conscious events interact with memory systems in learning, rehearsal and retrieval (Ebbinghaus 1885/1964; Tulving 1985). Here we present hypotheses that arise from the IDA computional model (Franklin, Kelemen and McCauley 1998; Franklin 2001b) of global workspace theory (Baars 1988, 2002). Our primary tool for this exploration is a flexible cognitive cycle employed by the IDA computational model and hypothesized to be a basic element of human cognitive processing. Since cognitive cycles are hypothesized to occur five to ten times a second and include interaction between conscious contents and several of the memory systems, they provide the means for an exceptionally fine-grained analysis of various cognitive tasks. We apply this tool to the small effect size of subliminal learning compared to supraliminal learning, to process dissociation, to implicit learning, to recognition vs. recall, and to the availability heuristic in recall. The IDA model elucidates the role of consciousness in the updating of perceptual memory, transient episodic memory, and procedural memory. In most cases, memory is hypothesized to interact with conscious events for its normal functioning. The methodology of the paper is unusual in that the hypotheses and explanations presented are derived from an empirically based, but broad and qualitative computational model of human cognition

    A Quantitative Neural Coding Model of Sensory Memory

    Full text link
    The coding mechanism of sensory memory on the neuron scale is one of the most important questions in neuroscience. We have put forward a quantitative neural network model, which is self organized, self similar, and self adaptive, just like an ecosystem following Darwin theory. According to this model, neural coding is a mult to one mapping from objects to neurons. And the whole cerebrum is a real-time statistical Turing Machine, with powerful representing and learning ability. This model can reconcile some important disputations, such as: temporal coding versus rate based coding, grandmother cell versus population coding, and decay theory versus interference theory. And it has also provided explanations for some key questions such as memory consolidation, episodic memory, consciousness, and sentiment. Philosophical significance is indicated at last.Comment: 9 pages, 3 figure

    A Theory of the Acquisition of Episodic Memory

    Get PDF
    Case-based reasoning (CBR) has been viewed by many as just a methodology for building systems, but the foundations of CBR are psychological theories. Dynamic Memory (Schank, 1982) was the first attempt to describe a theory for learning in computers and people, based on particular forms of data structures and processes, that nowadays are widely used in a variety of forms in CBR. In addition to being useful for system building, CBR provides a way of discussing a range of issues concerned with cognition. This focus on the practical uses of CBR has deflected attention from the need to develop further the underlying theory. In particular, the issue of knowledge acquisition, in not adequately handled by the existing theory. This paper discusses this theoretical weakness and then proposes an enhanced model of learning which is compatible with the CBR paradigm

    The GIST of Concepts

    Get PDF
    A unified general theory of human concept learning based on the idea that humans detect invariance patterns in categorical stimuli as a necessary precursor to concept formation is proposed and tested. In GIST (generalized invariance structure theory) invariants are detected via a perturbation mechanism of dimension suppression referred to as dimensional binding. Structural information acquired by this process is stored as a compound memory trace termed an ideotype. Ideotypes inform the subsystems that are responsible for learnability judgments, rule formation, and other types of concept representations. We show that GIST is more general (e.g., it works on continuous, semi-continuous, and binary stimuli) and makes much more accurate predictions than the leading models of concept learning difficulty,such as those based on a complexity reduction principle (e.g., number of mental models,structural invariance, algebraic complexity, and minimal description length) and those based on selective attention and similarity (GCM, ALCOVE, and SUSTAIN). GIST unifies these two key aspects of concept learning and categorization. Empirical evidence from three\ud experiments corroborates the predictions made by the theory and its core model which we propose as a candidate law of human conceptual behavior

    Cognitive load theory, spacing effect, and working memory resources depletion: implications for instructional design

    Get PDF
    In classroom, student learning is affected by multiple factors that influence information processing. Working memory with its limited capacity and duration plays a key role in learner ability to process information and, therefore, is critical for student performance. Cognitive load theory, based on human cognitive architecture, focuses on the instructional implications of relations between working memory and learner knowledge base in long-term memory. The ultimate goal of this theory is to generate effective instructional methods that allow managing students' working memory load to optimize their learning, indicating the relations between the form of instructional design and the function of instructional design. This chapter considers recent additions to the theory based on working memory resources depletion that occurs after exerting significant cognitive effort and reverses after a rest period. The discussed implications for instructional design include optimal sequencing of learning and assessment tasks using spaced and massed practice tasks, immediate and delayed tests

    Slides with English text that are explained in Persian

    Get PDF
    The common pattern of presentation in the Iranian medical community is lengthy English text in slides that are presented orally in Farsi, both in conferences and classrooms. In this paper, we aim to further explore this phenomenon based on a theory in the domain of cognitive science named the cognitive load theory (CLT). According to Atkinson and Shiffrin's model introduced in 1968, human memory consists of three parts: sensory memory, working memory, and long-term memory. Information first enters the sensory memory, and if received adequate attention and reaches the level of consciousness, it enters the working memory, which, unlike the other two memories, i.e, sensory and long-term memory, has a limited capacity (1). Interestingly, working memory has two separate and independent channels for processing visual and auditory information with a limited and predetermined capacity (dual-channel theory). As a result, the speed of learning in humans restricts (2). In 1988, Sweller proposed a theory of learning called the CLT, in which the three key components of the cognitive structure, i.e. memory systems, learning processes, and types of the cognitive load imposed on the working memory, were merged. According to this theory, because of the limited capacity of the working memory, any factor that imposes an excessive load on this memory will disrupt the learning process (2). Here three types of loads are introduced: 1. Intrinsic load is related to the task. The more complex the information that must be processed by the working memory, the greater the load imposes. – Cont

    Computing fuzzy rough approximations in large scale information systems

    Get PDF
    Rough set theory is a popular and powerful machine learning tool. It is especially suitable for dealing with information systems that exhibit inconsistencies, i.e. objects that have the same values for the conditional attributes but a different value for the decision attribute. In line with the emerging granular computing paradigm, rough set theory groups objects together based on the indiscernibility of their attribute values. Fuzzy rough set theory extends rough set theory to data with continuous attributes, and detects degrees of inconsistency in the data. Key to this is turning the indiscernibility relation into a gradual relation, acknowledging that objects can be similar to a certain extent. In very large datasets with millions of objects, computing the gradual indiscernibility relation (or in other words, the soft granules) is very demanding, both in terms of runtime and in terms of memory. It is however required for the computation of the lower and upper approximations of concepts in the fuzzy rough set analysis pipeline. Current non-distributed implementations in R are limited by memory capacity. For example, we found that a state of the art non-distributed implementation in R could not handle 30,000 rows and 10 attributes on a node with 62GB of memory. This is clearly insufficient to scale fuzzy rough set analysis to massive datasets. In this paper we present a parallel and distributed solution based on Message Passing Interface (MPI) to compute fuzzy rough approximations in very large information systems. Our results show that our parallel approach scales with problem size to information systems with millions of objects. To the best of our knowledge, no other parallel and distributed solutions have been proposed so far in the literature for this problem
    • …
    corecore