21 research outputs found

    Comparison and Mapping Facilitate Relation Discovery and Predication

    Get PDF
    Relational concepts play a central role in human perception and cognition, but little is known about how they are acquired. For example, how do we come to understand that physical force is a higher-order multiplicative relation between mass and acceleration, or that two circles are the same-shape in the same way that two squares are? A recent model of relational learning, DORA (Discovery of Relations by Analogy; Doumas, Hummel & Sandhofer, 2008), predicts that comparison and analogical mapping play a central role in the discovery and predication of novel higher-order relations. We report two experiments testing and confirming this prediction

    A mechanism for the cortical computation of hierarchical linguistic structure

    Get PDF
    Biological systems often detect species-specific signals in the environment. In humans, speech and language are species-specific signals of fundamental biological importance. To detect the linguistic signal, human brains must form hierarchical representations from a sequence of perceptual inputs distributed in time. What mechanism underlies this ability? One hypothesis is that the brain repurposed an available neurobiological mechanism when hierarchical linguistic representation became an efficient solution to a computational problem posed to the organism. Under such an account, a single mechanism must have the capacity to perform multiple, functionally related computations, e.g., detect the linguistic signal and perform other cognitive functions, while, ideally, oscillating like the human brain. We show that a computational model of analogy, built for an entirely different purpose—learning relational reasoning—processes sentences, represents their meaning, and, crucially, exhibits oscillatory activation patterns resembling cortical signals elicited by the same stimuli. Such redundancy in the cortical and machine signals is indicative of formal and mechanistic alignment between representational structure building and “cortical” oscillations. By inductive inference, this synergy suggests that the cortical signal reflects structure generation, just as the machine signal does. A single mechanism—using time to encode information across a layered network—generates the kind of (de)compositional representational hierarchy that is crucial for human language and offers a mechanistic linking hypothesis between linguistic representation and cortical computatio

    Order of presentation effects in learning color categories

    No full text

    Learning color words involves learning a system of mappings

    No full text
    corecore