6 research outputs found

    Are Numerical Symbols Fundamental to Neural Computation?

    Get PDF
    Abstract: Neuroclassicism is the view that cognition is computation and that core mental processes, such as perception, memory, and reasoning are products of digital computations realized in neural tissue. Cognitive psychologist C. R. Gallistel uses this classical framework to argue that all cognitive information processing is based on symbolic operations performed over quantitative values (i.e. numbers) stored in the brain, much like a digital computer. Assuming this hypothesis, he investigates how the brain stores quantitative information (i.e. the numerical symbols involved in neural computation). He claims that it is more plausible that memories for numbers are stored within molecular mechanisms inside the neuron, rather than within specific patterns of cell connectivity (the substrate for memory storage assumed by the traditional Hebbian plastic synapse model). In this paper, I dissect and critique Gallistel’s argument, which I find to be undermined by the findings of contemporary neuroscience

    A primacy code for odor identity

    Get PDF
    Humans can identify visual objects independently of view angle and lighting, words independently of volume and pitch, and smells independently of concentration. The computational principles underlying invariant object recognition remain mostly unknown. Here we propose that, in olfaction, a small and relatively stable set comprised of the earliest activated receptors forms a code for concentration-invariant odor identity. One prediction of this "primacy coding" scheme is that decisions based on odor identity can be made solely using early odor-evoked neural activity. Using an optogenetic masking paradigm, we define the sensory integration time necessary for odor identification and demonstrate that animals can use information occurring <100 ms after inhalation onset to identify odors. Using multi-electrode array recordings of odor responses in the olfactory bulb, we find that concentration-invariant units respond earliest and at latencies that are within this behaviorally-defined time window. We propose a computational model demonstrating how such a code can be read by neural circuits of the olfactory system

    Computing with Action Potentials

    Get PDF
    Most computational engineering based loosely on biology uses continuous variables to represent neural activity. Yet most neurons communicate with action potentials. The engineering view is equivalent to using a rate-code for representing information and for computing. An increasing number of examples are being discovered in which biology may not be using rate codes. Information can be represented using the timing of action potentials, and efficiently computed with in this representation. The &quot;analog match&quot; problem of odour identification is a simple problem which can be efficiently solved using action potential timing and an underlying rhythm. By using adapting units to effect a fundamental change of representation of a problem, we map the recognition of words (having uniform time-warp) in connected speech into the same analog match problem. We describe the architecture and preliminary results of such a recognition system. Using the fast events of biology in conjunction with an underlying rhythm is one way to overcome the limits of an eventdriven view of computation. When the intrinsic hardware is much faster than the time scale of change of inputs, this approach can greatly increase the effective computation per unit time on a given quantity of hardware

    Computing with Time: From Neural Networks to Short Title: Computing with Time Wireless Networks

    No full text
    This article advocates a new computing paradigm, called computing with time, that is capable of efficiently performing a certain class of computation, namely, searching in parallel for the closest value to the given parameter. It shares some features with the idea of computing with action potentials proposed by Hopfield, which originated in the field of artificial neuron network. The basic idea of computing with time is captured in a novel distributed algorithm based on broadcast communication called the Lecture Hall Algorithm, which can compute the minimum among n positive numbers using only O(1) messages. When applied to wireless communication network, the Lecture Hall Algorithm leads to an interesting routing protocol having several desirable features that are not acquired by intentional design
    corecore