329 research outputs found

    The cognitive neuroscience of visual working memory

    Get PDF
    Visual working memory allows us to temporarily maintain and manipulate visual information in order to solve a task. The study of the brain mechanisms underlying this function began more than half a century ago, with Scoville and Milner’s (1957) seminal discoveries with amnesic patients. This timely collection of papers brings together diverse perspectives on the cognitive neuroscience of visual working memory from multiple fields that have traditionally been fairly disjointed: human neuroimaging, electrophysiological, behavioural and animal lesion studies, investigating both the developing and the adult brain

    Walk of life:How brain state, spatial, and social context affect neural processing in rat perirhinal cortex, hippocampus, and sensory cortices

    Get PDF
    In this thesis I report the results of two in vivo electrophysiology experiments I have conducted during my PhD. During the first experiment, called Touch and See, we developed a new, expanded, micro recording device which allowed us to record from four areas of the rat brain simultaneously; primary visual cortex, barrel cortex, perirhinal cortex, and the CA1 subfield of the hippocampus. In this experiment we investigated how neurons communicate with each other, both within and between areas, and during different brain states; active, quiet, sleeping. We have found that: 1) Gamma oscillations are important for inter-areal communication, but intra-areal communication is mediated by slower brain rhythms. 2) Different types of excitatory and inhibitory neurons are activated during different parts of the gamma cycle. 3) Excitatory and inhibitory neurons are differentially involved in short- and long-range communication. 4) Neurons that were functionally coupled during execution of the task remain coupled even during post-task rest. 5) Finally, we report a new function of the perirhinal cortex in coding spatial segments of the environment. The second experiment was named the Rat Robot project. In this project we investigated whether place fields of rat hippocampal neurons can be used to not only track the position of the animal itself, but also to keep track of the position of other moving agents. We found that even though there was no overlap in firing fields of place cells for the rat and robot, movements of the robot did modulate the activity of rat place cells

    Utilizing microstimulation and local field potentials in the primary somatosensory and motor cortex

    Get PDF
    Brain-computer interfaces (BCIs) have advanced considerably from simple target detection by recording from a single neuron, to accomplishments like controlling a computer cursor accurately with neural activity from hundreds of neurons or providing instruction directly to the brain via microstimulation. However as BCIs continue to evolve, so do the challenges they face. Most BCIs rely on visual feedback, requiring sustained visual attention to use the device. As the role of BCIs expands beyond cursors moving on a computer screen to robotic hands picking up objects, there is increased need for an effective way to provide quick feedback independent of vision. Another challenge is utilizing all the signals available to produce the best decoding of movement possible. Local field potentials (LFPs) can be recorded at the same time as multi-unit activity (MUA) from multielectrode arrays but little is known in the area of what kind of information it possess, especially in relation to MUA. To tackle these issues, we preformed the following experiments. First, we examined the effectiveness of alternative forms of feedback applicable to BCIs, tactile stimuli delivered on the skin surface and microstimulation applied directly to the brain via the somatosensory cortex. To gauge effectiveness, we used a paradigm that captured a fundamental element of feedback: the ability to react to a stimulus while already in action. By measuring the response time to that stimulus, we were able to compare how well each modality could perform as a feedback stimulus. Second, we use regression and mutual information analyses to study how MUA, low-frequency LFP (15-40Hz, LFPL ), and high-frequency LFP (100-300Hz, LFPH) encoded reaching movements. The representation of kinematic parameters for direction, speed, velocity, and position were quantified and compared across these signals to be better applied in decoding models. Lastly, the results from these experiments could not have been accurately obtained without keeping careful account of the mechanical lags involved. Each of the stimuli affecting behavior had onset lags, which in some cases, varied greatly from trial to trial and could easily distorted timing effects if not accounted for. Special adaptations were constructed to precisely pinpoint display, system, and device onset lags

    Neuronal correlates of tactile working memory in rat barrel cortex and prefrontal cortex

    Get PDF
    The neuronal mechanisms of parametric working memory \u2013 the short-term storage of graded stimuli to guide behavior \u2013 are not fully elucidated. We have designed a working memory task where rats compare two sequential vibrations, S1 and S2, delivered to their whiskers (Fassihi et al, 2014). Vibrations are a series of velocities sampled from a zero-mean normal distribution. Rats must judge which stimulus had greater velocity standard deviation, \u3c3 (e.g. \u3c31 > \u3c32 turn left, \u3c31 < \u3c32 turn right). A critical operation in this task is to hold S1 information in working memory for subsequent comparison. In an earlier work we uncovered this cognitive capacity in rats (Fassihi et al, 2014), an ability previously ascribed only to primates. Where in the brain is such a memory kept and what is the nature of its representation? To address these questions, we performed simultaneous multi-electrode recordings from barrel cortex \u2013 the entryway of whisker sensory information into neocortex \u2013 and prelimbic area of medial prefrontal cortex (mPFC) which is involved in higher order cognitive functioning in rodents. During the presentation of S1 and S2, a majority of neurons in barrel cortex encoded the ongoing stimulus by monotonically modulating their firing rate as a function of \u3c3; i.e. 42% increased and 11% decreased their firing rate for progressively larger \u3c3 values. During the 2 second delay interval between the two stimuli, neuronal populations in barrel cortex kept a graded representation of S1 in their firing rate; 30% at early delay and 15% at the end. In mPFC, neurons expressed divers coding characteristics yet more than one-fourth of them varied their discharge rate according to the ongoing stimulus. Interestingly, a similar proportion carried the stimulus signal up to early parts of delay period. A smaller but considerable proportion (10%) kept the memory until the end of delay interval. We implemented novel information theoretic measures to quantify the stimulus and decision signals in neuronal responses in different stages of the task. By these measures, a decision signal was present in barrel cortex neurons during the S2 period and during the post stimulus delay, when the animal needed to postpone its action. Medial PFC units also represented animal choice, but later in the trial in comparison to barrel cortex. Decision signals started to build up in this area after the termination of S2. We implemented a regularized linear discriminant algorithm (RDA) to decode stimulus and decision signals in the population activity of barrel cortex and mPFC neurons. The RDA outperformed individual clusters and the standard linear discriminant analysis (LDA). The stimulus and animal\u2019s decision could be extracted from population activity simply by linearly weighting the responses of neuronal clusters. The population signal was present even in epochs of trial where no single cluster was informative. We predicted that coherent oscillations between brain areas might optimize the flow of information within the networks engaged by this task. Therefore, we quantified the phase synchronization of local field potentials in barrel cortex and mPFC. The two signals were coherent at theta range during S1 and S2 and, interestingly, prior to S1. We interpret the pre-stimulus coherence as reflecting top-down preparatory and expectation mechanisms. We showed, for the first time to our knowledge, the neuronal correlates of parametric working memory in rodents. The existence of both positive and negative codes in barrel cortex, besides the representation of stimulus memory and decision signals suggests that multiple functions might be folded into single modules. The mPFC also appears to be part of parametric working memory and decision making network in rats

    Contextual signals in visual cortex:How sounds, state, and task setting shape how we see

    Get PDF
    What we see is not always what we get. Even though the light that hits the retina might convey the same images, how visual information is processed and what we eventually do with it depend on many contextual factors. In this thesis, we show in a series of experiments how the sensory processing of the same visual input in the visual cortex of mice is affected by our internal state, movements, other senses and any task we are performing. We found that recurrent activity originating within higher visual areas modulates activity in the primary visual cortex (V1) and selectivity amplifies weak compared to strong sensory-evoked responses. Second, visual stimuli evoked similar early activity in V1, but later activity strongly depended on whether mice were trained to report the visual stimuli, and on the specific task. Specifically, adding a second modality to the task demands extended the temporal window during which V1 was causally involved in visual perception. Third, we report that not only visual stimuli but also sounds led to strong responses in V1, composed of distinct auditory-related and motor-related activity. Finally, we studied the role of Posterior Parietal Cortex in an audiovisual change detection task. Despite extensive single-neuron and population-level encoding of task-relevant visual and auditory stimuli, as well as upcoming behavioral responses, optogenetic inactivation did not affect task performance. Whereas these contextual factors have previously been studied in isolation, we obtain a more integrated understanding of how factors beyond visual information determine what we actually see

    Neural Network Dynamics of Visual Processing in the Higher-Order Visual System

    Get PDF
    Vision is one of the most important human senses that facilitate rich interaction with the external environment. For example, optimal spatial localization and subsequent motor contact with a specific physical object amongst others requires a combination of visual attention, discrimination, and sensory-motor coordination. The mammalian brain has evolved to elegantly solve this problem of transforming visual input into an efficient motor output to interact with an object of interest. The frontal and parietal cortices are two higher-order (i.e. processes information beyond simple sensory transformations) brain areas that are intimately involved in assessing how an animal’s internal state or prior experiences should influence cognitive-behavioral output. It is well known that activity within each region and functional interactions between both regions are correlated with visual attention, decision-making, and memory performance. Therefore, it is not surprising that impairment in the fronto-parietal circuit is often observed in many psychiatric disorders. Network- and circuit-level fronto-parietal involvement in sensory-based behavior is well studied; however, comparatively less is known about how single neuron activity in each of these areas can give rise to such macroscopic activity. The goal of the studies in this dissertation is to address this gap in knowledge through simultaneous recordings of cellular and population activity during sensory processing and behavioral paradigms. Together, the combined narrative builds on several themes in neuroscience: variability of single cell function, population-level encoding of stimulus properties, and state and context-dependent neural dynamics.Doctor of Philosoph

    Examining high level neural representations of cluttered scenes

    Get PDF
    Humans and other primates can rapidly categorize objects even when they are embedded in complex visual scenes (Thorpe et al., 1996; Fabre-Thorpe et al., 1998). Studies by Serre et al., 2007 have shown that the ability of humans to detect animals in brief presentations of natural images decreases as the size of the target animal decreases and the amount of clutter increases, and additionally, that a feedforward computational model of the ventral visual system, originally developed to account for physiological properties of neurons, shows a similar pattern of performance. Motivated by these studies, we recorded single- and multi-unit neural spiking activity from macaque superior temporal sulcus (STS) and anterior inferior temporal cortex (AIT), as a monkey passively viewed images of natural scenes. The stimuli consisted of 600 images of animals in natural scenes, and 600 images of natural scenes without animals in them, captured at four different viewing distances, and were the same images used by Serre et al. to allow for a direct comparison between human psychophysics, computational models, and neural data. To analyze the data, we applied population "readout" techniques (Hung et al., 2005; Meyers et al., 2008) to decode from the neural activity whether an image contained an animal or not. The decoding results showed a similar pattern of degraded decoding performance with increasing clutter as was seen in the human psychophysics and computational model results. However, overall the decoding accuracies from the neural data lower were than that seen in the computational model, and the latencies of information in IT were long (~125ms) relative to behavioral measures obtained from primates in other studies. Additional tests also showed that the responses of the model units were not capturing several properties of the neural responses, and that detecting animals in cluttered scenes using simple model units based on V1 cells worked almost as well as using more complex model units that were designed to model the responses of IT neurons. While these results suggest AIT might not be the primary brain region involved in this form of rapid categorization, additional studies are needed before drawing strong conclusions
    • …
    corecore