20 research outputs found

    Neural network mechanisms of working memory interference

    Get PDF
    [eng] Our ability to memorize is at the core of our cognitive abilities. How could we effectively make decisions without considering memories of previous experiences? Broadly, our memories can be divided in two categories: long-term and short-term memories. Sometimes, short-term memory is also called working memory and throughout this thesis I will use both terms interchangeably. As the names suggest, long-term memory is the memory you use when you remember concepts for a long time, such as your name or age, while short-term memory is the system you engage while choosing between different wines at the liquor store. As your attention jumps from one bottle to another, you need to hold in memory characteristics of previous ones to pick your favourite. By the time you pick your favourite bottle, you might remember the prices or grape types of the other bottles, but you are likely to forget all of those details an hour later at home, opening the wine in front of your guests. The overall goal of this thesis is to study the neural mechanisms that underlie working memory interference, as reflected in quantitative, systematic behavioral biases. Ultimately, the goal of each chapter, even when focused exclusively on behavioral experiments, is to nail down plausible neural mechanisms that can produce specific behavioral and neurophysiological findings. To this end, we use the bump-attractor model as our working hypothesis, with which we often contrast the synaptic working memory model. The work performed during this thesis is described here in 3 main chapters, encapsulation 5 broad goals: In Chapter 4.1, we aim at testing behavioral predictions of a bump-attractor (1) network when used to store multiple items. Moreover, we connected two of such networks aiming to model feature-binding through selectivity synchronization (2). In Chapter 4.2, we aim to clarify the mechanisms of working memory interference from previous memories (3), the so-called serial biases. These biases provide an excellent opportunity to contrast activity-based and activity-silent mechanisms because both mechanisms have been proposed to be the underlying cause of those biases. In Chapter 4.3, armed with the same techniques used to seek evidence for activity-silent mechanisms, we test a prediction of the bump-attractor model with short-term plasticity (4). Finally, in light of the results from aim 4 and simple computer simulations, we reinterpret previous studies claiming evidence for activity-silent mechanisms (5)

    Population analysis of neural data -- developments in statistical methods and related computational models

    Get PDF
    A key goal of neuroscience is to understand how the remarkable computational abilities of our brain emerge as a result of interconnected neuronal populations. Recently, advances in technologies for recording neural activity have increased the number of simultaneously recorded neurons by orders of magnitude, and these technologies are becoming more widely adopted. At the same time, massive increases in computational power and improved algorithms have enabled advanced statistical analyses of neural population activity and promoted our understanding of population coding. Nevertheless, there are many unanswered emerging questions, when it comes to analyzing and interpreting neural recordings. There are two major parts to this study. First, we consider an issue of increasing importance: that many in vivo recordings are now made by calcium-dependent fluorescent imaging, which only indirectly reports neural activity. We compare measurements of extracellular single units with fluorescence changes extracted from single neurons (often used as a proxy for spike rates), both recorded from cortical neural populations of behaving mice. We perform identical analyses at the single cell level and population level, and compare the results, uncovering a number of differences, or biases. We propose a phenomenological model to transform spike trains into synthetic imaging data and test whether the transformation explains the biases found. We discover that the slow temporal dynamics of calcium imaging obscure rapid changes in neuronal selectivity and disperse dynamic features in time. As a result, spike rate modulation that is locked to temporally localized events can appear as a more sequence-like pattern of activity in the imaging data. In addition, calcium imaging is more sensitive to increases rather than decreases in spike rate, leading to biased estimates of neural selectivity. These biases need to be considered when interpreting calcium imaging data. The second part of this work embarks on a challenging yet fruitful study of latent variable analysis of simultaneously recorded neural activity in a decision-making task. To connect the neural dynamics in different stages of a decision-making task, we developed a time-varying latent dynamics system model that uncovers neural dynamics shared by neurons in a local decision-making circuit. The shared neural activity supports the dynamics of choice generation and memory in a fashion akin to drift diffusion models, and robustly maintains a decision signal in the post-decision period. Importantly, we find that error trials follow similar dynamics to those of correct trials, but their dynamics are separated in shared neural activity space, proving a more correct early decoding estimation of an animal's success or failure at a given trial. Overall, the shared neural activity dynamics can predict multiple measures of behavioral variability including performance, reaction time, and trial correctness, and therefore are a useful summary of the neural representation. Such an approach can be readily applied to study complex dynamics in other neural systems. In summary, this dissertation represents an important step towards developing model-based analysis of neuronal dynamics and understanding population codes in large-scale neural data

    Cognitive and Neural Map Representations in Schizophrenia

    Get PDF
    An ability to build structured cognitive maps of the world may lie at the heart of understanding cognitive features of schizophrenia. In rodents, cognitive map representations are supported by sequential hippocampal place cell reactivations during rest (offline), known as replay. These events occur in the context of local high frequency ripple oscillations, and whole-brain default mode network (DMN) activation. Genetic mouse models of schizophrenia also report replay and ripple abnormalities. Here, I investigate the behavioural and neural signatures of structured internal representations in people with a diagnosis of schizophrenia (PScz, n = 29) and matched control participants (n = 28) using magnetoencephalography (MEG). Participants were asked to infer correct sequential relationships between task pictures by applying a pre-learned task template to visual experiences containing these pictures. In Chapter 3 I show that, during a post-task rest session, controls exhibited fast spontaneous neural reactivation of task state representations that replayed inferred relationships. Replay was coincident with increased ripple power in hippocampus, which may be related to NMDAR availability (Chapter 4). PScz showed both reduced replay and augmented ripple power, convergent with genetic mouse models. These abnormalities were linked to impairments in behavioural acquisition of task structure, and to its subsequent representation in visually evoked neural responses. In Chapter 5 I explore the temporal coupling between replay onsets and DMN activation. I show an impairment in this association in PScz, which related to subsequent mnemonic maintenance of learned task structure, complementing previous reports of DMN abnormalities in the condition. Finally, in Chapter 6, using a separate verbal fluency task, I show that PScz exhibit evidence of reduced use of (semantic) associative information when sampling concepts from memory. Together, my results provide support for a hypothesis that schizophrenia is associated with abnormalities in neural and behavioural correlates of cognitive map representation

    Goals and information processing in human decisions

    Get PDF
    We do not make decisions in the void. Every day, we act in awareness of our context, adjusting our objectives according to the situations we find. Operating effectively under multiple goals is fundamental for appropriate learning and decision-making, and deficiencies in this capacity can be at the core of mental disorders such as anxiety, depression, or post-traumatic stress disorder. In this thesis, I present studies I conducted to investigate how goals impact different stages of the decision process, from simple perceptual choices to subjective value preferences. Previous studies have described how animals assess alternatives and integrate evidence to make decisions. Most of the time, the focus of this work has been on simplified scenarios with single goals. In this thesis, my experiments tackle the issue of how people adjust information processing in tasks that demand more than one objective. Through various manipulations of the behavioural goals, such as decision framing, I show that (i) attention and evidence accumulation, (ii) brain representations, and (iii) decision confidence were all affected by context changes. Using behavioural testing, computational models, and neuroimaging I show that goals have a crucial role in evidence integration and the allocation of visual attention. My findings indicate that brain patterns adapt to enhance goal-relevant information during learning and the valuation of alternatives. Finally, I report the presence of goal-dependent asymmetries in the generation of decision confidence, overweighting the evidence of the most-relevant option to fulfil the goal. In conclusion, I show how the entire process is highly flexible and serves the behavioural demands. These findings support the reinterpretation of some perspectives, such as reported biases and irrationalities in decisions, as attributes of adaptive processing towards goal fulfilment

    27th Annual Computational Neuroscience Meeting (CNS*2018): Part One

    Get PDF

    Quantization in acquisition and computation networks

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 151-165).In modern systems, it is often desirable to extract relevant information from large amounts of data collected at different spatial locations. Applications include sensor networks, wearable health-monitoring devices and a variety of other systems for inference. Several existing source coding techniques, such as Slepian-Wolf and Wyner-Ziv coding, achieve asymptotic compression optimality in distributed systems. However, these techniques are rarely used in sensor networks because of decoding complexity and prohibitively long code length. Moreover, the fundamental limits that arise from existing techniques are intractable to describe for a complicated network topology or when the objective of the system is to perform some computation on the data rather than to reproduce the data. This thesis bridges the technological gap between the needs of real-world systems and the optimistic bounds derived from asymptotic analysis. Specifically, we characterize fundamental trade-offs when the desired computation is incorporated into the compression design and the code length is one. To obtain both performance guarantees and achievable schemes, we use high-resolution quantization theory, which is complementary to the Shannon-theoretic analyses previously used to study distributed systems. We account for varied network topologies, such as those where sensors are allowed to collaborate or the communication links are heterogeneous. In these settings, a small amount of intersensor communication can provide a significant improvement in compression performance. As a result, this work suggests new compression principles and network design for modern distributed systems. Although the ideas in the thesis are motivated by current and future sensor network implementations, the framework applies to a wide range of signal processing questions. We draw connections between the fidelity criteria studied in the thesis and distortion measures used in perceptual coding. As a consequence, we determine the optimal quantizer for expected relative error (ERE), a measure that is widely useful but is often neglected in the source coding community. We further demonstrate that applying the ERE criterion to psychophysical models can explain the Weber-Fechner law, a longstanding hypothesis of how humans perceive the external world. Our results are consistent with the hypothesis that human perception is Bayesian optimal for information acquisition conditioned on limited cognitive resources, thereby supporting the notion that the brain is efficient at acquisition and adaptation.by John Z. Sun.Ph.D

    Rhythmic fluctuations in mnemonic signatures during associative recall

    Get PDF
    The intricate linking of information processing and neural representations to the underlying hippocampal neural rhythms during episodic memory retrieval are yet to be fully explored in human subjects. In this doctoral thesis, the temporal order of these relationships was investigated, with emphasis on how the processes evolve and take place over time. Empirical evidence and neural network models suggest that hippocampus and the hippocampal theta rhythm play a central role in episodic memory. In the first two studies, different oscillatory dynamics in the hippocampal circuit thought to provide optimal states for encoding and retrieval were investigated. The third study investigated the role of the hippocampal theta oscillation as an adaptive mechanism in regulating competition between similar memories. And lastly, the fourth study investigated sharp-wave ripples in promoting successful episodic memory retrieval. Throughout the four chapters, memory content was decoded using multivariate pattern classification, and the timing of memory reactivation was linked to two prominent oscillatory brain signatures: the hippocampal theta rhythm on the one hand, and hippocampal sharp-wave ripples on the other. In sum, this doctoral thesis provides support for the powerful computations along the hippocampal theta oscillation, and the close interplay between hippocampus and neocortical areas, foremost at time of retrieva
    corecore