22 research outputs found

    Structure and limits of unconscious episodic memory

    Get PDF

    Larger capacity for unconscious versus conscious episodic memory

    Get PDF
    Episodic memory is the memory for experienced events. A peak competence of episodic memory is the mental combination of events to infer commonalities. Inferring commonalities may proceed with and without consciousness of events. Yet what distinguishes conscious from unconscious inference? This question inspired nine experiments that featured strongly and weakly masked cartoon clips presented for unconscious and conscious inference. Each clip featured a scene with a visually impenetrable hiding place. Five animals crossed the scene one-by-one consecutively. One animal trajectory represented one event. The animals moved through the hiding place, where they might linger or not. The participants' task was to observe the animals' entrances and exits to maintain a mental record of which animals hid simultaneously. We manipulated information load to explore capacity limits. Memory of inferences was tested immediately, 3.5 or 6 min following encoding. The participants retrieved inferences well when encoding was conscious. When encoding was unconscious, the participants needed to respond intuitively. Only habitually intuitive decision makers exhibited a significant delayed retrieval of inferences drawn unconsciously. Their unconscious retrieval performance did not drop significantly with increasing information load, while conscious retrieval performance dropped significantly. A working memory network, including hippocampus, was activated during both conscious and unconscious inference and correlated with retrieval success. An episodic retrieval network, including hippocampus, was activated during both conscious and unconscious retrieval of inferences and correlated with retrieval success. Only conscious encoding/retrieval recruited additional brain regions outside these networks. Hence, levels of consciousness influenced the memories' behavioral impact, memory capacity, and the neural representational code

    Acoustic stimulation during sleep predicts long-lasting increases in memory performance and beneficial amyloid response in older adults.

    Get PDF
    BACKGROUND Sleep and neurodegeneration are assumed to be locked in a bi-directional vicious cycle. Improving sleep could break this cycle and help to prevent neurodegeneration. We tested multi-night phase-locked acoustic stimulation (PLAS) during slow wave sleep (SWS) as a non-invasive method to improve SWS, memory performance and plasma amyloid levels. METHODS 32 healthy older adults (agemean: 68.9) completed a between-subject sham-controlled three-night intervention, preceded by a sham-PLAS baseline night. RESULTS PLAS induced increases in sleep-associated spectral-power bands as well as a 24% increase in slow wave-coupled spindles, known to support memory consolidation. There was no significant group-difference in memory performance or amyloid-beta between the intervention and control group. However, the magnitude of PLAS-induced physiological responses were associated with memory performance up to 3 months post intervention and beneficial changes in plasma amyloid. Results were exclusive to the intervention group. DISCUSSION Multi-night PLAS is associated with long-lasting benefits in memory and metabolite clearance in older adults, rendering PLAS a promising tool to build upon and develop long-term protocols for the prevention of cognitive decline

    The hierarchy of coupled sleep oscillations reverses with aging in humans.

    Get PDF
    A well-orchestrated coupling hierarchy of slow waves and spindles during slow wave sleep supports memory consolidation. In old age, duration of slow wave sleep and number of coupling events decreases. The coupling hierarchy deteriorates, predicting memory loss and brain atrophy. Here, we investigate the dynamics of this physiological change in slow wave-spindle coupling in a frontocentral electroencephalography position in a large sample (N=340, 237 female, 103 male) spanning most of the human lifespan (ages 15-83). We find that, instead of changing abruptly, spindles gradually shift from being driven by-, to driving slow waves with age, reversing the coupling hierarchy typically seen in younger brains. Reversal was stronger the lower the slow wave frequency, and starts around midlife (∌age 40-48), with an established reversed hierarchy at age 56-83. Notably, coupling strength remains unaffected by age. In older adults, deteriorating slow wave-spindle coupling, measured using phase slope index (PSI) and number of coupling events, is associated with blood plasma glial fibrillary acidic protein (GFAP) levels, a marker for astrocyte activation. Data-driven models suggest decreased sleep time and higher age lead to fewer coupling events, paralleled by increased astrocyte activation. Counterintuitively, astrocyte activation is associated with a back-shift of the coupling hierarchy (PSI) towards a "younger" status along with increased coupling occurrence and strength, potentially suggesting compensatory processes. As the changes in coupling hierarchy occur gradually starting at midlife, we suggest there exists a sizable window of opportunity for early interventions to counteract undesirable trajectories associated with neurodegeneration.Significance StatementEvidence accumulates that sleep disturbances and cognitive decline are bi-directionally and causally linked forming a vicious cycle. Improving sleep quality could break this cycle. One marker for sleep quality is a clear hierarchical structure of sleep oscillations. Previous studies showed that sleep oscillations decouple in old age. Here, we show that, rather, the hierarchical structure gradually shifts across the human lifespan and reverses in old age, while coupling strength remains unchanged. This shift is associated with markers for astrocyte activation in old age. The shifting hierarchy resembles brain maturation, plateau, and wear processes. This study furthers our comprehension of this important neurophysiological process and its dynamic evolution across the human lifespan

    Sleep-learning impairs subsequent awake-learning

    No full text
    Although we can learn new information while asleep, we usually cannot consciously remember the sleep-formed memories – presumably because learning occurred in an unconscious state. Here, we ask whether sleep-learning expedites the subsequent awake-learning of the same information. To answer this question, we reanalyzed data (ZĂŒst et al., 2019, Curr Biol) from napping participants, who learned new semantic associations between pseudowords and translation-words (guga–ship) while in slow-wave sleep. They retrieved sleep-formed associations unconsciously on an implicit memory test following awakening. Then, participants took five runs of paired-associative learning to probe carry-over effects of sleep-learning on awake-learning. Surprisingly, sleep-learning diminished awake-learning when participants learned semantic associations that were congruent to sleep-learned associations (guga-boat). Yet, learning associations that conflicted with sleep-learned associations (guga-coin) was unimpaired relative to learning new associations (resun-table; baseline). We speculate that the impeded wake-learning originated in a deficient synaptic downscaling and resulting synaptic saturation in neurons that were activated during both sleep-learning and awake-learning

    Sleep-learning Impairs Subsequent Wake-learning

    No full text
    Humans can unconsciously acquire new information during deep sleep. Although sleep-played information can guide behavior during subsequent wakefulness, sleep-formed memories cannot be remembered consciously after awakening. We explored whether sleep-learning might expedite conscious learning during subsequent wakefulness by providing a first bout of carving a new memory trace, which ensuing wake-learning can build on. We analyzed previously unreported data acquired in a recent study on vocabulary learning during slow-wave sleep (ZĂŒst et al., 2019, Curr Biol). Sleep-played vocabulary was successfully retrieved in an implicit memory test administered following awakening. However, sleep-learning diminished instead of increased wake relearning of the same vocabulary. We speculate that vocabulary learning during sleep may have interfered with the synaptic down-scaling of hippocampal and neocortical language-related neurons, which were then too saturated for further potentiation required for the wake-relearning of the same vocabulary

    The role of slow wave sleep in the development of dementia and its potential for preventative interventions

    Get PDF
    The increasing incidence rate of dementia underlines the necessity to identify early biomarkers of imminent cognitive decline. Recent findings suggest that cognitive decline and the pathophysiology of Alzheimer's disease are closely linked to disruptions in slow wave sleep (SWS) – the deepest sleep stage. SWS is essential for memory functions and displays a potentially causal and bidirectional link to the accumulation of amyloid beta deposition. Accordingly, improving SWS in older adults – especially when at risk for dementia – might slow down the rate of cognitive decline. Recent work suggests that SWS can be improved by specifically targeting the electrophysiological peaks of the slow waves with acoustic stimulation. In older adults, this approach is still fairly new and accompanied by challenges posed by the specific complexity of their sleep physiology, like lower amplitude slow waves and fragmented sleep architecture. We suggest an approach that tackles these issues and attempts to re-instate a sleep physiology that resembles a younger, healthier brain. With enough SWS of high quality, metabolic clearance and memory functions could benefit and help slowing the process of cognitive aging. Ultimately, acoustic stimulation to enhance SWS could serve as a cost-effective, non-invasive tool to combat cognitive decline

    Implicit Vocabulary Learning during Sleep Is Bound to Slow-Wave Peaks

    Get PDF
    Learning while asleep is a dream of mankind, but is often deemed impossible because sleep lacks the conscious awareness and neurochemical milieu thought to be necessary for learning. Current evidence for sleep learning in humans is inconclusive. To explore conditions under which verbal learning might occur, we hypothesized that peaks of slow waves would be conducive to verbal learning because the peaks define periods of neural excitability. While in slow-wave sleep during a nap, a series of word pairs comprising pseudowords, e.g., “tofer,” and actual German words, e.g., “Haus” (house), were played to young German-speaking women and men. When the presentation of the second word of a pair (e.g., “Haus” of “tofer-house”) coincided with an ongoing slow-wave peak, the chances increased that a new semantic association between the pair had been formed and retained. Sleep-formed associations translated into awake ones, where they guided forced choices on an implicit memory test. Reactivations of sleep-formed associations were mirrored by brain activation increases measured with fMRI in cortical language areas and the hippocampus, a brain structure critical for relational binding. We infer that implicit relational binding had occurred during peaks of slow oscillations, recruiting a hippocampal-neocortical network comparable to vocabulary learning in the waking state

    Implicit memory for content and speaker of messages heard during slow-wave sleep

    No full text
    Although sleep is a state of unconsciousness, the sleeping brain does not completely cease to process external events. In fact, our brain is able to distinguish between sensical and nonsensical messages and can even learn contingencies between non-verbal events while asleep. Here, we asked whether sleeping humans can encode new verbal messages, learn voices of unfamiliar speakers, and form associations between speakers and messages. To this aim, we presented 28 sentences uttered by 28 unfamiliar speakers to participants who were in EEG-defined slow-wave sleep. After waking, participants performed three tests which assessed recognition of sleep-played speakers, messages, and speaker-message associations. Recognition accuracy in all tests was at chance level, suggesting sleep-played stimuli were not learned. However, response latencies were significantly shorter for correct vs. incorrect decisions in the message recognition test, indicating implicit memory for sleep-played messages (but not for speakers or speaker-message combinations). Furthermore, participants with excellent implicit memory for sleep-played messages also displayed implicit memory for speakers (but not speaker-message associations), as suggested by the significant correlation between response-latency-differences for recognition of messages and speakers. Implicit memory for speakers was verified by EEG at test: listening to sleep-played vs. new speakers evoked a late centro-parietal negativity. Event-related EEG recorded during sleep revealed that peaks resembling up-states of sleep slow-waves contributed to sleep-learning. Participants with larger evoked slow-wave peaks later showed stronger implicit memory. Overall, humans appear to be able to implicitly learn semantic content and speakers of sleep-played messages. These forms of sleep-learning are mediated by slow-waves
    corecore