3 research outputs found

    Closed-loop auditory stimulation of sleep slow oscillations: Basic principles and best practices.

    Get PDF
    Sleep is essential for our physical and mental well-being. During sleep, despite the paucity of overt behavior, our brain remains active and exhibits a wide range of coupled brain oscillations. In particular slow oscillations are characteristic for sleep, however whether they are directly involved in the functions of sleep, or are mere epiphenomena, is not yet fully understood. To disentangle the causality of these relationships, experiments utilizing techniques to detect and manipulate sleep oscillations in real-time are essential. In this review, we first overview the theoretical principles of closed-loop auditory stimulation (CLAS) as a method to study the role of slow oscillations in the functions of sleep. We then describe technical guidelines and best practices to perform CLAS and analyze results from such experiments. We further provide an overview of how CLAS has been used to investigate the causal role of slow oscillations in various sleep functions. We close by discussing important caveats, open questions, and potential topics for future research

    Closed-loop auditory stimulation of sleep slow oscillations: basic principles and best practices

    No full text
    Sleep is an indispensable part of our life and plays a critical role in our physical and mental well-being. During sleep, despite the paucity of behavior, our brain stays active and exhibits a wide range of coupled brain oscillations. This activity in sleep-characteristic brain oscillations has been linked to various functions of sleep. However, whether these sleep oscillations mediate these functions or reflect mere epiphenomena is not yet fully understood. To disentangle the causality of these relationships, experiments utilizing non-invasive stimulation techniques have been essential. In particular, auditory stimulation aligned with neural activity in a closed-loop controlled fashion has drawn increasing attention during the last years due to its specificity and practical advantages. In this review, we summarize closed-loop auditory stimulation experiments that generated evidence to support a causal role of slow oscillations in information reprocessing functions of sleep. Furthermore, we provide technical details and guidelines regarding best practices in closed-loop auditory stimulation, from how to perform the stimulation to analysis strategies. Besides discussing important caveats and open questions, we also touch on various areas in which closed-loop auditory stimulation is applicable, from fundamental investigations on memory processing and endocrine function, to its potential for clinical applications. Eventually, we propose potential topics for future research

    High-precision spatial analysis of mouse courtship vocalization behavior reveals sex and strain differences

    No full text
    Abstract Mice display a wide repertoire of vocalizations that varies with sex, strain, and context. Especially during social interaction, including sexually motivated dyadic interaction, mice emit sequences of ultrasonic vocalizations (USVs) of high complexity. As animals of both sexes vocalize, a reliable attribution of USVs to their emitter is essential. The state-of-the-art in sound localization for USVs in 2D allows spatial localization at a resolution of multiple centimeters. However, animals interact at closer ranges, e.g. snout-to-snout. Hence, improved algorithms are required to reliably assign USVs. We present a novel algorithm, SLIM (Sound Localization via Intersecting Manifolds), that achieves a 2–3-fold improvement in accuracy (13.1–14.3 mm) using only 4 microphones and extends to many microphones and localization in 3D. This accuracy allows reliable assignment of 84.3% of all USVs in our dataset. We apply SLIM to courtship interactions between adult C57Bl/6J wildtype mice and those carrying a heterozygous Foxp2 variant (R552H). The improved spatial accuracy reveals that vocalization behavior is dependent on the spatial relation between the interacting mice. Female mice vocalized more in close snout-to-snout interaction while male mice vocalized more when the male snout was in close proximity to the female's ano-genital region. Further, we find that the acoustic properties of the ultrasonic vocalizations (duration, Wiener Entropy, and sound level) are dependent on the spatial relation between the interacting mice as well as on the genotype. In conclusion, the improved attribution of vocalizations to their emitters provides a foundation for better understanding social vocal behaviors
    corecore