607 research outputs found

    Hyperscanning Alone Cannot Prove Causality. Multibrain Stimulation Can

    Get PDF

    Towards a unified neural mechanism for reactive adaptive behaviour

    Get PDF
    Surviving in natural environments requires animals to sense sudden events and swiftly adapt behaviour accordingly. The study of such Reactive Adaptive Behaviour (RAB) has been central to a number of research streams, all orbiting around movement science but progressing in parallel, with little cross-field fertilization. We first provide a concise review of these research streams, independently describing four types of RAB: (1) cortico-muscular resonance, (2) stimulus locked response, (3) online motor correction and (4) action stopping. We then highlight remarkable similarities across these four RABs, suggesting that they might be subserved by the same neural mechanism, and propose directions for future research on this topic

    Neural alpha oscillations index the balance between self-other integration and segregation in real-time joint action

    Get PDF
    Shared knowledge and interpersonal coordination are prerequisites for most forms of social behavior. Influential approaches to joint action have conceptualized these capacities in relation to the separate constructs of co-representation (knowledge) and self-other entrainment (coordination). Here we investigated how brain mechanisms involved in co-representation and entrainment interact to support joint action. To do so, we used a musical joint action paradigm to show that the neural mechanisms underlying co-representation and self-other entrainment are linked via a process – indexed by EEG alpha oscillations – regulating the balance between self-other integration and segregation in real time. Pairs of pianists performed short musical items while action familiarity and interpersonal (behavioral) synchronization accuracy were manipulated in a factorial design. Action familiarity referred to whether or not pianists had rehearsed the musical material performed by the other beforehand. Interpersonal synchronization was manipulated via congruent or incongruent tempo change instructions that biased performance timing towards the impending, new tempo. It was observed that, when pianists were familiar with each other's parts, millisecond variations in interpersonal synchronized behavior were associated with a modulation of alpha power over right centro-parietal scalp regions. Specifically, high behavioral entrainment was associated with self-other integration, as indexed by alpha suppression. Conversely, low behavioral entrainment encouraged reliance on internal knowledge and thus led to self-other segregation, indexed by alpha enhancement. These findings suggest that alpha oscillations index the processing of information about self and other depending on the compatibility of internal knowledge and external (environmental) events at finely resolved timescales

    Local spatial analysis: an easy-to-use adaptive spatial EEG filter

    Get PDF
    Spatial EEG filters are widely used to isolate event-related potential (ERP) components. The most commonly used spatial filters (e.g., the average reference and the surface Laplacian) are "stationary." Stationary filters are conceptually simple, easy to use, and fast to compute, but all assume that the EEG signal does not change across sensors and time. Given that ERPs are intrinsically nonstationary, applying stationary filters can lead to misinterpretations of the measured neural activity. In contrast, "adaptive" spatial filters (e.g., independent component analysis, ICA; and principal component analysis, PCA) infer their weights directly from the spatial properties of the data. They are, thus, not affected by the shortcomings of stationary filters. The issue with adaptive filters is that understanding how they work and how to interpret their output require advanced statistical and physiological knowledge. Here, we describe a novel, easy-to-use, and conceptually simple adaptive filter (local spatial analysis, LSA) for highlighting local components masked by large widespread activity. This approach exploits the statistical information stored in the trial-by-trial variability of stimulus-evoked neural activity to estimate the spatial filter parameters adaptively at each time point. Using both simulated data and real ERPs elicited by stimuli of four different sensory modalities (audition, vision, touch, and pain), we show that this method outperforms widely used stationary filters and allows to identify novel ERP components masked by large widespread activity. Implementation of the LSA filter in MATLAB is freely available to download.NEW & NOTEWORTHY EEG spatial filtering is important for exploring brain function. Two classes of filters are commonly used: stationary and adaptive. Stationary filters are simple to use but wrongly assume that stimulus-evoked EEG responses (ERPs) are stationary. Adaptive filters do not make this assumption but require solid statistical and physiological knowledge. Bridging this gap, we present local spatial analysis (LSA), an adaptive, yet computationally simple, spatial filter based on linear regression that separates local and widespread brain activity (https://www.iannettilab.net/lsa.html or https://github.com/rorybufacchi/LSA-filter)

    Endogenous sources of interbrain synchrony in duetting pianists

    Get PDF
    When people interact with each other, their brains synchronise. However, it remains unclear whether interbrain synchrony (IBS) is functionally relevant for social interaction or stems from exposure of individual brains to identical sensorimotor information. To disentangle these views, the current dual-EEG study investigated amplitude-based IBS in pianists jointly performing duets containing a silent pause followed by a tempo change. First, we manipulated the similarity of the anticipated tempo change and measured IBS during the pause, hence, capturing the alignment of purely endogenous, temporal plans without sound or movement. Notably, right posterior gamma IBS was higher when partners planned similar tempi, it predicted whether partners’ tempi matched after the pause, and was modulated only in real, not in surrogate pairs. Second, we manipulated the familiarity with the partner’s actions and measured IBS during joint performance with sound. Although sensorimotor information was similar across conditions, gamma IBS was higher when partners were unfamiliar with each other’s part and had to attend more closely to the sound of the performance. These combined findings demonstrate that IBS is not merely an epiphenomenon of shared sensorimotor information, but can also hinge on endogenous, cognitive processes crucial for behavioural synchrony and successful social interaction

    Cortico-cerebellar audio-motor regions coordinate self and other in musical joint action

    Get PDF
    Joint music performance requires flexible sensorimotor coordination between self and other. Cognitive and sensory parameters of joint action—such as shared knowledge or temporal (a)synchrony—influence this coordination by shifting the balance between self-other segregation and integration. To investigate the neural bases of these parameters and their interaction during joint action, we asked pianists to play on an MR-compatible piano, in duet with a partner outside of the scanner room. Motor knowledge of the partner’s musical part and the temporal compatibility of the partner’s action feedback were manipulated. First, we found stronger activity and functional connectivity within cortico-cerebellar audio-motor networks when pianists had practiced their partner’s part before. This indicates that they simulated and anticipated the auditory feedback of the partner by virtue of an internal model. Second, we observed stronger cerebellar activity and reduced behavioral adaptation when pianists encountered subtle asynchronies between these model-based anticipations and the perceived sensory outcome of (familiar) partner actions, indicating a shift towards self-other segregation. These combined findings demonstrate that cortico-cerebellar audio-motor networks link motor knowledge and other-produced sounds depending on cognitive and sensory factors of the joint performance, and play a crucial role in balancing self-other integration and segregation

    Lateral prefrontal cortex is a hub for music production from structural rules to movements

    Get PDF
    Complex sequential behaviours, such as speaking or playing music, entail flexible rule-based chaining of single acts. However, it remains unclear how the brain translates abstract structural rules into movements. We combined music production with multi-modal neuroimaging to dissociate high-level structural and low-level motor planning. Pianists played novel musical chord sequences on a muted MR-compatible piano by imitating a model hand on screen. Chord sequences were manipulated in terms of musical harmony and context length to assess structural planning, and in terms of fingers used for playing to assess motor planning. A model of probabilistic sequence processing confirmed temporally extended dependencies between chords, as opposed to local dependencies between movements. Violations of structural plans activated the left inferior frontal and middle temporal gyrus, and the fractional anisotropy of the ventral pathway connecting these two regions positively predicted behavioural measures of structural planning. A bilateral fronto-parietal network was instead activated by violations of motor plans. Both structural and motor networks converged in lateral prefrontal cortex, with anterior regions contributing to musical structure building, and posterior areas to movement planning. These results establish a promising approach to study sequence production at different levels of action representation

    Ultralow-frequency neural entrainment to pain

    Get PDF
    Nervous systems exploit regularities in the sensory environment to predict sensory input, adjust behavior, and thereby maximize fitness. Entrainment of neural oscillations allows retaining temporal regularities of sensory information, a prerequisite for prediction. Entrainment has been extensively described at the frequencies of periodic inputs most commonly present in visual and auditory landscapes (e.g., >0.5 Hz). An open question is whether neural entrainment also occurs for regularities at much longer timescales. Here, we exploited the fact that the temporal dynamics of thermal stimuli in natural environment can unfold very slowly. We show that ultralow-frequency neural oscillations preserved a long-lasting trace of sensory information through neural entrainment to periodic thermo-nociceptive input as low as 0.1 Hz. Importantly, revealing the functional significance of this phenomenon, both power and phase of the entrainment predicted individual pain sensitivity. In contrast, periodic auditory input at the same ultralow frequency did not entrain ultralow-frequency oscillations. These results demonstrate that a functionally significant neural entrainment can occur at temporal scales far longer than those commonly explored. The non-supramodal nature of our results suggests that ultralow-frequency entrainment might be tuned to the temporal scale of the statistical regularities characteristic of different sensory modalities
    • …
    corecore