93 research outputs found

    Unsupervised vector-based classification of single-molecule charge transport data

    Get PDF
    The stochastic nature of single-molecule charge transport measurements requires collection of large data sets to capture the full complexity of a molecular system. Data analysis is then guided by certain expectations, for example, a plateau feature in the tunnelling current distance trace, and the molecular conductance extracted from suitable histogram analysis. However, differences in molecular conformation or electrode contact geometry, the number of molecules in the junction or dynamic effects may lead to very different molecular signatures. Since their manifestation is a priori unknown, an unsupervised classification algorithm, making no prior assumptions regarding the data is clearly desirable. Here we present such an approach based on multivariate pattern analysis and apply it to simulated and experimental single-molecule charge transport data. We demonstrate how different event shapes are clearly separated using this algorithm and how statistics about different event classes can be extracted, when conventional methods of analysis fail

    Motor skill learning between selection and execution.

    Get PDF
    Learning motor skills evolves from the effortful selection of single movement elements to their combined fast and accurate production. We review recent trends in the study of skill learning which suggest a hierarchical organization of the representations that underlie such expert performance, with premotor areas encoding short sequential movement elements (chunks) or particular component features (timing/spatial organization). This hierarchical representation allows the system to utilize elements of well-learned skills in a flexible manner. One neural correlate of skill development is the emergence of specialized neural circuits that can produce the required elements in a stable and invariant fashion. We discuss the challenges in detecting these changes with fMRI

    Function of the ventral premotor cortex in auditory-motor integration of musical rhythm

    Full text link
    Die Tendenz, sich mit einem musikalischen Puls zu synchronisieren wird als kulturübergreifende Universalie betrachtet. Dennoch ist bisher unklar geblieben, welche neuronalen Mechanismen den Drang und die Fähigkeit zur zeitlich akkuraten audio-motorischen Kopplung verursachen. Die vorliegende Arbeit demonstriert mittels funktioneller Bildgebung und nicht-invasiver Stimulation eine kausale Rolle des ventralen prämotorischen Kortex (PMv), einer motorischen Hirnregion mit ausgeprägten Verbindungen zu auditorischen Arealen, bei der audio-motorischen Integration von Rhythmus. Die Ergebnisse legen nahe, dass der PMv sowohl bei perzeptueller Präferenz, als auch bei motorischer Kopplung an einen Rhythmus kritisch beteiligt ist. Hierbei werden weitere unterstützende neuronale Mechanismen identifiziert und ein neuroanatomisch grundiertes Modell der audio-motorischen Integration von Rhythmus vorgestellt, das die Ergebnisse in den Kontext sensomotorischer Kontrolle und Kognition einordnet. The tendency to move in synchrony with an auditory rhythmical pulse is considered a human universal. Nevertheless, it remains unknown, which neural mechanisms give rise to the urge and ability to accurately couple one's own movements to an auditory rhythm. Using functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS), the present thesis demonstrates a causal contribution of a motor-related brain region with prominent connections to auditory areas - the ventral premotor cortex (PMv) - to auditory-motor integration of musical rhythm. The current findings suggest a critical role of the PMv in both perceptual preference of and motor coupling to a musical rhythm and reveal additional neural mechanisms that support auditory-motor timing. The thesis provides a neuroanatomically grounded model for auditory-motor integration of rhythm which incorporates the present experimental findings into a framework of sensorimotor control and cognition

    Tempo and intensity of pre-task music modulate neural activity during reactive task performance

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2013 The Authors.Research has shown that not only do young athletes purposively use music to manage their emotional state (Bishop, Karageorghis, & Loizou, 2007), but also that brief periods of music listening may facilitate their subsequent reactive performance (Bishop, Karageorghis, & Kinrade, 2009). We report an fMRI study in which young athletes lay in an MRI scanner and listened to a popular music track immediately prior to performance of a three-choice reaction time task; intensity and tempo were modified such that six excerpts (2 intensities Ă— 3 tempi) were created. Neural activity was measured throughout. Faster tempi and higher intensity collectively yielded activation in structures integral to visual perception (inferior temporal gyrus), allocation of attention (cuneus, inferior parietal lobule, supramarginal gyrus), and motor control (putamen), during reactive performance. The implications for music listening as a pre-competition strategy in sport are discussed

    Neural Competitive Queuing of Ordinal Structure Underlies Skilled Sequential Action.

    Get PDF
    Fluent retrieval and execution of movement sequences is essential for daily activities, but the neural mechanisms underlying sequence planning remain elusive. Here participants learned finger press sequences with different orders and timings and reproduced them in a magneto-encephalography (MEG) scanner. We classified the MEG patterns for each press in the sequence and examined pattern dynamics during preparation and production. Our results demonstrate the "competitive queuing" (CQ) of upcoming action representations, extending previous computational and non-human primate recording studies to non-invasive measures in humans. In addition, we show that CQ reflects an ordinal template that generalizes across specific motor actions at each position. Finally, we demonstrate that CQ predicts participants' production accuracy and originates from parahippocampal and cerebellar sources. These results suggest that the brain learns and controls multiple sequences by flexibly combining representations of specific actions and interval timing with high-level, parallel representations of sequence position

    Surmising synchrony of sound and sight: Factors explaining variance of audiovisual integration in hurdling, tap dancing and drumming.

    Get PDF
    Auditory and visual percepts are integrated even when they are not perfectly temporally aligned with each other, especially when the visual signal precedes the auditory signal. This window of temporal integration for asynchronous audiovisual stimuli is relatively well examined in the case of speech, while other natural action-induced sounds have been widely neglected. Here, we studied the detection of audiovisual asynchrony in three different whole-body actions with natural action-induced sounds-hurdling, tap dancing and drumming. In Study 1, we examined whether audiovisual asynchrony detection, assessed by a simultaneity judgment task, differs as a function of sound production intentionality. Based on previous findings, we expected that auditory and visual signals should be integrated over a wider temporal window for actions creating sounds intentionally (tap dancing), compared to actions creating sounds incidentally (hurdling). While percentages of perceived synchrony differed in the expected way, we identified two further factors, namely high event density and low rhythmicity, to induce higher synchrony ratings as well. Therefore, we systematically varied event density and rhythmicity in Study 2, this time using drumming stimuli to exert full control over these variables, and the same simultaneity judgment tasks. Results suggest that high event density leads to a bias to integrate rather than segregate auditory and visual signals, even at relatively large asynchronies. Rhythmicity had a similar, albeit weaker effect, when event density was low. Our findings demonstrate that shorter asynchronies and visual-first asynchronies lead to higher synchrony ratings of whole-body action, pointing to clear parallels with audiovisual integration in speech perception. Overconfidence in the naturally expected, that is, synchrony of sound and sight, was stronger for intentional (vs. incidental) sound production and for movements with high (vs. low) rhythmicity, presumably because both encourage predictive processes. In contrast, high event density appears to increase synchronicity judgments simply because it makes the detection of audiovisual asynchrony more difficult. More studies using real-life audiovisual stimuli with varying event densities and rhythmicities are needed to fully uncover the general mechanisms of audiovisual integration

    A unifying motor control framework for task-specific dystonia.

    Get PDF
    Task-specific dystonia is a movement disorder characterized by a painless loss of dexterity specific to a particular motor skill. This disorder is prevalent among writers, musicians, dancers and athletes. No current treatment is predictably effective, and the disorder generally ends the careers of affected individuals. Traditional disease models of dystonia have a number of limitations with regard to task-specific dystonia. We therefore discuss emerging evidence that the disorder has its origins within normal compensatory mechanisms of a healthy motor system in which the representation and reproduction of motor skill are disrupted. We describe how risk factors for task-specific dystonia can be stratified and translated into mechanisms of dysfunctional motor control. The proposed model aims to define new directions for experimental research and stimulate therapeutic advances for this highly disabling disorder

    Surmising synchrony of sound and sight:Factors explaining variance of audiovisual integration in hurdling, tap dancing and drumming

    Get PDF
    Auditory and visual percepts are integrated even when they are not perfectly temporally aligned with each other, especially when the visual signal precedes the auditory signal. This window of temporal integration for asynchronous audiovisual stimuli is relatively well examined in the case of speech, while other natural action-induced sounds have been widely neglected. Here, we studied the detection of audiovisual asynchrony in three different whole-body actions with natural action-induced sounds–hurdling, tap dancing and drumming. In Study 1, we examined whether audiovisual asynchrony detection, assessed by a simultaneity judgment task, differs as a function of sound production intentionality. Based on previous findings, we expected that auditory and visual signals should be integrated over a wider temporal window for actions creating sounds intentionally (tap dancing), compared to actions creating sounds incidentally (hurdling). While percentages of perceived synchrony differed in the expected way, we identified two further factors, namely high event density and low rhythmicity, to induce higher synchrony ratings as well. Therefore, we systematically varied event density and rhythmicity in Study 2, this time using drumming stimuli to exert full control over these variables, and the same simultaneity judgment tasks. Results suggest that high event density leads to a bias to integrate rather than segregate auditory and visual signals, even at relatively large asynchronies. Rhythmicity had a similar, albeit weaker effect, when event density was low. Our findings demonstrate that shorter asynchronies and visual-first asynchronies lead to higher synchrony ratings of whole-body action, pointing to clear parallels with audiovisual integration in speech perception. Overconfidence in the naturally expected, that is, synchrony of sound and sight, was stronger for intentional (vs. incidental) sound production and for movements with high (vs. low) rhythmicity, presumably because both encourage predictive processes. In contrast, high event density appears to increase synchronicity judgments simply because it makes the detection of audiovisual asynchrony more difficult. More studies using real-life audiovisual stimuli with varying event densities and rhythmicities are needed to fully uncover the general mechanisms of audiovisual integration
    • …
    corecore