43 research outputs found

    Unique contributions of perceptual and conceptual humanness to object representations in the human brain

    Get PDF
    The human brain is able to quickly and accurately identify objects in a dynamic visual world. Objects evoke different patterns of neural activity in the visual system, which reflect object category memberships. However, the underlying dimensions of object representations in the brain remain unclear. Recent research suggests that objects similarity to humans is one of the main dimensions used by the brain to organise objects, but the nature of the human-similarity features driving this organisation are still unknown. Here, we investigate the relative contributions of perceptual and conceptual features of humanness to the representational organisation of objects in the human visual system. We collected behavioural judgements of human-similarity of various objects, which were compared with time-resolved neuroimaging responses to the same objects. The behavioural judgement tasks targeted either perceptual or conceptual humanness features to determine their respective contribution to perceived human-similarity. Behavioural and neuroimaging data revealed significant and unique contributions of both perceptual and conceptual features of humanness, each explaining unique variance in neuroimaging data. Furthermore, our results showed distinct spatio-temporal dynamics in the processing of conceptual and perceptual humanness features, with later and more lateralised brain responses to conceptual features. This study highlights the critical importance of social requirements in information processing and organisation in the human brain

    Getting your sea legs

    Get PDF
    Sea travel mandates changes in the control of the body. The process by which we adapt bodily control to life at sea is known as getting one's sea legs. We conducted the first experimental study of bodily control as maritime novices adapted to motion of a ship at sea. We evaluated postural activity (stance width, stance angle, and the kinematics of body sway) before and during a sea voyage. In addition, we evaluated the role of the visible horizon in the control of body sway. Finally, we related data on postural activity to two subjective experiences that are associated with sea travel; seasickness, and mal de debarquement. Our results revealed rapid changes in postural activity among novices at sea. Before the beginning of the voyage, the temporal dynamics of body sway differed among participants as a function of their (subsequent) severity of seasickness. Body sway measured at sea differed among participants as a function of their (subsequent) experience of mal de debarquement. We discuss implications of these results for general theories of the perception and control of bodily orientation, for the etiology of motion sickness, and for general phenomena of perceptual-motor adaptation and learning

    Neural tracking and integration of 'self' and 'other' in improvised interpersonal coordination

    Get PDF
    Humans coordinate their movements with one another in a range of everyday activities and skill domains. Optimal joint performance requires the continuous anticipation of and adaptation to each other's movements, especially when actions are spontaneous rather than pre-planned. Here we employ dual-EEG and frequency-tagging techniques to investigate how the neural tracking of self- and other-generated movements supports interpersonal coordination during improvised motion. LEDs flickering at 5.7 and 7.7 Hz were attached to participants’ index fingers in 28 dyads as they produced novel patterns of synchronous horizontal forearm movements. EEG responses at these frequencies revealed enhanced neural tracking of self-generated movement when leading and of other-generated movements when following. A marker of self-other integration at 13.4 Hz (inter-modulation frequency of 5.7 and 7.7 Hz) peaked when no leader was designated, and mutual adaptation and movement synchrony were maximal. Furthermore, the amplitude of EEG responses reflected differences in the capacity of dyads to synchronize their movements, offering a neurophysiologically grounded perspective for understanding perceptual-motor mechanisms underlying joint action. © 2019 Elsevier Inc

    Dynamic modulation of beta band cortico-muscular coupling induced by audio-visual rhythms

    Get PDF
    Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 Hz or 2 Hz while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants’ EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12–40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG–EMG motor coherence were found for the 2 Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans

    Does movement amplitude of a co-performer affect individual performance in musical synchronization?

    Get PDF
    Interpersonal coordination in musical ensembles often involves multisensory cues, with visual information about body movements supplementing co-performers’ sounds. Previous research on the influence of movement amplitude of a visual stimulus on basic sensorimotor synchronization has shown mixed results. Uninstructed visuomotor synchronization seems to be influenced by amplitude of a visual stimulus, but instructed visuomotor synchronization is not. While music performance presents a special case of visually mediated coordination, involving both uninstructed (spontaneously coordinating ancillary body movements with co-performers) and instructed (producing sound on a beat) forms of synchronization, the underlying mechanisms might also support rhythmic interpersonal coordination in the general population. We asked whether visual cue amplitude would affect nonmusicians’ synchronization of sound and head movements in a musical drumming task designed to be accessible regardless of musical experience. Given the mixed prior results, we considered two competing hypotheses. H1: higher amplitude visual cues will improve synchronization. H2: different amplitude visual cues will have no effect on synchronization. Participants observed a human-derived motion capture avatar with three levels of movement amplitude, or a still image of the avatar, while drumming along to the beat of tempo-changing music. The moving avatars were always timed to match the music. We measured temporal asynchrony (drumming relative to the music), predictive timing, ancillary movement fluctuation, and cross-spectral coherence of ancillary movements between the participant and avatar. The competing hypotheses were tested using conditional equivalence testing. This method involves using a statistical equivalence test in the event that standard hypothesis tests show no differences. Our results showed no statistical differences across visual cues types. Therefore, we conclude that there is not a strong effect of visual stimulus amplitude on instructed synchronization

    Neural tracking of visual periodic motion

    Get PDF
    Periodicity is a fundamental property of biological systems, including human movement systems. Periodic movements support displacements of the body in the environment as well as interactions and communication between individuals. Here, we use electroencephalography (EEG) to investigate the neural tracking of visual periodic motion, and more specifically, the relevance of spatiotemporal information contained at and between their turning points. We compared EEG responses to visual sinusoidal oscillations versus nonlinear Rayleigh oscillations, which are both typical of human movements. These oscillations contain the same spatiotemporal information at their turning points but differ between turning points, with Rayleigh oscillations having an earlier peak velocity, shown to increase an individual's capacity to produce accurately synchronized movements. EEG analyses highlighted the relevance of spatiotemporal information between the turning points by showing that the brain precisely tracks subtle differences in velocity profiles, as indicated by earlier EEG responses for Rayleigh oscillations. The results suggest that the brain is particularly responsive to velocity peaks in visual periodic motion, supporting their role in conveying behaviorally relevant timing information at a neurophysiological level. The results also suggest key functions of neural oscillations in the Alpha and Beta frequency bands, particularly in the right hemisphere. Together, these findings provide insights into the neural mechanisms underpinning the processing of visual periodic motion and the critical role of velocity peaks in enabling proficient visuomotor synchronization

    Neural tracking of the musical beat is enhanced by low-frequency sounds

    Get PDF
    Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat

    huSync : a model and system for the measure of synchronization in small groups : a case study on musical joint action

    Get PDF
    Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists

    What would be Usain Bolt's 100-meter sprint world record without Tyson Gay? Unintentional interpersonal synchronization between the two sprinters

    No full text
    Despite the desire of athletes to separate themselves from their competitors, to be faster or better, their performance is often influenced by those they are competing with. Here we show that the unintentional or spontaneous interpersonal synchronization of athletes' movements may partially account for such performance modifications. We examined the 100-m final of Usain Bolt in the 12th IAAF World Championship in Athletics (Berlin, 2009) in which he broke the world record, and demonstrate that Usain Bolt and Tyson Gay who ran side-by-side throughout the race spontaneously and intermittently synchronized their steps. This finding demonstrates that even the most optimized individual motor skills can be modulated by the simple presence of another individual via interpersonal coordination processes. It extends previous research by showing that the hard constraints of individual motor performance do not overwhelm the occurrence of spontaneous interpersonal synchronization and open promising new research directions for better understanding and improving athletic performance

    Computation of continuous relative phase and modulation of frequency of human movement

    No full text
    Continuous relative phase measures have been used to quantify the coordination between different body segments in several activities. Our aim in this study was to investigate how the methods traditionally used to compute the continuous phase of human rhythmic movement are affected by modulations of frequency. We compared the continuous phase computed method with the traditional method derived from the position-velocity phase plane and with the Hilbert Transform. The methods were tested using sinusoidal signals with a modulation of frequency between or within cycles. Our results showed that the continuous phase computed with the first method results in oscillations in the phase time-series not expected for a sinusoidal signal and that the continuous phase is overestimated with the Hilbert Transform. We proposed a new method that produces a correct estimation of continuous phase by using half-cycle estimations of frequency to normalize the phase planes prior to calculating phase angles. The findings of the current study have important implications for computing continuous relative phase when investigating human movement coordination
    corecore