401 research outputs found

    Instrumental Music Influences Recognition of Emotional Body Language

    Get PDF
    In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music

    Emotional Voice and Emotional Body Postures Influence Each Other Independently of Visual Awareness

    Get PDF
    Multisensory integration may occur independently of visual attention as previously shown with compound face-voice stimuli. We investigated in two experiments whether the perception of whole body expressions and the perception of voices influence each other when observers are not aware of seeing the bodily expression. In the first experiment participants categorized masked happy and angry bodily expressions while ignoring congruent or incongruent emotional voices. The onset between target and mask varied from −50 to +133 ms. Results show that the congruency between the emotion in the voice and the bodily expressions influences audiovisual perception independently of the visibility of the stimuli. In the second experiment participants categorized the emotional voices combined with masked bodily expressions as fearful or happy. This experiment showed that bodily expressions presented outside visual awareness still influence prosody perception. Our experiments show that audiovisual integration between bodily expressions and affective prosody can take place outside and independent of visual awareness

    Proper Motion Study of the Magellanic Clouds using SPM material

    Get PDF
    Absolute proper motions are determined for stars and galaxies to V=17.5 over a 450 square-degree area that encloses both Magellanic Clouds. The proper motions are based on photographic and CCD observations of the Yale/San Juan Southern Proper Motion program, which span over a baseline of 40 years. Multiple, local relative proper motion measures are combined in an overlap solution using photometrically selected Galactic Disk stars to define a global relative system that is then transformed to absolute using external galaxies and Hipparcos stars to tie into the ICRS. The resulting catalog of 1.4 million objects is used to derive the mean absolute proper motions of the Large Magellanic Cloud and the Small Magellanic Cloud; (\mu_\alpha\cos\delta,\mu_\delta)_{LMC}=(1.89,+0.39)\pm (0.27,0.27)\;\;\{mas yr}^{-1} and (\mu_\alpha\cos\delta,\mu_\delta)_{SMC}=(0.98,-1.01)\pm (0.30,0.29)\;\;\{mas yr}^{-1}. These mean motions are based on best-measured samples of 3822 LMC stars and 964 SMC stars. A dominant portion (0.25 mas yr−1^{-1}) of the formal errors is due to the estimated uncertainty in the inertial system of the Hipparcos Catalog stars used to anchor the bright end of our proper motion measures. A more precise determination can be made for the proper motion of the SMC {\it relative} to the LMC; (\mu_{\alpha\cos\delta},\mu_\delta)_{SMC-LMC} = (-0.91,-1.49) \pm (0.16,0.15)\;\;\{mas yr}^{-1}. This differential value is combined with measurements of the proper motion of the LMC taken from the literature to produce new absolute proper-motion determinations for the SMC, as well as an estimate of the total velocity difference of the two clouds to within ±\pm54 kms−1^{-1}.Comment: 50 pages (referee format), 13 figures. Accepted for publication in A

    Music Alters Visual Perception

    Get PDF
    Background: Visual perception is not a passive process: in order to efficiently process visual input, the brain actively uses previous knowledge (e.g., memory) and expectations about what the world should look like. However, perception is not only influenced by previous knowledge. Especially the perception of emotional stimuli is influenced by the emotional state of the observer. In other words, how we perceive the world does not only depend on what we know of the world, but also by how we feel. In this study, we further investigated the relation between mood and perception. Methods and Findings: We let observers do a difficult stimulus detection task, in which they had to detect schematic happy and sad faces embedded in noise. Mood was manipulated by means of music. We found that observers were more accurate in detecting faces congruent with their mood, corroborating earlier research. However, in trials in which no actual face was presented, observers made a significant number of false alarms. The content of these false alarms, or illusory percepts, was strongly influenced by the observers ’ mood. Conclusions: As illusory percepts are believed to reflect the content of internal representations that are employed by the brain during top-down processing of visual input, we conclude that top-down modulation of visual processing is not purely predictive in nature: mood, in this case manipulated by music, may also directly alter the way we perceive the world

    Measurement of Angular Distributions and R= sigma_L/sigma_T in Diffractive Electroproduction of rho^0 Mesons

    Full text link
    Production and decay angular distributions were extracted from measurements of exclusive electroproduction of the rho^0(770) meson over a range in the virtual photon negative four-momentum squared 0.5< Q^2 <4 GeV^2 and the photon-nucleon invariant mass range 3.8< W <6.5 GeV. The experiment was performed with the HERMES spectrometer, using a longitudinally polarized positron beam and a ^3He gas target internal to the HERA e^{+-} storage ring. The event sample combines rho^0 mesons produced incoherently off individual nucleons and coherently off the nucleus as a whole. The distributions in one production angle and two angles describing the rho^0 -> pi+ pi- decay yielded measurements of eight elements of the spin-density matrix, including one that had not been measured before. The results are consistent with the dominance of helicity-conserving amplitudes and natural parity exchange. The improved precision achieved at 47 GeV, reveals evidence for an energy dependence in the ratio R of the longitudinal to transverse cross sections at constant Q^2.Comment: 15 pages, 15 embedded figures, LaTeX for SVJour(epj) document class Revision: Fig. 15 corrected, recent data added to Figs. 10,12,14,15; minor changes to tex

    How Bodies and Voices Interact in Early Emotion Perception

    Get PDF
    Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing
    • …
    corecore