41 research outputs found

    Validating Kinematic Displays for the Perception of Musical Performance

    Get PDF
    Human gestures contain certain characteristics and meanings in communication and represent a link between intention and body. This paper describes a pilot study investigating the role of ancillary musical gestures in understanding musical meaning from the listener's standpoint. We conducted a perceptual experiment using motion-capture recordings of musicians. Participants were presented video recordings and reconstructed point-light displays of music performances. By asking them to rate certain musicrelated parameters we found that abstract motions of the point-light displays yielded similar ratings to those of the real recordings. This suggests that pure body motion seems to be sufficient to communicate certain musical impressions

    Perception and prediction of simple object interactions

    No full text
    For humans, it is useful to be able to visually detect an object's physical properties. One potentially important source of information is the way the object moves and interacts with other objects in the environment. Here, we use computer simulations of a virtual ball bouncing on a horizontal plane to study the correspondence between our ability to estimate the ball's elasticity and to predict its future path. Three experiments were conducted to address (1) perception of the ball's elasticity, (2) interaction with the ball, and (3) prediction of its trajectory. The results suggest that different strategies and information sources are used for passive perception versus actively predicting future behavior

    The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    Get PDF
    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

    Analysis amd Synthesis of facial Expressions using Computer Graphivs Animation and Psychophysics

    No full text
    The human face is one of the most ecologically relevant objects for visual perception. Although the face changes expressions constantly and in a variety of complex ways, we are able to interpret these with a quick glance at a face. In particular, facial motion plays a complex and important role in communication. It can be used, for example, to convey meaning, to express an emotion or to modify the meaning of what is said. My research is focused on what we can learn, using psychophysical methodologies, about the human visual system from the way faces move. I will attempt to develop a detailed cognitive model for the perception of expressions by exploring and differentiating the information channels contained in facial expressions. Here, I present the results of psychophysical experiments, in which we manipulated real video sequences of facial expressions of different actors. In the first experiment, we scaled down the video sequences to find out how the recognition of an expression depends on the presented image size [2]. In a second set of experiments, Cunningham et al. selectively ’froze’ portions of a face to produce an initial, systematic description of the parts of a face that are necessary and sufficient for the recognition of facial expressions [3]. Based on these experiments, I will outline future work in which we plan to use computer animated faces [1]. This will allow us to produce realistic image sequences while retaining complete control over what occurs in the images (e.g., to finely alter the temporal parameters such as the speed, acceleration, duration, or synchronization of facial motion). Finally, I want to propose a unifying framework of interpretation and manipulation of facial analysis and synthesis, which contains different, hierarchically organized levels of perception and simulation. Within this framework, we can systematically identify and analyze the information channels that are addressed by the cognitive experiments described above. The results from this line of research are expected not only to shed light on perceptual mechanisms of expression recognition, but also to help improve computer animation in order to create perceptually consistent, realistic and believable conversational agents

    Using real-time 3D simulations to study the perception of elasticity

    No full text
    To know how an object will behave, it is useful to be able to visually infer its physical properties, such as its weight, hardness, or elasticity. One important source of information about an object is the way it moves and interacts with other objects in the scene. Here, we present two experiments in which subjects had to judge the elasticity of a ball from the way that it bounces. Previous research on this topic generally used simple 2D graphics to produce finely controllable, albeit not very realistic, simulations. With modern VR techniques, however, we can maintain the tight control over the parameters while creating highly realistic stimuli. Thus, we designed these psychophysical experiments using real-time 3D simulations of both simple and complex object interactions. Subjects viewed two of these simulations side by side and had to adjust the elasticity of the ball on the right to match the ball on the left. The entire simulation was shown either right-side-up or upside-down. Our results suggest that subjects generally rely on simple low-level image cues to match the elasticity: (1) For the simple bouncing events, subjects confounded elasticity with the height from which the ball was dropped; (2) Subjects were generally poor at matching elasticity for the complex bouncing events, in which the reliability of the low-level cues was minimized; (3) Performance was similar for right-side-up and for upside-down, suggesting a weak role of prior knowledge about gravity. We have found that using real-time VR simulations presents a number of unique challenges, and we will briefly review the successes and failures we encountered in the design and implementation of these experiments

    Erkennung physikalischer Eigenschaften aus der Bewegung von Objekten: das Beispiel Elastizität

    No full text
    Um das Verhalten eines Objektes bestimmen oder verhersagen zu können, ist es wichtig, einige seiner physikalischen Eigenschaften zu kennen. Eine wichtige Informationsquelle zum Abstrahieren dieser Eigenschaften ist die Beobachtung des Objektes bei der Interaktion mit anderen Objekten. In dieser Studie haben wir psychophysikalische Experimente durchgeführt, bei denen die Versuchspersonen die Aufgabe hatten, die Elastizität eines Balles anhand seines Bewegungsverhaltens zu bestimmen. Hierzu verwendeten wir eine 3D Simulations-Software für eine kontrollierte, physikalisch korrekte Darstellung des Balles. In zwei unterschiedlichen Umgebungen (freie sowie versperrende Bewegungsräume, d.h. einfache sowie komplexe Objektinteraktion), sollten die Versuchspersonen die Elastizität eines Balls verändern, so dass er sich so verhielt wie ein “Target”-Ball (match-to-sample). Die Ergebnisse zeigen, dass zur Bestimmung der Elastizität sowohl einfache visuelle Bildinformationen als auch kognitive Einflüsse (in Form physikalischer Modelle) integriert werden. Die Gewichtung dieser beiden Informationsquellen hängt entscheidend von der Komplexität der Objektinteraktion ab

    Identitätskrisen meistern

    No full text
    Bei Dial-up-Verbindungen ins Internet vergibt der Provider bei jeder Einwahl eine neue IP-Adresse. Das wird schnell zum Problem, wenn aus dem Internet heraus auf solche Rechner zugegriffen werden soll. Abhilfe versprechen spezielle Nameserver-Dienste für wechselnde IP-Adressen

    Music and Motion: How Music-Related Ancillary Body Movements Contribute to the Experience of Music

    No full text
    Expressive performer movements in musical performances represent implied levels of communication and can contain certain characteristics and meanings of embodied human expressivity. This study investigated the contribution of ancillary body movements on the perception of musical performances. Using kinematic displays of four clarinetists, perceptual experiments were conducted in which participants were asked to rate specific music-related dimensions of the performance and the performer. Additionally, motions of particular body parts, such as movements of the arms and torso, as well as motion amplitudes of the whole body were manipulated in the kinematic display. It was found that manipulations of arm and torso movements have fewer effects on the observers‘ ratings of the musicians than manipulations concerning the movement of the whole body. The results suggest that the multimodal experience of musicians is less dependent on the players‘ particular body motion behaviors than it is on the players‘ overall relative motion characteristics

    Charakteristik der Sprechstimme gegenüber ausgehaltener Phonation in der Stimmdiagnostik

    No full text

    Zeitliche Veränderung der stimmlichen Leistungsfähigkeit bei unterschiedlichen Stimmbelastungstests

    No full text
    corecore