14 research outputs found

    Sound and Posture: an Overview of Recent Findings

    No full text
    International audienceEven if it has been neglected for a long time, the sound and posture domain seemed to arouse an increasing interest in recent years. In the present position paper, we propose to present an overview of our recent findings on this field and to put them in perspective with the literature. We will bring evidence to support the view that spatial cues provided by auditory information can be integrated by human for a better postural control

    The influence of horizontally rotating sound on standing balance

    No full text
    International audiencePostural control is known to be the result of the integration and processing of various sensory inputs by the central nervous system. Among the various afferent inputs, the role of auditory information in postural regulation has been addressed in relatively few studies, which led to conflicting results. The purpose of the present study was to investigate the influence of a rotating auditory stimulus, delivered by an immersive 3D sound spatialization system, on the standing posture of young subjects. The postural sway of 20 upright, blindfolded subjects was recorded using a force platform. Use of various sound source rotation velocities followed by sudden immobilization of the sound was compared with two control conditions: no sound and a stationary sound source. The experiment showed that subjects reduced their body sway amplitude and velocity in the presence of rotating sound compared with the control conditions. The faster the sound source was rotating, the greater the reduction in subject body sway.Moreover, disruption of subject postural regulation was observed as soon as the sound source was immobilized. These results suggest that auditory information cannot be neglected in postural control, and that it acts as additional information influencing postural regulation

    Exploring the usability of sound strategies for guiding task: toward a generalization of sonification design

    No full text
    International audienceThis article aims at providing a new Parameter Mapping Sonification approach in order to facilitate and generalize sonification design for different applications. First a definition of the target as a con-cept that enables a general sonification strategy that is not limited to specific data types is given. This concept intends to facilitate the sepa-ration between sound and information to display. Rather than directly displaying data dimensions through the variation of a specific sound pa-rameter, the approach aims at displaying the distance between a given data value and the requested value. Then a taxonomy of sound strategies based on sound that allow the construction of several strategy types is presented. Finally, several sound strategies are evaluated with a user ex-periment and the taxonomy is discussed on the basis of user's guidance behavior during a guiding task

    Spatial Cues Provided by Sound Improve Postural Stabilization: Evidence of a Spatial Auditory Map?

    Get PDF
    International audienceIt has long been suggested that sound plays a role in the postural control process. Few studies however have explored sound and posture interactions. The present paper focuses on the specific impact of audition on posture, seeking to determine the attributes of sound that may be useful for postural purposes. We investigated the postural sway of young, healthy blindfolded subjects in two experiments involving different static auditory environments. In the first experiment, we compared effect on sway in a simple environment built from three static sound sources in two different rooms: a normal vs. an anechoic room. In the second experiment, the same auditory environment was enriched in various ways, including the ambisonics synthesis of a immersive environment, and subjects stood on two different surfaces: a foam vs. a normal surface. The results of both experiments suggest that the spatial cues provided by sound can be used to improve postural stability. The richer the auditory environment, the better this stabilization. We interpret these results by invoking the " spatial hearing map " theory: listeners build their own mental representation of their surrounding environment, which provides them with spatial landmarks that help them to better stabilize

    Sonification binaurale pour l'aide à la navigation

    No full text
    This manuscript presents an augmented reality system based on 3D sound and sonification whose aim is to provide navigation assistance for visually impaired users. The design of this system has been addressed in three ways. First, 3D sound generation via binaural synthesis has limitations due to the problem of the need for HRTF individualisation. A new method based on brain plasticity is established to adapt individuals to HRTFs using an audio-kinaesthetic platform, reversing the standard paradigm. This method has shown the potential for a rapid adaptation of the auditory system to virtual auditory cues without the use of vision. Second, spatial data sonification is investigated in the context of a system for locating and grasping objects in the peripersonnel space. Sound localization performance was examined by comparing real and virtual sound sources. On the basis of the results, a distance sonification method is developed with the aim of improving user performance. Rather than employing sonification by sound synthesis, the proposed sonification method varies parameters of an audio effect that is applied to a base sound. This method allows the user to select and change the base sound without requiring additional learning. Finally, we present the concept of a new method of sonification designed to answer end-user needs in terms of aesthetics and sonification customization. "Morphocons" are short audio units whose aim is the construction of a sound vocabulary based on the temporal evolution of sound. An identification test highlights the efficiency of morphocons for conveying the same information with various types of sounds.Dans cette thèse, nous proposons la mise en place d'un système de réalité augmentée fondé sur le son 3D et la sonification, ayant pour objectif de fournir les informations nécessaires aux non- voyants pour un déplacement fiable et sûr. La conception de ce système a été abordée selon trois axes. L'utilisation de la synthèse binaurale pour générer des sons 3D est limitée par le problème de l'individualisation des HRTF. Une méthode a été mise en place pour adapter les individus aux HRTF en utilisant la plasticité du cerveau. Évaluée avec une expérience de localisation, cette méthode a permis de montrer les possibilités d'acquisition rapide d'une carte audio-spatiale virtuelle sans utiliser la vision. La sonification de données spatiales a été étudiée dans le cadre d'un système permettant la préhension d'objet dans l'espace péripersonnel. Les capacités de localisation de sources sonores réelles et virtuelles ont été étudiées avec un test de localisation. Une technique de sonification de la distance a été développée. Consistant à relier le paramètre à sonifier aux paramètres d'un effet audio, cette technique peut être appliquée à tout type de son sans nécessiter d'apprentissage supplémentaire. Une stratégie de sonification permettant de prendre en compte les préférences des utilisateurs a été mise en place. Les " morphocons " sont des icônes sonores définis par des motifs de paramètres acoustiques. Cette méthode permet la construction d'un vocabulaire sonore indépendant du son utilisé. Un test de catégorisation a montré que les sujets sont capables de reconnaître des icônes sonores sur la base d'une description morphologique indépendamment du type de son utilisé

    Soma - Performance audio-tactile

    Get PDF
    International audienc

    Soma - Performance audio-tactile

    No full text
    International audienc

    Soma - Performance audio-tactile

    Get PDF
    International audienc

    Rapid Auditory System Adaptation Using a Virtual Auditory Environment

    No full text
    Various studies have highlighted plasticity of the auditory system from visual stimuli, limiting the trained field of perception. The aim of the present study is to investigate auditory system adaptation using an audio-kinesthetic platform. Participants were placed in a Virtual Auditory Environment allowing the association of the physical position of a virtual sound source with an alternate set of acoustic spectral cues or Head-Related Transfer Function (HRTF) through the use of a tracked ball manipulated by the subject. This set-up has the advantage to be not being limited to the visual field while also offering a natural perception-action coupling through the constant awareness of one's hand position. Adaptation process to non-individualized HRTF was realized through a spatial search game application. A total of 25 subjects participated, consisting of subjects presented with modified cues using non-individualized HRTF and a control group using individual measured HRTFs to account for any learning effect due to the game itself. The training game lasted 12 minutes and was repeated over 3 consecutive days. Adaptation effects were measured with repeated localization tests. Results showed a significant performance improvement for vertical localization and a significant reduction in the front/back confusion rate after 3 sessions

    Perception de la verticale en présence d’indices d’orientation visuels ou sonores : vers une dépendance allocentrée ?

    Get PDF
    International audienceSubjective vertical in presence of visual or auditory cues: Towards an allocentric dependence? The psychological differentiation approach initiated by Witkin and Asch (1948) on the perception of verticality yielded the concept of field dependence, through which observers could be distinguished from their tendency to be influenced or not by a visual frame tilt when judging the direction of gravity (i.e., subjective vertical [SV]). Since, field dependence has been mostly considered as a marker for a preferential sensibility to visual information with respect to other sensory modalities (e.g., vestibular or somatosensory). This pilot study aims at tackling the issue of field dependence in spatial perception from a novel perspective. We hypothesized that orientation cues issued from a same reference frame centered on near surroundings (i.e. allocentric reference frame) could lead to comparable distinctions between observers, whatever the sensory modality involved in conveying these cues. We tested 23 participants on a SV adjustment task facing two allocentric-visual or auditory-scenes. Our results show a strong correlation between SV settings in both sensory conditions where the allocentric scene was tilted. These findings suggest that individuals could differ regarding the process of spatial information issued from a same reference frame, irrespective from the sensory modality conveying the information.Reçu le 14 décembre 2017, Accepté le 9 septembre 2019 L'étude de la perception de la verticale initiée par Witkin et Asch (1948) a abouti au concept de dépendance à l'égard du champ, distinguant les individus dans leur propension à être influencés par l'inclinaison d'un cadre visuel sur le jugement de la direction gravitaire (i.e., verticale subjective [VS]). Depuis, cette dépendance est souvent considérée comme révélatrice d'une sensibilité préférentielle aux informations visuelles par rapport à d'autres modalités sensorielles (e.g., vestibulaires ou somesthésiques). Ce travail pilote vise à reconsidérer la notion de dépendance à l'égard du champ dans la perception spatiale. Nous faisons l'hypothèse que des informations d'orientation issues d'un même cadre de référence centré sur l'environnement (i.e., référentiel allocentré) puissent conduire à une distinction comparable entre participants, quelle que soit la modalité sensorielle considérée. Face à deux scènes allocentrées (visuelle et sonore), 23 participants ont été testés sur une tâche d'ajustement de la VS. Nos résultats montrent une forte corrélation entre les ajustements dans les deux conditions sensorielles où la scène allocentrée était inclinée. Ceci suggère que les individus peuvent se différencier dans le traitement des informations spatiales issues d'un même cadre de référence, et ce de la même façon quelle que soit la modalité sensorielle permettant de relayer ces informations. Mots clés : perception spatiale, dépendance à l'égard du champ, cadre de référence, orientation, vision, audition
    corecore