27 research outputs found

    A phenomenological approach to investigate the pre-reflexive contents of consciousness during sound production

    Get PDF
    International audienceThis article describes a listening experiment based on elici-tation interviews that aims at describing the conscious experience of a subject submitted to a perceptual stimulation. As opposed to traditional listening experiments in which subjects are generally influenced by closed or suggestive questions and limited to predefined, forced choices, elicita-tion interviews make it possible to get deeper insight into the listener's perception, in particular to the pre-reflexive content of the conscious experiences. Inspired by previous elicitation interviews during which subjects passively listened to sounds, this experience is based on an active task during which the subjects were asked to reproduce a sound with a stylus on a graphic tablet that controlled a synthesis model. The reproduction was followed by an elicitation interview. The trace of the graphic gesture as well as the answers recorded during the interview were then analyzed. Results revealed that the subjects varied their focus towards both the evoked sound source, and intrinsic sound properties and also described their sensations induced by the experience

    SkyPole

    No full text
    A method for locating the north celestial pole from skylight polarization pattern

    Influence of Music on Perceived Emotions in Film

    No full text
    "Influence of Music on Perceived Emotions in Film" has received an Honorable Mention award at the AES 153rd Convention. Honorable Mention awards are given only to the top 3 papers in overall reviewer score with additional input from the AES publications committee.International audienceFilm music plays a core role in film production and reception as it not only contributes to the film aesthetics and creativity, but it also affects viewers’ experience and enjoyment. Film music composers often aim to serve the film narrative, immerse viewers into the setting and story, convey clues, and importantly, act on their emotions. Yet, how film music influences viewers is still misunderstood. We conducted a perceptual study to analyse the impact of music on the perception of emotions in film. We developed an online interface for time-based emotion annotation of audio/video media clips based on the Valence/Arousal (VA) two-dimensional model. Participants reported their perceived emotions over time in the VA space for three media conditions: film scene presented without sound (video only), film music presented without video (audio only), and film scene with accompanying music and sound effects (both video and audio modalities). 16 film clips were selected covering four clips for each of four genres (action & drama, romance, comedy, and horror). 38 participants completed the study (12 females and 26 males from many countries, average age: 28.9). Density scatter plots are used to visualise the spread of emotion ratings in the VA space and differences across media conditions and film clips. Results from linear mixed effect models show significant effects of the audiovisual media condition and film genre on VA ratings, in line with previous results by Parke et al. [1]. Perceived VA ratings across media conditions follow an almost linear relationship with an increase in strength in the following order: film alone, film with music/sound, music alone. We illustrate this effect by plotting the VA rating centre of mass across conditions. VA ratings for the film-alone condition are closer to the origin of the space compared to the two other media conditions, indicating that the inclusion of music yields stronger emotions (higher VA ratings). Certain individual factors (musical ability, familiarity, preference) also seem to impact the perception of arousal and valence while viewing films. Our online emotion annotation interface was on overall well received and suggestions to improve the display of reference emotion tags are discussed

    Exploring Sound Perception through Vocal Imitations

    No full text
    International audienceUnderstanding how sounds are perceived and interpreted is an important challenge for researchers dealing with auditory perception. The ecological approach to perception suggests that the salient perceptual information that enables an auditor to recognize events through sounds is contained in specific structures called invariants. Identifying such invariants is of interest from a fundamental point of view to better understand auditory perception and is also useful to include perceptual considerations to model and control sounds. Among the different approaches used to identify perceptually relevant sound structures, vocal imitations are believed to bring a fresh perspective to the field. The main goal of this paper is to better understand how invariants are transmitted through vocal imitations. A sound corpus containing different types of known invariants obtained from an existing synthesizer was established. Participants took part in a test where they were asked to imitate the sound corpus. A continuous and sparse model adapted to the specificities of the vocal imitations was then developed and used to analyze the imitations. Results show that participants were able to highlight salient elements of the sounds which partially correspond to the invariants used in the sound corpus. This study also confirms that vocal imitations reveal how these invari-ants are transmitted through our perception and offers promising perspectives on auditory investigations

    Tour d'horizon des capteurs polarimétriques dédiés à la robotique mobile

    No full text
    National audienceLa navigation basée sur lumière polarisée est une nouvelle forme de navigation exploitant des capteurs bio-inspirées ou conventionnels dans lesquels des attributs du motif de polarisation du ciel sont calculés pour s’orienter ou se localiser. Ce tour d’horizon mettra l’accent sur les progrès de la recherche sur les réalisations de capteurs imageants et non-imageants à des fins de détection de cap. Les méthodes "imageantes" reposent sur des caméras équipées de filtres optiques (statiques ou rotatifs) ou de filtres à grilles métalliques. Aujourd’hui, les caméras dites à division du plan focal (DoFP) sont privilégiées, comme la caméra SONY IMX250MZR [1] pour détecter les axes de symétries ou la correspondance des motifs de polarisation aussi bien en angle de polarisation qu’en degré de polarisation pour en déduire un cap. L’unique boussole optique - SkyPass de Polaris Sensor Technologies, Inc. - disponible sur le marché américain coûte 36000€ pour des applications militaires [2]. Une solution alternative, actuellement émergente, est de coiffer une caméra couleur via une lame d’onde, ce qui permet d’exploiter les propriétés de matériaux biréfringents dont la dépendance de la retardance avec l’incidence des rayons permet de transformer l’état de polarisation en nuances de couleurs : lames à retard en "S" [3] ou linéaire (brevet Stellantis). Les méthodes "non-imageantes" reposent sur l’utilisation de photodiodes surmontées de polariseurs et utilisent, par exemple, le modèle biologique de Thomas Labhart (1988) exploité à bord du robot AntBot [4].RÉFÉRENCES[1] C. Lane, D. Rode, and T. Rösgen, “Calibration of a polarization image sensor and investigation of influencingfactors,” Applied Optics, vol. 61, no. 6, pp. C37–C45, 2022.[2] L. M. Eshelman, A. M. Smith, K. M. Smith, and D. B. Chenault, “Unique navigation solution utilizing sky polarizationsignatures,” in Polarization : Measurement, Analysis, and Remote Sensing XV, vol. 12112, p. 1211203,SPIE, 2022.[3] Y. Fan, R. Zhang, Z. Liu, and J. Chu, “A skylight orientation sensor based on s-waveplate and linear polarizer forautonomous navigation,” IEEE Sensors Journal, vol. 21, no. 20, pp. 23551–23557, 2021.[4] J. Dupeyroux, J. R. Serres, and S. Viollet, “Antbot : A six-legged walking robot able to home like desert ants inoutdoor environments,” Science Robotics, vol. 4, no. 27, p. eaau0307, 2019

    SkyPole: a geolocation algorithm based on polarized vision without using astronomical ephemerides

    No full text
    International audienceGlobal Navigation Satellite Systems (GNSS) are widely used due to their easy access to outdoor GNSS signalsand their spatial precision. However, such systems are sensitive to jamming and spoofing. Simple and robustnavigation strategies can be found in animals deprived by essence of any GNSS system. Studies have shown thatanimals like bees or ants utilize the sky’s polarization pattern for navigation. We recently proposed a methodinspired by migratory birds, which calibrate their magnetic compass through the celestial rotation of night starsor the daytime polarization pattern. By considering the temporal properties of the sky’s polarization pattern asa relevant navigation information, we developed a bio-inspired method to find the geographical north bearingand the observer’s latitude, requiring only skylight polarization observations during the day. To reduce the noisesusceptibility of our method, we added a pre-processing step using a least square method based on skylight polarization models, and a segmentation process based on a convolutional autoencoder neural network, trainedwith simulated data

    SkyPole -A bio-inspired polarimetric compass for GPS-denied navigation

    No full text
    National audienceGlobal Navigation Satellite Systems (GNSS) are widely used due to their precision. However, such systems are susceptible to jamming and spoofing. As an alternative, bio-inspired methods have been developed. Studies have shown that some insects, particularly desert ants, utilize the sky’s polarization pattern for navigation. The use of solar ephemeris combined with the estimation of the sun’s position through the polarization pattern enables direct geographical positioning. Nevertheless, animals do not have access to these ephemeris, and the utilizationof the polarization pattern as a reference for their navigation remains poorly understood. We propose here an alternative method inspired by migratory birds, which calibrate their magnetic compass through the celestial rotation of night stars or the daytime polarization pattern. Similar to Brines, we consider the temporal properties of the sky’s polarization pattern as relevant navigation information. We developed a bio-inspired method to find the geographical north bearing and the observer’s latitude, requiring only skylight polarization observations provided here by a commercial polarimetric camera. Skylight is mostly linearly polarized and characterized by two parameters: the angle of linear polarization (AoLP) and degree of linear polarization (DoLP). Our method consists in processing only skylight DoLP images taken at different instants, in order to find the north celestial pole (NCP) from temporal invariances of the DoLP pattern. Then, the geographical north bearing (true north) and the observer’s latitude Φ can be deduced from the NCP’s coordinates. To experimentally validate our approach, we utilized a polarimetric camera (PHX050SQC from Lucid Vision Labs, sensor ref. Sony IMX250MYR) equipped with a 185° fisheye lens (FE185C57HA-1, Fujinon), which was situated on the roof of the INT Lab at La Timone, Marseille, France (43.2870135°N, 5.4034385°E). Our validation yielded a Mean Absolute Error (MAE) of 2.6° in azimuth and 3.8° in latitude

    Multi-Touch Interactions for Model-Based Sonification

    Get PDF
    Tünnermann R, Hermann T. Multi-Touch Interactions for Model-Based Sonification. In: Aramaki M, Kronland-Martinet R, Ystad S, Jensen K, eds. Proceedings of the 15th International Conference on Auditory Display. Copenhagen, Denmark: Re:New – Digital Arts Forum; 2009.This paper presents novel interaction modes for Model-Based Sonification (MBS) via a multi-touch interface. We first lay out details about the constructed multi-touch surface. This is followed by a description of the Data Sonogram Sonification Model and how it is implemented using the system. Modifications from the original sonification model such as the limited space scans are described and discussed with sonification examples. Videos showing examples of interaction are provided for various data sets. Beyond Data Sonograms, the presented system provides a basis for the implementation of known and novel sonification models. We discuss the available interaction modes with multi-touch surfaces and how these interactions can be profitably used to control spatial and non-spatial sonification models
    corecore