30 research outputs found

    Physics-based Concatenative Sound Synthesis of Photogrammetric models for Aural and Haptic Feedback in Virtual Environments

    Get PDF
    We present a novel physics-based concatenative sound synthesis (CSS) methodology for congruent interactions across physical, graphical, aural and haptic modalities in Virtual Environments. Navigation in aural and haptic corpora of annotated audio units is driven by user interactions with highly realistic photogrammetric based models in a game engine, where automated and interactive positional, physics and graphics data are supported. From a technical perspective, the current contribution expands existing CSS frameworks in avoiding mapping or mining the annotation data to real-time performance attributes, while guaranteeing degrees of novelty and variation for the same gesture

    Bipedal steps in the development of rhythmic behavior in humans

    No full text
    We contrast two related hypotheses of the evolution of dance: H1: Maternal bipedal walking influenced the fetal experience of sound and associated movement patterns; H2: The human transition to bipedal gait produced more isochronous/predictable locomotion sound resulting in early music-like behavior associated with the acoustic advantages conferred by moving bipedally in pace. The cadence of walking is around 120 beats per minute, similar to the tempo of dance and music. Human walking displays long-term constancies. Dyads often subconsciously synchronize steps. The major amplitude component of the step is a distinctly produced beat. Human locomotion influences, and interacts with, emotions, and passive listening to music activates brain motor areas. Across dance-genres the footwork is most often performed in time to the musical beat. Brain development is largely shaped by early sensory experience, with hearing developed from week 18 of gestation. Newborns reacts to sounds, melodies, and rhythmic poems to which they have been exposed in utero. If the sound and vibrations produced by footfalls of a walking mother are transmitted to the fetus in coordination with the cadence of the motion, a connection between isochronous sound and rhythmical movement may be developed. Rhythmical sounds of the human mother locomotion differ substantially from that of nonhuman primates, while the maternal heartbeat heard is likely to have a similar isochronous character across primates, suggesting a relatively more influential role of footfall in the development of rhythmic/musical abilities in humans. Associations of gait, music, and dance are numerous. The apparent absence of musical and rhythmic abilities in nonhuman primates, which display little bipedal locomotion, corroborates that bipedal gait may be linked to the development of rhythmic abilities in humans. Bipedal stimuli in utero may primarily boost the ontogenetic development. The acoustical advantage hypothesis proposes a mechanism in the phylogenetic development

    Internet of Things for fall prediction and prevention

    Get PDF
    Internet of Things (IoT) is making a breakthrough for the development of innovative healthcare systems. IoT-based health applications are expected to change the paradigm traditionally followed by physicians for diagnosis, by moving health monitoring from the clinical environment to the domestic space. Fall avoidance is a field where the continuous monitoring allowed by the IoT-based framework offers tremendous benefits to the user. In fact, falls are highly damaging due to both physical and psychological injuries. Currently, the most promising approaches to reduce fall injuries are fall prediction, which strives to predict a fall before its occurrence, and fall prevention, which assesses balance and muscle strength through some clinical functional tests. In this context, the IoT-based framework provides real-time emergency notification as soon as fall is predicted, mid-term analysis on the monitored activities, and data logging for long-term analysis by clinical experts. This approach gives more information to experts for estimating the risk of a future fall and for suggesting proper exercises

    Determining sound, smell, and touch attributes in small urban parks using NGT

    Get PDF
    All senses in the landscape area are always interrelated in a complex way. Since concept of multi-sensory integration has been considered as an influential factor on the human environmental perception, engagement of the non-visual (sound- smell- touch) factors could add some information to human knowledge. The literature review of the paper initially addressed the effectiveness of non-visual factors. The summary extracted Natural, and Mechanical, Human, Instrumental (for sound), in addition Natural, Environmental related and Human-body (for smell), and finally Natural and Furniture (for touch). Furthermore, research with application of literature conducted NGT (Nominal Group Technique) to determine more salient information regarding availability of non-visual attributes in the urban environment (e.g. small urban parks). The finding of this research could offer some insight into the design elements. Indeed, the extracted information could help the designers and policy makers to propose applicable and appropriate combination of the elements in the urban area such as small urban parks to establish a more successful environment

    As light as your footsteps: altering walking sounds to change perceived body weight, emotional state and gait

    Get PDF
    An ever more sedentary lifestyle is a serious problem in our society. Enhancing people’s exercise adherence through technology remains an important research challenge. We propose a novel approach for a system supporting walking that draws from basic findings in neuroscience research. Our shoe-based prototype senses a person’s footsteps and alters in real-time the frequency spectra of the sound they produce while walking. The resulting sounds are consistent with those produced by either a lighter or heavier body. Our user study showed that modified walking sounds change one’s own perceived body weight and lead to a related gait pattern. In particular, augmenting the high frequencies of the sound leads to the perception of having a thinner body and enhances the motivation for physical activity inducing a more dynamic swing and a shorter heel strike. We here discuss the opportunities and the questions our findings open

    Using virtual reality for assessing the role of noise in the audio-visual design of an urban public space

    Get PDF
    Sound planning is not often included in the urban design process despite the well-known audio-visual interactions of human perception. A methodology to compare the overall appreciation of future renovation alternatives of urban public spaces using Virtual Reality Technology is proposed. This method is applied to assess the role of noise in the overall appreciation of a walk on a bridge crossing a highway. The auralization is a dynamic 3D surround based on B-format recordings (ambisonics), filtered by means of full-wave numerical calculations obtaining the sound field behind noise barriers along the bridge's edge. Four different styles of visual street design including different noise barrier heights in combination with the 4 corresponding predicted sound fields were evaluated for their pleasantness by 71 normal-hearing participants on 4 separate days. Each day participants experienced all the visual environments with only one soundscape (to elude direct sound comparison) and anything related to sound was not mentioned in the first part of the experiment. Even in this non-focussed context, a statistically significant effect of the sound environment on the overall appreciation was found. In general, the pleasantness increases with traffic noise level reduction, but the visual design has a stronger impact. By mentioning the soundscape while introducing the evaluation, slightly lower (but statistically significantly different) pleasantness ratings were obtained. Instead of increasing noise barrier height, improving the visual design of a lower barrier seems more effective to increase pleasantness. Visual designs including vegetation strongly outperform others. The virtual experience was rated as immersive and realistic

    Mixed Reality Browsers and Pedestrian Navigation in Augmented Cities

    Get PDF
    International audienceIn this paper, We use a declarative format for positional audio with synchronization between audio chunks using SMIL. This format has been specifically designed for the type of audio used in AR applications. The audio engine associated to this format is running on mobile platforms (iOS, Android). Our MRB browser called IXE use a format based on volunteered geographic information (OpenStreetMap) and OSM documents for IXE can be fully authored in side OSM editors like JOSM. This is in contrast with the other AR browsers like Layar, Juniao, Wikitude, which use a Point of Interest (POI) based format having no notion of ways. This introduces a fundamental difference and in some senses a duality relation between IXE and the other AR browsers. In IXE, Augmented Virtuality (AV) navigation along a route (composed of ways) is central and AR interaction with objects is delegated to associate 3D activities. In AR browsers, navigation along a route is delegated to associated map activities and AR interaction with objects is central. IXE supports multiple tracking technologies and therefore allows both indoor navigation in buildings and outdoor navigation at the level of sidewalks. A first android version of the IXE browser will be released at the end of 2013. Being based on volunteered geographic it will allow building accessible pedestrian networks in augmented cities

    Mixed Reality Browsers and Pedestrian Navigation in Augmented Cities

    No full text
    International audienceIn this paper, We use a declarative format for positional audio with synchronization between audio chunks using SMIL. This format has been specifically designed for the type of audio used in AR applications. The audio engine associated to this format is running on mobile platforms (iOS, Android). Our MRB browser called IXE use a format based on volunteered geographic information (OpenStreetMap) and OSM documents for IXE can be fully authored in side OSM editors like JOSM. This is in contrast with the other AR browsers like Layar, Juniao, Wikitude, which use a Point of Interest (POI) based format having no notion of ways. This introduces a fundamental difference and in some senses a duality relation between IXE and the other AR browsers. In IXE, Augmented Virtuality (AV) navigation along a route (composed of ways) is central and AR interaction with objects is delegated to associate 3D activities. In AR browsers, navigation along a route is delegated to associated map activities and AR interaction with objects is central. IXE supports multiple tracking technologies and therefore allows both indoor navigation in buildings and outdoor navigation at the level of sidewalks. A first android version of the IXE browser will be released at the end of 2013. Being based on volunteered geographic it will allow building accessible pedestrian networks in augmented cities

    As Light as You Aspire to Be: Changing body perception with sound to support physical activity

    Get PDF
    Supporting exercise adherence through technology remains an important HCI challenge. Recent works showed that altering walking sounds leads people perceiving themselves as thinner/lighter, happier and walking more dynamically. While this novel approach shows potential for physical activity, it raises critical questions impacting technology design. We ran two studies in the context of exertion (gym-step, stairs-climbing) to investigate how individual factors impact the effect of sound and the duration of the after-effects. The results confirm that the effects of sound in body-perception occur even in physically demanding situations and through ubiquitous wearable devices. We also show that the effect of sound interacted with participants’ body weight and masculinity/femininity aspirations, but not with gender. Additionally, changes in body-perceptions did not hold once the feedback stopped; however, body-feelings or behavioural changes appeared to persist for longer. We discuss the results in terms of malleability of body-perception and highlight opportunities for supporting exercise adherence
    corecore