1,940 research outputs found

    Is Vivaldi smooth and takete? Non-verbal sensory scales for describing music qualities

    Get PDF
    Studies on the perception of music qualities (such as induced or perceived emotions, performance styles, or timbre nuances) make a large use of verbal descriptors. Although many authors noted that particular music qualities can hardly be described by means of verbal labels, few studies have tried alternatives. This paper aims at exploring the use of non-verbal sensory scales, in order to represent different perceived qualities in Western classical music. Musically trained and untrained listeners were required to listen to six musical excerpts in major key and to evaluate them from a sensorial and semantic point of view (Experiment 1). The same design (Experiment 2) was conducted using musically trained and untrained listeners who were required to listen to six musical excerpts in minor key. The overall findings indicate that subjects\u2019 ratings on non-verbal sensory scales are consistent throughout and the results support the hypothesis that sensory scales can convey some specific sensations that cannot be described verbally, offering interesting insights to deepen our knowledge on the relationship between music and other sensorial experiences. Such research can foster interesting applications in the field of music information retrieval and timbre spaces explorations together with experiments applied to different musical cultures and contexts

    Shared neural representations of tactile roughness intensities by somatosensation and touch observation using an associative learning method

    Get PDF
    Previous human fMRI studies have reported activation of somatosensory areas not only during actual touch, but also during touch observation. However, it has remained unclear how the brain encodes visually evoked tactile intensities. Using an associative learning method, we investigated neural representations of roughness intensities evoked by (a) tactile explorations and (b) visual observation of tactile explorations. Moreover, we explored (c) modality-independent neural representations of roughness intensities using a cross-modal classification method. Case (a) showed significant decoding performance in the anterior cingulate cortex (ACC) and the supramarginal gyrus (SMG), while in the case (b), the bilateral posterior parietal cortices, the inferior occipital gyrus, and the primary motor cortex were identified. Case (c) observed shared neural activity patterns in the bilateral insula, the SMG, and the ACC. Interestingly, the insular cortices were identified only from the cross-modal classification, suggesting their potential role in modality-independent tactile processing. We further examined correlations of confusion patterns between behavioral and neural similarity matrices for each region. Significant correlations were found solely in the SMG, reflecting a close relationship between neural activities of SMG and roughness intensity perception. The present findings may deepen our understanding of the brain mechanisms underlying intensity perception of tactile roughness

    Multisensory Analysis of Consumer-Product Interaction During Ceramic Tile Shopping Experiences

    Full text link
    [EN] The need to design products that engage several senses has being increasingly recognised by design and marketing professionals. Many works analyse the impact of sensory stimuli on the hedonic, cognitive, and emotional responses of consumers, as well as on their satisfaction and intention to purchase. However, there is much less information about the utilitarian dimension related to a sensory non-reflective analysis of the tangible elements of the experience, the sequential role played by different senses, and their relative importance. This work analyses the sensorial dimension of consumer interactions in shops. Consumers were filmed in two ceramic tile shops and their behaviour was analysed according to a previously validated checklist. Sequence of actions, their frequency of occurrence, and the duration of inspections were recorded, and consumers were classified according to their sensory exploration strategies. Results show that inspection patterns are intentional but shifting throughout the interaction. Considering the whole sequence, vision is the dominant sense followed by touch. However, sensory dominance varies throughout the sequence. The dominance differences appear between all senses and within the senses of vision, touch and audition. Cluster analysis classified consumers into two groups, those who were more interactive and those who were visual and passive evaluators. These results are very important for understanding consumer interaction patterns, which senses are involved (including their importance and hierarchy), and which sensory properties of tiles are evaluated during the shopping experience. Moreover, this information is crucial for setting design guidelines to improve sensory interactions and bridge sensory demands with product features.The Spanish Ministry of Culture and Education funded this research with Grant No. PSE-020400-2007-1.Artacho Ramírez, MÁ.; Alcantara Alcover, E.; Martínez, N. (2020). Multisensory Analysis of Consumer-Product Interaction During Ceramic Tile Shopping Experiences. Multisensory Research. 33(2):213-249. https://doi.org/10.1163/22134808-20191391S21324933

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space

    Roughness and spatial density judgments on visual and haptic textures using virtual reality

    No full text
    The purpose of this study is to investigate multimodal visual-haptic texture perception for which we used virtual reality techniques. Participants judged a broad range of textures according to their roughness and their spatial density under visual, haptic and visual-haptic exploration conditions. Participants were well able to differentiate between the different textures both by using the roughness and the spatial density judgment. When provided with visualhaptic textures, subjects performance increased (for both judgments) indicating sensory combination of visual and haptic texture information. Most interestingly, performance for density and roughness judgments did not differ significantly, indicating that these estimates are highly correlated. This may be due to the fact that our textures were generated in virtual reality using a haptic pointforce display (PHANToM). In conclusion, it seems that the roughness and spatial density estimate were based on the same physical parameters given the display technology used

    Mapping a multi-sensory identity territory at the early design stage

    Get PDF
    This article presents a kansei design methodology. It is placed at the very beginning of the design process and aims to influence the following steps in order to improve the user's understanding and experiencing of the designed product. The experimentation combines in a subtle way the design thinking approach of learning by doing and the kansei engineering quantitative approach. The research presented is based on the results of a previous study that defined the semantic and emotional scope of future hybrid cars for European using visual stimuli. This kansei design methodology creates and assesses multi-sensory atmospheres is order to provide tangible direction composed of vision, touch, hearing and smell stimuli. From the cognitive and affective responses of the 42 participants we were able to detail 3 directions for future cars interiors that aim to enrich the styling design briefs and to influence the design strategies such as the management of the different grades. The research presented here was supported by the Kansei Design department from Toyota Motor Europe (TME-KD). This collaboration also brought an industrial context to it.SUPPORTED BY TOYOTA EUROP

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    It Sounds Cool: Exploring Sonification of Mid-Air Haptic Textures Exploration on Texture Judgments, Body Perception, and Motor Behaviour

    Get PDF
    Ultrasonic mid-air haptic technology allows for the perceptual rendering of textured surfaces onto the user's hand. Unlike real textured surfaces, however, mid-air haptic feedback lacks implicit multisensory cues needed to reliably infer a texture's attributes (e.g., its roughness). In this paper, we combined mid-air haptic textures with congruent sound feedback to investigate how sonification could influence people's (1) explicit judgment of the texture attributes, (2) explicit sensations of their own hand, and (3) implicit motor behavior during haptic exploration. Our results showed that audio cues (presented solely or combined with haptics) influenced participants' judgment of the texture attributes (roughness, hardness, moisture and viscosity), produced some hand sensations (the feeling of having a hand smoother, softer, looser, more flexible, colder, wetter and more natural), and changed participants' speed (moving faster or slower) while exploring the texture. We then conducted a principal component analysis to better understand and visualize the found results and conclude with a short discussion on how audio-haptic associations can be used to create embodied experiences in emerging application scenarios in the metaverse

    Tactual perception: a review of experimental variables and procedures

    Get PDF
    This paper reviews literature on tactual perception. Throughout this review we will highlight some of the most relevant variables in touch literature: interaction between touch and other senses; type of stimuli, from abstract stimuli such as vibrations, to two- and three-dimensional stimuli, also considering concrete stimuli such as the relation between familiar and unfamiliar stimuli or the haptic perception of faces; type of participants, separating studies with blind participants, studies with children and adults, and an analysis of sex differences in performance; and finally, type of tactile exploration, considering conditions of active and passive touch, the relevance of movement in touch and the relation between exploration and time. This review intends to present an organised overview of the main variables in touch experiments, attending to the main findings described in literature, to guide the design of future works on tactual perception and memory.This work was funded by the Portuguese “Foundation for Science and Technology” through PhD scholarship SFRH/BD/35918/2007
    corecore