267 research outputs found

    A transdisciplinary collaborative journey leading to sensorial clothing

    Get PDF
    Recent science funding initiatives have enabled participants from a diverse array of disciplines to engage in common spaces for developing solutions for new wearables. These initiatives include collaborations between the arts and sciences, fields which have traditionally contributed very different forms of knowledge, methodology, and results. However, many such collaborations often turn out as science communication and dissemination activities that make no concrete contribution to technological innovation. Magic Lining, a transdisciplinary collaborative project involving artistic and scientific partners working in the fields of e-textile design, cognitive neuroscience and human-computer interaction, creates a shared experiential knowledge space. This article focuses on the research question of how a transdisciplinary collaborative design processinvolving material explorations, prototyping, first-person-perspective and user studies, can lead to the creation of a garment that invites various perceptual and emotional responses in its wearer. The article reflects on the design journey, highlighting the transdisciplinary team's research through design experience and shared language for knowledge exchange. This process has revealed new research paths for an emerging field of 'sensorial clothing', combining the various team members' fields of expertise and resulting in a wearable prototype.This work was partially supported by the VERTIGO project as part of the STARTS program of the European Commission, based on technological elements from the project Magic Shoes (grant PSI2016-79004-R, Ministerio de Economía, Industria y Competitividad of Spain, AEI/FEDER). The work was also supported by the project Magic outFIT, funded by the Spanish Agencia Estatal de Investigación (PID2019-105579RB-I00/AEI/10.13039/501100011033). Aleksander Väljamäe’s work was supported by the Estonian Research Council grant PUT1518; and Ana Tajadura-Jiménez’s work was supported by RYC-2014–15421 grant, Ministerio de Economía, Industria y Competitividad of Spain

    Vestibular Contributions to the Sense of Body, Self, and Others

    Full text link
    There is increasing evidence that vestibular signals and the vestibular cortex are not only involved in oculomotor and postural control, but also contribute to higher-level cognition. Yet, despite the effort that has recently been made in the field, the exact location of the human vestibular cortex and its implications in various perceptional, emotional, and cognitive processes remain debated. Here, we argue for a vestibular contribution to what is thought to fundamentally underlie human consciousness, i.e., the bodily self. We will present empirical evidence from various research fields to support our hypothesis of a vestibular contribution to aspects of the bodily self, such as basic multisensory integration, body schema, body ownership, agency, and self-location. We will argue that the vestibular system is especially important for global aspects of the self, most crucially for implicit and explicit spatiotemporal self-location. Furthermore, we propose a novel model on how vestibular signals could not only underlie the perception of the self but also the perception of others, thereby playing an important role in embodied social cognition

    Multisensory self-motion processing in humans

    Get PDF
    Humans obtain and process sensory information from various modalities to ensure successful navigation through the environment. While visual, vestibular, and auditory self-motion perception have been extensively investigated, studies on tac-tile self-motion perception are comparably rare. In my thesis, I have investigated tactile self-motion perception and its interaction with the visual modality. In one of two behavioral studies, I analyzed the influence of a tactile heading stimulus intro-duced as a distractor on visual heading perception. In the second behavioral study, I analyzed visuo-tactile perception of self-motion direction (heading). In both studies, visual self-motion was simulated as forward motion over a 2D ground plane. Tactile self-motion was simulated by airflow towards the subjects’ forehead, mimicking the experience of travel wind, e.g., during a bike ride. In the analysis of the subjects’ perceptual reports, I focused on possible visuo-tactile interactions and applied dif-ferent models to describe the integration of visuo-tactile heading stimuli. Lastly, in a functional magnetic resonance imaging study (fMRI), I investigated neural correlates of visual and tactile perception of traveled distance (path integration) and its modu-lation by prediction and cognitive task demands. In my first behavioral study, subjects indicated perceived heading from uni-modal visual (optic flow), unimodal tactile (tactile flow) or from a combination of stimuli from both modalities, simulating either congruent or incongruent heading (bimodal condition). In the bimodal condition, the subjects’ task was to indicate visually perceived heading. Hence, here tactile stimuli were behaviorally irrelevant. In bimodal trials, I found a significant interaction of stimuli from both modalities. Visually perceived heading was biased towards tactile heading direction for an offset of up to 10° between both heading directions. The relative weighting of stimuli from both modalities in the visuo-tactile in-teraction were examined in my second behavioral study. Subjects indicated per-ceived heading from unimodal visual, unimodal tactile and bimodal trials. Here, in bimodal trials, stimuli form both modalities were presented as behaviorally rele-vant. By varying eye- relative to head position during stimulus presentation, possi-ble influences of different reference frames of the visual and tactile modality were investigated. In different sensory modalities, incoming information is encoded rela-tive to the reference system of the receiving sensory organ (e.g., relative to the reti-na in vision or relative to the skin in somatosensation). In unimodal tactile trials, heading perception was shifted towards eye-position. In bimodal trials, varying head- and eye-position had no significant effect on perceived heading: subjects indicated perceived heading based on both, the vis-ual and tactile stimulus, independently of the behavioral relevance of the tactile stimulus. In sum, results of both studies suggest that the tactile modality plays a greater role in self-motion perception than previously thought. Besides the perception of travel direction (heading), information about trav-eled speed and duration are integrated to achieve a measure of the distance trav-eled (path integration). One previous behavioral study has shown that tactile flow can be used for the reproduction of travel distance (Churan et al., 2017). However, studies on neural correlates of tactile distance encoding in humans are lacking en-tirely. In my third study, subjects solved two path integration tasks from unimodal visual and unimodal tactile self-motion stimuli. Brain activity was measured by means of functional magnetic resonance imaging (fMRI). Both tasks varied in the engagement of cognitive task demands. In the first task, subjects replicated (Active trial) a previously observed traveled distance (Passive trial) (= Reproduction task). In the second task, subjects traveled a self-chosen distance (Active trial) which was then recorded and played back to them (Passive trial) (= Self task). The predictive coding theory postulates an internal model which creates predictions about sensory outcomes-based mismatches between predictions and sensory input which enables the system to sharpen future predictions (Teufel et al., 2018). Recent studies sug-gested a synergistical interaction between prediction and cognitive demands, there-by reversing the attenuating effect of prediction. In my study, this hypothesis was tested by manipulating cognitive demands between both tasks. For both tasks, Ac-tive trials compared to Passive trials showed BOLD enhancement of early sensory cortices and suppression of higher order areas (e.g., the intraparietal lobule (IPL)). For both modalities, enhancement of early sensory areas might facilitate task solv-ing processes at hand, thereby reversing the hypothesized attenuating effect of pre-diction. Suppression of the IPL indicates this area as an amodal comparator of pre-dictions and incoming self-motion signals. In conclusion, I was able to show that tactile self-motion information, i.e., tactile flow, provides significant information for the processing of two key features of self-motion perception: Heading and path integration. Neural correlates of tactile path-integration were investigated by means of fMRI, showing similarities between visual and tactile path integration on early processing stages as well as shared neu-ral substrates in higher order areas located in the IPL. Future studies should further investigate the perception of different self-motion parameters in the tactile modali-ty to extend the understanding of this less researched – but important – modality

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Design and Assessment of Vibrotactile Biofeedback and Instructional Systems for Balance Rehabilitation Applications.

    Full text link
    Sensory augmentation, a type of biofeedback, is a technique for supplementing or reinforcing native sensory inputs. In the context of balance-related applications, it provides users with additional information about body motion, usually with respect to the gravito-inertial environment. Multiple studies have demonstrated that biofeedback, regardless of the feedback modality (i.e., vibrotactile, electrotactile, auditory), decreases body sway during real-time use within a laboratory setting. However, in their current laboratory-based form, existing vibrotactile biofeedback devices are not appropriate for use in clinical and/or home-based rehabilitation settings due to the expense, size, and operating complexity of the instrumentation required. This dissertation describes the design, development, and preliminary assessment of two technologies that support clinical and home-based balance rehabilitation training. The first system provides vibrotactile-based instructional motion cues to a trainee based on the measured difference between the expert’s and trainee’s motions. The design of the vibrotactile display is supported by a study that characterizes the non-volitional postural responses to vibrotactile stimulation applied to the torso. This study shows that vibration applied individually by tactors over the internal oblique and erector spinae muscles induces a postural shift of the order of one degree oriented in the direction of the stimulation. Furthermore, human performance is characterized both experimentally and theoretically when the expert–trainee error thresholds and nature of the control signal are varied. The results suggest that expert–subject cross-correlation values were maximized and position errors and time delays were minimized when the controller uses a 0.5 error threshold and proportional plus derivative feedback control signal, and that subject performance decreases as motion speed and complexity increase. The second system provides vibrotactile biofeedback about body motion using a cell phone. The system is capable of providing real-time vibrotactile cues that inform corrective trunk tilt responses. When feedback is available, both healthy subjects and those with vestibular involvement significantly reduce their anterior-posterior or medial-lateral root-mean-square body sway, have significantly smaller elliptical area fits to their sway trajectory, spend a significantly greater mean percentage time within the no feedback zone, and show a significantly greater A/P or M/L mean power frequency.Ph.D.Mechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/91546/1/channy_1.pd

    Gulliver’s virtual travels: active embodiment in extreme body sizes for modulating our body representations

    Get PDF
    It is noted that the perceptual experience of body and space can be modulated by changing the action capabilities or by manipulating the perceived body dimensions through a multisensory stimulation. This study adds to pre-existing literature by investigating the alterations in bodily experience following embodiment to both enlarged and shrunked bodies, while participants actively navigated in a virtual environment. A normal-sized body served as a reference condition. After each embodied navigation, participants estimated the height and width of three different body parts. Results revealed that the embodiment over shrunked body induced a significant reduction in participants’ body image, while no changes were reported after the embodiment over the enlarged body. Findings were discussed in terms of previous literature exploring the constraints implicated in the ownership over different bodies

    Sensorimotor learning and self-motion perception in human balance control

    Get PDF

    Vestibular contribution to bodily self-consciousness and multisensory cortical processing

    Get PDF
    How does the self relate to the body? Bodily self-consciousness, i.e. the sense of being a subject bound to a body, involves a first-person perspective (1PP), i.e. the sense of being directed at the world. Prior research suggests that bodily self-consciousness depends on brain mechanisms integrating multisensory bodily signals. However, the specific multisensory mechanisms of 1PP are poorly understood. Here, I defend the thesis that the vestibular system, i.e. the sensory system encoding rotational and linear accelerations of the head, contributes to 1PP and related multisensory processing in the brain. The first part of my thesis presents experimental evidence showing that 1PP was influenced by multisensory conflict about the direction of gravity and the location of the body. 1PP depended on integrated visual-vestibular signals and was functionally distinct from another aspect of bodily self-consciousness: self-identification, i.e. the feeling that a particular body is âmineâ. The second part of my thesis presents the electrical neural correlates by which vestibular stimulation affected somatosensory and visual cortical processing. Passive whole-body yaw rotation naturally and selectively stimulated the vestibular system while the evoked responses to somatosensory or visual stimuli were recorded by electroencephalography. Electrical neuroimaging analysis showed temporal-specific vestibular effects on somatosensory and visual evoked potentials, localized by source estimations to distinct regions of the somatosensory, visual, and vestibular cortical networks. Collectively, the results from my thesis suggest that the vestibular system contributes to 1PP and multisensory cortical processing and imply that the vestibular system should not be neglected when studying higher brain function and neurobiological mechanisms of consciousness

    The effects of body orientation and humeral elevation angle on shoulder muscle activity and shoulder joint position sense

    Get PDF
    The purpose of this study was to determine the effects of body tilt on shoulder muscle activity and repositioning accuracy during humeral elevation to three positions in the sagittal plane (70, 90 and 110 degrees). Thirty eight subjects underwent testing in an unconstrained joint position sense task. Kinematics were measured with a magnetic tracking device while muscle activation was measured with surface electromyography. The joint position sense task consisted of subjects moving their arms to a predetermined positing in space with the help of visual feedback from a head mounted display interfaced with the magnetic tracking device. Subjects were then asked to reproduce the presented shoulder position in the absence of visual feedback. The protocol was performed under two tilts: upright and back 90 degrees from vertical. This allowed for the comparison of joint position sense at the same elevation angles but different levels of shoulder muscle activation by altering the orientation of the subjects in the gravitational field. When comparing these two tilts we found that subjects matched with greater accuracy and precision at 90 and 110 degrees of elevation when they were upright (p \u3c 0.05). We also found that anterior deltoid muscle activity was significantly greater at all three elevation angles in the upright condition. This data, when taken together support the hypothesis that unconstrained shoulder joint position sense is enhanced with increased muscular activation levels
    • …
    corecore