723 research outputs found

    Equal Intensity Contours for Whole-Body Vibrations Compared with Vibrations Cross-Modally Matched to Isophones

    Full text link
    Abstract. In this study, two experiments were conducted to determine the curves of equal intensity perception for sinusoidal vertical whole-body vibrations (WBV) of seated subjects over the frequency range from 10Hz to 250Hz. Vibrations were presented to subjects using a at hard seat. In total, 10 participants were asked to match the intensity of dierent vibrations, using a method of adjustment. The obtained contours were compared with the threshold of vibration and to vibrations cross-modally matched to tones from isophones. The shapes of the equal intensity contours in the present study show reasonable agreement with the contours from other studies despite the use of dierent methodologies and experimental questions. The contours show a characteristic similar to the perception threshold. No dependency of vibration magnitude on the shape of the contours was found in the applied dynamic range. However, large inter-individual variations were observed. The results imply that vibration curves that are cross-modally matched to isophones show similar characteristics

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    The temporal pattern of impulses in primary afferents analogously encodes touch and hearing information

    Full text link
    An open question in neuroscience is the contribution of temporal relations between individual impulses in primary afferents in conveying sensory information. We investigated this question in touch and hearing, while looking for any shared coding scheme. In both systems, we artificially induced temporally diverse afferent impulse trains and probed the evoked perceptions in human subjects using psychophysical techniques. First, we investigated whether the temporal structure of a fixed number of impulses conveys information about the magnitude of tactile intensity. We found that clustering the impulses into periodic bursts elicited graded increases of intensity as a function of burst impulse count, even though fewer afferents were recruited throughout the longer bursts. The interval between successive bursts of peripheral neural activity (the burst-gap) has been demonstrated in our lab to be the most prominent temporal feature for coding skin vibration frequency, as opposed to either spike rate or periodicity. Given the similarities between tactile and auditory systems, second, we explored the auditory system for an equivalent neural coding strategy. By using brief acoustic pulses, we showed that the burst-gap is a shared temporal code for pitch perception between the modalities. Following this evidence of parallels in temporal frequency processing, we next assessed the perceptual frequency equivalence between the two modalities using auditory and tactile pulse stimuli of simple and complex temporal features in cross-sensory frequency discrimination experiments. Identical temporal stimulation patterns in tactile and auditory afferents produced equivalent perceived frequencies, suggesting an analogous temporal frequency computation mechanism. The new insights into encoding tactile intensity through clustering of fixed charge electric pulses into bursts suggest a novel approach to convey varying contact forces to neural interface users, requiring no modulation of either stimulation current or base pulse frequency. Increasing control of the temporal patterning of pulses in cochlear implant users might improve pitch perception and speech comprehension. The perceptual correspondence between touch and hearing not only suggests the possibility of establishing cross-modal comparison standards for robust psychophysical investigations, but also supports the plausibility of cross-sensory substitution devices

    How touch and hearing influence visual processing in sensory substitution, synaesthesia and cross-modal correspondences

    Get PDF
    Sensory substitution devices (SSDs) systematically turn visual dimensions into patterns of tactile or auditory stimulation. After training, a user of these devices learns to translate these audio or tactile sensations back into a mental visual picture. Most previous SSDs translate greyscale images using intuitive cross-sensory mappings to help users learn the devices. However more recent SSDs have started to incorporate additional colour dimensions such as saturation and hue. Chapter two examines how previous SSDs have translated the complexities of colour into hearing or touch. The chapter explores if colour is useful for SSD users, how SSD and veridical colour perception differ and how optimal cross-sensory mappings might be considered. After long-term training, some blind users of SSDs report visual sensations from tactile or auditory stimulation. A related phenomena is that of synaesthesia, a condition where stimulation of one modality (i.e. touch) produces an automatic, consistent and vivid sensation in another modality (i.e. vision). Tactile-visual synaesthesia is an extremely rare variant that can shed light on how the tactile-visual system is altered when touch can elicit visual sensations. Chapter three reports a series of investigations on the tactile discrimination abilities and phenomenology of tactile-vision synaesthetes, alongside questionnaire data from synaesthetes unavailable for testing. Chapter four introduces a new SSD to test if the presentation of colour information in sensory substitution affects object and colour discrimination. Chapter five presents experiments on intuitive auditory-colour mappings across a wide variety of sounds. These findings are used to predict the reported colour hallucinations resulting from LSD use while listening to these sounds. Chapter six uses a new sensory substitution device designed to test the utility of these intuitive sound-colour links for visual processing. These findings are discussed with reference to how cross-sensory links, LSD and synaesthesia can inform optimal SSD design for visual processing

    Perceptual Organization

    Get PDF
    Perceiving the world of real objects seems so easy that it is difficult to grasp just how complicated it is. Not only do we need to construct the objects quickly, the objects keep changing even though we think of them as having a consistent, independent existence (Feldman, 2003). Yet, we usually get it right, there are few failures. We can perceive a tree in a blinding snowstorm, a deer bounding across a tree line, dodge a snowball, catch a baseball, detect the crack of a branch breaking in a strong windstorm amidst the rustling of trees, predict the sounds of a dripping faucet, or track a street musician strolling down the road

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    Interactions between the auditory and vibrotactile senses : a study of perceptual effects

    Get PDF
    Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, February 2010."September 2009." Cataloged from PDF version of thesis.Includes bibliographical references (p. 160-175).This project is an experimental study of perceptual interactions between auditory and tactile stimuli. These experiments present vibrotactile stimuli to the fingertip and auditory tones diotically in broadband noise. Our hypothesis states that if the auditory and tactile systems integrate, the performance of the two sensory stimuli presented simultaneously will be different from the performance of the individual sensory stimuli. The research consists of work in two major areas: (1) Studies of the detection of auditory and tactile sinusoidal stimuli at levels near the threshold of perception (masked thresholds for auditory stimuli and absolute thresholds for tactile stimuli); and (2) Studies of loudness matching employing various combinations of auditory and tactile stimuli presented at supra-threshold levels. Results were compared to three models of auditory-tactile integration. The objective detection studies explore the effects of three major variables on perceptual integration: (a) the starting phase of the auditory relative to the tactile stimulus; (b) the temporal synchrony of stimulation within each of the two modalities; and (c) the frequency of stimulation within each modality. Detection performance for combined auditory-tactile (A+T) presentations was measured using stimulus levels that yielded 63%-77%-correct unimodal performance in a 2-Interval, 2-Alternative Forced- Choice procedure. Results for combined vibrotactile and auditory detection indicated: (1) For synchronous presentation of 500-msec, 250 Hz sinusoidal stimuli, percent-correct scores in the combined A+T conditions were significantly higher than scores within each single modality;(cont.) (2) Scores in the A+T conditions were not affected by the relative phase of the 250 Hz auditory and tactile stimuli; (3) For asynchronous presentation of auditory and tactile 250 Hz stimuli, scores on the A+T conditions improved only when the tactile stimulus preceded the auditory stimulus (and not vice versa); and (4) The highest rates of detection in the combined-modality stimulus were obtained when stimulating frequencies in the two modalities were equal or closely spaced (and within the Pacinian range). The lack of phase effect suggests that integration operates on the envelopes rather than on temporal fine structure. The effects of asynchronous presentation imply a shorter time constant in the auditory compared to the tactile modality and are consistent with time constants deduced from single-modality masking experiments. The effects of frequency depend both on absolute frequency and on relative frequency of stimulation within each modality. In general, we found that an additive sensitivity model best explained detection performance when tones were presented synchronously and of the same frequency. In the second area of research, loudness matching was employed in a subjective study of the effects of frequency on auditory-tactile integration for stimuli presented at supra-threshold levels. These experiments, which were derived from previous auditory studies demonstrating the dependence of loudness on critical-band spacing of tonal signals, employed various combinations of auditory and tactile stimuli that were presented at equally loud levels in isolation.(cont.) Loudness matches were obtained for auditory-only (A+A) and auditory-tactile (A+T) stimuli that were both close as well as farther apart in frequency. The results show that the matched loudness of an auditory pure tone is greater when the frequencies of combined stimuli (both A+A and A+T) are farther apart in frequency than when they are close in frequency. These results are consistent with the results found in the previous experiment exploring the frequency relationships at near-threshold levels, as well as with results in the psychoacoustic literature, and suggest that the auditory and tactile systems are interacting in a frequency-specific manner similar to the interactions of purely auditory stimuli. The research conducted here demonstrates objective and subjective perceptual effects that support the mounting anatomical and physiological evidence for interactions between the auditory and tactual sensory systems.by E. Courtenay Wilson.Ph.D

    Tinnitus:an MRI study on brain mechanisms

    Get PDF
    corecore