59 research outputs found

    Investigating perceptual congruence between information and sensory parameters in auditory and vibrotactile displays

    Get PDF
    A fundamental interaction between a computer and its user(s) is the transmission of information between the two and there are many situations where it is necessary for this interaction to occur non-visually, such as using sound or vibration. To design successful interactions in these modalities, it is necessary to understand how users perceive mappings between information and acoustic or vibration parameters, so that these parameters can be designed such that they are perceived as congruent. This thesis investigates several data-sound and data-vibration mappings by using psychophysical scaling to understand how users perceive the mappings. It also investigates the impact that using these methods during design has when they are integrated into an auditory or vibrotactile display. To investigate acoustic parameters that may provide more perceptually congruent data-sound mappings, Experiments 1 and 2 explored several psychoacoustic parameters for use in a mapping. These studies found that applying amplitude modulation — or roughness — to a signal, or applying broadband noise to it resulted in performance which were similar to conducting the task visually. Experiments 3 and 4 used scaling methods to map how a user perceived a change in an information parameter, for a given change in an acoustic or vibrotactile parameter. Experiment 3 showed that increases in acoustic parameters that are generally considered undesirable in music were perceived as congruent with information parameters with negative valence such as stress or danger. Experiment 4 found that data-vibration mappings were more generalised — a given increase in a vibrotactile parameter was almost always perceived as an increase in an information parameter — regardless of the valence of the information parameter. Experiments 5 and 6 investigated the impact that using results from the scaling methods used in Experiments 3 and 4 had on users' performance when using an auditory or vibrotactile display. These experiments also explored the impact that the complexity of the context which the display was placed had on user performance. These studies found that using mappings based on scaling results did not significantly impact user's performance with a simple auditory display, but it did reduce response times in a more complex use-case

    Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback

    Get PDF
    This paper explores the combination of multiple concurrent modalities for conveying emotional information in HCI: temperature, vibration and abstract visual displays. Each modality has been studied individually, but can only convey a limited range of emotions within two-dimensional valencearousal space. This paper is the first to systematically combine multiple modalities to expand the available affective range. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions

    Effects of modality, urgency and situation on responses to multimodal warnings for drivers

    Get PDF
    Signifying road-related events with warnings can be highly beneficial, especially when imminent attention is needed. This thesis describes how modality, urgency and situation can influence driver responses to multimodal displays used as warnings. These displays utilise all combinations of audio, visual and tactile modalities, reflecting different urgency levels. In this way, a new rich set of cues is designed, conveying information multimodally, to enhance reactions during driving, which is a highly visual task. The importance of the signified events to driving is reflected in the warnings, and safety-critical or non-critical situations are communicated through the cues. Novel warning designs are considered, using both abstract displays, with no semantic association to the signified event, and language-based ones, using speech. These two cue designs are compared, to discover their strengths and weaknesses as car alerts. The situations in which the new cues are delivered are varied, by simulating both critical and non-critical events and both manual and autonomous car scenarios. A novel set of guidelines for using multimodal driver displays is finally provided, considering the modalities utilised, the urgency signified, and the situation simulated

    Using pressure input and thermal feedback to broaden haptic interaction with mobile devices

    Get PDF
    Pressure input and thermal feedback are two under-researched aspects of touch in mobile human-computer interfaces. Pressure input could provide a wide, expressive range of continuous input for mobile devices. Thermal stimulation could provide an alternative means of conveying information non-visually. This thesis research investigated 1) how accurate pressure-based input on mobile devices could be when the user was walking and provided with only audio feedback and 2) what forms of thermal stimulation are both salient and comfortable and so could be used to design structured thermal feedback for conveying multi-dimensional information. The first experiment tested control of pressure on a mobile device when sitting and using audio feedback. Targeting accuracy was >= 85% when maintaining 4-6 levels of pressure across 3.5 Newtons, using only audio feedback and a Dwell selection technique. Two further experiments tested control of pressure-based input when walking and found accuracy was very high (>= 97%) even when walking and using only audio feedback, when using a rate-based input method. A fourth experiment tested how well each digit of one hand could apply pressure to a mobile phone individually and in combination with others. Each digit could apply pressure highly accurately, but not equally so, while some performed better in combination than alone. 2- or 3-digit combinations were more precise than 4- or 5-digit combinations. Experiment 5 compared one-handed, multi-digit pressure input using all 5 digits to traditional two-handed multitouch gestures for a combined zooming and rotating map task. Results showed comparable performance, with multitouch being ~1% more accurate but pressure input being ~0.5sec faster, overall. Two experiments, one when sitting indoors and one when walking indoors tested how salient and subjectively comfortable/intense various forms of thermal stimulation were. Faster or larger changes were more salient, faster to detect and less comfortable and cold changes were more salient and faster to detect than warm changes. The two final studies designed two-dimensional structured ‘thermal icons’ that could convey two pieces of information. When indoors, icons were correctly identified with 83% accuracy. When outdoors, accuracy dropped to 69% when sitting and 61% when walking. This thesis provides the first detailed study of how precisely pressure can be applied to mobile devices when walking and provided with audio feedback and the first systematic study of how to design thermal feedback for interaction with mobile devices in mobile environments

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Electrotactons: designing and evaluating electrotactile cues

    Get PDF
    Electrotactile feedback is a novel haptic feedback modality that can be used to evoke a desired level of alertness and emotion or convey multidimensional information to the user. However, there is a lack of research investigating its basic design parameters and how they can be used to create effective tactile cues. This thesis investigates the effect of Electrotactile feedback on the subjective perception of specific sensations, such as urgency, annoyance, valence and arousal, to find the number of distinguishable levels in each sensation. These levels are then used for designing structured, abstract, electrotactile messages called Electrotactons. These have potential benefits over vibration-based cues due to the greater flexibility of the actuators. Experiments 1, 2 & 4 investigated the effects of manipulating the basic electrotactile parameters pulse width, amplitude and pulse frequency on perceived sensations. The results showed that all parameters have a significant effect on the perceived sensations, except for pulse frequency not having an effect on valence. Also, pulse frequencies of 30 PPS and above did not influence the perceived sensations. Experiment 3 investigated the use of pulse width, amplitude and pulse frequency to convey three types of information simultaneously encoded into an electrotactile cue. This was the first attempt to design Electrotactons using the basic parameters of electrotactile feedback. The results showed overall recognition rates of 38.19% for the complete Electrotactons. For the individual component parameters, pulse width had a recognition rate of 71.67%, amplitude 70.27%, and pulse frequency 66.36%. Experiment 5 investigated intensity and pulse frequency to determine how many distinguishable levels could be perceived. Results showed that both intensity and pulse frequency significantly affected perception, with four distinguishable levels of intensity and two of pulse frequency. Experiment 6 investigated the use of intensity and pulse frequency from in Experiment 5 to improve the design of Electrotactons on three body locations using two different size electrodes. The results showed overall recognition rates of up to 65.31% for the complete Electrotactons. For the individual component parameters, intensity had a recognition rate of 68.68%, and pulse frequency 94.41%. These results add significant new knowledge about the parameter space of electrotactile cue design and help designers select suitable properties to use when creating electrotactile cues

    Exploitation of haptic renderings to communicate risk levels of falling

    Get PDF
    Falls represent a major cause of injury that could lead to death. This observation is even more accentuated in the elderly. Indeed, with aging comes some deterioration (gait disturbances, balance disorders, and sensory motor impairments) that may lead to falls. The research project presented in this thesis is focused on the problem of reducing the risk level of falling. This study proposes a solution for the communication of haptic information to reduce the risk of falling. This solution is part of the design of a haptic communication system in a controlled environment. This new system introduces the notion of haptic perception through the communication of information by touch using the foot, which the literature does not generally mention. For the design of this system, we first studied the use of tactile stimuli to evaluate the possibility of communicating a risk level through a haptic modality. Then, having hypothesized that some factors could influence the communication of stimuli representing the risk levels of falling, we conducted a second study to evaluate the effect of auditory disturbances during the communication of these stimuli. Third, to determine whether the user had the necessary time to act after the perception of the risk level, we analyzed a variation of the simple reaction time when walking on different types of soil. These results encouraged us to do a fourth assessment of reaction time using a new device coupled with a smartphone that can be positioned at different locations on the body. Several experiments have been done to validate each of the steps. With this, we can now communicate a risk level of falling to users through the haptic channel using an active device and easily differentiable stimuli. In addition, we can evaluate auditory factors during such a haptic perception. Finally, we can evaluate the physiological characteristics of the users (response time) while seated and while walking on different types of soil. Les chutes représentent une cause majeure de blessures pouvant entraîner la mort. Cette observation est encore plus accentuée chez les personnes âgées. En effet, avec le vieillissement, certaines détériorations (troubles de la démarche, troubles de l’équilibre, troubles sensorimoteurs) peuvent entraîner des chutes. Le projet de recherche présenté dans cette thèse fait partie du problème de la réduction du risque de chute. En particulier, cette étude propose une solution au problème de la réduction du risque de chute par la perception haptiques. Cette solution intègre la conception d’un système de communication haptique dans un environnement contrôlé. Ce nouveau système introduit la notion de perception haptique à travers la communication de l’information par le toucher avec le pied, que la littérature ne mentionne généralement pas. Pour cela nous avons d’abord étudié l’utilisation de stimuli tactiles pour évaluer la possibilité de communiquer un niveau de risque par la modalité haptique. Puis, ayant émis l’hypothèse que certains facteurs pourraient influencer la communication de ces stimuli, nous avons mené une deuxième étude pour évaluer l’impact des perturbations auditives lors de la perception haptique du niveau de risque. Troisièmement, afin de savoir si l’utilisateur avait le temps nécessaire pour agir après la perception du niveau de risque, nous avons analysé la variation du temps de réaction simple en marchant sur différents types de sols. Les résultats obtenus dans cette dernière étude nous ont motivé à faire une quatrième évaluation du temps de réaction mais en utilisant un nouveau dispositif couplé à un smartphone qui peut être positionné à différents endroits du corps. Plusieurs expériences ont été réalisées pour valider chacune des étapes. Avec toutes ces études, nous pouvons maintenant communiquer aux utilisateurs un niveau de risque à travers le canal haptique en utilisant un dispositif actif et des stimuli facilement différentiables. En outre, nous pouvons évaluer les facteurs externes (auditifs) au cours d’une telle perception haptique. Enfin, nous pouvons évaluer les caractéristiques physiologiques des utilisateurs (temps de réponse) en position assise et en marchant sur différents types de sols
    • …
    corecore