312 research outputs found

    Master of Science

    Get PDF
    thesisHaptic interactions with smartphones are generally restricted to vibrotactile feedback that offers limited distinction between delivered tactile cues. The lateral movement of a small, high-friction contactor at the fingerpad can be used to induce skin stretch tangent to the skin's surface. This method has been demonstrated to reliably communicate four cardinal directions with 1 mm translations of the device's contactor, when finger motion is properly restrained. While earlier research has used a thimble to restrain the finger, this interface has been made portable by incorporating a simple conical hole as a finger restraint. An initial portable device design used RC hobby servos and the conical hole finger restraint, but the shape and size of this portable device wasn't compatible with smartphone form factors. This design also had significant compliance and backlash that must be compensated for with additional control schemes. In contrast, this thesis presents the design, fabrication, and testing of a low-profile skin-stretch display (LPSSD) with a novel actuation design for delivering complex tactile cues with minimal backlash or hysteresis of the skin contactor or "tactor." This flatter mechanism features embedded sensors for fingertip cursor control and selection. This device's nonlinear tactor motions are compensated for using table look-up and high-frequency open-loop control to create direction cues with 1.8 mm radial tactor displacements in 16 directions (distributed evenly every 22.5°) before returning to center. Two LPSSDs are incorporated into a smartphone peripheral and used in single-handed and bimanual tests to identify 16 directions. Users also participated in "relative" identification tests where they were first provided a reference direction cue in the forward/north direction followed by the cue direction that they were to identify. Tests were performed with the user's thumbs oriented in the forward direction and with thumbs angled inward slightly, similar to the angledthumb orientation console game controllers. Users are found to have increased performance with an angled-thumb orientation. They performed similarly when stimuli were delivered to their right or left thumbs, and had significantly better performance judging direction cues with both thumbs simultaneously. Participants also performed slightly better in identifying the relative direction cues than the absolute

    Doctor of Philosophy

    Get PDF
    dissertationWhen interacting with objects, humans utilize their sense of touch to provide information about the object and surroundings. However, in video games, virtual reality, and training exercises, humans do not always have information available through the sense of touch. Several types of haptic feedback devices have been created to provide touch information in these scenarios. This dissertation describes the use of tactile skin stretch feedback to provide cues that convey direction information to a user. The direction cues can be used to guide a user or provide information about the environment. The tactile skin stretch feedback devices described herein provide feedback directly to the hands, just as in many real life interactions involving the sense of touch. The devices utilize a moving tactor (actuated skin contact surface, also called a contactor) and surrounding material to give the user a sense of the relative motion. Several game controller prototypes with skin stretch feedback embedded into the device to interface with the fingers were constructed. Experiments were conducted to evaluate user performance in moving the joysticks to match the direction of the stimulus. These experiments investigated stimulus masking effects with both skin stretch feedback and vibrotactile feedback. A controller with feedback on the thumb joysticks was found to have higher user accuracy. Next, precision grip and power grip skin stretch feedback devices were created to investigate cues to convey motion in a three-dimensional space. Experiments were conducted to compare the two devices and to explore user accuracy in identifying different direction cue types. The precision grip device was found to be superior in communicating direction cues to users in four degrees of freedom. Finally, closed-loop control was implemented to guide users to a specific location and orientation within a three-dimensional space. Experiments were conducted to improve controller feedback which in turn improved user performance. Experiments were also conducted to investigate the feasibility of providing multiple cues in succession, in order to guide a user with multiple motions of the hand. It was found that users can successfully reach multiple target locations and orientations in succession

    Master of Science

    Get PDF
    thesisHaptic feedback in modern game controllers is limited to vibrotactile feedback. The addition of skin-stretch feedback would significantly improve the type and quality of haptic feedback provided by game controllers. Skin-stretch feedback requires small forces (around a few newtons) and translations (as small as 0.5 mm) to provide identifiable direction cues. Prior work has developed skin-stretch mechanisms in two form factors: a flat form factor and a tall but compact (cubic) form factor. These mechanisms have been shown to be effective actuators for skin-stretch feedback, and are small enough to fit inside of a game controller. Additional prior work has shown that the cubic skin-stretch mechanism can be integrated into a thumb joystick for use with game controllers. This thesis presents the design, characterization, and testing of two skin-stretch game controllers. The first game controller provides skin stretch via a 2-axis mechanism integrated into its thumb joysticks. This controller uses the cubic skin-stretch mechanism to drive the skin stretch. Concerns that users' motions of the joystick could negatively impact the saliency of skin stretch rendered from the joystick prompted the design of a controller that provides 2-axis skin stretch to users' middle fingers on the back side of the controller. Two experiments were conducted with the two controllers. One experiment had participants identify the direction of skin stretch from a selection of 8 possible directions. This test compared users' accuracies with both controllers, and with five different finger restraints on the back-tactor controller. Results show that users' identification accuracy was similar across feedback conditions. A second experiment used skin stretch to rotationally guide participants to a randomized target angle. Three different feedback strategies were tested. Results showed that a strategy called sinusoidal feedback, which provided feedback that varied in frequency and amplitude as a function of the user's relative position to the tactor, performed significantly better on all performance metrics than the other feedback strategies. It is important to note that the sinusoidal feedback only requires two 1-axis skin-stretch actuators, which are spatially separated, in order to provide feedback. The other lower performing feedback strategies used two 2-axis skin-stretch actuators

    Designing smart garments for rehabilitation

    Get PDF

    Gloved Human-Machine Interface

    Get PDF
    Certain exemplary embodiments can provide a system, machine, device, manufacture, circuit, composition of matter, and/or user interface adapted for and/or resulting from, and/or a method and/or machine-readable medium comprising machine-implementable instructions for, activities that can comprise and/or relate to: tracking movement of a gloved hand of a human; interpreting a gloved finger movement of the human; and/or in response to interpreting the gloved finger movement, providing feedback to the human

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Ubiquitous haptic feedback in human-computer interaction through electrical muscle stimulation

    Get PDF
    [no abstract

    Integrating passive ubiquitous surfaces into human-computer interaction

    Get PDF
    Mobile technologies enable people to interact with computers ubiquitously. This dissertation investigates how ordinary, ubiquitous surfaces can be integrated into human-computer interaction to extend the interaction space beyond the edge of the display. It turns out that acoustic and tactile features generated during an interaction can be combined to identify input events, the user, and the surface. In addition, it is shown that a heterogeneous distribution of different surfaces is particularly suitable for realizing versatile interaction modalities. However, privacy concerns must be considered when selecting sensors, and context can be crucial in determining whether and what interaction to perform.Mobile Technologien ermöglichen den Menschen eine allgegenwärtige Interaktion mit Computern. Diese Dissertation untersucht, wie gewöhnliche, allgegenwärtige Oberflächen in die Mensch-Computer-Interaktion integriert werden können, um den Interaktionsraum über den Rand des Displays hinaus zu erweitern. Es stellt sich heraus, dass akustische und taktile Merkmale, die während einer Interaktion erzeugt werden, kombiniert werden können, um Eingabeereignisse, den Benutzer und die Oberfläche zu identifizieren. Darüber hinaus wird gezeigt, dass eine heterogene Verteilung verschiedener Oberflächen besonders geeignet ist, um vielfältige Interaktionsmodalitäten zu realisieren. Bei der Auswahl der Sensoren müssen jedoch Datenschutzaspekte berücksichtigt werden, und der Kontext kann entscheidend dafür sein, ob und welche Interaktion durchgeführt werden soll

    Principles and Guidelines for Advancement of Touchscreen-Based Non-visual Access to 2D Spatial Information

    Get PDF
    Graphical materials such as graphs and maps are often inaccessible to millions of blind and visually-impaired (BVI) people, which negatively impacts their educational prospects, ability to travel, and vocational opportunities. To address this longstanding issue, a three-phase research program was conducted that builds on and extends previous work establishing touchscreen-based haptic cuing as a viable alternative for conveying digital graphics to BVI users. Although promising, this approach poses unique challenges that can only be addressed by schematizing the underlying graphical information based on perceptual and spatio-cognitive characteristics pertinent to touchscreen-based haptic access. Towards this end, this dissertation empirically identified a set of design parameters and guidelines through a logical progression of seven experiments. Phase I investigated perceptual characteristics related to touchscreen-based graphical access using vibrotactile stimuli, with results establishing three core perceptual guidelines: (1) a minimum line width of 1mm should be maintained for accurate line-detection (Exp-1), (2) a minimum interline gap of 4mm should be used for accurate discrimination of parallel vibrotactile lines (Exp-2), and (3) a minimum angular separation of 4mm should be used for accurate discrimination of oriented vibrotactile lines (Exp-3). Building on these parameters, Phase II studied the core spatio-cognitive characteristics pertinent to touchscreen-based non-visual learning of graphical information, with results leading to the specification of three design guidelines: (1) a minimum width of 4mm should be used for supporting tasks that require tracing of vibrotactile lines and judging their orientation (Exp-4), (2) a minimum width of 4mm should be maintained for accurate line tracing and learning of complex spatial path patterns (Exp-5), and (3) vibrotactile feedback should be used as a guiding cue to support the most accurate line tracing performance (Exp-6). Finally, Phase III demonstrated that schematizing line-based maps based on these design guidelines leads to development of an accurate cognitive map. Results from Experiment-7 provide theoretical evidence in support of learning from vision and touch as leading to the development of functionally equivalent amodal spatial representations in memory. Findings from all seven experiments contribute to new theories of haptic information processing that can guide the development of new touchscreen-based non-visual graphical access solutions
    • …
    corecore