8,439 research outputs found

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    D-touch: A Consumer-Grade Tangible Interface Module and Musical Applications

    No full text
    We define a class of tangible media applications that can be implemented on consumer-grade personal computers. These applications interpret user manipulation of physical objects in a restricted space and produce unlocalized outputs. We propose a generic approach to the implementation of such interfaces using flexible fiducial markers, which identify objects to a robust and fast video-processing algorithm, so they can be recognized and tracked in real time. We describe an implementation of the technology, then report two new, flexible music performance applications that demonstrate and validate it

    Sheet Music Unbound: A fluid approach to sheet music display and annotation on a multi-touch screen

    Get PDF
    In this thesis we present the design and prototype implementation of a Digital Music Stand that focuses on fluid music layout management and free-form digital ink annotation. An analysis of user constraints and available technology lead us to select a 21.5” multi-touch monitor as the preferred input and display device. This comfortably displays two A4 pages of music side by side with space for a control panel. The analysis also identified single handed input as a viable choice for musicians. Finger input was chosen to avoid the need for any additional input equipment. To support layout reflow and zooming we develop a vector based music representation, based around the bar structure. This representation supports animation of transitions, in such a way as to give responsive dynamic interaction with multi-touch gesture input. In developing the prototype, particular attention was paid to the problem of drawing small, intricate annotation accurately located on the music using a fingertip. The zoomable nature of the music structure was leveraged to accomplish this, and an evaluation carried out to establish the best level of magnification. The thesis demonstrates, in the context of music, that annotation and layout management (typically treated as two distinct tasks) can be integrated into a single task yielding fluid and natural interaction

    SymbolDesign: A User-centered Method to Design Pen-based Interfaces and Extend the Functionality of Pointer Input Devices

    Full text link
    A method called "SymbolDesign" is proposed that can be used to design user-centered interfaces for pen-based input devices. It can also extend the functionality of pointer input devices such as the traditional computer mouse or the Camera Mouse, a camera-based computer interface. Users can create their own interfaces by choosing single-stroke movement patterns that are convenient to draw with the selected input device and by mapping them to a desired set of commands. A pattern could be the trace of a moving finger detected with the Camera Mouse or a symbol drawn with an optical pen. The core of the SymbolDesign system is a dynamically created classifier, in the current implementation an artificial neural network. The architecture of the neural network automatically adjusts according to the complexity of the classification task. In experiments, subjects used the SymbolDesign method to design and test the interfaces they created, for example, to browse the web. The experiments demonstrated good recognition accuracy and responsiveness of the user interfaces. The method provided an easily-designed and easily-used computer input mechanism for people without physical limitations, and, with some modifications, has the potential to become a computer access tool for people with severe paralysis.National Science Foundation (IIS-0093367, IIS-0308213, IIS-0329009, EIA-0202067

    Musical Gesture through the Human Computer Interface: An Investigation using Information Theory

    Get PDF
    This study applies information theory to investigate human ability to communicate using continuous control sensors with a particular focus on informing the design of digital musical instruments. There is an active practice of building and evaluating such instruments, for instance, in the New Interfaces for Musical Expression (NIME) conference community. The fidelity of the instruments can depend on the included sensors, and although much anecdotal evidence and craft experience informs the use of these sensors, relatively little is known about the ability of humans to control them accurately. This dissertation addresses this issue and related concerns, including continuous control performance in increasing degrees-of-freedom, pursuit tracking in comparison with pointing, and the estimations of musical interface designers and researchers of human performance with continuous control sensors. The methodology used models the human-computer system as an information channel while applying concepts from information theory to performance data collected in studies of human subjects using sensing devices. These studies not only add to knowledge about human abilities, but they also inform on issues in musical mappings, ergonomics, and usability

    Mobile EduFun School Application

    Get PDF
    Recently, modern handheld devices such as smartphones, tablets, and PDAs have become increasingly powerful in our society. However, dramatic breakthroughs in technology have allowed the automation of education teaching processes in which resulted in opening doors to a wide range of learning possibilities. For instance, most smartphones regularly include cameras and processors comparable to the personal computers from only a few years ago. Nonetheless, there are quite considerable applications that allow much passing of the knowledge and information to the young generation such as mobile educational applications. Therefore, the objective of the project is to develop a Mobile EduFun School application on android platforms that will guide the children step-by-step throughout the numeracy basis by providing basic counting and arithmetic exercises that will challenge yet increase the children’s ability to solve basic mathematical operations while understanding the idea of simple mathematics and applying mathematical skills in everyday life. The scope of the study will be to put emphasis on numerical or arithmetic operations for children of ages of 7 to 9 years old and the development will undergo incrementally and iteratively based on modern mobile application architecture. Furthermore, the project will be developed using a Java Programming Language for the coding and it will have as a tool Eclipse Classic Integrated Development Environment (IDE) and Android Software Development Kit (SDK) that includes the necessary libraries and custom tools used on android platforms. The progress in the first phase of final year project results in positive response from preliminary surveys and completion of first prototype which is a mock up or basic framework interface of the application

    Proceedings of the 3rd IUI Workshop on Interacting with Smart Objects

    Get PDF
    These are the Proceedings of the 3rd IUI Workshop on Interacting with Smart Objects. Objects that we use in our everyday life are expanding their restricted interaction capabilities and provide functionalities that go far beyond their original functionality. They feature computing capabilities and are thus able to capture information, process and store it and interact with their environments, turning them into smart objects

    Spartan Daily, September 10, 2008

    Get PDF
    Volume 131, Issue 7https://scholarworks.sjsu.edu/spartandaily/10491/thumbnail.jp

    "Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flicking

    Get PDF
    In a large number of electronic devices, users interact with the system by navigating through various menus. Auditory menus can complement or even replace visual menus, so research on auditory menus has recently increased with mobile devices as well as desktop computers. Despite the potential importance of auditory displays on touch screen devices, little research has been attempted to enhance the effectiveness of auditory menus for those devices. In the present study, I investigated how advanced auditory cues enhance auditory menu navigation on a touch screen smartphone, especially for new input gestures such as tapping, wheeling, and flicking methods for navigating a one-dimensional menu. Moreover, I examined if advanced auditory cues improve user experience, not only for visuals-off situations, but also for visuals-on contexts. To this end, I used a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the users of where they are in a long menu. In this study, each item in a menu was preceded by a sound based on the item's initial letter. One hundred and twenty two undergraduates navigated through an alphabetized list of 150 song titles. The study was a split-plot design with manipulated auditory cue type (text-to-speech (TTS) alone vs. TTS plus spindex), visual mode (on vs. off), and input gesture style (tapping, wheeling, and flicking). Target search time and subjective workload for the TTS + spindex were lower than those of the TTS alone in all input gesture types regardless of visual type. Also, on subjective ratings scales, participants rated the TTS + spindex condition higher than the plain TTS on being 'effective' and 'functionally helpful'. The interaction between input methods and output modes (i.e., auditory cue types) and its effects on navigation behaviors was also analyzed based on the two-stage navigation strategy model used in auditory menus. Results were discussed in analogy with visual search theory and in terms of practical applications of spindex cues.M.S.Committee Chair: Bruce N. Walker; Committee Member: Frank Durso; Committee Member: Gregory M. Cors

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity
    • 

    corecore