2,390 research outputs found

    An intuitive model of perceptual grouping for HCI design

    Get PDF
    ABSTRACT Understanding and exploiting the abilities of the human visual system is an important part of the design of usable user interfaces and information visualizations. Good design enables quick, easy and veridical perception of key components of that design. An important facet of human vision is its ability to seemingly effortlessly perform "perceptual organization"; it transforms individual feature estimates into perception of coherent regions, structures, and objects. We perceive regions grouped by proximity and feature similarity, grouping of curves by good continuation, and grouping of regions of coherent texture. In this paper, we discuss a simple model for a broad range of perceptual grouping phenomena. It takes as input an arbitrary image, and returns a structure describing the predicted visual organization of the image. We demonstrate that this model can capture aspects of traditional design rules, and predicts visual percepts in classic perceptual grouping displays

    Disciplining the body? Reflections on the cross disciplinary import of ‘embodied meaning’ into interaction design

    Get PDF
    The aim of this paper is above all critically to examine and clarify some of the negative implications that the idea of ‘embodied meaning’ has for the emergent field of interaction design research. Originally, the term ‘embodied meaning’ has been brought into HCI research from phenomenology and cognitive semantics in order to better understand how user’s experience of new technological systems relies to an increasing extent on full-body interaction. Embodied approaches to technology design could thus be found in Winograd & Flores (1986), Dourish (2001), Lund (2003), Klemmer, Hartman & Takayama (2006), Hornecker & Buur (2006), Hurtienne & Israel (2007) among others. However, fertile as this cross-disciplinary import may be, design research can generally be criticised for being ‘undisciplined’, because of its tendency merely to take over reductionist ideas of embodied meaning from those neighbouring disciplines without questioning the inherent limitations it thereby subscribe to. In this paper I focus on this reductionism and what it means for interaction design research. I start out by introducing the field of interaction design and two central research questions that it raises. This will serve as a prerequisite for understanding the overall intention of bringing the notion of ‘embodied meaning’ from cognitive semantics into design research. Narrowing my account down to the concepts of ‘image schemas’ and their ‘metaphorical extension’, I then explain in more detail what is reductionistic about the notion of embodied meaning. Having done so, I shed light on the consequences this reductionism might have for design research by examining a recently developed framework for intuitive user interaction along with two case examples. In so doing I sketch an alternative view of embodied meaning for interaction design research. Keywords: Interaction Design, Embodied Meaning, Tangible User Interaction, Design Theory, Cognitive Semiotics</p

    The use of analytical models in human-computer interface design

    Get PDF
    Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations

    Designing Affordances on Embedded Interfaces

    Get PDF
    The purpose of this study was to examine the effects of a user interface on users\u27 performance during an online shopping checkout task. Two interfaces were developed using the principles of Rasmussen\u27s SRK model: a high-affordance and a low-affordance interface. Seventy undergraduate and graduate students performed a simulated online shopping task with the two interfaces. It was hypothesized that the high-affordance interface would require less time and fewer clicks to conduct the shopping task when compared to a low-affordance interface. In addition, it was predicted that participants would prefer the high-affordance interface. The findings revealed participants spent more time on the task using the high-affordance interface, but the difference was not statistically significant. Participants made significantly fewer clicks using the high-affordance interface than they did using the low-affordance interface. Compared to the low-affordance interface, a significantly higher percentage of users reported that they would prefer the high-affordance interface. This is one of the first studies to examine the application of the SRK model to the design of consumer interfaces. Based on these results, the SRK model may be considered another conceptual tool to make interfaces easier to use and consumer experiences more satisfying and enjoyable

    Human-Computer/Device Interaction

    Get PDF
    Any interaction refers to the communication between two or more entities (be it abstract/conceptual or physical entity). Successful interaction is equated from the properties of each entity involved in the interaction as well as the capabilities of the interacting entities. With the diversified use and application of computers and specialized devices for specific tasks, such as biomechanical and biomedical devices, interaction design needs to further study the context of the tasks as well. Moreover, with the inclusion of embedded systems and smart devices, instead of focusing only on the hardware performance, the computer architecture needs to consider the opportunities. Especially, HCI can be improved as the current technologies are giving an opportunity for building smart interaction where the user interacts with devices implicitly and in less obtrusive way. In light of this, the design and architecture of an engineered product need to strive for making the product usable and used while making it useful to the user. And this can be achieved if interaction design is dictated by scrutinizing the user model with respects to the usability attributes in view of the context of its task as well as the platform capabilities and constraints as discussed in this chapter

    Understanding interaction mechanics in touchless target selection

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)We use gestures frequently in daily life—to interact with people, pets, or objects. But interacting with computers using mid-air gestures continues to challenge the design of touchless systems. Traditional approaches to touchless interaction focus on exploring gesture inputs and evaluating user interfaces. I shift the focus from gesture elicitation and interface evaluation to touchless interaction mechanics. I argue for a novel approach to generate design guidelines for touchless systems: to use fundamental interaction principles, instead of a reactive adaptation to the sensing technology. In five sets of experiments, I explore visual and pseudo-haptic feedback, motor intuitiveness, handedness, and perceptual Gestalt effects. Particularly, I study the interaction mechanics in touchless target selection. To that end, I introduce two novel interaction techniques: touchless circular menus that allow command selection using directional strokes and interface topographies that use pseudo-haptic feedback to guide steering–targeting tasks. Results illuminate different facets of touchless interaction mechanics. For example, motor-intuitive touchless interactions explain how our sensorimotor abilities inform touchless interface affordances: we often make a holistic oblique gesture instead of several orthogonal hand gestures while reaching toward a distant display. Following the Gestalt theory of visual perception, we found similarity between user interface (UI) components decreased user accuracy while good continuity made users faster. Other findings include hemispheric asymmetry affecting transfer of training between dominant and nondominant hands and pseudo-haptic feedback improving touchless accuracy. The results of this dissertation contribute design guidelines for future touchless systems. Practical applications of this work include the use of touchless interaction techniques in various domains, such as entertainment, consumer appliances, surgery, patient-centric health settings, smart cities, interactive visualization, and collaboration

    Perceptually relevant browsing environments for large texture databases

    Get PDF
    This thesis describes the development of a large database of texture stimuli, the production of a similarity matrix re ecting human judgements of similarity about the database, and the development of three browsing models that exploit structure in the perceptual information for navigation. Rigorous psychophysical comparison experiments are carried out and the SOM (Self Organising Map) found to be the fastest of the three browsing models under examination. We investigate scalable methods of augmenting a similarity matrix using the SOM browsing environment to introduce previously unknown textures. Further psychophysical experiments reveal our method produces a data organisation that is as fast to navigate as that derived from the perceptual grouping experiments.Engineering and Physical Sciences Research Council (EPSRC

    The use of analytical models in human-computer interface design

    Get PDF
    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA
    • 

    corecore