4 research outputs found

    Integrating perceptual and cognitive modeling for adaptive and intelligent human-computer interaction

    No full text
    This paper describes technology and tools for intelligent human-computer interaction (IHCI) where human cognitive, perceptual, motor, and affective factors are modeled and used to adapt the H-C interface. IHCI emphasizes that human behavior encompasses both apparent human behavior and the hidden mental state behind behavioral performance. IHCI expands on the interpretation of human activities, known as W4 (what, where, when, who). While W4 only addresses the apparent perceptual aspect of human behavior, the W5+ technology for IHCI described in this paper addresses also the why and how questions, whose solution requires recognizing specific cognitive states. IHCI integrates parsing and interpretation of nonverbal information with a computational cognitive model of the user, which, in turn, feeds into processes that adapt the interface to enhance operator performance and provide for rational decision-making. The technology proposed is based on a general four-stage interactive framework, which moves from parsing the raw sensory-motor input, to interpreting the user's motions and emotions, to building an understanding of the user's current cognitive state. It then diagnoses various problems in the situation and adapts the interface appropriately. The interactive component of the system improves processing at each stage. Examples of perceptual, behavioral, and cognitive tools are described throughout the paper. Adaptive and intelligent HCI are important for novel applications of computing, including ubiquitous and human-centered computing. © 2002 IEEE
    corecore