7 research outputs found

    Intelligent interaction in diagnostic expert systems

    Get PDF
    AbstractAdvisory systems help to improve quality in manufacturing. Such systems, however, both human and computerized, are less than perfect and frequently not welcome. Sharp separation between working and learning modes is the main reason for the apparent hostility of advisory systems. Intelligent interaction deploys computerized advisory capabilities by merging working and learning modes. We have developed a knowledge-based interactive graphic interface to a circuit pack diagnostic expert system. The graphic interface integrates both the domain knowledge (i.e. circuit pack) and the troubleshooting knowledge (i.e. diagnostic trees). Our interface dynamically changes the amount of detail presented to the user as well as the input choices that the user is allowed to make. These changes are made using knowledge-based models of the user and of the circuit pack troubleshooting domain. The resulting system, McR, instead of guiding the user by querying for input, monitors users actions, analyzes them and offers help when needed. McR is able both to advise “how-to-do-it” by reifying shallow knowledge from the deep knowledge, and to explain intelligently “how-does-it-work” by abstracting deep knowledge from the hallow knowledge, McR is used in conjunction with the STAREX expert sytem which is installed at AT&T factory

    A tangible programming environment model informed by principles of perception and meaning

    Get PDF
    It is a fundamental Human-Computer Interaction problem to design a tangible programming environment for use by multiple persons that can also be individualised. This problem has its origin in the phenomenon that the meaning an object holds can vary across individuals. The Semiotics Research Domain studies the meaning objects hold. This research investigated a solution based on the user designing aspects of the environment at a time after it has been made operational and when the development team is no longer available to implement the user’s design requirements. Also considered is how objects can be positioned so that the collection of objects is interpreted as a program. I therefore explored how some of the principles of relative positioning of objects, as researched in the domains of Psychology and Art, could be applied to tangible programming environments. This study applied the Gestalt principle of perceptual grouping by proximity to the design of tangible programming environments to determine if a tangible programming environment is possible in which the relative positions of personally meaningful objects define the program. I did this by applying the Design Science Research methodology with five iterations and evaluations involving children. The outcome is a model of a Tangible Programming Environment that includes Gestalt principles and Semiotic theory; Semiotic theory explains that the user can choose a physical representation of the program element that carries personal meaning whereas the Gestalt principle of grouping by proximity predicts that objects can be arranged to appear as if linked to each other.School of ComputingPh. D. (Computer Science
    corecore