28 research outputs found

    The Ubiquitous Interactor - Device Independent Access to Mobile Services

    Full text link
    The Ubiquitous Interactor (UBI) addresses the problems of design and development that arise around services that need to be accessed from many different devices. In UBI, the same service can present itself with different user interfaces on different devices. This is done by separating interaction between users and services from presentation. The interaction is kept the same for all devices, and different presentation information is provided for different devices. This way, tailored user interfaces for many different devices can be created without multiplying development and maintenance work. In this paper we describe the system design of UBI, the system implementation, and two services implemented for the system: a calendar service and a stockbroker service

    Generic framework for the personal omni-remote controller using M2MI

    Get PDF
    A Generic Framework for the Personal Omni-Remote Controller Using M2MI is a master’s thesis outlining a generic framework for the wireless omni-remote controller that controls neighboring appliances by using Many-to-Many Invocation (M2MI). M2MI is an object-oriented abstraction of broadcast communication. First, this paper introduces the history of remote controllers and analyzes omni-remote controller projects made by other researchers in this area, such as the Pebbles PDA project at Carnegie Mellon University and HP’s COOLTOWN project. Second, this paper depicts a generic framework of the personal omni-remote controller system including architecture, type hierarchy, and service discovery. In this framework, a module approach and a decentralized dual-mode service discovery scheme are introduced. When users request a certain type of service, their omni-remote controller application will first discover the available appliances in the vicinity and then bring up the corresponding control module for the target appliance. Thus, users can control the appliance through the User Interface of the control module. To join the omni-remote controller system, servers and clients need to follow the type hierarchy convention of the system. Finally, several implementations are given to show the control of different appliances with different capabilities. These appliances include thermostats, TVs with parental control, and washing machines

    Data-Centric Collaboration for Wired and Wireless Platforms

    Get PDF
    With the proliferation of mobile computing devices there is an increasing demand for applications supporting collaboration among users working in the field and in the office. A key component for collaboration in this domain is sharing and manipulation of information using very different devices and communications. We propose a novel, data-centric collaboration paradigm, where each user can obtain a subset of the shared data and the data may be visualized differently for different users. The data amount and the visualization technique reflect the user’s interests and/or computing and communication capabilities. The users collaborate on and exchange data, and the data is dynamically transformed to adapt to the particular computing/network platform. The resulting design is simple yet very powerful and scalable. It is implemented and tested by developing several complex groupware applications

    Dynamic graphical user interface generation for web-based public display applications

    Get PDF
    Public digital displays are moving towards open display networks, resulting in a shift in the focus from single-purpose public displays that are developed with a single task or application in mind, to general- purpose displays that can run several applications, developed by different vendors. In this new paradigm, it is important to facilitate the development of interactive public display applications and provide programmers with toolkits for incorporating interaction features. An important function of such toolkits is to support interaction with public displays through a users' smartphone, allowing users to discover and interact with the public display applications configured in a given display. This paper describes our approach to providing dynamically generated graphical user interfaces for public display applications that is part of the PuReWidgets toolkit

    Data-Centric Collaboration for Wired and Wireless Platforms

    Get PDF
    With the proliferation of mobile computing devices there is an increasing demand for applications supporting collaboration among users working in the field and in the office. A key component for collaboration in this domain is sharing and manipulation of information using very different devices and communications. We propose a novel, data-centric collaboration paradigm, where each user can obtain a subset of the shared data and the data may be visualized differently for different users. The data amount and the visualization technique reflect the user’s interests and/or computing and communication capabilities. The users collaborate on and exchange data, and the data is dynamically transformed to adapt to the particular computing/network platform. The resulting design is simple yet very powerful and scalable. It is implemented and tested by developing several complex groupware applications

    Generating speech user interfaces from interaction acts

    Get PDF
    ABSTRACT We have applied interaction acts, an abstract user-service interaction specification, to speech user interfaces to investigate how well it lends itself to a new type of user interface. We used interaction acts to generate a VoiceXML-based speech user interface, and identified two main issues connected to the differences between graphical user interfaces and speech user interfaces. The first issue concerns the structure of the user interface. Generating speech user interfaces and GUIs from the same underlying structure easily results in a too hierarchical and difficult to use speech user interface. The second issue is user input. Interpreting spoken user input is fundamentally different from user input in GUIs. We have shown that it is possible to generate speech user interfaces based on. A small user study supports the results. We discuss these issues and some possible solutions, and some results from preliminary user studies

    Automatically Generating Personalized User Interfaces with SUPPLE

    Get PDF
    Today's computer–human interfaces are typically designed with the assumption that they are going to be used by an able-bodied person, who is using a typical set of input and output devices, who has typical perceptual and cognitive abilities, and who is sitting in a stable, warm environment. Any deviation from these assumptions may drastically hamper the person's effectiveness—not because of any inherent barrier to interaction, but because of a mismatch between the person's effective abilities and the assumptions underlying the interface design. We argue that automatic personalized interface generation is a feasible and scalable solution to this challenge. We present our Supple system, which can automatically generate interfaces adapted to a person's devices, tasks, preferences, and abilities. In this paper we formally define interface generation as an optimization problem and demonstrate that, despite a large solution space (of up to 1017 possible interfaces), the problem is computationally feasible. In fact, for a particular class of cost functions, Supple produces exact solutions in under a second for most cases, and in a little over a minute in the worst case encountered, thus enabling run-time generation of user interfaces. We further show how several different design criteria can be expressed in the cost function, enabling different kinds of personalization. We also demonstrate how this approach enables extensive user- and system-initiated run-time adaptations to the interfaces after they have been generated. Supple is not intended to replace human user interface designers—instead, it offers alternative user interfaces for those people whose devices, tasks, preferences, and abilities are not sufficiently addressed by the hand-crafted designs. Indeed, the results of our study show that, compared to manufacturers' defaults, interfaces automatically generated by Supple significantly improve speed, accuracy and satisfaction of people with motor impairments.Engineering and Applied Science

    Shared Substance: Developing Flexible Multi-Surface Applications

    Get PDF
    International audienceThis paper presents a novel middleware for developing flexible interactive multi-surface applications. Using a scenario-based approach, we identify the requirements for this type of applications. We then introduce Substance, a data-oriented framework that decouples functionality from data, and Shared Substance, a middleware implemented in Substance that provides powerful sharing abstractions. We describe our implementation of two applications with Shared Substance and discuss the insights gained from these experiments. Our finding is that the combination of a data- oriented programming model with middleware support for sharing data and functionality provides a flexible, robust solution with low viscosity at both design-time and run-time

    Interactive Systems:Design, Specification, and Verification

    Full text link
    corecore