17,197 research outputs found

    Feel My Pain: Design and Evaluation of Painpad, a Tangible Device for Supporting Inpatient Self-Logging of Pain

    Get PDF
    Monitoring patients' pain is a critical issue for clinical caregivers, particularly among staff responsible for providing analgesic relief. However, collecting regularly scheduled pain readings from patients can be difficult and time-consuming for clinicians. In this paper we present Painpad, a tangible device that was developed to allow patients to engage in self-logging of their pain. We report findings from two hospital-based field studies in which Painpad was deployed to a total of 78 inpatients recovering from ambulatory surgery. We find that Painpad results in improved frequency and compliance with pain logging, and that self-logged scores may be more faithful to patients' experienced pain than corresponding scores reported to nurses. We also show that older adults may prefer tangible interfaces over tablet-based alternatives for reporting their pain, and we contribute design lessons for pain logging devices intended for use in hospital settings

    Modelling and correcting for the impact of the gait cycle on touch screen typing accuracy

    Get PDF
    Walking and typing on a smartphone is an extremely common interaction. Previous research has shown that error rates are higher when walking than when stationary. In this paper we analyse the acceleration data logged in an experiment in which users typed whilst walking, and extract the gait phase angle. We find statistically significant relationships between tapping time, error rate and gait phase angle. We then use the gait phase as an additional input to an offset model, and show that this allows more accurate touch interaction for walking users than a model which considers only the recorded tap position

    Design and User Satisfaction of Interactive Maps for Visually Impaired People

    Get PDF
    Multimodal interactive maps are a solution for presenting spatial information to visually impaired people. In this paper, we present an interactive multimodal map prototype that is based on a tactile paper map, a multi-touch screen and audio output. We first describe the different steps for designing an interactive map: drawing and printing the tactile paper map, choice of multi-touch technology, interaction technologies and the software architecture. Then we describe the method used to assess user satisfaction. We provide data showing that an interactive map - although based on a unique, elementary, double tap interaction - has been met with a high level of user satisfaction. Interestingly, satisfaction is independent of a user's age, previous visual experience or Braille experience. This prototype will be used as a platform to design advanced interactions for spatial learning

    The effect of age and font size on reading text on handheld computers

    Get PDF
    Though there have been many studies of computer based text reading, only a few have considered the small screens of handheld computers. This paper presents an investigation into the effect of varying font size between 2 and 16 point on reading text on a handheld computer. By using both older and younger participants the possible effects of age were examined. Reading speed and accuracy were measured and subjective views of participants recorded. Objective results showed that there was little difference in reading performance above 6 point, but subjective comments from participants showed a preference for sizes in the middle range. We therefore suggest, for reading tasks, that designers of interfaces for mobile computers provide fonts in the range of 8-12 point to maximize readability for the widest range of users

    Assessing the effectiveness of multi-touch interfaces for DP operation

    Get PDF
    Navigating a vessel using dynamic positioning (DP) systems close to offshore installations is a challenge. The operator's only possibility of manipulating the system is through its interface, which can be categorized as the physical appearance of the equipment and the visualization of the system. Are there possibilities of interaction between the operator and the system that can reduce strain and cognitive load during DP operations? Can parts of the system (e.g. displays) be physically brought closer to the user to enhance the feeling of control when operating the system? Can these changes make DP operations more efficient and safe? These questions inspired this research project, which investigates the use of multi-touch and hand gestures known from consumer products to directly manipulate the visualization of a vessel in the 3D scene of a DP system. Usability methodologies and evaluation techniques that are widely used in consumer market research were used to investigate how these interaction techniques, which are new to the maritime domain, could make interaction with the DP system more efficient and transparent both during standard and safety-critical operations. After investigating which gestures felt natural to use by running user tests with a paper prototype, the gestures were implemented into a Rolls-Royce DP system and tested in a static environment. The results showed that the test participants performed significantly faster using direct gesture manipulation compared to using traditional button/menu interaction. To support the results from these tests, further tests were carried out. The purpose is to investigate how gestures are performed in a moving environment, using a motion platform to simulate rough sea conditions. The key results and lessons learned from a collection of four user experiments, together with a discussion of the choice of evaluation techniques will be discussed in this paper

    User-centered design of a dynamic-autonomy remote interaction concept for manipulation-capable robots to assist elderly people in the home

    Get PDF
    In this article, we describe the development of a human-robot interaction concept for service robots to assist elderly people in the home with physical tasks. Our approach is based on the insight that robots are not yet able to handle all tasks autonomously with sufficient reliability in the complex and heterogeneous environments of private homes. We therefore employ remote human operators to assist on tasks a robot cannot handle completely autonomously. Our development methodology was user-centric and iterative, with six user studies carried out at various stages involving a total of 241 participants. The concept is under implementation on the Care-O-bot 3 robotic platform. The main contributions of this article are (1) the results of a survey in form of a ranking of the demands of elderly people and informal caregivers for a range of 25 robot services, (2) the results of an ethnography investigating the suitability of emergency teleassistance and telemedical centers for incorporating robotic teleassistance, and (3) a user-validated human-robot interaction concept with three user roles and corresponding three user interfaces designed as a solution to the problem of engineering reliable service robots for home environments
    • 

    corecore