947 research outputs found

    Grasp Your Pain: A Tangible Tool to Explore the Logging and Assessment of Pain

    Get PDF
    Pain is a subjective and innate experience that can be difficult to describe. Chronic pain is associated with decreased quality of life, and it is prevalent in cancer populations. With a growing elderly population, the global cancer burden is expected to rapidly advance in the coming years. Expressing pain and symptom experiences is essential for patients to receive proper treatment and care. Self-reporting tools are useful and reliable measures of patients' symptoms. A commonly used assessment form in palliative care is ESAS-r, the revised Edmonton Symptom Assessment System. It lets the patient rate a list of symptoms, on a scale from 0 to 10, depending on their intensity. Research suggests that ESAS-r only captures a snapshot of the patients' symptom profile, and that is burdensome to patients and clinical staff. There is a need for self-assessment tools that are easy to use, non-intrusive, and can be used in situ. The research in this thesis explores the use of a tangible tool (Grasp), and squeezing as an input method to log pain/symptoms experiences. Grasp consists of a small stone-like object. When squeezed, it logs the time and duration of the interaction. Squeezes are then visualized on an accompanying interface. Through a Mixed Methods Research approach, a pilot study and clinical trial were conducted. The former gathered participant (N=8) opinions on Grasp, and the use of squeeze duration to log experiences. The latter explored the implementation of Grasp alongside ESAS-r in a cancer ward (nurses = 6, patients = 8). Two broad research questions were examined: RQ1: How can tangible interaction through Grasp support the logging of experiences? and RQ2: How do palliative cancer patients and nurses experience Grasp as a tool for the logging, assessment, and communication of pain and symptoms compared to ESAS-r? Findings from the pilot suggest that there is potential in using Grasp and squeeze duration to log events, and that interacting with the tool potentially can help distract or externalize from negative experiences. Participants from both studies found Grasp easy to use, and visualizations intuitive and meaningful. Nurses and patients were generally satisfied with Grasp as a tool, and it helped paint a wider image of the patients' symptoms compared to ESAS-r alone. However, patients were sometimes too ill to use Grasp, and the research was limited by barriers related to clinical environments. Further research is needed to explore the potential of tangible interaction and squeezing as an input method with other patient groups. There is also the aspect of the affective interaction that should be investigated further.Masteroppgave i informasjonsvitenskapINFO390MASV-INF

    Cocreation In Context: An Evaluation of Participatory Technology Design for Enhancing Community Engagement with Public Spaces

    Get PDF
    The integration of interactive public displays into urban regeneration is a dynamic research area, extending beyond traditional screens to offer accessible, alternative interactions in outdoor settings. These displays not only provide users with a wealth of information but also provide in depth emotionally charged interactions, evoking nostalgia and re-engaging people with a space.In the ever-evolving urban landscape, this thesis addresses the challenges posed by adapting spaces and the effect this can have on the way people engage with these spaces, highlighting a need for innovative interventions to help communities to remain engaged with their local spaces. This work centres on integrating interactive public displays into urban environments, with a primary focus on evoking nostalgia (an affectionate feeling for the past [54]) and topophilia (emotional connections to a space [271]). This thesis embarks on a journey, engaging a diverse range of users, including designers, children, regeneration experts, and community members. It commences with a nine-month deployment of a tangible embedded interface into a dynamic urban context, receiving over 10,000 session interactions. This real-world experience shows the importance of incorporating aspects of cultural heritage and immersive content into informative designs. Subsequently, the journey explores uncharted territories, navigating the landscape of cocreation methods and collaborative efforts, culminating in the development of a multisensory installation, that integrates olfactory, auditory, and visual elements. This exploration delves into the intricacies of community engagement, technology integration, nostalgia, and the dynamics of urban regeneration. As these chapters unfold, valuablelessons are drawn from field experiences, guiding a reflective journey through the design process with an aim to advance interactive public display creation by addressing gaps in design tools and methodologies. Resulting in feature requirements for an overarching master design suite, this contribution advances the field of HCI within urban regeneration. It brings together valuable lessons learned and recommendations for integrating novel multimodal installations of the future, while also addressing the need for adaptable tools to facilitate their design. This holistic approach underscores the transformative potential of technology in shaping urban environments and community engagement

    Eyewear Computing \u2013 Augmenting the Human with Head-Mounted Wearable Assistants

    Get PDF
    The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays. The seminar welcomed 30 academic and industry researchers from Europe, the US, and Asia with a diverse background, including wearable and ubiquitous computing, computer vision, developmental psychology, optics, and human-computer interaction. In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to one half-day and to leave the rest of the week for hands-on sessions, group work, general discussions, and socialising. The key results of this seminar are 1) the identification of key research challenges and summaries of breakout groups on multimodal eyewear computing, egocentric vision, security and privacy issues, skill augmentation and task guidance, eyewear computing for gaming, as well as prototyping of VR applications, 2) a list of datasets and research tools for eyewear computing, 3) three small-scale datasets recorded during the seminar, 4) an article in ACM Interactions entitled \u201cEyewear Computers for Human-Computer Interaction\u201d, as well as 5) two follow-up workshops on \u201cEgocentric Perception, Interaction, and Computing\u201d at the European Conference on Computer Vision (ECCV) as well as \u201cEyewear Computing\u201d at the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp)

    Human Machine Interfaces for Teleoperators and Virtual Environments

    Get PDF
    In Mar. 1990, a meeting organized around the general theme of teleoperation research into virtual environment display technology was conducted. This is a collection of conference-related fragments that will give a glimpse of the potential of the following fields and how they interplay: sensorimotor performance; human-machine interfaces; teleoperation; virtual environments; performance measurement and evaluation methods; and design principles and predictive models

    Advancing proxy-based haptic feedback in virtual reality

    Get PDF
    This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback fĂŒr Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die FĂ€higkeit zuschreibt, Materie kontrollieren zu können. Um dieser Vision nĂ€her zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische EindrĂŒcke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die fĂŒr die Realisierung eines ultimativen Displays nötige FlexibilitĂ€t zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunĂ€chst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. Anschließend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle ĂŒber haptische EindrĂŒcke von Nutzern erhalten können. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinĂ€sthetische EindrĂŒcke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und VerĂ€nderung des Luftwiderstandes von Requisiten vermitteln. ZusĂ€tzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken können. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. Abschließend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine können
    • 

    corecore