14 research outputs found

    Eyes-free interaction with aural user interfaces

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)Existing web applications force users to focus their visual attentions on mobile devices, while browsing content and services on the go (e.g., while walking or driving). To support mobile, eyes-free web browsing and minimize interaction with devices, designers can leverage the auditory channel. Whereas acoustic interfaces have proven to be effective in regard to reducing visual attention, a perplexing challenge exists in designing aural information architectures for the web because of its non-linear structure. To address this problem, we introduce and evaluate techniques to remodel existing information architectures as "playlists" of web content - aural flows. The use of aural flows in mobile web browsing can be seen in ANFORA News, a semi-aural mobile site designed to facilitate browsing large collections of news stories. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experiences with ANFORA News in a mobile setting. The initial evidence suggests that aural flows are a promising paradigm for supporting eyes-free mobile navigation while on the go. Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe. To reduce visual interaction with the screen, we also explore the use of simulated voice commands to control aural flows. In a study, 20 participants browsed aural flows either through a visual interface or with a visual interface augmented by voice commands. The results suggest that using voice commands decreases by half the time spent looking at the device, but yields similar walking speeds, system usability and cognitive effort ratings as using buttons. To test the potential of using aural flows in a higher distracting context, a study (n=60) was conducted in a driving simulation lab. Each participant drove through three driving scenario complexities: low, moderate and high. Within each driving complexity, the participants went through an alternative aural application exposure: no device, voice-controlled aural flows (ANFORADrive) or alternative solution on the market (Umano). The results suggest that voice-controlled aural flows do not affect distraction, overall safety, cognitive effort, driving performance or driving behavior when compared to the no device condition

    Understanding Advice Sharing among Physicians: Towards Trust-Based Clinical Alerts

    Get PDF
    Safe prescribing of medications relies on drug safety alerts, but up to 96% of such warnings are ignored by physicians. Prior research has proposed improvements to the design of alerts, but with limited increase in adherence. We propose a different perspective: before re-designing alerts, we focus on improving the trust between physicians and computerized advice by examining why physicians trust their medical colleagues. To understand trusted advice among physicians, we conducted three contextual inquiries in a hospital setting (22 participants), and corroborated our findings with a survey (37 participants). Drivers that guide physicians in trusting peer advice include: timeliness of the advice, collaborative language, empathy, level of specialization and medical hierarchy. Based on these findings, we introduce seven design directions for trust-based alerts: endorsement, transparency, team sensing, collaborative, empathic, conflict mitigating and agency laden. Our work contributes to novel alert design strategies to improve the effectiveness of drug safety advice

    Navigating the Aural Web: Augmenting User Experience for Visually Impaired and Mobile Users

    Get PDF
    poster abstractThe current web navigation paradigm structures interaction around vision and thus hampers users in two eyes-free scenarios: mobile computing and information access for the visually impaired. Users in both scenarios are unable to navigate complex information architectures efficiently because of the strictly linear perceptual bandwidth of the aural channel. To combat this problem, we are conducting a long-term research program aimed at establishing novel design strategies that can augment the aural navigation while users browse complex information architectures typical of the web. A pervasive problem in designing for web accessibility (especially for screen reader users) is to provide efficient access to a large collection of contents, which is manifested in long lists indexing the underlying contents. Cognitively managing the interaction with long lists is cumbersome in the aural paradigm because users need to listen attentively to each list item to make a decision about what link to follow and then select a link. For every non relevant page selected, screen reader users need to go back to the list to select another page. Our most recent study studies compared the performance of index-based web navigation to guided-tour navigation (navigation without lists) for screen-reader users. Guided-tour navigation allows users to move directly back and forth across the content pages of a collection, bypassing lists. An experiment (N=10), conducted at the Indiana School for the Blind and Visually Impaired (ISBVI), examined these web navigation strategies during fact-finding tasks. Guided-tour significantly reduced time on task, number of pages visited, number of keystrokes, and perceived cognitive effort while enhancing the navigational experience. By augmenting existing navigational methods for screen-reader users, our research offers design strategies to web designers to improve web accessibility without costly site redesign. This research material is based upon work supported by the National Science Foundation under Grant #1018054

    Eyes-free interaction with aural user interfaces

    Get PDF
    Poster abstractPeople engaged in parallel tasks at once, such as walking and browsing the web, cannot efficiently access web content and safely monitor their surroundings at the same time. To combat this, we investigate techniques to design novel aural interfaces, which remodel existing web information architectures as linear, aural flows to be listened to. An aural flow is a design-driven, concatenated sequence of pages that can be listened to with minimal interaction required. Aural flows are exemplified in ANFORA News, a semi-aural mobile site optimized to aurally browse large collections of news stories on the go. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experience with ANFORA News in a mobile setting. Initial evidence suggests that aural flows are a promising paradigm to support eyes-free mobile navigation while on the go, but users still require assistance and additional learning to fully master the aural mechanics of the flows. To unleash a more natural interaction with aural flows, we are currently exploring linkless navigation, which enables users to control the flow via a small set of dialogic commands, issued via voice. Overall, our approach will open new avenues to design appropriate aural user interfaces for content-intensive web systems. This research material is based on work supported by the National Science Foundation under Grant #1018054. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the NSF

    Semi-aural Interfaces: Investigating Voice-controlled Aural Flows

    Get PDF
    To support mobile, eyes-free web browsing, users can listen to ‘playlists’ of web content— aural flows . Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g. while walking). This research extends the interaction with aural flows through simulated voice commands as a way to reduce visual interaction. This paper presents the findings of a study with 20 participants who browsed aural flows either through a visual interface only or by augmenting it with voice commands. Results suggest that using voice commands reduced the time spent looking at the device by half but yielded similar system usability and cognitive effort ratings as using buttons. Overall, the low-cognitive effort engendered by aural flows, regardless of the interaction modality, allowed participants to do more non-instructed (e.g. looking at the surrounding environment) than instructed activities (e.g. focusing on the user interface)

    Involving patients as key stakeholders in the design of cardiovascular implantable electronic device data dashboards: Implications for patient care

    Get PDF
    Background: Data from remote monitoring (RM) of cardiovascular implantable electronic devices (CIEDs) currently are not accessible to patients despite demand. The typical RM report contains multiple pages of data for trained technicians to read and interpret and requires a patient-centered approach to be curated to meet individual user needs. Objective: The purpose of this study was to understand which RM data elements are important to patients and to gain design insights for displaying meaningful data in a digital dashboard. Methods: Adults with implantable cardioverter-defibrillators (ICDs) and pacemakers (PMs) participated in this 2-phase, user-centered design study. Phase 1 included a card-sorting activity to prioritize device data elements. Phase 2 included one-on-one design sessions to gather insights and feedback about a visual display (labels and icons). Results: Twenty-nine adults (mean age 71.8 ± 11.6 years; 51.7% female; 89.7% white) participated. Priority data elements for both ICD and PM groups in phase 1 (n = 19) were related to cardiac episodes, device activity, and impedance values. Recommended replacement time for battery was high priority for the PM group but not the ICD group. Phase 2 (n = 10) revealed that patients would like descriptive, nontechnical terms to depict the data and icons that are intuitive and informative. Conclusion: This user-centered design study demonstrated that patients with ICDs and PMs were able to prioritize specific data from a comprehensive list of data elements that they had never seen before. This work contributes to the goal of sharing RM data with patients in a way that optimizes the RM feature of CIEDs for improving patient outcomes and clinical care

    Untold Stories in User-Centered Design of Mobile Health: Practical Challenges and Strategies Learned From the Design and Evaluation of an App for Older Adults With Heart Failure

    Get PDF
    Background User-centered design (UCD) is a powerful framework for creating useful, easy-to-use, and satisfying mobile health (mHealth) apps. However, the literature seldom reports the practical challenges of implementing UCD, particularly in the field of mHealth. Objective This study aims to characterize the practical challenges encountered and propose strategies when implementing UCD for mHealth. Methods Our multidisciplinary team implemented a UCD process to design and evaluate a mobile app for older adults with heart failure. During and after this process, we documented the challenges the team encountered and the strategies they used or considered using to address those challenges. Results We identified 12 challenges, 3 about UCD as a whole and 9 across the UCD stages of formative research, design, and evaluation. Challenges included the timing of stakeholder involvement, overcoming designers’ assumptions, adapting methods to end users, and managing heterogeneity among stakeholders. To address these challenges, practical recommendations are provided to UCD researchers and practitioners. Conclusions UCD is a gold standard approach that is increasingly adopted for mHealth projects. Although UCD methods are well-described and easily accessible, practical challenges and strategies for implementing them are underreported. To improve the implementation of UCD for mHealth, we must tell and learn from these traditionally untold stories

    Erratum to: Providing Patients with Implantable Cardiac Device Data through a Personal Health Record: A Qualitative Study

    Get PDF
    Erratum to: Providing Patients with Implantable Cardiac Device Data through a Personal Health Record: A Qualitative Study. [Appl Clin Inform. 2017

    Patient responses to daily cardiac resynchronization therapy device data: A pilot trial assessing a novel patient-centered digital dashboard in everyday life

    Get PDF
    Background Heart failure (HF) is a growing public health problem in the United States. Implantable cardiac resynchronization therapy (CRT) devices reduce mortality and morbidity, and remote monitoring (RM) of these devices improves outcomes. However, patient RM adherence is low, due in part to lack of access to their RM data. Providing these data to patients may increase engagement, but they must be appropriately tailored to ensure understanding. Objective The purpose of this study was to examine patients’ experiences interacting with their RM data through a novel digital dashboard as part of daily life. Methods In this mixed-methods pilot study, 10 patients with implantable CRT defibrillators were given access to a patient-centered RM data dashboard, updated daily for 6–12 months. Pre- and post-health literacy, engagement, electronic portal (MyChart, Epic Systems Corporation) logins, and RM adherence were measured; system usability scores were collected at exit; and dashboard views were tracked. Exit interviews were conducted to elucidate patients’ experiences. Results Participants (100% white; 60% male; age 34–80 years [mean ± SD: 62.0 ± 13.4]) had adequate health literacy, increased MyChart logins (P = .0463), and nonsignificant increase in RM adherence. Participants viewed their dashboards 0–42 times (mean 14.9 ± 12.5). Interviews revealed participants generally appreciated access to their data, understood it, and responded to changes; however, questions and concerns remained regarding data interpretation and visualization. Conclusion Preliminary findings support potential future integration of a CRT RM data dashboard in the daily care of HF patients. With appropriate informational support and personalization, sharing RM data with patients in a tailored dashboard may improve health engagement

    Uncertainty Management Among Older Adults with Heart Failure: Responses to Receiving Implanted Device Data using a Fictitious Scenario Interview Method

    Get PDF
    Heart failure (HF) is a complex chronic illness that affects the older adult population, requiring medical therapy and day-to-day management to prevent worsening and exacerbation. Patients with HF are often treated with cardiac implanted electronic devices (CIEDs) which capture diagnostic and predictive parameters for HF. In this work we explore how patients would respond to receiving data from an implanted device, using a fictitious scenario interview method with 24 older adults with HF. We applied an uncertainty management lens to better understand how patients face uncertain outcomes and integrate novel data into their decision making. The findings provide insight into how patients would engage and respond to a technology which provides an indicator of their HF status from an implanted device
    corecore