98 research outputs found

    Augmenting the Spatial Perception Capabilities of Users Who Are Blind

    Get PDF
    People who are blind face a series of challenges and limitations resulting from their lack of being able to see, forcing them to either seek the assistance of a sighted individual or work around the challenge by way of a inefficient adaptation (e.g. following the walls in a room in order to reach a door rather than walking in a straight line to the door). These challenges are directly related to blind users' lack of the spatial perception capabilities normally provided by the human vision system. In order to overcome these spatial perception related challenges, modern technologies can be used to convey spatial perception data through sensory substitution interfaces. This work is the culmination of several projects which address varying spatial perception problems for blind users. First we consider the development of non-visual natural user interfaces for interacting with large displays. This work explores the haptic interaction space in order to find useful and efficient haptic encodings for the spatial layout of items on large displays. Multiple interaction techniques are presented which build on prior research (Folmer et al. 2012), and the efficiency and usability of the most efficient of these encodings is evaluated with blind children. Next we evaluate the use of wearable technology in aiding navigation of blind individuals through large open spaces lacking tactile landmarks used during traditional white cane navigation. We explore the design of a computer vision application with an unobtrusive aural interface to minimize veering of the user while crossing a large open space. Together, these projects represent an exploration into the use of modern technology in augmenting the spatial perception capabilities of blind users

    The Social Network: How People with Visual Impairment use Mobile Phones in Kibera, Kenya

    Get PDF
    Living in an informal settlement with a visual impairment can be very challenging resulting in social exclusion. Mobile phones have been shown to be hugely beneficial to people with sight loss in formal and high-income settings. However, little is known about whether these results hold true for people with visual impairment (VIPs) in informal settlements. We present the findings of a case study of mobile technology use by VIPs in Kibera, an informal settlement in Nairobi. We used contextual interviews, ethnographic observations and a co-design workshop to explore how VIPs use mobile phones in their daily lives, and how this use influences the social infrastructure of VIPs. Our findings suggest that mobile technology supports and shapes the creation of social infrastructure. However, this is only made possible through the existing support networks of the VIPs, which are mediated through four types of interaction: direct, supported, dependent and restricted

    Designing Multimodal Mobile interaction for a Text Messaging application for Visually impaired Users

    Get PDF
    While mobile devices have experienced important accessibility advances in the past years, people with visual impairments still face important barriers, especially in specific contexts when both their hands are not free to hold the mobile device, like when walking outside. By resorting to a multimodal combination of body based gestures and voice, we aim to achieve full hands and vision free interaction with mobile devices. In this article, we describe this vision and present the design of a prototype, inspired by that vision, of a text messaging application. The article also presents a user study where the suitability of the proposed approach was assessed, and a performance comparison between our prototype and existing SMS applications was conducted. Study participants received positively the prototype, which also supported better performance in tasks that involved text editing

    How a Diverse Research Ecosystem Has Generated New Rehabilitation Technologies: Review of NIDILRR’s Rehabilitation Engineering Research Centers

    Get PDF
    Over 50 million United States citizens (1 in 6 people in the US) have a developmental, acquired, or degenerative disability. The average US citizen can expect to live 20% of his or her life with a disability. Rehabilitation technologies play a major role in improving the quality of life for people with a disability, yet widespread and highly challenging needs remain. Within the US, a major effort aimed at the creation and evaluation of rehabilitation technology has been the Rehabilitation Engineering Research Centers (RERCs) sponsored by the National Institute on Disability, Independent Living, and Rehabilitation Research. As envisioned at their conception by a panel of the National Academy of Science in 1970, these centers were intended to take a “total approach to rehabilitation”, combining medicine, engineering, and related science, to improve the quality of life of individuals with a disability. Here, we review the scope, achievements, and ongoing projects of an unbiased sample of 19 currently active or recently terminated RERCs. Specifically, for each center, we briefly explain the needs it targets, summarize key historical advances, identify emerging innovations, and consider future directions. Our assessment from this review is that the RERC program indeed involves a multidisciplinary approach, with 36 professional fields involved, although 70% of research and development staff are in engineering fields, 23% in clinical fields, and only 7% in basic science fields; significantly, 11% of the professional staff have a disability related to their research. We observe that the RERC program has substantially diversified the scope of its work since the 1970’s, addressing more types of disabilities using more technologies, and, in particular, often now focusing on information technologies. RERC work also now often views users as integrated into an interdependent society through technologies that both people with and without disabilities co-use (such as the internet, wireless communication, and architecture). In addition, RERC research has evolved to view users as able at improving outcomes through learning, exercise, and plasticity (rather than being static), which can be optimally timed. We provide examples of rehabilitation technology innovation produced by the RERCs that illustrate this increasingly diversifying scope and evolving perspective. We conclude by discussing growth opportunities and possible future directions of the RERC program

    The role of mobile technology for fall risk assessment for individuals with multiple sclerosis

    Get PDF
    Multiple Sclerosis (MS) is a chronic, progressive neurogenerative disease that affects one million people in the United States (Wallin et al., 2019). Common MS symptoms include impaired coordination, poor walking and balance, and fatigue, and these symptoms put people with MS (pwMS) at a higher risk for falls (Cameron & Nilsagard, 2018). Falls are highly prevalent among pwMS and can result in detrimental consequences including bone fractures and even death (Matsuda et al., 2011). To prevent falls and fall related injuries, it is important to first assess for multiple risk factors and then intervene through targeted treatments (Palumbo et al., 2015). Fall risk can be assessed through self-report measures, clinical performance tests, or with technology such as force plates and motion capture systems (Kanekar & Aruin, 2013). However, clinicians have time constraints, technology is expensive, and trained personnel is needed. Moreover, due to the COVID-19 pandemic, access to in-person clinical visits is limited. As a result, pwMS may not receive fall risk screening and remain vulnerable to fall related injuries. Mobile technology offers a solution to increase access to fall risk screening using an affordable, ubiquitous, and portable tool (Guise et al., 2014; Marrie et al., 2019). Therefore, the overarching goal of this study was to develop a usable fall risk health application (app) for pwMS to self-assess their fall risk in the home setting. Four studies were performed: 1) smartphone accelerometry was tested to measure postural control in pwMS; 2) a fall risk algorithm was developed for a mobile health app; 3) a fall risk app, Steady-MS, was developed and its usability was tested; and 4) the feasibility of home-based procedures for using Steady-MS was determined. Results suggest that smartphone accelerometry can assess postural control in pwMS. This information was used to develop an algorithm to measure overall fall risk in pwMS and was then incorporated into Steady-MS. Steady-MS was found to be usable among MS users and feasible to use in the home setting. The results from this project demonstrate that pwMS can independently assess their fall risk with Steady-MS in their homes. For the first time, pwMS are equipped to self-assess their fall risk and can monitor and manage their risk. Home-based assessments also opens the potential to offer individualized and targeted treatments to prevent falls. Ultimately, Steady-MS increases access to home-based assessments to reduce falls and improve functional independence for those with MS

    Around-Body Interaction: Leveraging Limb Movements for Interacting in a Digitally Augmented Physical World

    Full text link
    Recent technological advances have made head-mounted displays (HMDs) smaller and untethered, fostering the vision of ubiquitous interaction with information in a digitally augmented physical world. For interacting with such devices, three main types of input - besides not very intuitive finger gestures - have emerged so far: 1) Touch input on the frame of the devices or 2) on accessories (controller) as well as 3) voice input. While these techniques have both advantages and disadvantages depending on the current situation of the user, they largely ignore the skills and dexterity that we show when interacting with the real world: Throughout our lives, we have trained extensively to use our limbs to interact with and manipulate the physical world around us. This thesis explores how the skills and dexterity of our upper and lower limbs, acquired and trained in interacting with the real world, can be transferred to the interaction with HMDs. Thus, this thesis develops the vision of around-body interaction, in which we use the space around our body, defined by the reach of our limbs, for fast, accurate, and enjoyable interaction with such devices. This work contributes four interaction techniques, two for the upper limbs and two for the lower limbs: The first contribution shows how the proximity between our head and hand can be used to interact with HMDs. The second contribution extends the interaction with the upper limbs to multiple users and illustrates how the registration of augmented information in the real world can support cooperative use cases. The third contribution shifts the focus to the lower limbs and discusses how foot taps can be leveraged as an input modality for HMDs. The fourth contribution presents how lateral shifts of the walking path can be exploited for mobile and hands-free interaction with HMDs while walking.Comment: thesi

    Inclusive Augmented and Virtual Reality: A Research Agenda

    Get PDF
    Augmented and virtual reality experiences present significant barriers for disabled people, making it challenging to fully engage with immersive platforms. Whilst researchers have started to explore potential solutions addressing these accessibility issues, we currently lack a comprehensive understanding of research areas requiring further investigation to support the development of inclusive AR/VR systems. To address current gaps in knowledge, we led a series of multidisciplinary sandpits with relevant stakeholders (i.e., academic researchers, industry specialists, people with lived experience of disability, assistive technologists, and representatives from disability organisations, charities, and special needs educational institutions) to collaboratively explore research challenges, opportunities, and solutions. Based on insights shared by participants, we present a research agenda identifying key areas where further work is required in relation to specific forms of disability (i.e., across the spectrum of physical, visual, cognitive, and hearing impairments), including wider considerations associated with the development of more accessible immersive platforms

    Accessible On-Body Interaction for People With Visual Impairments

    Get PDF
    While mobile devices offer new opportunities to gain independence in everyday activities for people with disabilities, modern touchscreen-based interfaces can present accessibility challenges for low vision and blind users. Even with state-of-the-art screenreaders, it can be difficult or time-consuming to select specific items without visual feedback. The smooth surface of the touchscreen provides little tactile feedback compared to physical button-based phones. Furthermore, in a mobile context, hand-held devices present additional accessibility issues when both of the users’ hands are not available for interaction (e.g., on hand may be holding a cane or a dog leash). To improve mobile accessibility for people with visual impairments, I investigate on-body interaction, which employs the user’s own skin surface as the input space. On-body interaction may offer an alternative or complementary means of mobile interaction for people with visual impairments by enabling non-visual interaction with extra tactile and proprioceptive feedback compared to a touchscreen. In addition, on-body input may free users’ hands and offer efficient interaction as it can eliminate the need to pull out or hold the device. Despite this potential, little work has investigated the accessibility of on-body interaction for people with visual impairments. Thus, I begin by identifying needs and preferences of accessible on-body interaction. From there, I evaluate user performance in target acquisition and shape drawing tasks on the hand compared to on a touchscreen. Building on these studies, I focus on the design, implementation, and evaluation of an accessible on-body interaction system for visually impaired users. The contributions of this dissertation are: (1) identification of perceived advantages and limitations of on-body input compared to a touchscreen phone, (2) empirical evidence of the performance benefits of on-body input over touchscreen input in terms of speed and accuracy, (3) implementation and evaluation of an on-body gesture recognizer using finger- and wrist-mounted sensors, and (4) design implications for accessible non-visual on-body interaction for people with visual impairments

    The 16th international symposium on wearable computers, ISWC 2012, adjunct proceedings, Newcastle Upon Tyne, UK, June 18-22 2012

    Get PDF
    • …
    corecore