190 research outputs found

    DESIGNING AND IMPLEMENTING ACCESSIBLE WEARABLE INTERACTIONS FOR PEOPLE WITH MOTOR IMPAIRMENTS

    Get PDF
    Emerging wearable technologies like fitness bands, smartwatches, and head-mounted displays (HMDs) are entering the mainstream market. Unlike smartphones and tablets, these wearables, worn on the body or clothing, are always available and have the potential to provide quick access to information [7]. For instance, HMDs can provide relatively hands-free interaction compared to smartphones, and smartwatches and activity trackers can collect continuous health and fitness-related information of their wearer. However, there are over 20 million people in the U.S. with upper body motor impairments [133], who may not be able to gain from the potential benefits of these wearables. For example, the small interaction spaces of smartwatches may present accessibility challenges. Yet, few studies have explored the potential impacts or evaluated the accessibility of these wearables or investigated ways to design accessible wearable interactions for people with motor impairments. To inform the design of future wearable technologies, my dissertation investigates three threads of research: (1) assessing the accessibility of wearable technologies like HMDs, smartwatches and fitness trackers; (2) understanding the potential impacts of sharing automatically tracked fitness-related information for people with mobility impairments; and (3) implementing and evaluating accessible interactions for HMDs and smartwatches. As part of my first research thread, I conducted two formative studies investigating the accessibility of HMDs and fitness trackers and found that people with motor impairments experienced accessibility challenges like problematic form factors, irrelevant data tracking and difficulty with existing input. For my second research thread, I investigated the potential impacts of sharing automatically tracked data from fitness trackers with peers with similar impairments and therapists and presented design opportunities to build tools to support sharing. Towards my third research thread, I addressed the earlier issues identified with HMD accessibility by building custom wearable touchpads to control a commercial HMD. Next, I explored the touchscreen and non-touchscreen areas (bezel, wristband and user’s body) of smartwatches for accessible interaction. And, lastly, I built and compared bezel input with touchscreen input for accessible smartwatch interaction. The techniques implemented and evaluated in this dissertation will enable more equitable and independent use of wearable technologies for people with motor impairments

    Investigando Natural User Interfaces (NUIs) : tecnologias e interação em contexto de acessibilidade

    Get PDF
    Orientador: Maria Cecília Calani BaranauskasTese (doutorado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: Natural User Interfaces (NUIs) representam um novo paradigma de interação, com a promessa de ser mais intuitivo e fácil de usar do que seu antecessor, que utiliza mouse e teclado. Em um contexto no qual as tecnologias estão cada vez mais invisíveis e pervasivas, não só a quantidade mas também a diversidade de pessoas que participam deste contexto é crescente. Nesse caso, é preciso estudar como esse novo paradigma de interação de fato consegue ser acessível a todas as pessoas que podem utilizá-lo no dia-a-dia. Ademais, é preciso também caracterizar o paradigma em si, para entender o que o torna, de fato, natural. Portanto, nesta tese apresentamos o caminho que percorremos em busca dessas duas respostas: como caracterizar NUIs, no atual contexto tecnológico, e como tornar NUIs acessíveis para todos. Para tanto, primeiro apresentamos uma revisão sistemática de literatura com o estado da arte. Depois, mostramos um conjunto de heurísticas para o design e a avaliação de NUIs, que foram aplicadas em estudos de caso práticos. Em seguida, estruturamos as ideias desta pesquisa dentro dos artefatos da Semiótica Organizacional, e obtivemos esclarecimentos sobre como fazer o design de NUIs com Acessibilidade, seja por meio de Design Universal, seja para propor Tecnologias Assistivas. Depois, apresentamos três estudos de caso com sistemas NUI cujo design foi feito por nós. A partir desses estudos de caso, expandimos nosso referencial teórico e conseguimos, por fim, encontrar três elementos que resumem a nossa caracterização de NUI: diferenças, affordances e enaçãoAbstract: Natural User Interfaces (NUIs) represent a new interaction paradigm, with the promise of being more intuitive and easy to use than its predecessor, that utilizes mouse and keyboard. In a context where technology is becoming each time more invisible and pervasive, not only the amount but also the diversity of people who participate in this context is increasing. In this case, it must be studied how this new interaction paradigm can, in fact, be accessible to all the people who may use it on their daily routine. Furthermore, it is also necessary to characterize the paradigm itself, to understand what makes it, in fact, natural. Therefore, in this thesis we present the path we took in search of these two answers: how to characterize NUIs in the current technological context, and how to make NUIs accessible to all. To do so, first we present a systematic literature review with the state of the art. Then, we show a set of heuristics for the design and evaluation of NUIs, which were applied in practical study cases. Afterwards, we structure the ideas of this research into the Organizational Semiotics artifacts, and we obtain insights into how to design NUIs with Accessibility, be it through Universal Design, be it to propose Assistive Technologies. Then, we present three case studies with NUI systems which we designed. From these case studies, we expanded our theoretical references were able to, finally, find three elements that sum up our characterization of NUI: differences, affordances and enactionDoutoradoCiência da ComputaçãoDoutora em Ciência da Computação160911/2015-0CAPESCNP

    Accessible On-Body Interaction for People With Visual Impairments

    Get PDF
    While mobile devices offer new opportunities to gain independence in everyday activities for people with disabilities, modern touchscreen-based interfaces can present accessibility challenges for low vision and blind users. Even with state-of-the-art screenreaders, it can be difficult or time-consuming to select specific items without visual feedback. The smooth surface of the touchscreen provides little tactile feedback compared to physical button-based phones. Furthermore, in a mobile context, hand-held devices present additional accessibility issues when both of the users’ hands are not available for interaction (e.g., on hand may be holding a cane or a dog leash). To improve mobile accessibility for people with visual impairments, I investigate on-body interaction, which employs the user’s own skin surface as the input space. On-body interaction may offer an alternative or complementary means of mobile interaction for people with visual impairments by enabling non-visual interaction with extra tactile and proprioceptive feedback compared to a touchscreen. In addition, on-body input may free users’ hands and offer efficient interaction as it can eliminate the need to pull out or hold the device. Despite this potential, little work has investigated the accessibility of on-body interaction for people with visual impairments. Thus, I begin by identifying needs and preferences of accessible on-body interaction. From there, I evaluate user performance in target acquisition and shape drawing tasks on the hand compared to on a touchscreen. Building on these studies, I focus on the design, implementation, and evaluation of an accessible on-body interaction system for visually impaired users. The contributions of this dissertation are: (1) identification of perceived advantages and limitations of on-body input compared to a touchscreen phone, (2) empirical evidence of the performance benefits of on-body input over touchscreen input in terms of speed and accuracy, (3) implementation and evaluation of an on-body gesture recognizer using finger- and wrist-mounted sensors, and (4) design implications for accessible non-visual on-body interaction for people with visual impairments

    New Trends in Neuromechanics and Motor Rehabilitation

    Get PDF
    Neuromechanics has been used to identify optimal rehabilitation protocols that successfully improve motor deficits in various populations, such as elderly people and individuals with neurological diseases (e.g., stroke, Parkinson’s disease, and essential tremor). By investigating structural and functional changes in the central and peripheral nervous systems based on neuromechanical theories and findings, we can expand our knowledge regarding underlying neurophysiological mechanisms and specific motor impairment patterns before and after therapies to further develop new training programs (e.g., non-invasive brain stimulation). Thus, the aim of this Special Issue is to present the main contributions of researchers and rehabilitation specialists in biomechanics, motor control, neurophysiology, neuroscience, and rehabilitation science. The current collection provides new neuromechanical approaches addressing theoretical, methodological, and practical topics for facilitating motor recovery progress

    HandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairments

    Get PDF
    Many activities of daily living such as getting dressed, preparing food, wayfinding, or shopping rely heavily on visual information, and the inability to access that information can negatively impact the quality of life for people with vision impairments. While numerous researchers have explored solutions for assisting with visual tasks that can be performed at a distance, such as identifying landmarks for navigation or recognizing people and objects, few have attempted to provide access to nearby visual information through touch. Touch is a highly attuned means of acquiring tactile and spatial information, especially for people with vision impairments. By supporting touch-based access to information, we may help users to better understand how a surface appears (e.g., document layout, clothing patterns), thereby improving the quality of life. To address this gap in research, this dissertation explores methods to augment a visually impaired user’s sense of touch with interactive, real-time computer vision to access information about the physical world. These explorations span three application areas: reading and exploring printed documents, controlling mobile devices, and identifying colors and visual textures. At the core of each application is a system called HandSight that uses wearable cameras and other sensors to detect touch events and identify surface content beneath the user’s finger. To create HandSight, we designed and implemented the physical hardware, developed signal processing and computer vision algorithms, and designed real-time feedback that enables users to interpret visual or digital content. We involve visually impaired users throughout the design and development process, conducting several user studies to assess usability and robustness and to improve our prototype designs. The contributions of this dissertation include: (i) developing and iteratively refining HandSight, a novel wearable system to assist visually impaired users in their daily lives; (ii) evaluating HandSight across a diverse set of tasks, and identifying tradeoffs of a finger-worn approach in terms of physical design, algorithmic complexity and robustness, and usability; and (iii) identifying broader design implications for future wearable systems and for the fields of accessibility, computer vision, augmented and virtual reality, and human-computer interaction

    Evaluating Context-Aware Applications Accessed Through Wearable Devices as Assistive Technology for Students with Disabilities

    Get PDF
    The purpose of these two single subject design studies was to evaluate the use of the wearable and context-aware technologies for college students with intellectual disability and autism as tools to increase independence and vocational skills. There is a compelling need for the development of tools and strategies that will facilitate independence, self-sufficiency, and address poor outcomes in adulthood for students with disabilities. Technology is considered to be a great equalizer for people with disabilities. The proliferation of new technologies allows access to real-time, contextually-based information as a means to compensate for limitations in cognitive functioning and decrease the complexity of prerequisite skills for successful use of previous technologies. Six students participated in two single-subject design studies; three students participate in Study I and three different students participated in Study II. The results of these studies are discussed in the context applying new technology applications to assist and improve individuals with intellectual disability and autism to self-manage technological supports to learn new skills, set reminders, and enhance independence. During Study I, students were successfully taught to use a wearable smartglasses device, which delivered digital auditory and visual information to complete three novel vocational tasks. The results indicated that all students learned all vocational task using the wearable device. Students also continued to use the device beyond the initial training phase to self-direct their learning and self-manage prompts for task completion as needed. During Study II, students were successfully taught to use a wearable smartwatch device to enter novel appointments for the coming week, as well as complete the tasks associated with each appointment. The results indicated that all students were able to self-operate the wearable device to enter appointments, attend all appointments on-time and complete all associated tasks

    Exploring the Landscape of Ubiquitous In-home Health Monitoring: A Comprehensive Survey

    Full text link
    Ubiquitous in-home health monitoring systems have become popular in recent years due to the rise of digital health technologies and the growing demand for remote health monitoring. These systems enable individuals to increase their independence by allowing them to monitor their health from the home and by allowing more control over their well-being. In this study, we perform a comprehensive survey on this topic by reviewing a large number of literature in the area. We investigate these systems from various aspects, namely sensing technologies, communication technologies, intelligent and computing systems, and application areas. Specifically, we provide an overview of in-home health monitoring systems and identify their main components. We then present each component and discuss its role within in-home health monitoring systems. In addition, we provide an overview of the practical use of ubiquitous technologies in the home for health monitoring. Finally, we identify the main challenges and limitations based on the existing literature and provide eight recommendations for potential future research directions toward the development of in-home health monitoring systems. We conclude that despite extensive research on various components needed for the development of effective in-home health monitoring systems, the development of effective in-home health monitoring systems still requires further investigation.Comment: 35 pages, 5 figure
    • …
    corecore