5 research outputs found

    SmartAbility: Detection of reduced physical abilities through smartphone sensors

    Get PDF
    This paper describes a developed Android application to accompany the SmartAbility Framework, in order to recommend technologies for people with reduced physical ability. The framework is a culmination of previously conducted research, including requirements elicitation and technology trials. The application is based on a previous prototype version that required manual input of user abilities. The presented version detects the action that users are able to perform independently through the adoption of sensor technologies that are built-into Android devices, such as the accelerometer and step counter. The knowledge contained within the framework is subsequently used to derive recommendations of technology that could be suitable for the user. Future enhancements to the application will enable complete automatic detection, without the requirement for manual input. The exploitation of the SmartAbility application is anticipated to increase technology awareness amongst the user community, as well as providing a means for assistive technology manufacturers to promote their products

    The development and evaluation of the SmartAbility Android Application to detect users’ abilities

    Get PDF
    The SmartAbility Android Application recommends Assistive Technology (AT) for people with reduced physical ability, by focusing on the actions (abilities) that can be performed independently. The Application utilises built-in sensor technologies in Android devices to detect user abilities, including head and limb movements, speech and blowing. The Application was evaluated by 18 participants with varying physical conditions and assessed through the System Usability Scale (SUS) and NASA Task Load Index (TLX). The Application achieved a SUS score of 72.5 (indicating ‘Good Usability’) with low levels of Temporal Demand and Frustration and medium levels of Mental Demand, Physical Demand and Effort. It is anticipated that the SmartAbility Application will be disseminated to the AT domain, to improve quality of life for people with reduced physical ability

    Automatic Detection of User Abilities through the SmartAbility Framework

    Get PDF
    This paper presents a proposed smartphone application for the unique SmartAbility Framework that supports interaction with technology for people with reduced physical ability, through focusing on the actions that they can perform independently. The Framework is a culmination of knowledge obtained through previously conducted technology feasibility trials and controlled usability evaluations involving the user community. The Framework is an example of ability-based design that focuses on the abilities of users instead of their disabilities. The paper includes a summary of Versions 1 and 2 of the Framework, including the results of a two-phased validation approach, conducted at the UK Mobility Roadshow and via a focus group of domain experts. A holistic model developed by adapting the House of Quality (HoQ) matrix of the Quality Function Deployment (QFD) approach is also described. A systematic literature review of sensor technologies built into smart devices establishes the capabilities of sensors in the Android and iOS operating systems. The review defines a set of inclusion and exclusion criteria, as well as search terms used to elicit literature from online repositories. The key contribution is the mapping of ability-based sensor technologies onto the Framework, to enable the future implementation of a smartphone application. Through the exploitation of the SmartAbility application, the Framework will increase technology amongst people with reduced physical ability and provide a promotional tool for assistive technology manufacturers

    The development of a SmartAbility Framework to enhance multimodal interaction for people with reduced physical ability.

    Get PDF
    Assistive technologies are an evolving market due to the number of people worldwide who have conditions resulting in reduced physical ability (also known as disability). Various classification schemes exist to categorise disabilities, as well as government legislations to ensure equal opportunities within the community. However, there is a notable absence of a process to map physical conditions to technologies in order to improve Quality of Life for this user group. This research is characterised primarily under the Human Computer Interaction (HCI) domain, although aspects of Systems of Systems (SoS) and Assistive Technologies have been applied. The thesis focuses on examples of multimodal interactions leading to the development of a SmartAbility Framework that aims to assist people with reduced physical ability by utilising their abilities to suggest interaction mediums and technologies. The framework was developed through a predominantly Interpretivism methodology approach consisting of a variety of research methods including state- of-the-art literature reviews, requirements elicitation, feasibility trials and controlled usability evaluations to compare multimodal interactions. The developed framework was subsequently validated through the involvement of the intended user community and domain experts and supported by a concept demonstrator incorporating the SmartATRS case study. The aim and objectives of this research were achieved through the following key outputs and findings: - A comprehensive state-of-the-art literature review focussing on physical conditions and their classifications, HCI concepts relevant to multimodal interaction (Ergonomics of human-system interaction, Design For All and Universal Design), SoS definition and analysis techniques involving System of Interest (SoI), and currently-available products with potential uses as assistive technologies. - A two-phased requirements elicitation process applying surveys and semi-structured interviews to elicit the daily challenges for people with reduced physical ability, their interests in technology and the requirements for assistive technologies obtained through collaboration with a manufacturer. - Findings from feasibility trials involving monitoring brain activity using an electroencephalograph (EEG), tracking facial features through Tracking Learning Detection (TLD), applying iOS Switch Control to track head movements and investigating smartglasses. - Results of controlled usability evaluations comparing multimodal interactions with the technologies deemed to be feasible from the trials. The user community of people with reduced physical ability were involved during the process to maximise the usefulness of the data obtained. - An initial SmartDisability Framework developed from the results and observations ascertained through requirements elicitation, feasibility trials and controlled usability evaluations, which was validated through an approach of semi-structured interviews and a focus group. - An enhanced SmartAbility Framework to address the SmartDisability validation feedback by reducing the number of elements, using simplified and positive terminology and incorporating concepts from Quality Function Deployment (QFD). - A final consolidated version of the SmartAbility Framework that has been validated through semi-structured interviews with additional domain experts and addressed all key suggestions. The results demonstrated that it is possible to map technologies to people with physical conditions by considering the abilities that they can perform independently without external support and the exertion of significant physical effort. This led to a realisation that the term ‘disability’ has a negative connotation that can be avoided through the use of the phrase ‘reduced physical ability’. It is important to promote this rationale to the wider community, through exploitation of the framework. This requires a SmartAbility smartphone application to be developed that allows users to input their abilities in order for recommendations of interaction mediums and technologies to be provided

    The development of assistive technology to reveal knowledge of physical world concepts in young people who have profound motor impairments.

    Get PDF
    Cognitively able children and young people who have profound motor impairments and complex communication needs (the target group or TG) face many barriers to learning, communication, personal development, physical interaction and play experiences, compared to their typically developing peers. Physical interaction (and play) are known to be important components of child development, but this group currently has few suitable ways in which to participate in these activities. Furthermore, the TG may have knowledge about real world physical concepts despite having limited physical interaction experiences but it can be difficult to reveal this knowledge and conventional assessment techniques are not suitable for this group, largely due to accessibility issues. This work presents a pilot study involving a robotics-based system intervention which enabled members of the TG to experience simulated physical interaction and was designed to identify and develop the knowledge and abilities of the TG relating to physical concepts involving temporal, spatial or movement elements. The intervention involved the participants using an eye gaze controlled robotic arm with a custom made haptic feedback device to complete a set of tasks. To address issues with assessing the TG, two new digital Assistive Technology (AT) accessible assessments were created for this research, one using static images, the other video clips. Two participants belonging to the TG took part in the study. The outcomes indicated a high level of capability in performing the tasks, with the participants exhibiting a level of knowledge and ability which was much higher than anticipated. One explanation for this finding could be that they have acquired this knowledge through past experiences and ‘observational learning’. The custom haptic device was found to be useful for assessing the participants’ sense of ‘touch’ in a way which is less invasive than conventional ‘pin-prick’ techniques. The new digital AT accessible assessments seemed especially suitable for one participant, while results were mixed for the other. This suggests that a combination of ‘traditional’ assessment and a ‘practical’ intervention assessment approach may help to provide a clearer, more rounded understanding of individuals within the TG. The work makes contributions to knowledge in the field of disability and Assistive Technology, specifically regarding: AT accessible assessments; haptic device design for the TG; the combination of robotics, haptics and eye gaze for use by the TG to interact with the physical world; a deeper understanding of the TG in general; insights into designing for and working with the TG. The work and information gathered can help therapists and education staff to identify strengths and gaps in knowledge and skills, to focus learning and therapy activities appropriately, and to change the perceptions of those who work with this group, encouraging them to broaden their expectations of the TG
    corecore