57 research outputs found

    Understanding older adults' perceptions of usefulness of an assistive home robot

    Get PDF
    Developing robots that are useful to older adults is more than simply creating robots that complete household tasks. To ensure that older adults perceive a robot to be useful, careful consideration of the users’ capabilities, robot autonomy, and task is needed (Venkatesh & Davis, 2000). The purpose of this study was to investigate the construct of perceived usefulness within the context of robot assistance. Mobile older adults (N = 12) and older adults with mobility loss (N=12) participated in an autonomy selection think aloud task, and a persona based interview. Findings suggest that older adults with mobility loss preferred an autonomy level where they command/control the robot themselves. Mobile older adults’ preferences were split between commanding/controlling the robot themselves, or the robot commands/controls itself. Reasons for their preferences were related to decision making, and were task specific. Additionally, findings from the persona base interview study support Technology Acceptance Model (TAM) constructs, as well as adaptability, reliability, and trust as positively correlated with perceptions of usefulness. However, despite the positive correlation, barriers and facilitators of acceptance identified in the interview suggest that perceived usefulness judgments are complex, and some questionnaire constructs were interpreted differently between participants. Thus, care should be taken when applying TAM constructs to other domains, such as robot assistance to promote older adult independence.Ph.D

    Toward a Framework for Levels of Robot Autonomy in Human-Robot Interaction

    Get PDF
    Autonomy is a critical construct related to human-robot interaction (HRI) and varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots interact with one another. Thus, there is a need to understand HRI by identifying variables that influence—and are influenced by—robot autonomy. Our overarching goal is to develop a framework for LORA in HRI. To reach this goal, our framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, this framework proposes a process for determining a robot’s autonomy level by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as a guideline for determining autonomy, categorizing the LORA along a qualitative taxonomy and considering HRI variables (e.g., acceptance, situation awareness, reliability) that may be influenced by the LOR

    Design Considerations for Technology Interventions to Support Social and Physical Wellness for Older Adults with Disability

    Get PDF
    Social and physical wellness are important considerations for maintaining one’s health into older age and remaining independent. However, some segments of the older adult population, such as those aging with disability, are at increased risk for loneliness and reduced physical activity, which could result in negative health consequences. There is a critical need to understand how to deploy social and physical wellness interventions for people aging with disability. We provide an overview of constructs related to social and physical wellness, as well as evidence-based interventions effective with older populations. Our review yields considerations for how interventions may need to be developed or modified to be efficacious for this population segment. Technology may be a key component in adopting interventions, particularly tele-technologies, which we define and discuss in depth.<br /

    Leaving the Lecture Hall: Conducting HF/E Outside the Classroom

    Get PDF
    Georgia Tech HF/E students initiated and managed a multisemester project to experience the nuances of conducting HF/E outside the classroom setting. This article focuses on the lessons learned beyond the classroom: project management, team coordination, communication with non-HF/E team members, application of research methods, and integration of data to prioritize and guide design changes. The goal of this article is to help guide other HF/E students and educators when implementing similar projects by providing the lessons we learned from this experience

    Recognizing facial expression of virtual agents, synthetic faces, and human faces: the effects of age and character type on emotion recognition

    Get PDF
    An agent's facial expression may communicate emotive state to users both young and old. The ability to recognize emotions has been shown to differ with age, with older adults more commonly misidentifying the facial emotions of anger, fear, and sadness. This research study examined whether emotion recognition of facial expressions differed between different types of on-screen agents, and between age groups. Three on-screen characters were compared: a human, a synthetic human, and a virtual agent. In this study 42 younger (age 28-28) and 42 older (age 65-85) adults completed an emotion recognition task with static pictures of the characters demonstrating four basic emotions (anger, fear, happiness, and sadness) and neutral. The human face resulted in the highest proportion match, followed by the synthetic human, then the virtual agent with the lowest proportion match. Both the human and synthetic human faces resulted in age-related differences for the emotions anger, fear, sadness, and neutral, with younger adults showing higher proportion match. The virtual agent showed age-related differences for the emotions anger, fear, happiness, and neutral, with younger adults showing higher proportion match. The data analysis and interpretation of the present study differed from previous work by utilizing two unique approaches to understanding emotion recognition. First, misattributions participants made when identifying emotion were investigated. Second, a similarity index of the feature placement between any two virtual agent emotions was calculated, suggesting that emotions were commonly misattributed as other emotions similar in appearance. Overall, these results suggest that age-related differences transcend human faces to other types of on-screen characters, and differences between older and younger adults in emotion recognition may be further explained by perceptual discrimination between two emotions of similar feature appearance.M.S.Committee Chair: Fisk, Arthur; Committee Member: Blanchard-Fields, Fredda; Committee Member: Rogers, Wend
    • …
    corecore