44 research outputs found

    A Human-Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Get PDF
    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions

    A Human–Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Get PDF
    abstract: Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.View the article as published at http://journal.frontiersin.org/article/10.3389/fnbot.2017.00024/ful

    Peripheral neuroergonomics - An elegant way to improve Human-Robot Interaction?

    Get PDF
    The day seems not too far away, in which robots will be an active part of our daily life, just like electric appliances already are. Hence, there is an increasing need for paradigms, tools, and techniques to design proper human-robot interaction in a human-centered fashion (Beckerle et al., 2017). To this end, appropriate Human-Machine Interfaces (HMIs) are required, and there is a growing body of research showing how the Peripheral Nervous System (PNS) might be the ideal channel through which this interaction could proficiently happen

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    a closed loop neurobotic system for fine touch sensing

    Get PDF
    Objective. Fine touch sensing relies on peripheral-to-central neurotransmission of somesthetic percepts, as well as on active motion policies shaping tactile exploration. This paper presents a novel neuroengineering framework for robotic applications based on the multistage processing of fine tactile information in the closed action–perception loop. Approach. The integrated system modules focus on (i) neural coding principles of spatiotemporal spiking patterns at the periphery of the somatosensory pathway, (ii) probabilistic decoding mechanisms mediating cortical-like tactile recognition and (iii) decision-making and low-level motor adaptation underlying active touch sensing. We probed the resulting neural architecture through a Braille reading task. Main results. Our results on the peripheral encoding of primary contact features are consistent with experimental data on human slow-adapting type I mechanoreceptors. They also suggest second-order processing by cuneate neurons may resolve perceptual ambiguities, contributing to a fast and highly performing online discrimination of Braille inputs by a downstream probabilistic decoder. The implemented multilevel adaptive control provides robustness to motion inaccuracy, while making the number of finger accelerations covariate with Braille character complexity. The resulting modulation of fingertip kinematics is coherent with that observed in human Braille readers. Significance. This work provides a basis for the design and implementation of modular neuromimetic systems for fine touch discrimination in robotics

    Haptic Glove and Platform with Gestural Control For Neuromorphic Tactile Sensory Feedback In Medical Telepresence

    Get PDF
    Advancements in the study of the human sense of touch are fueling the field of haptics. This is paving the way for augmenting sensory perception during object palpation in tele-surgery and reproducing the sensed information through tactile feedback. Here, we present a novel tele-palpation apparatus that enables the user to detect nodules with various distinct stiffness buried in an ad-hoc polymeric phantom. The contact force measured by the platform was encoded using a neuromorphic model and reproduced on the index fingertip of a remote user through a haptic glove embedding a piezoelectric disk. We assessed the effectiveness of this feedback in allowing nodule identification under two experimental conditions of real-time telepresence: In Line of Sight (ILS), where the platform was placed in the visible range of a user; and the more demanding Not In Line of Sight (NILS), with the platform and the user being 50 km apart. We found that the entailed percentage of identification was higher for stiffer inclusions with respect to the softer ones (average of 74% within the duration of the task), in both telepresence conditions evaluated. These promising results call for further exploration of tactile augmentation technology for telepresence in medical interventions

    Body and the senses in spatial experience: the implications of kinesthetic and synesthetic perceptions for design thinking

    Get PDF
    Human perception has long been a critical subject of design thinking. While various studies have stressed the link between thinking and acting, particularly in spatial experience, the term "design thinking" seems to disconnect conceptual thinking from physical expression or process. Spatial perception is multimodal and fundamentally bound to the body that is not a mere receptor of sensory stimuli but an active agent engaged with the perceivable environment. The body apprehends the experience in which one's kinesthetic engagement and knowledge play an essential role. Although design disciplines have integrated the abstract, metaphoric, and visual aspects of the body and its movement into conceptual thinking, studies have pointed out that design disciplines have emphasized visuality above the other sensory domains and heavily engaged with the perception of visual configurations, relying on the Gestalt principles. Gestalt psychology must be valued for its attention to a whole. However, the theories of design elements and principles over-empathizing such visuality posit the aesthetics of design mainly as visual value and understate other sensorial and perceptual aspects. Although the visual approach may provide a practical means to represent and communicate ideas, a design process heavily driven by visuality can exhibit weaknesses undermining certain aspects of spatial experience despite the complexity. Grounded in Merleau-Ponty's notion of multisensory perception, this article discusses the relationship between body awareness and spatial perception and its implication for design disciplines concerning built environments. Special attention is given to the concepts of kinesthetic and synesthetic phenomena known as multisensory and cross-sensory, respectively. This discussion integrates the corporeal and spatiotemporal realms of human experience into the discourse of kinesthetic and synesthetic perceptions. Based on the conceptual, theoretical, and precedent analyses, this article proposes three models for design thinking: Synesthetic Translation, Kinesthetic Resonance, and Kinesthetic Engagement. To discuss the concepts rooted in action-based perception and embodied cognition, this study borrows the neurological interpretation of haptic perception, interoception, and proprioception of space. This article suggests how consideration of the kinesthetic or synesthetic body can deepen and challenge the existing models of the perceptual aspects of environmental psychology adopted in design disciplines.Includes bibliographical references
    corecore