36 research outputs found

    Minimally Invasive Expeditionary Surgical Care Using Human-Inspired Robots

    Get PDF
    This technical report serves as an updated collection of subject matter experts on surgical care using human-inspired robotics for human exploration. It is a summary of the Blue Sky Meeting, organized by the Florida Institute for Human and Machine Cognition (IHMC), Pensacola, Florida, and held on October 2-3, 2018. It contains an executive summary, the final report, all of the presentation materials, and an updated reference list

    EEG theta and Mu oscillations during perception of human and robot actions.

    Get PDF
    The perception of others' actions supports important skills such as communication, intention understanding, and empathy. Are mechanisms of action processing in the human brain specifically tuned to process biological agents? Humanoid robots can perform recognizable actions, but can look and move differently from humans, and as such, can be used in experiments to address such questions. Here, we recorded EEG as participants viewed actions performed by three agents. In the Human condition, the agent had biological appearance and motion. The other two conditions featured a state-of-the-art robot in two different appearances: Android, which had biological appearance but mechanical motion, and Robot, which had mechanical appearance and motion. We explored whether sensorimotor mu (8-13 Hz) and frontal theta (4-8 Hz) activity exhibited selectivity for biological entities, in particular for whether the visual appearance and/or the motion of the observed agent was biological. Sensorimotor mu suppression has been linked to the motor simulation aspect of action processing (and the human mirror neuron system, MNS), and frontal theta to semantic and memory-related aspects. For all three agents, action observation induced significant attenuation in the power of mu oscillations, with no difference between agents. Thus, mu suppression, considered an index of MNS activity, does not appear to be selective for biological agents. Observation of the Robot resulted in greater frontal theta activity compared to the Android and the Human, whereas the latter two did not differ from each other. Frontal theta thus appears to be sensitive to visual appearance, suggesting agents that are not sufficiently biological in appearance may result in greater memory processing demands for the observer. Studies combining robotics and neuroscience such as this one can allow us to explore neural basis of action processing on the one hand, and inform the design of social robots on the other

    Artificial Companion: building a impacting relation

    No full text
    International audienceIn this paper we show that we are in front of an evolution from traditional human-computer interactions to a kind of intense exchange between the human user and new generation of virtual or real systems -Embodied Conversational Agents (ECAs) or affective robots- bringing the interaction to another level, the "relation level". We call these systems "companions" that is to say systems with which the user wants to build a kind of life- long relationship. We thus argue that we need to go beyond the concepts acceptability and believability of system to get closer to human and look for "impact" concept. We will see that this problematic is shared between the community of researchers in Embodied Conversational Agents (ECAs) and in affective robotics fields. We put forward a definition of an "impacting relation" that will enable believable interactive ECAs or robots to become believable impacting companions

    Development of a Multisensor-Based Bio-Botanic Robot and Its Implementation Using a Self-Designed Embedded Board

    Get PDF
    This paper presents the design concept of a bio-botanic robot which demonstrates its behavior based on plant growth. Besides, it can reflect the different phases of plant growth depending on the proportional amounts of light, temperature and water. The mechanism design is made up of a processed aluminum base, spring, polydimethylsiloxane (PDMS) and actuator to constitute the plant base and plant body. The control system consists of two micro-controllers and a self-designed embedded development board where the main controller transmits the values of the environmental sensing module within the embedded board to a sub-controller. The sub-controller determines the growth stage, growth height, and time and transmits its decision value to the main controller. Finally, based on the data transmitted by the sub-controller, the main controller controls the growth phase of the bio-botanic robot using a servo motor and leaf actuator. The research result not only helps children realize the variation of plant growth but also is entertainment-educational through its demonstration of the growth process of the bio-botanic robot in a short time

    Deep neural network approach in human-like redundancy optimization for anthropomorphic manipulators

    Get PDF
    © 2013 IEEE. Human-like behavior has emerged in the robotics area for improving the quality of Human-Robot Interaction (HRI). For the human-like behavior imitation, the kinematic mapping between a human arm and robot manipulator is one of the popular solutions. To fulfill this requirement, a reconstruction method called swivel motion was adopted to achieve human-like imitation. This approach aims at modeling the regression relationship between robot pose and swivel motion angle. Then it reaches the human-like swivel motion using its redundant degrees of the manipulator. This characteristic holds for most of the redundant anthropomorphic robots. Although artificial neural network (ANN) based approaches show moderate robustness, the predictive performance is limited. In this paper, we propose a novel deep convolutional neural network (DCNN) structure for reconstruction enhancement and reducing online prediction time. Finally, we utilized the trained DCNN model for managing redundancy control a 7 DoFs anthropomorphic robot arm (LWR4+, KUKA, Germany) for validation. A demonstration is presented to show the human-like behavior on the anthropomorphic manipulator. The proposed approach can also be applied to control other anthropomorphic robot manipulators in industry area or biomedical engineering

    Pressure mapping using nanocomposite-enhanced foam and machine learning

    Get PDF
    Pressure mapping has garnered considerable interest in the healthcare and robotic industries. Low-cost and large-area compliant devices, as well as fast and effective computational algorithms, have been proposed in the last few years to facilitate distributed pressure sensing. One approach is to use electrical impedance tomography (EIT) to reconstruct the contact pressure distribution of piezoresistive materials. While tremendous success has been demonstrated, conventional algorithms may be unsuitable for real-time monitoring due to its computational demand and runtime. Moreover, the low resolution of reconstructed images is a well-known issue related to the regularization strategies typically employed for traditional EIT methods. Therefore, in this study, two different supervised machine learning (ML) approaches, namely, radial basis function networks and deep neural networks, were employed to efficiently solve the inverse EIT problem and improve the resolution of reconstructed pressure maps. The demonstration of high-resolution pressure mapping, specifically, for identifying pressure hotspots, was achieved using a carbon nanotube-based thin film integrated with foam

    The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions

    Get PDF
    Using functional magnetic resonance imaging (fMRI) repetition suppression, we explored the selectivity of the human action perception system (APS), which consists of temporal, parietal and frontal areas, for the appearance and/or motion of the perceived agent. Participants watched body movements of a human (biological appearance and movement), a robot (mechanical appearance and movement) or an android (biological appearance, mechanical movement). With the exception of extrastriate body area, which showed more suppression for human like appearance, the APS was not selective for appearance or motion per se. Instead, distinctive responses were found to the mismatch between appearance and motion: whereas suppression effects for the human and robot were similar to each other, they were stronger for the android, notably in bilateral anterior intraparietal sulcus, a key node in the APS. These results could reflect increased prediction error as the brain negotiates an agent that appears human, but does not move biologically, and help explain the ‘uncanny valley’ phenomenon
    corecore