2,200 research outputs found
Development of an intelligent object for grasp and manipulation research
Kõiva R, Haschke R, Ritter H. Development of an intelligent object for grasp and manipulation research. Presented at the ICAR 2011, Tallinn, Estonia.In this paper we introduce a novel device, called iObject, which is equipped with tactile and motion tracking sensors that allow for the evaluation of human and robot grasping and manipulation actions. Contact location and contact force, object acceleration in space (6D) and orientation relative to the earth (3D magnetometer) are measured and transmitted wirelessly over a Bluetooth connection. By allowing human-human, human-robot and robot-robot comparisons to be made, iObject is a versatile tool for studying manual interaction.
To demonstrate the efficiency and flexibility of iObject for the study of bimanual interactions, we report on a physiological experiment and evaluate the main parameters of the considered dual-handed manipulation task
Interactive Force Control Based on Multimodal Robot Skin for Physical Human-Robot Collaboration
This work proposes and realizes a control architecture that can support the deployment of a large-scale robot skin in a Human-Robot Collaboration scenario. It is shown, how whole-body tactile feedback can extend the capabilities of robots during dynamic interactions by providing information about multiple contacts across the robot\u27s surface. Specifically, an uncalibrated skin system is used to implement stable force control while simultaneously handling the multi-contact interactions of a user. The system formulates control tasks for force control, tactile guidance, collision avoidance, and compliance, and fuses them with a multi-priority redundancy resolution strategy. The approach is evaluated on an omnidirectional mobile-manipulator with dual arms covered with robot skin. Results are assessed under dynamic conditions, showing that multi-modal tactile information enables robust force control while at the same time remaining responsive to a user\u27s interactions
NASA space station automation: AI-based technology review
Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures
A flexible sensor technology for the distributed measurement of interaction pressure
We present a sensor technology for the measure of the physical human-robot interaction pressure developed in the last years at Scuola Superiore Sant'Anna. The system is composed of flexible matrices of opto-electronic sensors covered by a soft silicone cover. This sensory system is completely modular and scalable, allowing one to cover areas of any sizes and shapes, and to measure different pressure ranges. In this work we present the main application areas for this technology. A first generation of the system was used to monitor human-robot interaction in upper- (NEUROExos; Scuola Superiore Sant'Anna) and lower-limb (LOPES; University of Twente) exoskeletons for rehabilitation. A second generation, with increased resolution and wireless connection, was used to develop a pressure-sensitive foot insole and an improved human-robot interaction measurement systems. The experimental characterization of the latter system along with its validation on three healthy subjects is presented here for the first time. A perspective on future uses and development of the technology is finally drafted
Active haptic perception in robots: a review
In the past few years a new scenario for robot-based applications has emerged. Service
and mobile robots have opened new market niches. Also, new frameworks for shop-floor
robot applications have been developed. In all these contexts, robots are requested to
perform tasks within open-ended conditions, possibly dynamically varying. These new
requirements ask also for a change of paradigm in the design of robots: on-line and safe
feedback motion control becomes the core of modern robot systems. Future robots will
learn autonomously, interact safely and possess qualities like self-maintenance. Attaining
these features would have been relatively easy if a complete model of the environment
was available, and if the robot actuators could execute motion commands perfectly
relative to this model. Unfortunately, a complete world model is not available and robots
have to plan and execute the tasks in the presence of environmental uncertainties which
makes sensing an important component of new generation robots. For this reason,
today\u2019s new generation robots are equipped with more and more sensing components,
and consequently they are ready to actively deal with the high complexity of the real
world. Complex sensorimotor tasks such as exploration require coordination between the
motor system and the sensory feedback. For robot control purposes, sensory feedback
should be adequately organized in terms of relevant features and the associated data
representation. In this paper, we propose an overall functional picture linking sensing
to action in closed-loop sensorimotor control of robots for touch (hands, fingers). Basic
qualities of haptic perception in humans inspire the models and categories comprising the
proposed classification. The objective is to provide a reasoned, principled perspective on
the connections between different taxonomies used in the Robotics and human haptic
literature. The specific case of active exploration is chosen to ground interesting use
cases. Two reasons motivate this choice. First, in the literature on haptics, exploration has
been treated only to a limited extent compared to grasping and manipulation. Second,
exploration involves specific robot behaviors that exploit distributed and heterogeneous
sensory data
On the Use of Large Area Tactile Feedback for Contact Data Processing and Robot Control
The progress in microelectronics and embedded systems has recently enabled the realization of devices for robots functionally similar to the human skin, providing large area tactile feedback over the whole robot body.
The availability of such kind of systems, commonly referred to as extit{robot skins}, makes possible to measure the contact pressure distribution applied on the robot body over an arbitrary area.
Large area tactile systems open new scenarios on contact processing, both for control and cognitive level processing, enabling the interpretation of physical contacts.
The contents proposed in this thesis address these topics by proposing techniques exploiting large area tactile feedback for: (i) contact data processing and classification; (ii) robot control
Proximity and Visuotactile Point Cloud Fusion for Contact Patches in Extreme Deformation
Equipping robots with the sense of touch is critical to emulating the
capabilities of humans in real world manipulation tasks. Visuotactile sensors
are a popular tactile sensing strategy due to data output compatible with
computer vision algorithms and accurate, high resolution estimates of local
object geometry. However, these sensors struggle to accommodate high
deformations of the sensing surface during object interactions, hindering more
informative contact with cm-scale objects frequently encountered in the real
world. The soft interfaces of visuotactile sensors are often made of
hyperelastic elastomers, which are difficult to simulate quickly and accurately
when extremely deformed for tactile information. Additionally, many
visuotactile sensors that rely on strict internal light conditions or pattern
tracking will fail if the surface is highly deformed. In this work, we propose
an algorithm that fuses proximity and visuotactile point clouds for contact
patch segmentation that is entirely independent from membrane mechanics. This
algorithm exploits the synchronous, high-res proximity and visuotactile
modalities enabled by an extremely deformable, selectively transmissive soft
membrane, which uses visible light for visuotactile sensing and infrared light
for proximity depth. We present the hardware design, membrane fabrication, and
evaluation of our contact patch algorithm in low (10%), medium (60%), and high
(100%+) membrane strain states. We compare our algorithm against three
baselines: proximity-only, tactile-only, and a membrane mechanics model. Our
proposed algorithm outperforms all baselines with an average RMSE under 2.8mm
of the contact patch geometry across all strain ranges. We demonstrate our
contact patch algorithm in four applications: varied stiffness membranes,
torque and shear-induced wrinkling, closed loop control for whole body
manipulation, and pose estimation
Adaptive physical human-robot interaction (PHRI) with a robotic nursing assistant.
Recently, more and more robots are being investigated for future applications in health-care. For instance, in nursing assistance, seamless Human-Robot Interaction (HRI) is very important for sharing workspaces and workloads between medical staff, patients, and robots. In this thesis we introduce a novel robot - the Adaptive Robot Nursing Assistant (ARNA) and its underlying components. ARNA has been designed specifically to assist nurses with day-to-day tasks such as walking patients, pick-and-place item retrieval, and routine patient health monitoring. An adaptive HRI in nursing applications creates a positive user experience, increase nurse productivity and task completion rates, as reported by experimentation with human subjects. ARNA has been designed to include interface devices such as tablets, force sensors, pressure-sensitive robot skins, LIDAR and RGBD camera. These interfaces are combined with adaptive controllers and estimators within a proposed framework that contains multiple innovations. A research study was conducted on methods of deploying an ideal HumanMachine Interface (HMI), in this case a tablet-based interface. Initial study points to the fact that a traded control level of autonomy is ideal for tele-operating ARNA by a patient. The proposed method of using the HMI devices makes the performance of a robot similar for both skilled and un-skilled workers. A neuro-adaptive controller (NAC), which contains several neural-networks to estimate and compensate for system non-linearities, was implemented on the ARNA robot. By linearizing the system, a cross-over usability condition is met through which humans find it more intuitive to learn to use the robot in any location of its workspace, A novel Base-Sensor Assisted Physical Interaction (BAPI) controller is introduced in this thesis, which utilizes a force-torque sensor at the base of the ARNA robot manipulator to detect full body collisions, and make interaction safer. Finally, a human-intent estimator (HIE) is proposed to estimate human intent while the robot and user are physically collaborating during certain tasks such as adaptive walking. A NAC with HIE module was validated on a PR2 robot through user studies. Its implementation on the ARNA robot platform can be easily accomplished as the controller is model-free and can learn robot dynamics online. A new framework, Directive Observer and Lead Assistant (DOLA), is proposed for ARNA which enables the user to interact with the robot in two modes: physically, by direct push-guiding, and remotely, through a tablet interface. In both cases, the human is being “observed” by the robot, then guided and/or advised during interaction. If the user has trouble completing the given tasks, the robot adapts their repertoire to lead users toward completing goals. The proposed framework incorporates interface devices as well as adaptive control systems in order to facilitate a higher performance interaction between the user and the robot than was previously possible. The ARNA robot was deployed and tested in a hospital environment at the School of Nursing of the University of Louisville. The user-experience tests were conducted with the help of healthcare professionals where several metrics including completion time, rate and level of user satisfaction were collected to shed light on the performance of various components of the proposed framework. The results indicate an overall positive response towards the use of such assistive robot in the healthcare environment. The analysis of these gathered data is included in this document. To summarize, this research study makes the following contributions: Conducting user experience studies with the ARNA robot in patient sitter and walker scenarios to evaluate both physical and non-physical human-machine interfaces. Evaluation and Validation of Human Intent Estimator (HIE) and Neuro-Adaptive Controller (NAC). Proposing the novel Base-Sensor Assisted Physical Interaction (BAPI) controller. Building simulation models for packaged tactile sensors and validating the models with experimental data. Description of Directive Observer and Lead Assistance (DOLA) framework for ARNA using adaptive interfaces
- …