3,227 research outputs found
Visual servoing by partitioning degrees of freedom
There are many design factors and choices when mounting a vision system for robot control. Such factors may include the kinematic and dynamic characteristics in the robot's degrees of freedom (DOF), which determine what velocities and fields-of-view a camera can achieve. Another factor is that additional motion components (such as pan-tilt units) are often mounted on a robot and introduce synchronization problems. When a task does not require visually servoing every robot DOF, the designer must choose which ones to servo. Questions then arise as to what roles, if any, do the remaining DOF play in the task. Without an analytical framework, the designer resorts to intuition and try-and-see implementations. This paper presents a frequency-based framework that identifies the parameters that factor into tracking. This framework gives design insight which was then used to synthesize a control law that exploits the kinematic and dynamic attributes of each DOF. The resulting multi-input multi-output control law, which we call partitioning, defines an underlying joint coupling to servo camera motions. The net effect is that by employing both visual and kinematic feedback loops, a robot can quickly position and orient a camera in a large assembly workcell. Real-time experiments tracking people and robot hands are presented using a 5-DOF hybrid (3-DOF Cartesian gantry plus 2-DOF pan-tilt unit) robot
Recommended from our members
Joint-coupled compensation effects in visually servoed tracking
Humans have degrees-of-freedom (DOF) of varying bandwidths and one casually observes that we coordinate these DOF while visually tracking. This suggests that joint interplay aids tracking performance. In a control scheme we call partitioning, both image and kinematic data are used to visually-servo a 5-DOF robot by defining a joint-coupling among the rotational and translational DOF. Analysis of simulations and experiments reveal that a robot's fast bandwidth joints physically serve as lead compensators when coupled to slower joints thus reducing tracking lag
Modeling Hose Dynamics for Unmanned Aerial Vehicles
Bridges and other large pieces of infrastructure accumulate massive amounts of dirt, dust, and other particulates that can obscure the structure when scanning to discern structural integrity. Traditionally, these particulates have been removed by humans operating handheld compressed-air hoses, often while mounting ladders -- a risky and inefficient task. To improve infrastructure scanning, unmanned aerial vehicles (UAV) equipped with hoses could be used to clean the structure in place of the current method.
The challenge in equipping a UAV with a hose is compensating for the reaction forces and torques produced by fluids expelled by the hose. In order to counteract these reaction forces and torques, the process should be carefully modeled and incorporated in the controller architecture
Parallel Manipulator-Gripper for Mobile Manipulating UAVs
Unmanned Aerial Vehicles (UAVs) are originally developed for military, but have been developed over time to time for valuable roles in surveillance, work-assistant, and intelligence for both civilian and military operations.
The ability of UAVs that manipulate or carry objects can expand the type of tasks achieved by unmanned aerial systems. High degree of freedom robots with dexterous arm can lead to various applications. Most of manipulators are serial, each motor on each joint affects on stabilizing UAVs. Our lab , DASL , has presented parallel mechanism manipulator for UAVs. It results in less impact on center of gravity(CoG) of UAVs and high precise manipulation.
Thus, this work focuses on 6 degree-of-freedom parallel manipulator and gripper(PMG) concept for unmanned aerial vehicles that can be used for multiple purposes. Depending on the purpose, the grasper module on the manipulator’s end-effector changes. The design and mechanism is proposed, and the final results are also given
Recommended from our members
Integration of vision and force sensors for grasping
This paper describes a set of methods that can be used to integrate real-time external vision sensing with internal force and position sensing to estimate contact forces by the fingers of a hand. Estimating these forces and contacts is essential to performing dextrous manipulation tasks. Most robotic hands are either sensorless or lack the ability to accurately and robustly report position and force information relating to contact. By adding external vision sensing, we can complement any internal sensors to more accurately estimate forces and contact positions. Experiments are described that use real-time visual trackers in conjunction with internal strain gauges and a new tactile sensor to accurately estimate finger contacts and applied forces for a three fingered robotic hand
CQAR: Closed quarter aerial robot design for reconnaissance, surveillance and target acquisition tasks in urban areas
International Journal of Computational Intelligence, Volume 1, Number 4, 2004. Retrieved April 2006 from http://prism2.mem.drexel.edu/~paul/papers/ohIjci2004.pdfThis paper describes a prototype aircraft that can fly
slowly, safely and transmit wireless video for tasks like reconnaissance,
surveillance and target acquisition. The aircraft is designed to
fly in closed quarters like forests, buildings, caves and tunnels which
are often spacious but GPS reception is poor. Envisioned is that a
small, safe and slow flying vehicle can assist in performing dull,
dangerous and dirty tasks like disaster mitigation, search-and-rescue
and structural damage assessment
Autonomous hovering of a fixed-wing micro air vehicle
Paper presented at the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, FL.Recently, there is a need to acquire intelligence in hostile
or dangerous environments such as caves, forests,
or urban areas. Rather than risking human life, backpackable,
bird-sized aircraft, equipped with a wireless
camera, can be rapidly deployed to gather reconnaissance
in such environments. However, they first must
be designed to fly in tight, cluttered terrain. This paper
discusses an additional flight modality for a fixedwing
aircraft, enabling it to supplement existing endurance
superiority with hovering capabilities. An inertial
measurement sensor and an onboard processing
and control unit, used to achieve autonomous hovering,
are also described. This is, to the best of our
knowledge, the first documented success of hovering a
fixed-wing Micro Air Vehicle autonomously
A MAV that flies like an airplane and hovers like a helicopter
IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Monterey, CA, pp. 699-704, July 2005Near-Earth environments, such as forests, caves,
tunnels, and urban structures make reconnaissance,
surveillance and search-and-rescue missions difficult
and dangerous to accomplish. Micro-Air-Vehicles
(MAVs), equipped with wireless cameras, can assist
in such missions by providing real-time situational
awareness. This paper describes an additional flight
modality enabling fixed-wing MAVs to supplement existing
endurance superiority with hovering capabilities.
This secondary flight mode can also be used
to avoid imminent collisions by quickly transitioning
from cruise to hover flight. A sensor suite which will
allow for autonomous hovering by regulating the aircraft’s
yaw, pitch and roll angles is also described
Human-in-the-loop camera control for a mechatronic broadcast boom
IEEE/ASME Transactions on Mechatronics, 12(1): pp. 41-52.Platforms like gantries, booms, aircrafts, and submersibles
are often used in the broadcasting industry. To avoid
collisions and occlusions, such mechatronic platforms often possess
redundant degrees-of-freedom (DOFs). As a result, manual
manipulating of such platforms demands much skill. This paper
describes the implementation of several controllers that, by using
computer vision, attempts to reduce the number of manually manipulated
DOFs. Experiments were performed to assess the performance
of each controller. A model for such a system was developed
and validated. To determine how the visual servoing can improve
the tracking, a novice operator and an expert were asked to manually
track a moving target with the assistance of visual servoing.
The results of these tests were analyzed and compared
Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments
Proceedings of the 2004 IEEE International Conference on Robotics & Automation. Retrieved April 2006 from http://prism2.mem.drexel.edu/~paul/papers/greenOhBarrowsIcra2004.pdfNear-Earth environments are time consuming, labor
intensive and possibly dangerous to safe guard. Accomplishing
tasks like bomb detection, search-andrescue
and reconnaissance with aerial robots could
save resources. This paper describes the adoption of
insect behavior and flight patterns to devolop a AtAV
sensor suite. A prototype called CQAR: Closed Quarter
Aerial Robot, which is capable of flying in and
around buildings, through tunnels and in and out of
caves will be used to validate the eficiency of such a
method when equipped with optic flow microsensors
- …