40 research outputs found
Exact N=2 Supergravity Solutions With Polarized Branes
We construct several classes of exact supersymmetric supergravity solutions
describing D4 branes polarized into NS5 branes and F-strings polarized into D2
branes. These setups belong to the same universality class as the perturbative
solutions used by Polchinski and Strassler to describe the string dual of N=1*
theories. The D4-NS5 setup can be interpreted as a string dual to a confining
4+1 dimensional theory with 8 supercharges, whose properties we discuss. By
T-duality, our solutions give Type IIB supersymmetric backgrounds with
polarized branes.Comment: 22 pages. v2 - references added, details clarifie
The group structure of non-Abelian NS-NS transformations
We study the transformations of the worldvolume fields of a system of
multiple coinciding D-branes under gauge transformations of the supergravity
Kalb-Ramond field. We find that the pure gauge part of these NS-NS
transformations can be written as a U(N) symmetry of the underlying Yang-Mills
group, but that in general the full NS-NS variations get mixed up non-trivially
with the U(N). We compute the commutation relations and the Jacobi identities
of the bigger group formed by the NS-NS and U(N) transformations.Comment: Latex, 11 pages. v2: Typos corrected; version to appear in JHEP
On The Interaction Of D0-Brane Bound States And RR Photons
We consider the problem of the interaction between D0-brane bound state and
1-form RR photons by the world-line theory. Based on the fact that in the
world-line theory the RR gauge fields depend on the matrix coordinates of
D0-branes, the gauge fields also appear as matrices in the formulation. At the
classical level, we derive the Lorentz-like equations of motion for D0-branes,
and it is observed that the center-of-mass is colourless with respect to the
SU(N) sector of the background. Using the path integral method, the
perturbation theory for the interaction between the bound state and the RR
background is developed. We discuss what kind of field theory may be
corresponded to the amplitudes which are calculated by the perturbation
expansion in world-line theory. Qualitative considerations show that the
possibility of existence of a map between the world-line theory and the
non-Abelian gauge theory is very considerable.Comment: LaTeX, 28 pages, 4 eps figures. v2 and v3: eqs. (3.18) and (B.2) are
corrected, very small change
Interaction between M2-branes and Bulk Form Fields
We construct the interaction terms between the world-volume fields of
multiple M2-branes and the 3- and 6-form fields in the context of ABJM theory
with U()U() gauge symmetry. A consistency check is made in the
simplest case of a single M2-brane, i.e, our construction matches the known
effective action of M2-brane coupled to antisymmetric 3-form field. We show
that when dimensionally reduced, our couplings coincide with the effective
action of D2-branes coupled to R-R 3- and 5-form fields in type IIA string
theory. We also comment on the relation between a coupling with a specific
6-form field configuration and the supersymmetry preserving mass deformation in
ABJM theory.Comment: 30 pages, version to appear in JHE
Cognitive vision system for control of dexterous prosthetic hands: Experimental evaluation
<p>Abstract</p> <p>Background</p> <p>Dexterous prosthetic hands that were developed recently, such as SmartHand and i-LIMB, are highly sophisticated; they have individually controllable fingers and the thumb that is able to abduct/adduct. This flexibility allows implementation of many different grasping strategies, but also requires new control algorithms that can exploit the many degrees of freedom available. The current study presents and tests the operation of a new control method for dexterous prosthetic hands.</p> <p>Methods</p> <p>The central component of the proposed method is an autonomous controller comprising a vision system with rule-based reasoning mounted on a dexterous hand (CyberHand). The controller, termed cognitive vision system (CVS), mimics biological control and generates commands for prehension. The CVS was integrated into a hierarchical control structure: 1) the user triggers the system and controls the orientation of the hand; 2) a high-level controller automatically selects the grasp type and size; and 3) an embedded hand controller implements the selected grasp using closed-loop position/force control. The operation of the control system was tested in 13 healthy subjects who used Cyberhand, attached to the forearm, to grasp and transport 18 objects placed at two different distances.</p> <p>Results</p> <p>The system correctly estimated grasp type and size (nine commands in total) in about 84% of the trials. In an additional 6% of the trials, the grasp type and/or size were different from the optimal ones, but they were still good enough for the grasp to be successful. If the control task was simplified by decreasing the number of possible commands, the classification accuracy increased (e.g., 93% for guessing the grasp type only).</p> <p>Conclusions</p> <p>The original outcome of this research is a novel controller empowered by vision and reasoning and capable of high-level analysis (i.e., determining object properties) and autonomous decision making (i.e., selecting the grasp type and size). The automatic control eases the burden from the user and, as a result, the user can concentrate on what he/she does, not on how he/she should do it. The tests showed that the performance of the controller was satisfactory and that the users were able to operate the system with minimal prior training.</p
Recommended from our members
Robots for Humanity: Using Assistive Robots to Empower People with Disabilities
Assistive mobile manipulators have the potential
to one day serve as surrogates and helpers for people with
disabilities, giving them the freedom to perform tasks such as
scratching an itch, picking up a cup, or socializing with their
families. This article introduces a collaborative project with the
goal of putting assistive mobile manipulators into real homes
to work with people with disabilities. Through a participatory
design process in which users have been actively involved from
day one, we are identifying and developing assistive capabilities
for the PR2 robot. Our approach is to develop a diverse suite
of open source software tools that blend the capabilities of the
user and the robot. Within this article, we introduce the project,
describe our progress, and discuss lessons we have learned.This is an author's peer-reviewed final manuscript, as accepted by the publisher. The published article is copyrighted by IEEE-Institute of Electrical and Electronics Engineers and can be found at: http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=100. ©2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Keywords: Medical robotics, Biomedical equipment, Robots, Control systems, Handicapped aids, Software development, Exoskeletons, Sensory aids, Human factors, Prosthetic
ROS Commander (ROSCo): Behavior Creation for Home Robots
©2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at the 2013 IEEE International Conference on Robotics and Automation (ICRA) Karlsruhe, Germany, May 6-10, 2013.DOI: 10.1109/ICRA.2013.6630616We introduce ROS Commander (ROSCo), an open source system that enables expert users to construct, share, and deploy robot behaviors for home robots. A user builds a behavior in the form of a Hierarchical Finite State Machine (HFSM) out of generic, parameterized building blocks, with a real robot in the develop and test loop. Once constructed, users save behaviors in an open format for direct use with robots, or for use as parts of new behaviors. When the system is deployed, a user can show the robot where to apply behaviors relative to fiducial markers (AR Tags), which allows the robot to quickly become operational in a new environment. We show evidence that the underlying state machine representation and current building blocks are capable of spanning a variety of desirable behaviors for home robots, such as opening a refrigerator door with two arms, flipping a light switch, unlocking a door, and handing an object to someone. Our experiments show that sensor-driven behaviors constructed with ROSCo can be executed in realistic home environments with success rates between 80% and 100%. We conclude by describing a test in the home of a person with quadriplegia, in which the person was able to automate parts of his home using previously-built Behaviors
