3 research outputs found

    Visual and Kinematic Coordinated Control of Mobile Manipulating Unmanned Aerial Vehicles

    Get PDF
    Manipulating objects using arms mounted to unmanned aerial vehicles (UAVs) is attractive because UAVs may access many locations that are otherwise inaccessible to traditional mobile manipulation platforms such as ground vehicles. Historically, UAVs have been employed in ways that avoid interaction with the environment at all costs. The recent trend of increasing small UAV lift capacity and the reduction of the weight of manipulator components make the realization of mobile manipulating UAVs imminent. Despite recent work, several major challenges remain to be overcome before it will be common practice to manipulate objects from UAVs. Among these challenges, the constantly moving UAV platform and compliance of manipulator arms make it difficult to position the UAV and end-effector relative to an object of interest precisely enough for reliable manipulation. Solving this challenge will bring UAVs one step closer to being able to perform meaningful tasks such as infrastructure repair, disaster response, law enforcement, and personal assistance. Toward a solution to this challenge, this thesis describes a way forward that uses the UAV as a means to crudely position a manipulator within reach of the end-effector's goal position in the world. The manipulator then performs the fine positioning of the end-effector, rejecting position perturbations caused by UAV motions. An algorithm to coordinate the redundant degrees of freedom of an aerial manipulation system is described that allows the motions of the manipulator to serve as inputs to the UAV's position controller. To demonstrate this algorithm, the manipulator's six degrees of freedom are servoed using visual sensing to drive an eye-in-hand camera to a specified pose relative to a target while treating motions of the host platform as perturbations. Simultaneously, the host platform's degrees of freedom are regulated using kinematic information from the manipulator. This ultimately drives the UAV to a position that allows the manipulator to assume a pose relative to the UAV that maximizes reachability, thus facilitating the arm's ability to compensate for undesired UAV motions. Maintaining this loose kinematic coupling between the redundant degrees of freedom of the host UAV and manipulator allows this type of controller to be applied to a wide variety of platforms, including manned aircraft, rather than a single instance of a purpose-built system. As a result of this loose coupling, careful consideration must be given to the manipulator design so that it can achieve useful poses while minimally influencing the stability of the host UAV. Accordingly, the novel application of a parallel manipulator mechanism is described.Ph.D., Mechanical Engineering -- Drexel University, 201

    A Human-Embodied Drone for Dexterous Aerial Manipulation

    Full text link
    Current drones perform a wide variety of tasks in surveillance, photography, agriculture, package delivery, etc. However, these tasks are performed passively without the use of human interaction. Aerial manipulation shifts this paradigm and implements drones with robotic arms that allow interaction with the environment rather than simply sensing it. For example, in construction, aerial manipulation in conjunction with human interaction could allow operators to perform several tasks, such as hosing decks, drill into surfaces, and sealing cracks via a drone. This integration with drones will henceforth be known as dexterous aerial manipulation. Our recent work integrated the worker’s experience into aerial manipulation using haptic technology. The net effect was such a system could enable the worker to leverage drones and complete tasks while utilizing haptics on the task site remotely. However, the tasks were completed within the operator’s line-of-sight. Until now, immersive AR/VR frameworks has rarely been integrated in aerial manipulation. Yet, such a framework allows the drones to embody and transport the operator’s senses, actions, and presence to a remote location in real-time. As a result, the operator can both physically interact with the environment and socially interact with actual workers on the worksite. This dissertation presents a human-embodied drone interface for dexterous aerial manipulation. Using VR/AR technology, the interface allows the operator to leverage their intelligence to collaboratively perform desired tasks anytime, anywhere with a drone that possesses great dexterity

    Commande référencée vision pour drones à décollages et atterrissages verticaux

    Get PDF
    La miniaturisation des calculateurs a permis le développement des drones, engins volants capable de se déplacer de façon autonome et de rendre des services, comme se rendre clans des lieux peu accessibles ou remplacer l'homme dans des missions pénibles. Un enjeu essentiel dans ce cadre est celui de l'information qu'ils doivent utiliser pour se déplacer, et donc des capteurs à exploiter pour obtenir cette information. Or nombre de ces capteurs présentent des inconvénients (risques de brouillage ou de masquage en particulier). L'utilisation d'une caméra vidéo dans ce contexte offre une perspective intéressante. L'objet de cette thèse était l'étude de l'utilisation d'une telle caméra dans un contexte capteur minimaliste: essentiellement l'utilisation des données visuelles et inertielles. Elle a porté sur le développement de lois de commande offrant au système ainsi bouclé des propriétés de stabilité et de robustesse. En particulier, une des difficultés majeures abordées vient de la connaissance très limitée de l'environnement dans lequel le drone évolue. La thèse a tout d'abord étudié le problème de stabilisation du drone sous l'hypothèse de petits déplacements (hypothèse de linéarité). Dans un second temps, on a montré comment relâcher l'hypothèse de petits déplacements via la synthèse de commandes non linéaires. Le cas du suivi de trajectoire a ensuite été considéré, en s'appuyant sur la définition d'un cadre générique de mesure d'erreur de position par rapport à un point de référence inconnu. Enfin, la validation expérimentale de ces résultats a été entamée pendant la thèse, et a permis de valider bon nombre d'étapes et de défis associés à leur mise en œuvre en conditions réelles. La thèse se conclut par des perspectives pour poursuivre les travaux.The computers miniaturization has paved the way for the conception of Unmanned Aerial vehicles - "UAVs"- that is: flying vehicles embedding computers to make them partially or fully automated for such missions as e.g. cluttered environments exploration or replacement of humanly piloted vehicles for hazardous or painful missions. A key challenge for the design of such vehicles is that of the information they need to find in order to move, and, thus, the sensors to be used in order to get such information. A number of such sensors have flaws (e.g. the risk of being jammed). In this context, the use of a videocamera offers interesting prospectives. The goal of this PhD work was to study the use of such a videocamera in a minimal sensors setting: essentially the use of visual and inertial data. The work has been focused on the development of control laws offering the closed loop system stability and robustness properties. In particular, one of the major difficulties we faced came from the limited knowledge of the UAV environment. First we have studied this question under a small displacements assumption (linearity assumption). A control law has been defined, which took performance criteria into account. Second, we have showed how the small displacements assumption could be given up through nonlinear control design. The case of a trajectory following has then been considered, with the use of a generic error vector modelling with respect to an unknown reference point. Finally, an experimental validation of this work has been started and helped validate a number of steps and challenges associated to real conditions experiments. The work was concluded with prospectives for future work.TOULOUSE-ISAE (315552318) / SudocSudocFranceF
    corecore