4 research outputs found

    Vision-based Feature matching as a tool for Robotic Localization

    Get PDF
    Os algoritmos usados em localização robótica geralmente dependem do cálculo da distância percorrida, baseada na velocidade e tempo de viagem (odometria). Outras categorias de localização incluem a correspondência de modelos ou ainda métodos geométricos como trilateração e triangulação.Esses métodos apesar de confiáveis envolvem enormes custos instalação e infra-estrutura.A solução proposta oferece uma abordagem mais robusta para a correspondência de modelos, bem como uma configuração mais fácil.The algorithms for robotic localization usually rely on calculating the distance based on speed and travel time (odometry). Another categories include template matching and geometric methods such as trilateration and triangulation.These methods however reliable encompass enormous overhead in setup and infrastructure cost.The proposed solution provides a more robust approach to template matching as well as an easier setup

    From Optimal Synthesis to Optimal Visual Servoing for Autonomous Vehicles

    Get PDF
    This thesis focuses on the characterization of optimal (shortest) paths to a desired position for a robot with unicycle kinematics and an on-board camera with limited Field-Of-View (FOV), which must keep a given feature in sight. In particular, I provide a complete optimal synthesis for the problem, i.e., a language of optimal control words, and a global partition of the motion plane induced by shortest paths, such that a word in the optimal language is univocally associated to a region and completely describes the shortest path from any starting point in that region to the goal point. Moreover, I provide a generalization to the case of arbitrary FOVs, including the case that the direction of motion is not an axis of symmetry for the FOV, and even that it is not contained in the FOV. Finally, based on the shortest path synthesis available, feedback control laws are defined for any point on the motion plane exploiting geometric properties of the synthesis itself. Moreover, by using a slightly generalized stability analysis setting, which is that of stability on a manifold, a proof of stability is given for the controlled system. At the end, simulation results are reported to demonstrate the effectiveness of the proposed technique

    Images Interpolation for Image-based Control under Large Displacement

    No full text
    The principal deficiency of image-based visual servoing is that the induced (3D) trajectories are not optimal and sometimes, especially when the displacement to realize is large, these trajectories are not physically valid leading to the failure of the servoing process. Furthermore, visual control needs a matching step between the features extracted from the initial image and the desired one. This step can be problematic (or impossible) when the camera displacement between the acquisitions of the initial and desired images is large and/or for complex scenes. To resolve these deficiencies, we couple an image interpolation process between N relay images extracted from a database and an image-based trajectory tracking. The camera calibration and the model of the observed scene are not assumed to be known. The relay images are interpolated in such a way that the corresponding camera trajectory is minimal. First a closed form collineation path is obtained and then the analytical form of image features trajectories are derived and efficiently tracked using a purely image-based control. Experimental results obtained on a six DOF eye-in-hand robotic system are presented and confirm the validity of the proposed approach

    Commande référencée vision pour drones à décollages et atterrissages verticaux

    Get PDF
    La miniaturisation des calculateurs a permis le développement des drones, engins volants capable de se déplacer de façon autonome et de rendre des services, comme se rendre clans des lieux peu accessibles ou remplacer l'homme dans des missions pénibles. Un enjeu essentiel dans ce cadre est celui de l'information qu'ils doivent utiliser pour se déplacer, et donc des capteurs à exploiter pour obtenir cette information. Or nombre de ces capteurs présentent des inconvénients (risques de brouillage ou de masquage en particulier). L'utilisation d'une caméra vidéo dans ce contexte offre une perspective intéressante. L'objet de cette thèse était l'étude de l'utilisation d'une telle caméra dans un contexte capteur minimaliste: essentiellement l'utilisation des données visuelles et inertielles. Elle a porté sur le développement de lois de commande offrant au système ainsi bouclé des propriétés de stabilité et de robustesse. En particulier, une des difficultés majeures abordées vient de la connaissance très limitée de l'environnement dans lequel le drone évolue. La thèse a tout d'abord étudié le problème de stabilisation du drone sous l'hypothèse de petits déplacements (hypothèse de linéarité). Dans un second temps, on a montré comment relâcher l'hypothèse de petits déplacements via la synthèse de commandes non linéaires. Le cas du suivi de trajectoire a ensuite été considéré, en s'appuyant sur la définition d'un cadre générique de mesure d'erreur de position par rapport à un point de référence inconnu. Enfin, la validation expérimentale de ces résultats a été entamée pendant la thèse, et a permis de valider bon nombre d'étapes et de défis associés à leur mise en œuvre en conditions réelles. La thèse se conclut par des perspectives pour poursuivre les travaux.The computers miniaturization has paved the way for the conception of Unmanned Aerial vehicles - "UAVs"- that is: flying vehicles embedding computers to make them partially or fully automated for such missions as e.g. cluttered environments exploration or replacement of humanly piloted vehicles for hazardous or painful missions. A key challenge for the design of such vehicles is that of the information they need to find in order to move, and, thus, the sensors to be used in order to get such information. A number of such sensors have flaws (e.g. the risk of being jammed). In this context, the use of a videocamera offers interesting prospectives. The goal of this PhD work was to study the use of such a videocamera in a minimal sensors setting: essentially the use of visual and inertial data. The work has been focused on the development of control laws offering the closed loop system stability and robustness properties. In particular, one of the major difficulties we faced came from the limited knowledge of the UAV environment. First we have studied this question under a small displacements assumption (linearity assumption). A control law has been defined, which took performance criteria into account. Second, we have showed how the small displacements assumption could be given up through nonlinear control design. The case of a trajectory following has then been considered, with the use of a generic error vector modelling with respect to an unknown reference point. Finally, an experimental validation of this work has been started and helped validate a number of steps and challenges associated to real conditions experiments. The work was concluded with prospectives for future work.TOULOUSE-ISAE (315552318) / SudocSudocFranceF
    corecore