3,759 research outputs found

    Visually Guided Control of Movement

    Get PDF
    The papers given at an intensive, three-week workshop on visually guided control of movement are presented. The participants were researchers from academia, industry, and government, with backgrounds in visual perception, control theory, and rotorcraft operations. The papers included invited lectures and preliminary reports of research initiated during the workshop. Three major topics are addressed: extraction of environmental structure from motion; perception and control of self motion; and spatial orientation. Each topic is considered from both theoretical and applied perspectives. Implications for control and display are suggested

    Visuomotor control, eye movements, and steering : A unified approach for incorporating feedback, feedforward, and internal models

    Get PDF
    The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a–c) has been considered before (also in the context of driving), integrating them into a single framework and the authors’ multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion.Peer reviewe

    Precision Landing of a Quadrotor UAV on a Moving Target Using Low-Cost Sensors

    Get PDF
    With the use of unmanned aerial vehicles (UAVs) becoming more widespread, a need for precise autonomous landings has arisen. In the maritime setting, precise autonomous landings will help to provide a safe way to recover UAVs deployed from a ship. On land, numerous applications have been proposed for UAV and unmanned ground vehicle (UGV) teams where autonomous docking is required so that the UGVs can either recover or service a UAV in the field. Current state of the art approaches to solving the problem rely on expensive inertial measurement sensors and RTK or differential GPS systems. However, such a solution is not practical for many UAV systems. A framework to perform precision landings on a moving target using low-cost sensors is proposed in this thesis. Vision from a downward facing camera is used to track a target on the landing platform and generate high quality relative pose estimates. The landing procedure consists of three stages. First, a rendezvous stage commands the quadrotor on a path to intercept the target. A target acquisition stage then ensures that the quadrotor is tracking the landing target. Finally, visual measurements of the relative pose to the landing target are used in the target tracking stage where control and estimation are performed in a body-planar frame, without the use of GPS or magnetometer measurements. A comprehensive overview of the control and estimation required to realize the three stage landing approach is presented. Critical parts of the landing framework were implemented on an AscTec Pelican testbed. The AprilTag visual fiducial system is chosen for use as the landing target. Implementation details to improve the AprilTag detection pipeline are presented. Simulated and experimen- tal results validate key portions of the landing framework. The novel relative estimation scheme is evaluated in an indoor positioning system. Tracking and landing on a moving target is demonstrated in an indoor environment. Outdoor tests also validate the target tracking performance in the presence of wind

    Precision Landing of a Quadrotor UAV on a Moving Target Using Low-Cost Sensors

    Get PDF
    With the use of unmanned aerial vehicles (UAVs) becoming more widespread, a need for precise autonomous landings has arisen. In the maritime setting, precise autonomous landings will help to provide a safe way to recover UAVs deployed from a ship. On land, numerous applications have been proposed for UAV and unmanned ground vehicle (UGV) teams where autonomous docking is required so that the UGVs can either recover or service a UAV in the field. Current state of the art approaches to solving the problem rely on expensive inertial measurement sensors and RTK or differential GPS systems. However, such a solution is not practical for many UAV systems. A framework to perform precision landings on a moving target using low-cost sensors is proposed in this thesis. Vision from a downward facing camera is used to track a target on the landing platform and generate high quality relative pose estimates. The landing procedure consists of three stages. First, a rendezvous stage commands the quadrotor on a path to intercept the target. A target acquisition stage then ensures that the quadrotor is tracking the landing target. Finally, visual measurements of the relative pose to the landing target are used in the target tracking stage where control and estimation are performed in a body-planar frame, without the use of GPS or magnetometer measurements. A comprehensive overview of the control and estimation required to realize the three stage landing approach is presented. Critical parts of the landing framework were implemented on an AscTec Pelican testbed. The AprilTag visual fiducial system is chosen for use as the landing target. Implementation details to improve the AprilTag detection pipeline are presented. Simulated and experimen- tal results validate key portions of the landing framework. The novel relative estimation scheme is evaluated in an indoor positioning system. Tracking and landing on a moving target is demonstrated in an indoor environment. Outdoor tests also validate the target tracking performance in the presence of wind

    Multimodal Sensory Integration for Perception and Action in High Functioning Children with Autism Spectrum Disorder

    Get PDF
    Movement disorders are the earliest observed features of autism spectrum disorder (ASD) present in infancy. Yet we do not understand the neural basis for impaired goal-directed movements in this population. To reach for an object, it is necessary to perceive the state of the arm and the object using multiple sensory modalities (e.g. vision, proprioception), to integrate those sensations into a motor plan, to execute the plan, and to update the plan based on the sensory consequences of action. In this dissertation, I present three studies in which I recorded hand paths of children with ASD and typically developing (TD) controls as they grasped the handle of a robotic device to control a cursor displayed on a video screen. First, participants performed discrete and continuous movements to capture targets. Cursor feedback was perturbed from the hand\u27s actual position to introduce visuo-spatial conflict between sensory and proprioceptive feedback. Relative to controls, children with ASD made greater errors, consistent with deficits of sensorimotor adaptive and strategic compensations. Second, participants performed a two-interval forced-choice discrimination task in which they perceived two movements of the visual cursor and/or the robot handle and then indicated which of the two movements was more curved. Children with ASD were impaired in their ability to discriminate movement kinematics when provided visual and proprioceptive information simultaneously, suggesting deficits of visuo-proprioceptive integration. Finally, participants made goal-directed reaching movements against a load while undergoing simultaneous functional magnetic resonance imaging (MRI). The load remained constant (predictable) within an initial block of trials and then varied randomly within four additional blocks. Children with ASD exhibited greater movement variability compared to controls during both constant and randomly-varying loads. MRI analysis identified marked differences in the extent and intensity of the neural activities supporting goal-directed reaching in children with ASD compared to TD children in both environmental conditions. Taken together, the three studies revealed deficits of multimodal sensory integration in children with ASD during perception and execution of goal-directed movements and ASD-related motor performance deficits have a telltale neural signature, as revealed by functional MR imaging

    Sensorless Haptic Force Feedback for Telemanipulation using two identical Delta Robots

    Get PDF
    Bilateral teleoperation allows users to interact with objects in remote environments by providing the operator with haptic feedback. In this thesis two control scheme have been implemented in order to guarantee stability and transparency to the system: a position-position control scheme with gravity and passivity compensation and a bilateral force sensorless acceleration control implemented with Kalman filters and disturbance observers. Both methods were tested using two identical Delta robot

    Aerial Vehicles

    Get PDF
    This book contains 35 chapters written by experts in developing techniques for making aerial vehicles more intelligent, more reliable, more flexible in use, and safer in operation.It will also serve as an inspiration for further improvement of the design and application of aeral vehicles. The advanced techniques and research described here may also be applicable to other high-tech areas such as robotics, avionics, vetronics, and space

    Researcher's guide to the NASA Ames Flight Simulator for Advanced Aircraft (FSAA)

    Get PDF
    Performance, limitations, supporting software, and current checkout and operating procedures are presented for the flight simulator, in terms useful to the researcher who intends to use it. Suggestions to help the researcher prepare the experimental plan are also given. The FSAA's central computer, cockpit, and visual and motion systems are addressed individually but their interaction is considered as well. Data required, available options, user responsibilities, and occupancy procedures are given in a form that facilitates the initial communication required with the NASA operations' group
    corecore