497 research outputs found

    Optical flow sensing and the inverse perception problem for flying bats

    Full text link
    The movements of birds, bats, and other flying species are governed by complex sensorimotor systems that allow the animals to react to stationary environmental features as well as to wind disturbances, other animals in nearby airspace, and a wide variety of unexpected challenges. The paper and talk will describe research that analyzes the three-dimensional trajectories of bats flying in a habitat in Texas. The trajectories are computed with stereoscopic methods using data from synchronous thermal videos that were recorded with high temporal and spatial resolution from three viewpoints. Following our previously reported work, we examine the possibility that bat trajectories in this habitat are governed by optical flow sensing that interpolates periodic distance measurements from echolocation. Using an idealized geometry of bat eyes, we introduce the concept of time-to-transit, and recall some research that suggests that this quantity is computed by the animals' visual cortex. Several steering control laws based on time-to-transit are proposed for an idealized flight model, and it is shown that these can be used to replicate the observed flight of what we identify as typical bats. Although the vision-based motion control laws we propose and the protocols for switching between them are quite simple, some of the trajectories that have been synthesized are qualitatively bat-like. Examination of the control protocols that generate these trajectories suggests that bat motions are governed both by their reactions to a subset of key feature points as well by their memories of where these feature points are located

    Perception and steering control in paired bat flight

    Get PDF
    Animals within groups need to coordinate their reactions to perceived environmental features and to each other in order to safely move from one point to another. This paper extends our previously published work on the flight patterns of Myotis velifer that have been observed in a habitat near Johnson City, Texas. Each evening, these bats emerge from a cave in sequences of small groups that typically contain no more than three or four individuals, and they thus provide ideal subjects for studying leader-follower behaviors. By analyzing the flight paths of a group of M. velifer, the data show that the flight behavior of a follower bat is influenced by the flight behavior of a leader bat in a way that is not well explained by existing pursuit laws, such as classical pursuit, constant bearing and motion camouflage. Thus we propose an alternative steering law based on virtual loom, a concept we introduce to capture the geometrical configuration of the leader-follower pair. It is shown that this law may be integrated with our previously proposed vision-enabled steering laws to synthesize trajectories, the statistics of which fit with those of the bats in our data set. The results suggest that bats use perceived information of both the environment and their neighbors for navigation.2018-08-0

    Perceptual modalities guiding bat flight in a native habitat

    Get PDF
    Flying animals accomplish high-speed navigation through fields of obstacles using a suite of sensory modalities that blend spatial memory with input from vision, tactile sensing, and, in the case of most bats and some other animals, echolocation. Although a good deal of previous research has been focused on the role of individual modes of sensing in animal locomotion, our understanding of sensory integration and the interplay among modalities is still meager. To understand how bats integrate sensory input from echolocation, vision, and spatial memory, we conducted an experiment in which bats flying in their natural habitat were challenged over the course of several evening emergences with a novel obstacle placed in their flight path. Our analysis of reconstructed flight data suggests that vision, echolocation, and spatial memory together with the possible exercise of an ability in using predictive navigation are mutually reinforcing aspects of a composite perceptual system that guides flight. Together with the recent development in robotics, our paper points to the possible interpretation that while each stream of sensory information plays an important role in bat navigation, it is the emergent effects of combining modalities that enable bats to fly through complex spaces

    Saliency Based Control in Random Feature Networks

    Full text link
    The ability to rapidly focus attention and react to salient environmental features enables animals to move agiley through their habitats. To replicate this kind of high-performance control of movement in synthetic systems, we propose a new approach to feedback control that bases control actions on randomly perceived features. Connections will be made with recent work incorporating communication protocols into networked control systems. The concepts of {\em random channel controllability} and {\em random channel observability} for LTI control systems are introduced and studied.Comment: 9 pages, 2 figure

    Communicating through motion in dance and animal groups

    Get PDF
    This study explores principles of motion based communication in animal and human group behavior. It develops models of cooperative control that involve communication through actions aimed at a shared objective. Moreover, it aims at understanding the collective motion in multi-agent models towards a desired objective which requires interaction with the environment. In conducting a formal study of these problems, first we investigate the leader-follower interaction in a dance performance. Here, the prototype model is salsa. Salsa is of interest because it is a structured interaction between a leader (usually a male dancer) and a follower (usually a female dancer). Success in a salsa performance depends on how effectively the dance partners communicate with each other using hand, arm and body motion. We construct a mathematical framework in terms of a Dance Motion Description Language (DMDL). This provides a way to specify control protocols for dance moves and to represent every performance as sequences of letters and corresponding motion signals. An enhanced form of salsa (intermediate level) is discussed in which the constraints on the motion transitions are described by simple rules suggested by topological knot theory. It is shown that the proficiency hierarchy in dance is effectively captured by proposed complexity metrics. In order to investigate the group behavior of animals that are reacting to environmental features, we have analyzed a large data set derived from 3-d video recordings of groups of Myotis velifer emerging from a cave. A detailed statistical analysis of large numbers of trajectories indicates that within certain bounds of animal diversity, there appear to be common characteristics of the animals' reactions to features in a clearly defined flight corridor near the mouth of the cave. A set of vision-based motion control primitives is proposed and shown to be effective in synthesizing bat-like flight paths near groups of obstacles. A comparison of synthesized paths and actual bat motions culled from our data set suggests that motions are not based purely on reactions to environmental features. Spatial memory and reactions to the movement of other bats may also play a role. It is argued that most bats employ a hybrid navigation strategy that combines reactions to nearby obstacles and other visual features with some combination of spatial memory and reactions to the motions of other bats

    Discovering useful parts for pose estimation in sparsely annotated datasets

    Full text link
    Our work introduces a novel way to increase pose estimation accuracy by discovering parts from unannotated regions of training images. Discovered parts are used to generate more accurate appearance likelihoods for traditional part-based models like Pictorial Structures and its derivatives. Our experiments on images of a hawkmoth in flight show that our proposed approach significantly improves over existing work for this application, while also being more generally applicable. Our proposed approach localizes landmarks at least twice as accurately as a baseline based on a Mixture of Pictorial Structures (MPS) model. Our unique High-Resolution Moth Flight (HRMF) dataset is made publicly available with annotations.https://arxiv.org/abs/1605.00707Accepted manuscrip

    Visual Navigation Using Sparse Optical Flow and Time-to-Transit

    Get PDF
    Drawing inspiration from biology, we describe the way in which visual sensing with a monocular camera can provide a reliable signal for navigation of mobile robots. The work takes inspiration from the classic paper which described a behavioral strategy pursued by diving sea birds based on a visual cue called time-to-contact. A closely related concept of time-to-transit, tau, is defined, and it is shown that steering laws based on monocular camera perceptions of tau can reliably steer a mobile vehicle. The contribution of the paper is two-fold. It provides a simple theory of robust vision-based steering control. It goes on to show how the theory guides the implementation of robust visual navigation using ROS-Gazebo simulations as well as deployment and experiments with a camera-equipped Jackal robot. As will be noted, there is an extensive literature on how animals use optical flow to guide their movements. The novelty of the work below is the introduction of the concepts of Eulerian optical flow and time-to-transit, tau and the demonstration that control laws based on the tau values associated with an aggregated set of features in the field of view can be used to reliably steer a laboratory robot

    Perceptual aliasing in vision-based robot navigation

    Get PDF
    In order to create intelligent robots that are able to react to their environment through computer vision, it has been of interest to study how humans, and animals receive and process visual information. Flying animals, such as birds and bats, use a vision processing technique called optical flow to navigate the environment. The key to making use of optical flow for feedback control is the idea of time-to-transit, which is a measure of how long it will take an observer to pass an object in its field of view. Simply using optical flow data, this time-to-transit (tau) can be calculated without knowing the distance to the object, or the size of the object itself. Tau can be computed in real time and used as input to autonomous vehicle control laws. Vision-based navigation of autonomous robotic vehicles can support applications in both the military and civilian sectors. In this work, a series of feedback control laws for autonomous robot control, whose inputs are the frames of a video sequence in real time, are developed. Two control laws, coined motion primitives, are developed based on tau balancing and tau difference maximizing, and protocol switching logic is established to determine when each should be employed. The tau balancing law utilizes information on both the right and left sides of the path environment, when available, and attempts to balance between them. The tau difference maximizing primitive, contrastingly, aligns the vehicle motion with features either on one side or the other. A tertiary navigation strategy is also implemented where the segments of sensing, perceiving, and acting are separated. A simulation environment is also developed as a test-bed for studying the effects of changing control law parameters and decision variables for protocol switches. In some cases, it may appear as though one strategy can be used, when the other is actually required. Such situations are referred to as occurrences of perceptual aliasing - the misinterpretation of perceptual cues, leading to the execution of an unsuitable action. Such misunderstanding of the environment can lead to dangerous motions of the vehicle - as would occur when the control attempts to steer the vehicle between features on the left and right sides of a solid obstacle or wall in the vehicle's path. Without safeguards in place to prevent this misinterpretation, perceptual aliasing could cause a robot to collide with obstacles in its environment. Perceptual aliasing can occur whenever the most intuitive control strategy will not result in successful navigation. The problem is overcome through studies of human and animal perception, as well as a statistical analysis of the structure of optical flow and time-to-transit, to intelligently select which control strategy to implement. These control laws are composed together to allow a robot to autonomously navigate a corridor environment with both straight and turning sections

    Visual navigation with a 2-pixel camera -- possibilities and limitations

    Get PDF
    Borrowing terminology from fluid mechanics, the concepts of {\em Eulerian} and {\em Lagrangian optical flow sensing} are introduced. Eulerian optical flow sensing assumes that each photoreceptor in the camera or eye can instantaneously detect feature image points and their velocities on the retina. If this assumption is satisfied, even a two pixel imaging system can provide a moving agent with information about its movement along a corridor that is sufficiently precise as to be used as a robustly reliable steering signal. Implementing Eulerian optical flow sensing poses significant challenges, however. Lagrangian optical flow, on the other hand, tracks feature image points as they move on the retina. This form of visual sensing is the basis for many standard computer vision implementations, including Lukas-Kanade and Horn-Schunck. Lagrangian optical flow has its own challenges, not least of which is that it is badly confounded by rotational components of motion. Combined steering and sensing strategies for mitigating the effects of rotational motions are considered.First author draf
    • …
    corecore