84 research outputs found
Texture dependence of motion sensing and free flight behavior in blowflies
Lindemann JP, Egelhaaf M. Texture dependence of motion sensing and free flight behavior in blowflies. Frontiers in Behavioral Neuroscience. 2013;6:92.Many flying insects exhibit an active flight and gaze strategy: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment
Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action
Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Frontiers in Neural Circuits. 2012;6:108.Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor
Contrast-Independent Biologically Inspired Motion Detection
Optic flow, i.e., retinal image movement resulting from ego-motion, is a crucial source of information used for obstacle avoidance and course control in flying insects. Optic flow analysis may prove promising for mobile robotics although it is currently not among the standard techniques. Insects have developed a computationally cheap analysis mechanism for image motion. Detailed computational models, the so-called elementary motion detectors (EMDs), describe motion detection in insects. However, the technical application of EMDs is complicated by the strong effect of local pattern contrast on their motion response. Here we present augmented versions of an EMD, the (s)cc-EMDs, which normalise their responses for contrast and thereby reduce the sensitivity to contrast changes. Thus, velocity changes of moving natural images are reflected more reliably in the detector response. The (s)cc-EMDs can easily be implemented in hardware and software and can be a valuable novel visual motion sensor for mobile robots
Temporal Statistics of Natural Image Sequences Generated by Movements with Insect Flight Characteristics
Schwegmann A, Lindemann JP, Egelhaaf M. Temporal Statistics of Natural Image Sequences Generated by Movements with Insect Flight Characteristics. PLoS ONE. 2014;9(10): e110386.Many flying insects, such as flies, wasps and bees, pursue a saccadic flight and gaze strategy. This behavioral strategy is thought to separate the translational and rotational components of self-motion and, thereby, to reduce the computational efforts to extract information about the environment from the retinal image flow. Because of the distinguishing dynamic features of this active flight and gaze strategy of insects, the present study analyzes systematically the spatiotemporal statistics of image sequences generated during saccades and intersaccadic intervals in cluttered natural environments. We show that, in general, rotational movements with saccade-like dynamics elicit fluctuations and overall changes in brightness, contrast and spatial frequency of up to two orders of magnitude larger than translational movements at velocities that are characteristic of insects. Distinct changes in image parameters during translations are only caused by nearby objects. Image analysis based on larger patches in the visual field reveals smaller fluctuations in brightness and spatial frequency composition compared to small patches. The temporal structure and extent of these changes in image parameters define the temporal constraints imposed on signal processing performed by the insect visual system under behavioral conditions in natural environments
Peripheral Processing Facilitates Optic Flow-Based Depth Perception
Li J, Lindemann JP, Egelhaaf M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Frontiers in Computational Neuroscience. 2016;10(10): 111.Flying insects, such as flies or bees, rely on consistent information regarding the
depth structure of the environment when performing their flight maneuvers in cluttered natural
environments. These behaviors include avoiding collisions, approaching targets or spatial
navigation. Insects are thought to obtain depth information visually from the retinal image
displacements (“optic flow”) during translational ego-motion. Optic flow in the insect visual
system is processed by a mechanism that can be modeled by correlation-type elementary motion
detectors (EMDs). However, it is still an open question how spatial information can be extracted
reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially
if the vast range of light intensities encountered in natural environments is taken into account.
This question will be addressed here by systematically modeling the peripheral visual system of
flies, including various adaptive mechanisms. Different model variants of the peripheral visual
system were stimulated with image sequences that mimic the panoramic visual input during
translational ego-motion in various natural environments, and the resulting peripheral signals were
fed into an array of EMDs. We characterized the influence of each peripheral computational unit on
the representation of spatial information in the EMD responses. Our model simulations reveal that
information about the overall light level needs to be eliminated from the EMD input as is
accomplished under light-adapted conditions in the insect peripheral visual system. The response
characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces
the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness
of objects and, especially, of their contours. We furthermore show that local brightness adaptation
of photoreceptors allows for spatial vision under a wide range of dynamic light
conditions
Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis
Schwegmann A, Lindemann JP, Egelhaaf M. Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Frontiers in Computational Neuroscience. 2014;8:83.Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way
Motion as a source of environmental information: a fresh view on biological motion computation by insect brains
Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Frontiers in Neural Circuits. 2014;8:127.Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around (“optic flow”) to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and—in many behavioral contexts—less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism
Local motion adaptation enhances the representation of spatial structure at EMD arrays
Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLOS Computational Biology. 2017;13(12): e1005919.Neuronal representation and extraction of spatial information are essential for behavioral
control. For flying insects, a plausible way to gain spatial information is to exploit distancedependent
optic flow that is generated during translational self-motion. Optic flow is computed
by arrays of local motion detectors retinotopically arranged in the second neuropile
layer of the insect visual system. These motion detectors have adaptive response characteristics,
i.e. their responses to motion with a constant or only slowly changing velocity
decrease, while their sensitivity to rapid velocity changes is maintained or even increases.
We analyzed by a modeling approach how motion adaptation affects signal representation
at the output of arrays of motion detectors during simulated flight in artificial and natural 3D
environments. We focused on translational flight, because spatial information is only contained
in the optic flow induced by translational locomotion. Indeed, flies, bees and other
insects segregate their flight into relatively long intersaccadic translational flight sections
interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation
(80% of the flight). With a novel adaptive model of the insect visual motion pathway we
could show that the motion detector responses to background structures of cluttered environments
are largely attenuated as a consequence of motion adaptation, while responses to
foreground objects stay constant or even increase. This conclusion even holds under the
dynamic flight conditions of insects
A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes
Bertrand O, Lindemann JP, Egelhaaf M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Computational Biology. 2015;11(11): e1004339.Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects
Head orientation of walking blowflies is controlled by visual and mechanical cues
Monteagudo Ibarreta J, Lindemann JP, Egelhaaf M. Head orientation of walking blowflies is controlled by visual and mechanical cues. The Journal of Experimental Biology. 2017;220(24):4578-4582.During locomotion, animals employ visual and mechanical cues in
order to establish the orientation of their head, which reflects the
orientation of the visual coordinate system. However, in certain
situations, contradictory cues may suggest different orientations
relative to the environment. We recorded blowflies walking on a
horizontal or tilted surface surrounded by visual cues suggesting a
variety of orientations.We found that the different orientations relative
to gravity of visual cues and walking surface were integrated, with the
orientation of the surface being the major contributor to head
orientation, while visual cues and gravity also play an important
role. In contrast, visual cues did not affect body orientation much. Cue
integration was modeled as the weighted sum of orientations
suggested by the different cues. Our model suggests that in the
case of lacking visual cues, more weight is given to gravity
- …