193 research outputs found

    How do field of view and resolution affect the information content of panoramic scenes for visual navigation? A computational investigation

    Get PDF
    The visual systems of animals have to provide information to guide behaviour and the informational requirements of an animal’s behavioural repertoire are often reflected in its sensory system. For insects, this is often evident in the optical array of the compound eye. One behaviour that insects share with many animals is the use of learnt visual information for navigation. As ants are expert visual navigators it may be that their vision is optimised for navigation. Here we take a computational approach in asking how the details of the optical array influence the informational content of scenes used in simple view matching strategies for orientation. We find that robust orientation is best achieved with low-resolution visual information and a large field of view, similar to the optical properties seen for many ant species. A lower resolution allows for a trade-off between specificity and generalisation for stored views. Additionally, our simulations show that orientation performance increases if different portions of the visual field are considered as discrete visual sensors, each giving an independent directional estimate. This suggests that ants might benefit by processing information from their two eyes independently

    Ground-nesting insects could use visual tracking for monitoring nest position during learning flights

    Get PDF
    Ants, bees and wasps are central place foragers. They leave their nests to forage and routinely return to their home-base. Most are guided by memories of the visual panorama and the visual appearance of the local nest environment when pinpointing their nest. These memories are acquired during highly structured learning walks or flights that are performed when leaving the nest for the first time or whenever the insects had difficulties finding the nest during their previous return. Ground-nesting bees and wasps perform such learning flights daily when they depart for the first time. During these flights, the insects turn back to face the nest entrance and subsequently back away from the nest while flying along ever increasing arcs that are centred on the nest. Flying along these arcs, the insects counter-turn in such a way that the nest entrance is always seen in the frontal visual field at slightly lateral positions. Here we asked how the insects may achieve keeping track of the nest entrance location given that it is a small, inconspicuous hole in the ground, surrounded by complex natural structures that undergo unpredictable perspective transformations as the insect pivots around the area and gains distance from it. We reconstructed the natural visual scene experienced by wasps and bees during their learning flights and applied a number of template-based tracking methods to these image sequences. We find that tracking with a fixed template fails very quickly in the course of a learning flight, but that continuously updating the template allowed us to reliably estimate nest direction in reconstructed image sequences. This is true even for later sections of learning flights when the insects are so far away from the nest that they cannot resolve the nest entrance as a visual feature. We discuss why visual goal-anchoring is likely to be important during the acquisition of visual-spatial memories and describe experiments to test whether insects indeed update nest-related templates during their learning flights. © 2014 Springer International Publishing Switzerland

    A model of ant route navigation driven by scene familiarity

    Get PDF
    In this paper we propose a model of visually guided route navigation in ants that captures the known properties of real behaviour whilst retaining mechanistic simplicity and thus biological plausibility. For an ant, the coupling of movement and viewing direction means that a familiar view specifies a familiar direction of movement. Since the views experienced along a habitual route will be more familiar, route navigation can be re-cast as a search for familiar views. This search can be performed with a simple scanning routine, a behaviour that ants have been observed to perform. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. In the first instance we determine view familiarity by exhaustive comparison with the set of views experienced during training. In further experiments we train an artificial neural network to perform familiarity discrimination using the training views. Our results indicate that, not only is the approach successful, but also that the routes that are learnt show many of the characteristics of the routes of desert ants. As such, we believe the model represents the only detailed and complete model of insect route guidance to date. What is more, the model provides a general demonstration that visually guided routes can be produced with parsimonious mechanisms that do not specify when or what to learn, nor separate routes into sequences of waypoints

    Effect of Spatial Charge Inhomogeneity on 1/f Noise Behavior in Graphene

    Full text link
    Scattering mechanisms in graphene are critical to understanding the limits of signal-to-noise-ratios of unsuspended graphene devices. Here we present the four-probe low frequency noise (1/f) characteristics in back-gated single layer graphene (SLG) and bilayer graphene (BLG) samples. Contrary to the expected noise increase with the resistance, the noise for SLG decreases near the Dirac point, possibly due to the effects of the spatial charge inhomogeneity. For BLG, a similar noise reduction near the Dirac point is observed, but with a different gate dependence of its noise behavior. Some possible reasons for the different noise behavior between SLG and BLG are discussed.Comment: 28 pages, 3 figures + 3 supplement figure

    Using deep autoencoders to investigate image matching in visual navigation

    Get PDF
    This paper discusses the use of deep autoencoder networks to find a compressed representation of an image, which can be used for visual naviga-tion. Images reconstructed from the compressed representation are tested to see if they retain enough information to be used as a visual compass (in which an image is matched with another to recall a bearing/movement direction) as this ability is at the heart of a visual route navigation algorithm. We show that both reconstructed images and compressed representations from different layers of the autoencoder can be used in this way, suggesting that a compact image code is sufficient for visual navigation and that deep networks hold promise for find-ing optimal visual encodings for this task

    Visually Guided Avoidance in the Chameleon (Chamaeleo chameleon): Response Patterns and Lateralization

    Get PDF
    The common chameleon, Chamaeleo chameleon, is an arboreal lizard with highly independent, large-amplitude eye movements. In response to a moving threat, a chameleon on a perch responds with distinct avoidance movements that are expressed in its continuous positioning on the side of the perch distal to the threat. We analyzed body-exposure patterns during threat avoidance for evidence of lateralization, that is, asymmetry at the functional/behavioral levels. Chameleons were exposed to a threat approaching horizontally from the left or right, as they held onto a vertical pole that was either wider or narrower than the width of their head, providing, respectively, monocular or binocular viewing of the threat. We found two equal-sized sub-groups, each displaying lateralization of motor responses to a given direction of stimulus approach. Such an anti-symmetrical distribution of lateralization in a population may be indicative of situations in which organisms are regularly exposed to crucial stimuli from all spatial directions. This is because a bimodal distribution of responses to threat in a natural population will reduce the spatial advantage of predators

    Absolute response of Fuji imaging plate detectors to picosecond-electron bunches

    Get PDF
    The characterization of the absolute number of electrons generated by laser wakefield acceleration often relies on absolutely calibrated FUJI imaging plates (IP), although their validity in the regime of extreme peak currents is untested. Here, we present an extensive study on the dependence of the sensitivity of BAS-SR and BAS-MS IP to picosecond electron bunches of varying charge of up to 60 pC, performed at the electron accelerator ELBE, making use of about three orders of magnitude of higher peak intensity than in prior studies. We demonstrate that the response of the IPs shows no saturation effect and that the BAS-SR IP sensitivity of 0.0081 photostimulated luminescence per electron number confirms surprisingly well data from previous works. However, the use of the identical readout system and handling procedures turned out to be crucial and, if unnoticed, may be an important error source

    Absolute response of Fuji imaging plate detectors to picosecond-electron bunches

    Get PDF
    The characterization of the absolute number of electrons generated by laser wakefield acceleration often relies on absolutely calibrated FUJI imaging plates (IP), although their validity in the regime of extreme peak currents is untested. Here, we present an extensive study on the dependence of the sensitivity of BAS-SR and BAS-MS IP to picosecond electron bunches of varying charge of up to 60 pC, performed at the electron accelerator ELBE, making use of about three orders of magnitude of higher peak intensity than in prior studies. We demonstrate that the response of the IPs shows no saturation effect and that the BAS-SR IP sensitivity of 0.0081 photostimulated luminescence per electron number confirms surprisingly well data from previous works. However, the use of the identical readout system and handling procedures turned out to be crucial and, if unnoticed, may be an important error source

    Multimodal influences on learning walks in desert ants (Cataglyphis fortis)

    Get PDF
    Ants are excellent navigators using multimodal information for navigation. To accurately localise the nest at the end of a foraging journey, visual cues, wind direction and also olfactory cues need to be learnt. Learning walks are performed at the start of an ant’s foraging career or when the appearance of the nest surrounding has changed. We investigated here whether the structure of such learning walks in the desert ant Cataglyphis fortis takes into account wind direction in conjunction with the learning of new visual information. Ants learnt to travel back and forth between their nest and a feeder, and we then introduced a black cylinder near their nest to induce learning walks in regular foragers. By doing this across days with different wind directions, we were able to probe how ants balance different sensory modalities. We found that (1) the ants’ outwards headings are influenced by the wind direction with their routes deflected such that they will arrive downwind of their target, (2) a novel object along the route induces learning walks in experienced ants and (3) the structure of learning walks is shaped by the wind direction rather than the position of the visual cue
    • …
    corecore