84 research outputs found

    How do field of view and resolution affect the information content of panoramic scenes for visual navigation? A computational investigation

    Get PDF
    The visual systems of animals have to provide information to guide behaviour and the informational requirements of an animal’s behavioural repertoire are often reflected in its sensory system. For insects, this is often evident in the optical array of the compound eye. One behaviour that insects share with many animals is the use of learnt visual information for navigation. As ants are expert visual navigators it may be that their vision is optimised for navigation. Here we take a computational approach in asking how the details of the optical array influence the informational content of scenes used in simple view matching strategies for orientation. We find that robust orientation is best achieved with low-resolution visual information and a large field of view, similar to the optical properties seen for many ant species. A lower resolution allows for a trade-off between specificity and generalisation for stored views. Additionally, our simulations show that orientation performance increases if different portions of the visual field are considered as discrete visual sensors, each giving an independent directional estimate. This suggests that ants might benefit by processing information from their two eyes independently

    A model of ant route navigation driven by scene familiarity

    Get PDF
    In this paper we propose a model of visually guided route navigation in ants that captures the known properties of real behaviour whilst retaining mechanistic simplicity and thus biological plausibility. For an ant, the coupling of movement and viewing direction means that a familiar view specifies a familiar direction of movement. Since the views experienced along a habitual route will be more familiar, route navigation can be re-cast as a search for familiar views. This search can be performed with a simple scanning routine, a behaviour that ants have been observed to perform. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. In the first instance we determine view familiarity by exhaustive comparison with the set of views experienced during training. In further experiments we train an artificial neural network to perform familiarity discrimination using the training views. Our results indicate that, not only is the approach successful, but also that the routes that are learnt show many of the characteristics of the routes of desert ants. As such, we believe the model represents the only detailed and complete model of insect route guidance to date. What is more, the model provides a general demonstration that visually guided routes can be produced with parsimonious mechanisms that do not specify when or what to learn, nor separate routes into sequences of waypoints

    Electroluminescence from chirality-sorted (9,7)-semiconducting carbon nanotube devices

    Full text link
    We have measured the electroluminescence and photoluminescence of (9,7) semiconducting carbon nanotube devices and demonstrate that the electroluminescence wavelength is determined by the nanotube's chiral index (n,m). The devices were fabricated on Si3N4 membranes by dielectrophoretic assembly of tubes from monochiral dispersion. Electrically driven (9,7) devices exhibit a single Lorentzian shaped emission peak at 825 nm in the visible part of the spectrum. The emission could be assigned to the excitonic E22 interband transition by comparison of the electroluminescence spectra with corresponding photoluminescence excitation maps. We show a linear dependence of the EL peak width on the electrical current, and provide evidence for the inertness of Si3N4 surfaces with respect to the nanotubes optical properties.Comment: 6 pages, 3 figures, submitted to Optics Expres

    PU.1 controls fibroblast polarization and tissue fibrosis

    Get PDF
    Fibroblasts are polymorphic cells with pleiotropic roles in organ morphogenesis, tissue homeostasis and immune responses. In fibrotic diseases, fibroblasts synthesize abundant amounts of extracellular matrix, which induces scarring and organ failure. By contrast, a hallmark feature of fibroblasts in arthritis is degradation of the extracellular matrix because of the release of metalloproteinases and degrading enzymes, and subsequent tissue destruction. The mechanisms that drive these functionally opposing pro-fibrotic and pro-inflammatory phenotypes of fibroblasts remain unknown. Here we identify the transcription factor PU.1 as an essential regulator of the pro-fibrotic gene expression program. The interplay between transcriptional and post-transcriptional mechanisms that normally control the expression of PU.1 expression is perturbed in various fibrotic diseases, resulting in the upregulation of PU.1, induction of fibrosis-associated gene sets and a phenotypic switch in extracellular matrix-producing pro-fibrotic fibroblasts. By contrast, pharmacological and genetic inactivation of PU.1 disrupts the fibrotic network and enables reprogramming of fibrotic fibroblasts into resting fibroblasts, leading to regression of fibrosis in several organs

    Using deep autoencoders to investigate image matching in visual navigation

    Get PDF
    This paper discusses the use of deep autoencoder networks to find a compressed representation of an image, which can be used for visual naviga-tion. Images reconstructed from the compressed representation are tested to see if they retain enough information to be used as a visual compass (in which an image is matched with another to recall a bearing/movement direction) as this ability is at the heart of a visual route navigation algorithm. We show that both reconstructed images and compressed representations from different layers of the autoencoder can be used in this way, suggesting that a compact image code is sufficient for visual navigation and that deep networks hold promise for find-ing optimal visual encodings for this task

    Using an insect mushroom body circuit to encode route memory in complex natural environments

    Get PDF
    Ants, like many other animals, use visual memory to follow extended routes through complex environments, but it is unknown how their small brains implement this capability. The mushroom body neuropils have been identified as a crucial memory circuit in the insect brain, but their function has mostly been explored for simple olfactory association tasks. We show that a spiking neural model of this circuit originally developed to describe fruitfly (Drosophila melanogaster) olfactory association, can also account for the ability of desert ants (Cataglyphis velox) to rapidly learn visual routes through complex natural environments. We further demonstrate that abstracting the key computational principles of this circuit, which include one-shot learning of sparse codes, enables the theoretical storage capacity of the ant mushroom body to be estimated at hundreds of independent images

    One Step Nucleic Acid Amplification (OSNA) - a new method for lymph node staging in colorectal carcinomas

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Accurate histopathological evaluation of resected lymph nodes (LN) is essential for the reliable staging of colorectal carcinomas (CRC). With conventional sectioning and staining techniques usually only parts of the LN are examined which might lead to incorrect tumor staging. A molecular method called OSNA (One Step Nucleic Acid Amplification) may be suitable to determine the metastatic status of the complete LN and therefore improve staging.</p> <p>Methods</p> <p>OSNA is based on a short homogenisation step and subsequent automated amplification of cytokeratin 19 (CK19) mRNA directly from the sample lysate, with result available in 30-40 minutes. In this study 184 frozen LN from 184 patients with CRC were investigated by both OSNA and histology (Haematoxylin & Eosin staining and CK19 immunohistochemistry), with half of the LN used for each method. Samples with discordant results were further analysed by RT-PCR for CK19 and carcinoembryonic antigen (CEA).</p> <p>Results</p> <p>The concordance rate between histology and OSNA was 95.7%. Three LN were histology+/OSNA- and 5 LN histology-/OSNA+. RT-PCR supported the OSNA result in 3 discordant cases, suggesting that metastases were exclusively located in either the tissue analysed by OSNA or the tissue used for histology. If these samples were excluded the concordance was 97.2%, the sensitivity 94.9%, and the specificity 97.9%. Three patients (3%) staged as UICC I or II by routine histopathology were upstaged as LN positive by OSNA. One of these patients developed distant metastases (DMS) during follow up.</p> <p>Conclusion</p> <p>OSNA is a new and reliable method for molecular staging of lymphatic metastases in CRC and enables the examination of whole LN. It can be applied as a rapid diagnostic tool to estimate tumour involvement in LN during the staging of CRC.</p

    Insect-Inspired Navigation Algorithm for an Aerial Agent Using Satellite Imagery

    Get PDF
    Humans have long marveled at the ability of animals to navigate swiftly, accurately, and across long distances. Many mechanisms have been proposed for how animals acquire, store, and retrace learned routes, yet many of these hypotheses appear incongruent with behavioral observations and the animals’ neural constraints. The “Navigation by Scene Familiarity Hypothesis” proposed originally for insect navigation offers an elegantly simple solution for retracing previously experienced routes without the need for complex neural architectures and memory retrieval mechanisms. This hypothesis proposes that an animal can return to a target location by simply moving toward the most familiar scene at any given point. Proof of concept simulations have used computer-generated ant’s-eye views of the world, but here we test the ability of scene familiarity algorithms to navigate training routes across satellite images extracted from Google Maps. We find that Google satellite images are so rich in visual information that familiarity algorithms can be used to retrace even tortuous routes with low-resolution sensors. We discuss the implications of these findings not only for animal navigation but also for the potential development of visual augmentation systems and robot guidance algorithms.Ye
    corecore