17 research outputs found

    The neural basis of Drosophila gravity-sensing and hearing

    Get PDF
    The neural substrates that the fruitfly Drosophila uses to sense smell, taste and light share marked structural and functional similarities with ours, providing attractive models to dissect sensory stimulus processing. Here we focus on two of the remaining and less understood prime sensory modalities: graviception and hearing. We show that the fly has implemented both sensory modalities into a single system, Johnston's organ, which houses specialized clusters of mechanosensory neurons, each of which monitors specific movements of the antenna. Gravity- and sound-sensitive neurons differ in their response characteristics, and only the latter express the candidate mechanotransducer channel NompC. The two neural subsets also differ in their central projections, feeding into neural pathways that are reminiscent of the vestibular and auditory pathways in our brain. By establishing the Drosophila counterparts of these sensory systems, our findings provide the basis for a systematic functional and molecular dissection of how different mechanosensory stimuli are detected and processed

    A structured LED linear light as an economically priced and technical alternative to a laser line generator

    Get PDF
    An interesting option for 3D object recognition is the triangulation principle. For this purpose a laser line generator is often used. But other structured linear light sources are also possible. For this the linear light has to have a defined width, brightness and sharpness. The linear light on the object, e. g. a groove, is changed by the object’s surface and is detected by digital image processing. Until now such linear light has been created with the help of a complex visual system using laser diodes. Because of some optical disadvantages of laser light, like speckles, in optical measurement applications, this paper examines the effect of using a structured LED light source instead of the laser generator. Consequently, the focus of this research lies on the necessary width, brightness, sharpness, length and depth of focus of the linear light generated by a LED light source for high-precision measuring. This research has been carried out by extensive computer-aided simulations. Several solutions are given and assessed in this paper

    Method for a robust search line based estimation of intensity edge width in blurred gray scale images for quantification of motion- and out-of-focus-blur

    Get PDF
    This paper presents a robust method for the estimation of the edge width at contours in intensity gray level images to determine the grade of blur respectively motion and out-of-focus blur. There are several methods for estimating of intensity edge width, but a lot of them got as main problem a sensitivity to noise and for this reason large variances of the measuring results. The method bases on a histogram estimation of bright and dark level with respect to the noise followed by a scaling. Afterwards the scaled edge curve is fitted by Gaussian error function for a functional describing of the edge [1]. The fitted edge is following used for calculation of edge width described by Thomas principle used for lens quality estimations [2]. The functionality of the algorithm is evaluated with synthetically noised and realistic captures at different optical magnifications, exposure times and velocities of relative motion between camera and measuring scene. Index Terms- image quality, edge quality estimation, image restoration basics, optical coordinate measurin

    An experimental study of motion blur in optical coordinate metrology for dynamic measurements of geometrical features

    Get PDF
    Published in: Proceedings of the 14th Joint International IMEKO TC1 + TC7 + TC 13 Symposium : "Intelligent quality measurements - theory, education and training" ; in conjunction with the 56th IWK, Ilmenau University of Technology and the 11th SpectroNet Collaboration Forum ; 31. August - 2. September 2011, JenTower Jena, Germany. - Ilmenau : Univ.-Bibliothek, ilmedia, 2011. URN: urn:nbn:de:gbv:ilm1-2011imeko:
    corecore