The Drosophila visual system: a super-efficient encoder

Abstract

In order to survive and reproduce, every animal needs to run accurate and diverse visual processes efficiently. However, understanding how they see is limited by our lack of insight into how evolution optimises resource- and area-constrained neural machinery. It was shown recently in Drosophila that its photoreceptor cells, corresponding to individual “pixels” of the scene, react photomechanically to these light changes by generating an ultrafast counter-motion, a photoreceptor microsaccade. Each photoreceptor moves in a specific direction at its particular location inside the compound eye, transiently readjusting its own light input. These mirror-symmetrically opposing microsaccades cause small timing differences in the eye and the brain networks’ electrical signals, rapidly and accurately informing the fly of the 3D world structure. Remarkably, it has been shown that the Drosophila can resolve angles finer than 1°, five times less than what the optic laws would predict in a static fly eye. The results presented in this thesis demonstrate that hyperacute visual information is transmitted from the photoreceptors to the visual pathway and I report a deep learning approach for discovering how the Drosophila compound eyes' biological neural network (BNN) samples and represents hyperacute stimuli. Using in vivo two-photon calcium imaging on a transgenic fly, I recorded the responses of 17 flies’ L2 neurons, OFF neurons in the early visual pathway, while presenting fine resolution visual patterns. I showed that the Drosophila’s visual hyperacute information is transmitted from the photoreceptors to the medulla layer (2nd layer in the visual system). Additionally, I found that the L2 neurons show direction-specific acuity and proved that this is a consequence of the photoreceptors’ microsaccades. Next, I show that an artificial neural network (ANN), with precisely-positioned and photomechanically-moving photoreceptors, shaping and feeding visual information to a lifelike-wired neuropile, learns to reproduce natural response dynamics. Remarkably, this ANN predicts realistic stimulus-locked responses and synaptic connection eights at each eye location, mapping the eyes' experimentally verified hyperacute orientation sensitivity. By systematically altering sampling dynamics and connections, I further show that without the realistic orientation-tuned photoreceptor microsaccades and connectome, performance falters to suboptimal. My results demonstrate the importance of precise microsaccades and connectivity for efficient visual encoding and highlight the effect of morphodynamic information sampling on accurate perception

    Similar works