23 research outputs found

    A Visionary Approach to Listening: Determining The Role Of Vision In Auditory Scene Analysis

    Get PDF
    To recognize and understand the auditory environment, the listener must first separate sounds that arise from different sources and capture each event. This process is known as auditory scene analysis. The aim of this thesis is to investigate whether and how visual information can influence auditory scene analysis. The thesis consists of four chapters. Firstly, I reviewed the literature to give a clear framework about the impact of visual information on the analysis of complex acoustic environments. In chapter II, I examined psychophysically whether temporal coherence between auditory and visual stimuli was sufficient to promote auditory stream segregation in a mixture. I have found that listeners were better able to report brief deviants in an amplitude modulated target stream when a visual stimulus changed in size in a temporally coherent manner than when the visual stream was coherent with the non-target auditory stream. This work demonstrates that temporal coherence between auditory and visual features can influence the way people analyse an auditory scene. In chapter III, the integration of auditory and visual features in auditory cortex was examined by recording neuronal responses in awake and anaesthetised ferret auditory cortex in response to the modified stimuli used in Chapter II. I demonstrated that temporal coherence between auditory and visual stimuli enhances the neural representation of a sound and influences which sound a neuron represents in a sound mixture. Visual stimuli elicited reliable changes in the phase of the local field potential which provides mechanistic insight into this finding. Together these findings provide evidence that early cross modal integration underlies the behavioural effects in chapter II. Finally, in chapter IV, I investigated whether training can influence the ability of listeners to utilize visual cues for auditory stream analysis and showed that this ability improved by training listeners to detect auditory-visual temporal coherence

    Universal adaptive optics for microscopy through embedded neural network control

    Get PDF
    The resolution and contrast of microscope imaging is often affected by aberrations introduced by imperfect optical systems and inhomogeneous refractive structures in specimens. Adaptive optics (AO) compensates these aberrations and restores diffraction limited performance. A wide range of AO solutions have been introduced, often tailored to a specific microscope type or application. Until now, a universal AO solution – one that can be readily transferred between microscope modalities – has not been deployed. We propose versatile and fast aberration correction using a physics-based machine learning assisted wavefront-sensorless AO control (MLAO) method. Unlike previous ML methods, we used a specially constructed neural network (NN) architecture, designed using physical understanding of the general microscope image formation, that was embedded in the control loop of different microscope systems. The approach means that not only is the resulting NN orders of magnitude simpler than previous NN methods, but the concept is translatable across microscope modalities. We demonstrated the method on a two-photon, a three-photon and a widefield three-dimensional (3D) structured illumination microscope. Results showed that the method outperformed commonly-used modal-based sensorless AO methods. We also showed that our ML-based method was robust in a range of challenging imaging conditions, such as 3D sample structures, specimen motion, low signal to noise ratio and activity-induced fluorescence fluctuations. Moreover, as the bespoke architecture encapsulated physical understanding of the imaging process, the internal NN configuration was no-longer a “black box”, but provided physical insights on internal workings, which could influence future designs

    GCaMP8 transgenic mice learn to make visual decisions

    No full text
    Transgenic mice engineered to express calcium indicators such as GCaMP have revolutionized exploration of neuronal circuit function. The latest development, GCaMP8 transgenic mice, exhibits enhanced temporal kinetics and sensitivity of neural signals, opening new avenues for studying neuronal dynamics within behaviorally relevant time frames. However, in initial attempts, it has been chalenging to train these mice in visual decision making tasks. Here we show that GCaMP8 transgenic mice, specifically TetO-jGCaMP8s x CaMK2a-tTA mice, learn to perform head-fixed visual decision tasks with a rate and accuracy comparable to wildtype mice. These proof-of-principle results enhance the utility of these transgenic animals in neuroscientific studies of learning and decision making

    Acute Inactivation of Primary Auditory Cortex Causes a Sound Localisation Deficit in Ferrets

    Get PDF
    <div><p>The objective of this study was to demonstrate the efficacy of acute inactivation of brain areas by cooling in the behaving ferret and to demonstrate that cooling auditory cortex produced a localisation deficit that was specific to auditory stimuli. The effect of cooling on neural activity was measured in anesthetized ferret cortex. The behavioural effect of cooling was determined in a benchmark sound localisation task in which inactivation of primary auditory cortex (A1) is known to impair performance. Cooling strongly suppressed the spontaneous and stimulus-evoked firing rates of cortical neurons when the cooling loop was held at temperatures below 10°C, and this suppression was reversed when the cortical temperature recovered. Cooling of ferret auditory cortex during behavioural testing impaired sound localisation performance, with unilateral cooling producing selective deficits in the hemifield contralateral to cooling, and bilateral cooling producing deficits on both sides of space. The deficit in sound localisation induced by inactivation of A1 was not caused by motivational or locomotor changes since inactivation of A1 did not affect localisation of visual stimuli in the same context.</p></div

    Factors affecting visual localisation performance and unsigned error magnitude in GLM analysis for the midline location during unilateral cooling.

    No full text
    <p>Factors affecting visual localisation performance and unsigned error magnitude in GLM analysis for the midline location during unilateral cooling.</p

    Factors affecting midline sound localisation performance in the GLM analysis for the midline location during unilateral cooling.

    No full text
    <p>Factors affecting midline sound localisation performance in the GLM analysis for the midline location during unilateral cooling.</p

    The effect of bilateral cooling on sound localisation performance.

    No full text
    <p>[A] Bars show the mean performance across all locations tested for each stimulus duration for warm and bilaterally cooled conditions. [B] Bars show the mean unsigned error magnitude for each stimulus duration for warm and bilaterally cooled conditions. Symbols: F1204 = triangles, F1311 = squares.</p

    Histological verification of cooling loop locations.

    No full text
    <p>[A] Positions of cooling loops in each ferret (thick black lines). The auditory cortex is outlined in solid lines (sss: supra-sylvian sulcus, pss: pseudo-sylvian sulcus) with location of the Medial Ectosylvian Gyrus (MEG, where A1 is located) indicated by the dashed black lines, as determined by cytoarchitectonic boundaries derived from post-mortem histology). [B] Typical example of neurofilament stained brain section from underneath one of the loops (from F1204 right hemisphere). Cooling loop location indicated by arrows. Red dash box shows region of magnification in [C]. [C] Magnified image from [B] showing the cortex positioned beneath the loop is indistinguishable from elsewhere. Orientation axes: D = dorsal, R = rostral, M = medial.</p
    corecore