42 research outputs found

    Cortical depth dependent functional responses in humans at 7T: improved specificity with 3D GRASE

    Get PDF
    Ultra high fields (7T and above) allow functional imaging with high contrast-to-noise ratios and improved spatial resolution. This, along with improved hardware and imaging techniques, allow investigating columnar and laminar functional responses. Using gradient-echo (GE) (T2* weighted) based sequences, layer specific responses have been recorded from human (and animal) primary visual areas. However, their increased sensitivity to large surface veins potentially clouds detecting and interpreting layer specific responses. Conversely, spin-echo (SE) (T2 weighted) sequences are less sensitive to large veins and have been used to map cortical columns in humans. T2 weighted 3D GRASE with inner volume selection provides high isotropic resolution over extended volumes, overcoming some of the many technical limitations of conventional 2D SE-EPI, whereby making layer specific investigations feasible. Further, the demonstration of columnar level specificity with 3D GRASE, despite contributions from both stimulated echoes and conventional T2 contrast, has made it an attractive alternative over 2D SE-EPI. Here, we assess the spatial specificity of cortical depth dependent 3D GRASE functional responses in human V1 and hMT by comparing it to GE responses. In doing so we demonstrate that 3D GRASE is less sensitive to contributions from large veins in superficial layers, while showing increased specificity (functional tuning) throughout the cortex compared to GE

    Two-Photon Imaging of Calcium in Virally Transfected Striate Cortical Neurons of Behaving Monkey

    Get PDF
    Two-photon scanning microscopy has advanced our understanding of neural signaling in non-mammalian species and mammals. Various developments are needed to perform two-photon scanning microscopy over prolonged periods in non-human primates performing a behavioral task. In striate cortex in two macaque monkeys, cortical neurons were transfected with a genetically encoded fluorescent calcium sensor, memTNXL, using AAV1 as a viral vector. By constructing an extremely rigid and stable apparatus holding both the two-photon scanning microscope and the monkey's head, single neurons were imaged at high magnification for prolonged periods with minimal motion artifacts for up to ten months. Structural images of single neurons were obtained at high magnification. Changes in calcium during visual stimulation were measured as the monkeys performed a fixation task. Overall, functional responses and orientation tuning curves were obtained in 18.8% of the 234 labeled and imaged neurons. This demonstrated that the two-photon scanning microscopy can be successfully obtained in behaving primates

    Mapping the Organization of Axis of Motion Selective Features in Human Area MT Using High-Field fMRI

    Get PDF
    Functional magnetic resonance imaging (fMRI) at high magnetic fields has made it possible to investigate the columnar organization of the human brain in vivo with high degrees of accuracy and sensitivity. Until now, these results have been limited to the organization principles of early visual cortex (V1). While the middle temporal area (MT) has been the first identified extra-striate visual area shown to exhibit a columnar organization in monkeys, evidence of MT's columnar response properties and topographic layout in humans has remained elusive. Research using various approaches suggests similar response properties as in monkeys but failed to provide direct evidence for direction or axis of motion selectivity in human area MT. By combining state of the art pulse sequence design, high spatial resolution in all three dimensions (0.8 mm isotropic), optimized coil design, ultrahigh field magnets (7 Tesla) and novel high resolution cortical grid sampling analysis tools, we provide the first direct evidence for large-scale axis of motion selective feature organization in human area MT closely matching predictions from topographic columnar-level simulations

    A framework for the first‑person internal sensation of visual perception in mammals and a comparable circuitry for olfactory perception in Drosophila

    Get PDF
    Perception is a first-person internal sensation induced within the nervous system at the time of arrival of sensory stimuli from objects in the environment. Lack of access to the first-person properties has limited viewing perception as an emergent property and it is currently being studied using third-person observed findings from various levels. One feasible approach to understand its mechanism is to build a hypothesis for the specific conditions and required circuit features of the nodal points where the mechanistic operation of perception take place for one type of sensation in one species and to verify it for the presence of comparable circuit properties for perceiving a different sensation in a different species. The present work explains visual perception in mammalian nervous system from a first-person frame of reference and provides explanations for the homogeneity of perception of visual stimuli above flicker fusion frequency, the perception of objects at locations different from their actual position, the smooth pursuit and saccadic eye movements, the perception of object borders, and perception of pressure phosphenes. Using results from temporal resolution studies and the known details of visual cortical circuitry, explanations are provided for (a) the perception of rapidly changing visual stimuli, (b) how the perception of objects occurs in the correct orientation even though, according to the third-person view, activity from the visual stimulus reaches the cortices in an inverted manner and (c) the functional significance of well-conserved columnar organization of the visual cortex. A comparable circuitry detected in a different nervous system in a remote species-the olfactory circuitry of the fruit fly Drosophila melanogaster-provides an opportunity to explore circuit functions using genetic manipulations, which, along with high-resolution microscopic techniques and lipid membrane interaction studies, will be able to verify the structure-function details of the presented mechanism of perception

    Time-varying wing-twist improves aerodynamic efficiency of forward flight in butterflies

    Get PDF
    PMC3547021Insect wings can undergo significant chordwise (camber) as well as spanwise (twist) deformation during flapping flight but the effect of these deformations is not well understood. The shape and size of butterfly wings leads to particularly large wing deformations, making them an ideal test case for investigation of these effects. Here we use computational models derived from experiments on free-flying butterflies to understand the effect of time-varying twist and camber on the aerodynamic performance of these insects. High-speed videogrammetry is used to capture the wing kinematics, including deformation, of a Painted Lady butterfly (Vanessa cardui) in untethered, forward flight. These experimental results are then analyzed computationally using a high-fidelity, three-dimensional, unsteady Navier-Stokes flow solver. For comparison to this case, a set of non-deforming, flat-plate wing (FPW) models of wing motion are synthesized and subjected to the same analysis along with a wing model that matches the time-varying wing-twist observed for the butterfly, but has no deformation in camber. The simulations show that the observed butterfly wing (OBW) outperforms all the flat-plate wings in terms of usable force production as well as the ratio of lift to power by at least 29% and 46%, respectively. This increase in efficiency of lift production is at least three-fold greater than reported for other insects. Interestingly, we also find that the twist-only-wing (TOW) model recovers much of the performance of the OBW, demonstrating that wing-twist, and not camber is key to forward flight in these insects. The implications of this on the design of flapping wing micro-aerial vehicles are discussed.JH Libraries Open Access Fun

    Audiotactile interactions in temporal perception

    Full text link
    corecore