677 research outputs found

    Olfactory modulation of flight in Drosophila is sensitive, selective and rapid

    Get PDF
    Freely flying Drosophila melanogaster respond to odors by increasing their flight speed and turning upwind. Both these flight behaviors can be recapitulated in a tethered fly, which permits the odor stimulus to be precisely controlled. In this study, we investigated the relationship between these behaviors and odor-evoked activity in primary sensory neurons. First, we verified that these behaviors are abolished by mutations that silence olfactory receptor neurons (ORNs). We also found that antennal mechanosensors in Johnston's organ are required to guide upwind turns. Flight responses to an odor depend on the identity of the ORNs that are active, meaning that these behaviors involve odor discrimination and not just odor detection. Flight modulation can begin rapidly (within about 85 ms) after the onset of olfactory transduction. Moreover, just a handful of spikes in a single ORN type is sufficient to trigger these behaviors. Finally, we found that the upwind turn is triggered independently from the increase in wingbeat frequency, implying that ORN signals diverge to activate two independent and parallel motor commands. Together, our results show that odor-evoked flight modulations are rapid and sensitive responses to specific patterns of sensory neuron activity. This makes these behaviors a useful paradigm for studying the relationship between sensory neuron activity and behavioral decision-making in a simple and genetically tractable organism

    The Laminar Organization of Visual Cortex: A Unified View of Development, Learning, and Grouping

    Full text link
    Why are all sensory and cognitive neocortex organized into layered circuits? How do these layers organize circuits that form functional columns in cortical maps? How do bottom-up, top-down, and horizontal interactions within the cortical layers generate adaptive behaviors. This chapter summarizes an evolving neural model which suggests how these interactions help the visual cortex to realize: (1) the binding process whereby cortex groups distributed data into coherent object representations; (2) the attentional process whereby cortex selectively processes important events; and (3) the developmental and learning processes whereby cortex shapes its circuits to match environmental constraints. It is suggested that the mechanisms which achieve property (3) imply properties of (I) and (2). New computational ideas about feedback systems suggest how neocortex develops and learns in a stable way, and why top-down attention requires converging bottom-up inputs to fully activate cortical cells, whereas perceptual groupings do not.Defense Advanced Research Projects Agency and the Office of Naval Research (N00014-95-1-0409); National Science Foundation (IRI-97-20333); Office of Naval Research (N00014-95-1-0657

    How does the Cerebral Cortex Work? Learning, Attention, and Grouping by the Laminar Circuits of Visual Cortex

    Full text link
    The organization of neocortex into layers is one of its most salient anatomical features. These layers include circuits that form functional columns in cortical maps. A major unsolved problem concerns how bottom-up, top-down, and horizontal interactions are organized within cortical layers to generate adaptive behaviors. This article models how these interactions help visual co1tex to realize: (I) the binding process whereby cortex groups distributed data into coherent object representations; (2) the attentional process whereby cortex selectively processes important events; and (3) the developmental and learning processes whereby cortex shapes its circuits to match environmental constraints. New computational ideas about feedback systems suggest how neocortex develops and learns in a stable way, and why top-down attention requires converging bottom-up inputs to fully activate cortical cells, whereas perceptual groupings do not.Defense Advanced Research Projects Agency; National Science Foundation (IRI-97-20333); Office of Naval Research (N00014-95-1-0409, N00014-95-1-0657

    An object's smell in the multisensory brain : how our senses interact during olfactory object processing

    Get PDF
    Object perception is a remarkable and fundamental cognitive ability that allows us to interpret and interact with the world we are living in. In our everyday life, we constantly perceive objects–mostly without being aware of it and through several senses at the same time. Although it might seem that object perception is accomplished without any effort, the underlying neural mechanisms are anything but simple. How we perceive objects in the world surrounding us is the result of a complex interplay of our senses. The aim of the present thesis was to explore, by means of functional magnetic resonance imaging, how our senses interact when we perceive an object’s smell in a multisensory setting where the amount of sensory stimulation increases, as well as in a unisensory setting where we perceive an object’s smell in isolation. In Study I, we sought to determine whether and how multisensory object information influences the processing of olfactory object information in the posterior piriform cortex (PPC), a region linked to olfactory object encoding. In Study II, we then expanded our search for integration effects during multisensory object perception to the whole brain because previous research has demonstrated that multisensory integration is accomplished by a network of early sensory cortices and higher-order multisensory integration sites. We specifically aimed at determining whether there exist cortical regions that process multisensory object information independent of from which senses and from how many senses the information arises. In Study III, we then sought to unveil how our senses interact during olfactory object perception in a unisensory setting. Other previous studies have shown that even in such unisensory settings, olfactory object processing is not exclusively accomplished by regions within the olfactory system but instead engages a more widespread network of brain regions, such as regions belonging to the visual system. We aimed at determining what this visual engagement represents. That is, whether areas of the brain that are principally concerned with processing visual object information also hold neural representations of olfactory object information, and if so, whether these representations are similar for smells and pictures of the same objects. In Study I we demonstrated that assisting inputs from our senses of vision and hearing increase the processing of olfactory object information in the PPC, and that the more assisting input we receive the more the processing is enhanced. As this enhancement occurred only for matching inputs, it likely reflects integration of multisensory object information. Study II provided evidence for convergence of multisensory object information in form of a non-linear response enhancement in the inferior parietal cortex: activation increased for bimodal compared to unimodal stimulation, and increased even further for trimodal compared to bimodal stimulation. As this multisensory response enhancement occurred independent of the congruency of the incoming signals, it likely reflects a process of relating the incoming sensory information streams to each other. Finally, Study III revealed that regions of the ventral visual object stream are engaged in recognition of an object’s smell and represent olfactory object information in form of distinct neural activation patterns. While the visual system encodes information about both visual and olfactory objects, it appears to keep information from the two sensory modalities separate by representing smells and pictures of objects differently. Taken together, the studies included in this thesis reveal that olfactory object perception is a multisensory process that engages a widespread network of early sensory as well higher-order cortical regions, even if we do not encounter ourselves in a multisensory setting but exclusively perceive an object’s smell

    Effect of Olfaction on the Perception of Movie Clips

    Get PDF
    Odours are important to many species but their effect on human perception in the context of concurrent auditory and visual stimulation has received little investigation. Here we examined how the experience of viewing audio-visual movie clips changes when accompanied by congruent or incongruent odours. Using an olfactometer to control odourant delivery, thirty-five undergraduate students from Western University were randomly presented 36 different odour-video pairs twice. Following each presentation, participants completed three Likert scales to assess multisensory interaction in terms of engagement, pleasantness, and emotional arousal. Comparison of congruent and incongruent odours to the no odour control condition revealed that incongruent odours had a greater effect than congruent odours on participant ratings, and that this effect acted to negatively influence experience, reducing engagement, pleasantness, and emotional arousal. There was little difference between congruent odours and no odour on ratings of engagement and emotional arousal; however, even congruent odours reduced pleasantness ratings, suggesting all odours used were, to an extent, unpleasant. An interaction suggested that certain movies were more strongly modulated by odour than others. We interpret our results as evidence of crossmodal competition, in which the presence of an odour leads to suppression of the auditory and visual modalities. This was confirmed using functional magnetic resonance imaging in a single participant. Future research should continue to investigate the surprising role odour plays in multisensory interaction

    The Laminar Architecture of Visual Cortex and Image Processing Technology

    Full text link
    The mammalian neocortex is organized into layers which include circuits that form functional columns in cortical maps. A major unsolved problem concerns how bottom-up, top-down, and horizontal interactions are organized within cortical layers to generate adaptive behaviors. This article summarizes a model, called the LAMINART model, of how these interactions help visual cortex to realize: (1) the binding process whereby cortex groups distributed data into coherent object representations; (2) the attentional process whereby cortex selectively processes important events; and (3) the developmental and learning processes whereby cortex stably grows and tunes its circuits to match environmental constraints. Such Laminar Computing completes perceptual groupings that realize the property of Analog Coherence, whereby winning groupings bind together their inducing features without losing their ability to represent analog values of these features. Laminar Computing also efficiently unifies the computational requirements of preattentive filtering and grouping with those of attentional selection. It hereby shows how Adaptive Resonance Theory (ART) principles may be realized within the laminar circuits of neocortex. Applications include boundary segmentation and surface filling-in algorithms for processing Synthetic Aperture Radar images.Defense Advanced Research Projects Agency and the Office of Naval Research (N00014-95-1-0409); Office of Naval Research (N00014-95-1-0657

    Towards a Unified Theory of Neocortex: Laminar Cortical Circuits for Vision and Cognition

    Full text link
    A key goal of computational neuroscience is to link brain mechanisms to behavioral functions. The present article describes recent progress towards explaining how laminar neocortical circuits give rise to biological intelligence. These circuits embody two new and revolutionary computational paradigms: Complementary Computing and Laminar Computing. Circuit properties include a novel synthesis of feedforward and feedback processing, of digital and analog processing, and of pre-attentive and attentive processing. This synthesis clarifies the appeal of Bayesian approaches but has a far greater predictive range that naturally extends to self-organizing processes. Examples from vision and cognition are summarized. A LAMINART architecture unifies properties of visual development, learning, perceptual grouping, attention, and 3D vision. A key modeling theme is that the mechanisms which enable development and learning to occur in a stable way imply properties of adult behavior. It is noted how higher-order attentional constraints can influence multiple cortical regions, and how spatial and object attention work together to learn view-invariant object categories. In particular, a form-fitting spatial attentional shroud can allow an emerging view-invariant object category to remain active while multiple view categories are associated with it during sequences of saccadic eye movements. Finally, the chapter summarizes recent work on the LIST PARSE model of cognitive information processing by the laminar circuits of prefrontal cortex. LIST PARSE models the short-term storage of event sequences in working memory, their unitization through learning into sequence, or list, chunks, and their read-out in planned sequential performance that is under volitional control. LIST PARSE provides a laminar embodiment of Item and Order working memories, also called Competitive Queuing models, that have been supported by both psychophysical and neurobiological data. These examples show how variations of a common laminar cortical design can embody properties of visual and cognitive intelligence that seem, at least on the surface, to be mechanistically unrelated.National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Cortical Dynamics of Contextually-Cued Attentive Visual Learning and Search: Spatial and Object Evidence Accumulation

    Full text link
    How do humans use predictive contextual information to facilitate visual search? How are consistently paired scenic objects and positions learned and used to more efficiently guide search in familiar scenes? For example, a certain combination of objects can define a context for a kitchen and trigger a more efficient search for a typical object, such as a sink, in that context. A neural model, ARTSCENE Search, is developed to illustrate the neural mechanisms of such memory-based contextual learning and guidance, and to explain challenging behavioral data on positive/negative, spatial/object, and local/distant global cueing effects during visual search. The model proposes how global scene layout at a first glance rapidly forms a hypothesis about the target location. This hypothesis is then incrementally refined by enhancing target-like objects in space as a scene is scanned with saccadic eye movements. The model clarifies the functional roles of neuroanatomical, neurophysiological, and neuroimaging data in visual search for a desired goal object. In particular, the model simulates the interactive dynamics of spatial and object contextual cueing in the cortical What and Where streams starting from early visual areas through medial temporal lobe to prefrontal cortex. After learning, model dorsolateral prefrontal cortical cells (area 46) prime possible target locations in posterior parietal cortex based on goalmodulated percepts of spatial scene gist represented in parahippocampal cortex, whereas model ventral prefrontal cortical cells (area 47/12) prime possible target object representations in inferior temporal cortex based on the history of viewed objects represented in perirhinal cortex. The model hereby predicts how the cortical What and Where streams cooperate during scene perception, learning, and memory to accumulate evidence over time to drive efficient visual search of familiar scenes.CELEST, an NSF Science of Learning Center (SBE-0354378); SyNAPSE program of Defense Advanced Research Projects Agency (HR0011-09-3-0001, HR0011-09-C-0011

    Olfaction, Emtion & the Amygdala: arousal-dependent modulation of long-term autobiographical memory and its association with olfaction

    Get PDF
    The sense of smell is set apart from other sensory modalities. Odours possess the capacity to trigger immediately strong emotional memories. Moreover, odorous stimuli provide a higher degree of memory retention than other sensory stimuli. Odour perception, even in its most elemental form - olfaction - already involves limbic structures. This early involvement is not paralleled in other sensory modalities. Bearing in mind the considerable connectivity with limbic structures, and the fact that an activation of the amygdala is capable of instantaneously evoking emotions and facilitating the encoding of memories, it is unsurprising that the sense of smell has its characteristic nature. The aim of this review is to analyse current understanding of higher olfactory information processing as it relates to the ability of odours to spontaneously cue highly vivid, affectively toned, and often very old autobiographical memories (episodes known anecdotally as Proust phenomena). Particular emphasis is placed on the diversity of functions attributed to the amygdala. Its role in modulating the encoding and retrieval of long-term memory is investigated with reference to lesion, electrophysiological, immediate early gene, and functional imaging studies in both rodents and humans. Additionally, the influence of hormonal modulation and the adrenergic system on emotional memory storage is outlined. I finish by proposing a schematic of some of the critical neural pathways that underlie the odour-associated encoding and retrieval of emotionally toned autobiographical memories
    • …
    corecore