256 research outputs found

    Neuronal Modifications During Visuomotor Association Learning Assessed by Electric Brain Tomography

    Get PDF
    Summary: In everyday life specific situations need specific reactions. Through repetitive practice, such stimulus-response associations can be learned and performed automatically. The aim of the present EEG study was the illustration of learning dependent modifications in neuronal pathways during short-term practice of visuomotor associations. Participants performed a visuomotor association task including four visual stimuli, which should be associated with four keys, learned by trial and error. We assumed that distinct cognitive processes might be dominant during early learning e.g., visual perception and decision making. Advanced learning, however, might be indicated by increased neuronal activation in integration- and memory-related regions. For assessment of learning progress, visual- and movement-related brain potentials were measured and compared between three learning stages (early, intermediate, and late). The results have revealed significant differences between the learning stages during distinct time intervals. Related to visual stimulus presentation, Low Resolution Electromagnetic Brain Tomography (LORETA) revealed strong neuronal activation in a parieto-prefrontal network in time intervals between 100-400 ms post event and during early learning. In relation to the motor response neuronal activation was significantly increased during intermediate compared to early learning. Prior to the motor response (120-360 ms pre event), neuronal activation was detected in the cingulate motor area and the right dorsal premotor cortex. Subsequent to the motor response (68-430 ms post event) there was an increase in neuronal activation in visuomotor- and memory-related areas including parietal cortex, SMA, premotor, dorsolateral prefrontal, and parahippocampal cortex. The present study has shown specific time elements of a visuomotor-memory-related network, which might support learning progress during visuomotor association learnin

    Time Course of Neural Activity Correlated with Colored-Hearing Synesthesia

    Get PDF
    Synesthesia is defined as the involuntary and automatic perception of a stimulus in 2 or more sensory modalities (i.e., cross-modal linkage). Colored-hearing synesthetes experience colors when hearing tones or spoken utterances. Based on event-related potentials we employed electric brain tomography with high temporal resolution in colored-hearing synesthetes and nonsynesthetic controls during auditory verbal stimulation. The auditory-evoked potentials to words and letters were different between synesthetes and controls at the N1 and P2 components, showing longer latencies and lower amplitudes in synesthetes. The intracerebral sources of these components were estimated with low-resolution brain electromagnetic tomography and revealed stronger activation in synesthetes in left posterior inferior temporal regions, within the color area in the fusiform gyrus (V4), and in orbitofrontal brain regions (ventromedial and lateral). The differences occurred as early as 122 ms after stimulus onset. Our findings replicate and extend earlier reports with functional magnetic resonance imaging and positron emission tomography in colored-hearing synesthesia and contribute new information on the time course in synesthesia demonstrating the fast and possibly automatic processing of this unusual and remarkable phenomeno

    Das Gesetz des abnehmenden Bodenertrages seit Justus von Liebig : eine dogmengeschichtliche Untersuchung

    Get PDF
    - Inhalt #9- Justus von Liebig #11- Das Gesetz des abnehmenden Bodenertrages seit dem Aufkommen der internationalen landwirtschaftlichen Konkurrenz #8

    Neural correlate of spatial presence in an arousing and noninteractive virtual reality: an EEG and psychophysiology study

    Get PDF
    Using electroencephalography (EEG), psychophysiology, and psychometric measures, this is the first study which investigated the neurophysiological underpinnings of spatial presence. Spatial presence is considered a sense of being physically situated within a spatial environment portrayed by a medium (e.g., television, virtual reality). Twelve healthy children and 11 healthy adolescents were watching different virtual roller coaster scenarios. During a control session, the roller coaster cab drove through a horizontal roundabout track. The following realistic roller coaster rides consisted of spectacular ups, downs, and loops. Low-resolution brain electromagnetic tomography (LORETA) and event-related desynchronization (ERD) were used to analyze the EEG data. As expected, we found that, compared to the control condition, experiencing a virtual roller coaster ride evoked in both groups strong SP experiences, increased electrodermal reactions, and activations in parietal brain areas known to be involved in spatial navigation. In addition, brain areas that receive homeostatic afferents from somatic and visceral sensations of the body were strongly activated. Most interesting, children (as compared to adolescents) reported higher spatial presence experiences and demonstrated a different frontal activation pattern. While adolescents showed increased activation in prefrontal areas known to be involved in the control of executive functions, children demonstrated a decreased activity in these brain regions. Interestingly, recent neuroanatomical and neurophysiological studies have shown that the frontal brain continues to develop to adult status well into adolescence. Thus, the result of our study implies that the increased spatial presence experience in children may result from the not fully developed control functions of the frontal cortex

    Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures

    Full text link
    Emotional regulation strategies are used by people to influence their emotional responses to external or internal emotional stimuli. The aim of this study is to evaluate the brain activations that are associated with the application of two different emotional regulation strategies (cognitive reappraisal and expressive suppression) during virtual reality mood induction procedures. We used Emotiv EPOC to measure the brain electrical activity of participants while sadness is induced using a virtual reality environment. We monitored 24 participants, who were distributed among three experimental groups: a control group, a cognitive reappraisal group and an expressive suppression group. In the control group, we found significant activations in several right frontal regions that are related to the induction of negative emotions . We also found significant activations in the limbic, occipital, and parietal regions in the emotional regulation groups. These regions are related to the application of emotional regulation strategies. The results are consistent with those shown in the literature, which were obtained through clinical neuroimaging systems.The work of A. Rodriguez was funded by the Spanish MEC under an FPI Grant BES-2011-043316. The work of Miriam Clemente was funded by the Generalitat Valenciana under a VALi+d Grant.Rodríguez Ortega, A.; Rey, B.; Clemente Bellido, M.; Wrzesien, M.; Alcañiz Raya, ML. (2015). Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures. Expert Systems with Applications. 42(3):1699-1709. https://doi.org/10.1016/j.eswa.2014.10.006S1699170942

    Gender differences in hemispheric asymmetry for face processing

    Get PDF
    BACKGROUND: Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. RESULTS: In this study, event-related brain potentials (ERPs) were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men) while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. CONCLUSION: Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    Queen mandibular pheromone: questions that remain to be resolved

    No full text
    The discovery of ‘queen substance’, and the subsequent identification and synthesis of keycomponents of queen mandibular pheromone, has been of significant importance to beekeepers and to thebeekeeping industry. Fifty years on, there is greater appreciation of the importance and complexity of queenpheromones, but many mysteries remain about the mechanisms through which pheromones operate. Thediscovery of sex pheromone communication in moths occurred within the same time period, but in this case,intense pressure to find better means of pest management resulted in a remarkable focusing of research activityon understanding pheromone detection mechanisms and the central processing of pheromone signals in themoth. We can benefit from this work and here, studies on moths are used to highlight some of the gaps in ourknowledge of pheromone communication in bees. A better understanding of pheromone communication inhoney bees promises improved strategies for the successful management of these extraordinary animals

    Towards an understanding of molecule capture by the antennae of male beetles belonging to the Genus <i>Rhipicera</i> (Coleoptera, Rhipiceridae)

    Get PDF
    Working on the hypothesis that an important function of the lamellate antennae of adult male beetles belonging to the genus Rhipicera is to detect scent associated with female conspecifics, and using field observations, anatomical models derived from X-ray microcomputed tomography, and scanning electron microscopy, we have investigated the behavioral, morphological, and morphometric factors that may influence molecule capture by these antennae. We found that male beetles fly upwind in a zigzag manner, or face upwind when perching, behavior consistent with an animal that is tracking scent. Furthermore, the ultrastructure of the male and female antennae, like their gross morphology, is sexually dimorphic, with male antennae possessing many more of a particular type of receptor-the sensillum placodeum-than their female counterparts (approximately 30,000 vs. 100 per antenna, respectively). Based on this disparity, we assume that the sensilla placodea on the male antennae are responsible for detecting scent associated with female Rhipicera beetles. Molecule capture by male antennae in their alert, fanned states is likely to be favoured by: (a) male beetles adopting prominent, upright positions on high points when searching for scent; (b) the partitioning of antennae into many small segments; (c) antennal morphometry (height, width, outline area, total surface area, leakiness, and narrow channels); (d) the location of the sensilla placodea where they are most likely to encounter odorant molecules; and (e) well dispersed sensilla placodea. The molecule-capturing ability of male Rhipicera antennae may be similar to that of the pectinate antennae of certain male moths.</p
    • …
    corecore