21 research outputs found

    Comparison of non-invasive, scalp-recorded auditory steady-state responses in humans, rhesus monkeys, and common marmosets

    Get PDF
    Auditory steady-state responses (ASSRs) are basic neural responses used to probe the ability of auditory circuits to produce synchronous activity to repetitive external stimulation. Reduced ASSR has been observed in patients with schizophrenia, especially at 40 Hz. Although ASSR is a translatable biomarker with a potential both in animal models and patients with schizophrenia, little is known about the features of ASSR in monkeys. Herein, we recorded the ASSR from humans, rhesus monkeys, and marmosets using the same method to directly compare the characteristics of ASSRs among the species. We used auditory trains on a wide range of frequencies to investigate the suitable frequency for ASSRs induction, because monkeys usually use stimulus frequency ranges different from humans for vocalization. We found that monkeys and marmosets also show auditory event-related potentials and phase-locking activity in gamma-frequency trains, although the optimal frequency with the best synchronization differed among these species. These results suggest that the ASSR could be a useful translational, cross-species biomarker to examine the generation of gamma-band synchronization in nonhuman primate models of schizophrenia

    Cerebral cortical processing time is elongated in human brain evolution

    Get PDF
    サルより遅いヒトの脳処理 --進化するほど脳の回転は遅くなる!?--. 京都大学プレスリリース. 2022-01-26.An increase in number of neurons is presumed to underlie the enhancement of cognitive abilities in brain evolution. The evolution of human cognition is then expected to have accompanied a prolongation of net neural-processing time due to the accumulation of processing time of individual neurons over an expanded number of neurons. Here, we confirmed this prediction and quantified the amount of prolongation in vivo, using noninvasive measurements of brain responses to sounds in unanesthetized human and nonhuman primates. Latencies of the N1 component of auditory-evoked potentials recorded from the scalp were approximately 40, 50, 60, and 100 ms for the common marmoset, rhesus monkey, chimpanzee, and human, respectively. Importantly, the prominent increase in human N1 latency could not be explained by the physical lengthening of the auditory pathway, and therefore reflected an extended dwell time for auditory cortical processing. A longer time window for auditory cortical processing is advantageous for analyzing time-varying acoustic stimuli, such as those important for speech perception. A novel hypothesis concerning human brain evolution then emerges: the increase in cortical neuronal number widened the timescale of sensory cortical processing, the benefits of which outweighed the disadvantage of slow cognition and reaction

    Cerebral Substrates for Controlling Rhythmic Movements

    No full text
    Our daily lives are filled with rhythmic movements, such as walking, sports, and dancing, but the mechanisms by which the brain controls rhythmic movements are poorly understood. In this review, we examine the literature on neuropsychological studies of patients with focal brain lesions, and functional brain imaging studies primarily using finger-tapping tasks. These studies suggest a close connection between sensory and motor processing of rhythm, with no apparent distinction between the two functions. Thus, we conducted two functional brain imaging studies to survey the rhythm representations relatively independent of sensory and motor functions. First, we determined brain activations related to rhythm processing in a sensory modality-independent manner. Second, we examined body part-independent brain activation related to rhythm reproduction. Based on previous literature, we discuss how brain areas contribute rhythmic motor control. Furthermore, we also discuss the mechanisms by which the brain controls rhythmic movements

    The influence of tempo upon the rhythmic motor control in macaque monkeys.

    Get PDF
    We examined behavioral features of isochronous repetitive movements in two macaques. The monkeys were required to press a button repetitively in response to external cues. If the cue-intervals were constant (isochronous) and sub-second, the reaction time was shorter than in random-interval condition. In contrast, in the supra-second isochronous conditions, the reaction time was not different from random-interval condition. The results suggest that the monkeys can acquire isochronous rhythms if the intervals are sub-second, probably depending on the automatic timing system. However, the conscious timing system for supra-second intervals is not well developed in monkeys, unlike humans

    Evolutionary elongation of the time window of integration in auditory cortex: Macaque vs. human comparison of the effects of sound duration on auditory evoked potentials

    Get PDF
    The auditory cortex integrates auditory information over time to obtain neural representations of sound events, the time scale of which critically affects perception. This work investigated the species differences in the time scale of integration by comparing humans and monkeys regarding how their scalp-recorded cortical auditory evoked potentials (CAEPs) decrease in amplitude as stimulus duration is shortened from 100 ms (or longer) to 2 ms. Cortical circuits tuned to processing sounds at short time scales would continue to produce large CAEPs to brief sounds whereas those tuned to longer time scales would produce diminished responses. Four peaks were identified in the CAEPs and labeled P1, N1, P2, and N2 in humans and mP1, mN1, mP2, and mN2 in monkeys. In humans, the N1 diminished in amplitude as sound duration was decreased, consistent with the previously described temporal integration window of N1 (>50 ms). In macaques, by contrast, the mN1 was unaffected by sound duration, and it was clearly elicited by even the briefest sounds. Brief sounds also elicited significant mN2 in the macaque, but not the human N2. Regarding earlier latencies, both P1 (humans) and mP1 (macaques) were elicited at their full amplitudes even by the briefest sounds. These findings suggest an elongation of the time scale of late stages of human auditory cortical processing, as reflected by N1/mN1 and later CAEP components. Longer time scales of integration would allow neural representations of complex auditory features that characterize speech and music

    Laminar Pattern of Projections Indicates the Hierarchical Organization of the Anterior Cingulate-Temporal Lobe Emotion System

    Get PDF
    The anterior cingulate cortex (ACC), surrounding the genu of the corpus callosum, plays important roles in emotional processing and is functionally divided into the dorsal, perigenual, and subgenual subregions (dACC, pgACC, and sgACC, respectively). Previous studies have suggested that the pgACC and sgACC have distinctive roles in the regulation of emotion. In order to elicit appropriate emotional responses, these ACC regions require sensory information from the environment. Anatomically, the ACC has rich connections with the temporal lobe, where the higher-order processing of sensory information takes place. To clarify the organization of sensory inputs into the ACC subregions, we injected neuronal tracers into the pgACC, sgACC, and dACC and compared the afferent connections. Previously, we analyzed the afferent projections from the amygdala and found a distinct pattern for the sgACC. In the present study, the patterns of the afferent projections were analyzed in the temporal cortex, especially the temporal pole (TP) and medial temporal areas. After tracers were injected into the sgACC, we observed labeled neurons in the TP and the subiculum of the hippocampal formation. The majority of the labeled cell bodies were found in the superficial layers of the TP (“feedforward” type projections). The pgACC received afferent projections from the TP, the entorhinal cortex (EC), and the parahippocampal cortex (PHC), but not from the hippocampus. In each area, the labeled cells were mainly found in the deep layers (“feedback” type projection). The pattern for the dACC was similar to that for the pgACC. Previous studies suggested that the pgACC, but not the sgACC receive projections from the dorsolateral prefrontal cortex (DLPFC). These data suggest that the sgACC plays crucial roles for emotional responses based on sensory and mnemonic inputs from the anterior temporal lobe, whereas the pgACC is more related to the cognitive control of emotion

    Functional connectivity during rhythm encoding.

    No full text
    <p>A) Three seed regions; right IPL (red), right IFG (blue), and left IFG (yellow) from left to right. B) Brain areas exhibiting significantly increased coupling with the seed regions in the presence of rhythm encoding. Colored areas show significant coupling with the seed region in the same color. Red, blue, and yellow regions show significant functional connectivity with the right IPL, right IFG, and let IFG, respectively. Threshold: <i>P</i> < 0.05; FWE-corrected of the clusters. Abbreviations: CB, cerebellum; IFG, inferior frontal gyrus; STG, superior temporal gyrus.</p

    A user-oriented evaluation of digital libraries: case study the 'electronic journals' service of the library and information service of the University of Patras, Greece

    Get PDF
    Περιέχει το πλήρες κείμενοRespondents were asked to indicate which factors would discourage them from accessing an e-journals service. The choices provided by the questionnaire are detailed in Table XVIII. There was also the "other" option where users could indicate any other factor. A total of 203 people responded to this question. The most common reason cited for not reading an e-journal was the lack of enough information relevant to the users' interests - 51.2 per cent mentioned it

    Rhythm working memory tasks.

    No full text
    <p>A sample rhythm pattern was presented at the beginning of each trial. Each rhythm pattern consisted of three repeats of a three-pure-tone sequence. The participants were required to memorize the rhythm pattern within 6 s (encoding phase), to maintain the rhythm information for 6–12 s (maintenance phase), and reproduce it by tapping with the right index finger, left index finger or right foot, or by articulation within 8 s (retrieval phase). We used 20 rhythmic patterns for each participant. The two out of three durations (SOA1, SOA2 and IUI) in each pattern were constantly same. The SOA and IUI were chosen from six possible durations (0.4, 0.5, 0.6, 0.8, 1.0 or 1.2 s) so that the duration of each rhythmic pattern ranged from 4 to 6 s. Abbreviations: SOA, stimulus onset asynchrony; IUI, inter-unit-interval.</p
    corecore