496 research outputs found

    Detecting impaired language processing in patients with mild cognitive impairment using around-the-ear cEEgrid electrodes

    Get PDF
    Mild cognitive impairment (MCI) is the term used to identify those individuals with subjective and objective cognitive decline but with preserved activities of daily living and an absence of dementia. Although MCI can impact functioning in different cognitive domains, most notably episodic memory, relatively little is known about the comprehension of language in MCI. In this study, we used around-the-ear electrodes (cEEGrids) to identify impairments during language comprehension in patients with MCI. In a group of 23 patients with MCI and 23 age-matched controls, language comprehension was tested in a two-word phrase paradigm. We examined the oscillatory changes following word onset as a function of lexico-semantic single-word retrieval (e.g., swrfeq vs. swift) and multiword binding processes (e.g., horse preceded by swift vs. preceded by swrfeq). Electrophysiological signatures (as measured by the cEEGrids) were significantly different between patients with MCI and controls. In controls, lexical retrieval was associated with a rebound in the alpha/beta range, and binding was associated with a post-word alpha/beta suppression. In contrast, both the single-word retrieval and multiword binding signatures were absent in the MCI group. The signatures observed using cEEGrids in controls were comparable with those signatures obtained with a full-cap EEG setup. Importantly, our findings suggest that patients with MCI have impaired electrophysiological signatures for comprehending single words and multiword phrases. Moreover, cEEGrid setups provide a noninvasive and sensitive clinical tool for detecting early impairments in language comprehension in MCI

    Evaluation of cancer outcome assessment using MRI: A review of deep-learning methods

    Full text link
    Accurate evaluation of tumor response to treatment is critical to allow personalized treatment regimens according to the predicted response and to support clinical trials investigating new therapeutic agents by providing them with an accurate response indicator. Recent advances in medical imaging, computer hardware, and machine-learning algorithms have resulted in the increased use of these tools in the field of medicine as a whole and specifically in cancer imaging for detection and characterization of malignant lesions, prognosis, and assessment of treatment response. Among the currently available imaging techniques, magnetic resonance imaging (MRI) plays an important role in the evaluation of treatment assessment of many cancers, given its superior soft-tissue contrast and its ability to allow multiplanar imaging and functional evaluation. In recent years, deep learning (DL) has become an active area of research, paving the way for computer-assisted clinical and radiological decision support. DL can uncover associations between imaging features that cannot be visually identified by the naked eye and pertinent clinical outcomes. The aim of this review is to highlight the use of DL in the evaluation of tumor response assessed on MRI. In this review, we will first provide an overview of common DL architectures used in medical imaging research in general. Then, we will review the studies to date that have applied DL to magnetic resonance imaging for the task of treatment response assessment. Finally, we will discuss the challenges and opportunities of using DL within the clinical workflow

    Inflammation causes mood changes through alterations in subgenual cingulate activity and mesolimbic connectivity

    Get PDF
    BACKGROUND: Inflammatory cytokines are implicated in the pathophysiology of depression. In rodents, systemically administered inflammatory cytokines induce depression-like behavior. Similarly in humans, therapeutic interferon-alpha induces clinical depression in a third of patients. Conversely, patients with depression also show elevated pro-inflammatory cytokines. OBJECTIVES: To determine the neural mechanisms underlying inflammation-associated mood change and modulatory effects on circuits involved in mood homeostasis and affective processing. METHODS: In a double-blind, randomized crossover study, 16 healthy male volunteers received typhoid vaccination or saline (placebo) injection in two experimental sessions. Mood questionnaires were completed at baseline and at 2 and 3 hours. Two hours after injection, participants performed an implicit emotional face perception task during functional magnetic resonance imaging. Analyses focused on neurobiological correlates of inflammation-associated mood change and affective processing within regions responsive to emotional expressions and implicated in the etiology of depression. RESULTS: Typhoid but not placebo injection produced an inflammatory response indexed by increased circulating interleukin-6 and significant mood reduction at 3 hours. Inflammation-associated mood deterioration correlated with enhanced activity within subgenual anterior cingulate cortex (sACC) (a region implicated in the etiology of depression) during emotional face processing. Furthermore, inflammation-associated mood change reduced connectivity of sACC to amygdala, medial prefrontal cortex, nucleus accumbens, and superior temporal sulcus, which was modulated by peripheral interleukin-6. CONCLUSIONS: Inflammation-associated mood deterioration is reflected in changes in sACC activity and functional connectivity during evoked responses to emotional stimuli. Peripheral cytokine

    Pre-Stimulus Activity Predicts the Winner of Top-Down vs. Bottom-Up Attentional Selection

    Get PDF
    Our ability to process visual information is fundamentally limited. This leads to competition between sensory information that is relevant for top-down goals and sensory information that is perceptually salient, but task-irrelevant. The aim of the present study was to identify, from EEG recordings, pre-stimulus and pre-saccadic neural activity that could predict whether top-down or bottom-up processes would win the competition for attention on a trial-by-trial basis. We employed a visual search paradigm in which a lateralized low contrast target appeared alone, or with a low (i.e., non-salient) or high contrast (i.e., salient) distractor. Trials with a salient distractor were of primary interest due to the strong competition between top-down knowledge and bottom-up attentional capture. Our results demonstrated that 1) in the 1-sec pre-stimulus interval, frontal alpha (8–12 Hz) activity was higher on trials where the salient distractor captured attention and the first saccade (bottom-up win); and 2) there was a transient pre-saccadic increase in posterior-parietal alpha (7–8 Hz) activity on trials where the first saccade went to the target (top-down win). We propose that the high frontal alpha reflects a disengagement of attentional control whereas the transient posterior alpha time-locked to the saccade indicates sensory inhibition of the salient distractor and suppression of bottom-up oculomotor capture
    • …
    corecore