42 research outputs found

    High frequency oscillations in epileptic and non-epileptic human hippocampus during a cognitive task

    Get PDF
    Hippocampal high-frequency electrographic activity (HFOs) represents one of the major discoveries not only in epilepsy research but also in cognitive science over the past few decades. A fundamental challenge, however, has been the fact that physiological HFOs associated with normal brain function overlap in frequency with pathological HFOs. We investigated the impact of a cognitive task on HFOs with the aim of improving differentiation between epileptic and non-epileptic hippocampi in humans. Hippocampal activity was recorded with depth electrodes in 15 patients with focal epilepsy during a resting period and subsequently during a cognitive task. HFOs in ripple and fast ripple frequency ranges were evaluated in both conditions, and their rate, spectral entropy, relative amplitude and duration were compared in epileptic and non-epileptic hippocampi. The similarity of HFOs properties recorded at rest in epileptic and non-epileptic hippocampi suggests that they cannot be used alone to distinguish between hippocampi. However, both ripples and fast ripples were observed with higher rates, higher relative amplitudes and longer durations at rest as well as during a cognitive task in epileptic compared with non-epileptic hippocampi. Moreover, during a cognitive task, significant reductions of HFOs rates were found in epileptic hippocampi. These reductions were not observed in non-epileptic hippocampi. Our results indicate that although both hippocampi generate HFOs with similar features that probably reflect non-pathological phenomena, it is possible to differentiate between epileptic and non-epileptic hippocampi using a simple odd-ball task

    Joint inversion estimate of regional glacial isostatic adjustment in Antarctica considering a lateral varying Earth structure (ESA STSE Project REGINA)

    Get PDF
    A major uncertainty in determining the mass balance of the Antarctic ice sheet from measurements of satellite gravimetry, and to a lesser extent satellite altimetry, is the poorly known correction for the ongoing deformation of the solid Earth caused by glacial isostatic adjustment (GIA). Although much progress has been made in consistently modelling the ice-sheet evolution throughout the last glacial cycle, as well as the induced bedrock deformation caused by these load changes, forward models of GIA remain ambiguous due to the lack of observational constraints on the ice sheet's past extent and thickness and mantle rheology beneath the continent. As an alternative to forward modelling GIA, we estimate GIA from multiple space-geodetic observations: GRACE, Envisat/ICESat and GPS. Making use of the different sensitivities of the respective satellite observations to current and past surface mass (ice mass) change and solid Earth processes, we estimate GIA based on viscoelastic response functions to disc load forcing. We calculate and distribute the viscoelastic response functions according to estimates of the variability of lithosphere thickness and mantle viscosity in Antarctica. We compare our GIA estimate with published GIA corrections and evaluate its impact in determining the ice mass balance in Antarctica from GRACE and satellite altimetry. Particular focus is applied to the Amundsen Sea Sector in West Antarctica, where uplift rates of several cm/yr have been measured by GPS. We show that most of this uplift is caused by the rapid viscoelastic response to recent ice-load changes, enabled by the presence of a low-viscosity upper mantle in West Antarctica. This paper presents the second and final contribution summarizing the work carried out within a European Space Agency funded study, REGINA, (www.regina-science.eu)

    A Novel Statistical Model for Predicting the Efficacy of Vagal Nerve Stimulation in Patients With Epilepsy (Pre-X-Stim) Is Applicable to Different EEG Systems

    Get PDF
    Background: Identifying patients with intractable epilepsy who would benefit from therapeutic chronic vagal nerve stimulation (VNS) preoperatively remains a major clinical challenge. We have developed a statistical model for predicting VNS efficacy using only routine preimplantation electroencephalogram (EEG) recorded with the TruScan EEG device (Brazdil et al., 2019). It remains to be seen, however, if this model can be applied in different clinical settings. Objective: To validate our model using EEG data acquired with a different recording system. Methods: We identified a validation cohort of eight patients implanted with VNS, whose preimplantation EEG was recorded on the BrainScope device and who underwent the EEG recording according to the protocol. The classifier developed in our earlier work, named Pre-X-Stim, was then employed to classify these patients as predicted responders or non-responders based on the dynamics in EEG power spectra. Predicted and real-world outcomes were compared to establish the applicability of this classifier. In total, two validation experiments were performed using two different validation approaches (single classifier or classifier voting). Results: The classifier achieved 75% accuracy, 67% sensitivity, and 100% specificity. Only two patients, both real-life responders, were classified incorrectly in both validation experiments. Conclusion: We have validated the Pre-X-Stim model on EEGs from a different recording system, which indicates its application under different technical conditions. Our approach, based on preoperative EEG, is easily applied and financially undemanding and presents great potential for real-world clinical use

    Ex vivo drug response profiling detects recurrent sensitivity patterns in drug-resistant acute lymphoblastic leukemia

    Get PDF
    Drug sensitivity and resistance testing on diagnostic leukemia samples should provide important functional information to guide actionable target and biomarker discovery. We provide proof of concept data by profiling 60 drugs on 68 acute lymphoblastic leukemia (ALL) samples mostly from resistant disease in cocultures of bone marrow stromal cells. Patient-derived xenografts retained the original pattern of mutations found in the matched patient material. Stromal coculture did not prevent leukemia cell cycle activity, but a specific sensitivity profile to cell cycle-related drugs identified samples with higher cell proliferation both in vitro and in vivo as leukemia xenografts. In patients with refractory relapses, individual patterns of marked drug resistance and exceptional responses to new agents of immediate clinical relevance were detected. The BCL2inhibitor venetoclax was highly active below 10 nM in B-cell precursor ALL (BCP-ALL) subsets, including MLL-AF4 and TCF3-HLF ALL, and in some T-cell ALLs (T-ALLs), predicting in vivo activity as a single agent and in combination with dexamethasone and vincristine. Unexpected sensitivity to dasatinib with half maximal inhibitory concentration values below 20 nM was detected in 2 independent T-ALL cohorts, which correlated with similar cytotoxic activity of the SRC inhibitor KX2-391 and inhibition of SRC phosphorylation. A patient with refractory T-ALL was treated with dasatinib on the basis of drug profiling information and achieved a 5-month remission. Thus, drug profiling captures disease-relevant features and unexpected sensitivity to relevant drugs, which warrants further exploration of this functional assay in the context of clinical trials to develop drug repurposing strategies for patients with urgent medical needs.Peer reviewe

    Probabilistic functional tractography of the human cortex revisited

    Get PDF
    In patients with pharmaco-resistant focal epilepsies investigated with intracranial electroencephalography (iEEG), direct electrical stimulations of a cortical region induce cortico-cortical evoked potentials (CCEP) in distant cerebral cortex, which properties can be used to infer large scale brain connectivity. In 2013, we proposed a new probabilistic functional tractography methodology to study human brain connectivity. We have now been revisiting this method in the F-TRACT project (f-tract.eu) by developing a large multicenter CCEP database of several thousand stimulation runs performed in several hundred patients, and associated processing tools to create a probabilistic atlas of human cortico-cortical connections. Here, we wish to present a snapshot of the methods and data of F-TRACT using a pool of 213 epilepsy patients, all studied by stereo-encephalography with intracerebral depth electrodes. The CCEPs were processed using an automated pipeline with the following consecutive steps: detection of each stimulation run from stimulation artifacts in raw intracranial EEG (iEEG) files, bad channels detection with a machine learning approach, model-based stimulation artifact correction, robust averaging over stimulation pulses. Effective connectivity between the stimulated and recording areas is then inferred from the properties of the first CCEP component, i.e. onset and peak latency, amplitude, duration and integral of the significant part. Finally, group statistics of CCEP features are implemented for each brain parcel explored by iEEG electrodes. The localization (coordinates, white/gray matter relative positioning) of electrode contacts were obtained from imaging data (anatomical MRI or CT scans before and after electrodes implantation). The iEEG contacts were repositioned in different brain parcellations from the segmentation of patients' anatomical MRI or from templates in the MNI coordinate system. The F-TRACT database using the first pool of 213 patients provided connectivity probability values for 95% of possible intrahemispheric and 56% of interhemispheric connections and CCEP features for 78% of intrahemisheric and 14% of interhemispheric connections. In this report, we show some examples of anatomo-functional connectivity matrices, and associated directional maps. We also indicate how CCEP features, especially latencies, are related to spatial distances, and allow estimating the velocity distribution of neuronal signals at a large scale. Finally, we describe the impact on the estimated connectivity of the stimulation charge and of the contact localization according to the white or gray matter. The most relevant maps for the scientific community are available for download on f-tract. eu (David et al., 2017) and will be regularly updated during the following months with the addition of more data in the F-TRACT database. This will provide an unprecedented knowledge on the dynamical properties of large fiber tracts in human.Peer reviewe

    Structural brain abnormalities in the common epilepsies assessed in a worldwide ENIGMA study

    Get PDF

    Integration of airborne gravimetry data filtering into residual least-squares collocation: example from the 1 cm geoid experiment

    No full text
    Low-pass filters are commonly used for the processing of airborne gravity observations. In this paper, for the first time, we include the resulting correlations consistently in the functional and stochastic model of residual least-squares collocation. We demonstrate the necessity of removing high-frequency noise from airborne gravity observations, and derive corresponding parameters for a Gaussian low-pass filter. Thereby, we intend an optimal combination of terrestrial and airborne gravity observations in the mountainous area of Colorado. We validate the combination in the frame of our participation in ‘the 1 cm geoid experiment’. This regional geoid modeling inter-comparison exercise allows the calculation of a reference solution, which is defined as the mean value of 13 independent height anomaly results in this area. Our result performs among the best and with 7.5 mm shows the lowest standard deviation to the reference. From internal validation we furthermore conclude that the input from airborne and terrestrial gravity observations is consistent in large parts of the target area, but not necessarily in the highly mountainous areas. Therefore, the relative weighting between these two data sets turns out to be a main driver for the final result, and is an important factor in explaining the remaining differences between various height anomaly results in this experiment.Technische Universität München (1025

    Integration of airborne gravimetry data filtering into residual least-squares collocation: example from the 1 cm geoid experiment

    No full text
    <jats:title>Abstract</jats:title><jats:p>Low-pass filters are commonly used for the processing of airborne gravity observations. In this paper, for the first time, we include the resulting correlations consistently in the functional and stochastic model of residual least-squares collocation. We demonstrate the necessity of removing high-frequency noise from airborne gravity observations, and derive corresponding parameters for a Gaussian low-pass filter. Thereby, we intend an optimal combination of terrestrial and airborne gravity observations in the mountainous area of Colorado. We validate the combination in the frame of our participation in ‘the 1 cm geoid experiment’. This regional geoid modeling inter-comparison exercise allows the calculation of a reference solution, which is defined as the mean value of 13 independent height anomaly results in this area. Our result performs among the best and with 7.5 mm shows the lowest standard deviation to the reference. From internal validation we furthermore conclude that the input from airborne and terrestrial gravity observations is consistent in large parts of the target area, but not necessarily in the highly mountainous areas. Therefore, the relative weighting between these two data sets turns out to be a main driver for the final result, and is an important factor in explaining the remaining differences between various height anomaly results in this experiment.</jats:p&gt

    Determination of the Regularization Parameter to Combine Heterogeneous Observations in Regional Gravity Field Modeling

    No full text
    Various types of heterogeneous observations can be combined within a parameter estimation process using spherical radial basis functions (SRBFs) for regional gravity field refinement. In this process, regularization is in most cases inevitable, and choosing an appropriate value for the regularization parameter is a crucial issue. This study discusses the drawbacks of two frequently used methods for choosing the regularization parameter, which are the L-curve method and the variance component estimation (VCE). To overcome their drawbacks, two approaches for the regularization parameter determination are proposed, which combine the L-curve method and VCE. The first approach, denoted as “VCE-Lc”, starts with the calculation of the relative weights between the observation techniques by means of VCE. Based on these weights, the L-curve method is applied to determine the regularization parameter. In the second approach, called “Lc-VCE”, the L-curve method determines first the regularization parameter, and it is set to be fixed during the calculation of the relative weights between the observation techniques from VCE. To evaluate and compare the performance of the two proposed methods with the L-curve method and VCE, all these four methods are applied in six study cases using four types of simulated observations in Europe, and their modeling results are compared with the validation data. The RMS errors (w.r.t the validation data) obtained by VCE-Lc and Lc-VCE are smaller than those obtained from the L-curve method and VCE in all the six cases. VCE-Lc performs the best among these four tested methods, no matter if using SRBFs with smoothing or non-smoothing features. These results prove the benefits of the two proposed methods for regularization parameter determination when different data sets are to be combined
    corecore