122 research outputs found
A Fast EEG Forecasting Algorithm for Phase-Locked Transcranial Electrical Stimulation of the Human Brain
A growing body of research suggests that non-invasive electrical brain stimulation can more effectively modulate neural activity when phase-locked to the underlying brain rhythms. Transcranial alternating current stimulation (tACS) can potentially stimulate the brain in-phase to its natural oscillations as recorded by electroencephalography (EEG), but matching these oscillations is a challenging problem due to the complex and time-varying nature of the EEG signals. Here we address this challenge by developing and testing a novel approach intended to deliver tACS phase-locked to the activity of the underlying brain region in real-time. This novel approach extracts phase and frequency from a segment of EEG, then forecasts the signal to control the stimulation. A careful tuning of the EEG segment length and prediction horizon is required and has been investigated here for different EEG frequency bands. The algorithm was tested on EEG data from 5 healthy volunteers. Algorithm performance was quantified in terms of phase-locking values across a variety of EEG frequency bands. Phase-locking performance was found to be consistent across individuals and recording locations. With current parameters, the algorithm performs best when tracking oscillations in the alpha band (8–13 Hz), with a phase-locking value of 0.77 ± 0.08. Performance was maximized when the frequency band of interest had a dominant frequency that was stable over time. The algorithm performs faster, and provides better phase-locked stimulation, compared to other recently published algorithms devised for this purpose. The algorithm is suitable for use in future studies of phase-locked tACS in preclinical and clinical applications
Treatment-resistant major depressive disorder: Canadian expert consensus on definition and assessment
Background: Treatment-resistant depression (TRD) is a debilitating chronic mental illness that confers increased morbidity and mortality, decreases the quality of life, impairs occupational, social, and offspring development, and translates into increased costs on the healthcare system. The goal of this study is to reach an agreement on the concept, definition, staging model, and assessment of TRD. Methods: This study involved a review of the literature and a modified Delphi process for consensus agreement. The Appraisal of Guidelines for Research & Evaluation II guidelines were followed for the literature appraisal. Literature was assessed for quality and strength of evidence using the grading, assessment, development, and evaluations system. Canadian national experts in depression were invited for the modified Delphi process based on their prior clinical and research expertize. Survey items were considered to have reached a consensus if 80% or more of the experts supported the statement. Results: Fourteen Canadian experts were recruited for three rounds of surveys to reach a consensus on a total of 27 items. Experts agreed that a dimensional definition for treatment resistance was a useful concept to describe the heterogeneity of this illness. The use of staging models and clinical scales was recommended in evaluating depression. Risk factors and comorbidities were identified as potential predictors for treatment resistance. Conclusions: TRD is a meaningful concept both for clinical practice and research. An operational definition for TRD will allow for opportunities to improve the validity of predictors and therapeutic options for these patients
Recommended from our members
Resting state cortico-thalamic-striatal connectivity predicts pesponse to dorsomedial prefrontal rTMS in major depressive disorder
Despite its high toll on society, there has been little recent improvement in treatment efficacy for Major Depressive Disorder (MDD). The identification of biological markers of successful treatment response may allow for more personalized and effective treatment. Here we investigate whether resting state functional connectivity predicted response to treatment with rapid transcranial magnetic stimulation (rTMS) to dorsomedial prefrontal cortex (dmPFC). Twenty five individuals with treatment-refractory MDD underwent a 4-week course of dmPFC-rTMS. Before and after treatment, subjects received resting state functional MRI scans and assessments of depressive symptoms using the Hamilton Depresssion Rating Scale (HAMD17). We found that higher baseline cortico-cortical connectivity (dmPFC-subgenual cingulate and subgenual cingulate to dorsolateral PFC) and lower cortico-thalamic, cortico-striatal and cortico-limbic connectivity were associated with better treatment outcomes. We also investigated how changes in connectivity over the course of treatment related to improvements in HAMD17 scores. We found that successful treatment was associated with increased dmPFC-thalamic connectivity and decreased sgACC-caudate connectivity, Our findings provide insight into which individuals might respond to rTMS treatment and the mechanisms through which these treatments work
Recommended from our members
Anhedonia and reward-circuit connectivity distinguish nonresponders from responders to dorsomedial prefrontal rTMS in major depression
Background
Depression is a heterogeneous mental illness. Neurostimulation treatments, by targeting specific nodes within the brain’s emotion-regulation network, may be useful both as therapies and as probes for identifying clinically relevant depression subtypes.
Methods
Here, we applied 20 sessions of magnetic resonance imaging-guided repetitive transcranial magnetic stimulation (rTMS) to the dorsomedial prefrontal cortex in 47 unipolar or bipolar patients with a medication-resistant major depressive episode.
Results
Treatment response was strongly bimodal, with individual patients showing either minimal or marked improvement. Compared with responders, nonresponders showed markedly higher baseline anhedonia symptomatology (including pessimism, loss of pleasure, and loss of interest in previously enjoyed activities) on item-by-item examination of Beck Depression Inventory-II and Quick Inventory of Depressive Symptomatology ratings. Congruently, on baseline functional magnetic resonance imaging, nonresponders showed significantly lower connectivity through a classical reward pathway comprising ventral tegmental area, striatum, and a region in ventromedial prefrontal cortex. Responders and nonresponders also showed opposite patterns of hemispheric lateralization in the connectivity of dorsomedial and dorsolateral regions to this same ventromedial region.
Conclusions
The results suggest distinct depression subtypes, one with preserved hedonic function and responsive to dorsomedial rTMS and another with disrupted hedonic function, abnormally lateralized connectivity through ventromedial prefrontal cortex, and unresponsive to dorsomedial rTMS. Future research directly comparing the effects of rTMS at different targets, guided by neuroimaging and clinical presentation, may clarify whether hedonia/reward circuit integrity is a reliable marker for optimizing rTMS target selection
Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector
A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements
Enabling planetary science across light-years. Ariel Definition Study Report
Ariel, the Atmospheric Remote-sensing Infrared Exoplanet Large-survey, was adopted as the fourth medium-class mission in ESA's Cosmic Vision programme to be launched in 2029. During its 4-year mission, Ariel will study what exoplanets are made of, how they formed and how they evolve, by surveying a diverse sample of about 1000 extrasolar planets, simultaneously in visible and infrared wavelengths. It is the first mission dedicated to measuring the chemical composition and thermal structures of hundreds of transiting exoplanets, enabling planetary science far beyond the boundaries of the Solar System. The payload consists of an off-axis Cassegrain telescope (primary mirror 1100 mm x 730 mm ellipse) and two separate instruments (FGS and AIRS) covering simultaneously 0.5-7.8 micron spectral range. The satellite is best placed into an L2 orbit to maximise the thermal stability and the field of regard. The payload module is passively cooled via a series of V-Groove radiators; the detectors for the AIRS are the only items that require active cooling via an active Ne JT cooler. The Ariel payload is developed by a consortium of more than 50 institutes from 16 ESA countries, which include the UK, France, Italy, Belgium, Poland, Spain, Austria, Denmark, Ireland, Portugal, Czech Republic, Hungary, the Netherlands, Sweden, Norway, Estonia, and a NASA contribution
The QCD transition temperature: results with physical masses in the continuum limit II.
We extend our previous study [Phys. Lett. B643 (2006) 46] of the cross-over
temperatures (T_c) of QCD. We improve our zero temperature analysis by using
physical quark masses and finer lattices. In addition to the kaon decay
constant used for scale setting we determine four quantities (masses of the
\Omega baryon, K^*(892) and \phi(1020) mesons and the pion decay constant)
which are found to agree with experiment. This implies that --independently of
which of these quantities is used to set the overall scale-- the same results
are obtained within a few percent. At finite temperature we use finer lattices
down to a <= 0.1 fm (N_t=12 and N_t=16 at one point). Our new results confirm
completely our previous findings. We compare the results with those of the
'hotQCD' collaboration.Comment: 19 pages, 8 figures, 3 table
Measurement of the inclusive isolated-photon cross section in pp collisions at √s = 13 TeV using 36 fb−1 of ATLAS data
The differential cross section for isolated-photon production in pp collisions is measured at a centre-of-mass energy of 13 TeV with the ATLAS detector at the LHC using an integrated luminosity of 36.1 fb. The differential cross section is presented as a function of the photon transverse energy in different regions of photon pseudorapidity. The differential cross section as a function of the absolute value of the photon pseudorapidity is also presented in different regions of photon transverse energy. Next-to-leading-order QCD calculations from Jetphox and Sherpa as well as next-to-next-to-leading-order QCD calculations from Nnlojet are compared with the measurement, using several parameterisations of the proton parton distribution functions. The predictions provide a good description of the data within the experimental and theoretical uncertainties. [Figure not available: see fulltext.
Review of influenza-associated pulmonary aspergillosis in ICU patients and proposal for a case definition: an expert opinion
Purpose: Invasive pulmonary aspergillosis is increasingly reported in patients with influenza admitted to the intensive care unit (ICU). Classification of patients with influenza-associated pulmonary aspergillosis (IAPA) using the current definitions for invasive fungal diseases has proven difficult, and our aim was to develop case definitions for IAPA that can facilitate clinical studies. Methods: A group of 29 international experts reviewed current insights into the epidemiology, diagnosis and management of IAPA and proposed a case definition of IAPA through a process of informal consensus. Results: Since IAPA may develop in a wide range of hosts, an entry criterion was proposed and not host factors. The entry criterion was defined as a patient requiring ICU admission for respiratory distress with a positive influenza test temporally related to ICU admission. In addition, proven IAPA required histological evidence of invasive septate hyphae and mycological evidence for Aspergillus. Probable IAPA required the detection of galactomannan or positive Aspergillus culture in bronchoalveolar lavage (BAL) or serum with pulmonary infiltrates or a positive culture in upper respiratory samples with bronchoscopic evidence for tracheobronchitis or cavitating pulmonary infiltrates of recent onset. The IAPA case definitions may be useful to classify patients with COVID-19-associated pulmonary aspergillosis (CAPA), while awaiting further studies that provide more insight into the interaction between Aspergillus and the SARS-CoV-2-infected lung. Conclusion: A consensus case definition of IAPA is proposed, which will facilitate research into the epidemiology, diagnosis and management of this emerging acute and severe Aspergillus disease, and may be of use to study CAPA
Frameworks and tools for risk assessment of manufactured nanomaterials
Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs.Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening level assessments rather than to support regulatory RA and risk management Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. (C) 2016 Elsevier Ltd. All rights reserved
- …