123 research outputs found

    Cerebellum in timing control: Evidence from contingent negative variation after cerebellar tDCS

    Get PDF
    Background and aims Timing control is defined as the ability to quantify time. The temporal estimation of supra-seconds range is generally seen as a conscious cognitive process, while the sub-seconds range is a more automatic cognitive process. It is accepted that cerebellum contributes to temporal processing, but its function is still debated. The aim of this research was to better explore the role of cerebellum in timing control. We transitorily inhibited cerebellar activity and studied the effects on CNV components in healthy subjects. Methods Sixteen healthy subjects underwent a S1-S2 duration discrimination motor task, prior and after cathodal and sham cerebellar tDCS, in two separate sessions. In S1-S2 task they had to judge whether the duration of a probe interval trial was shorter (Short-ISI-trial:800 ms), longer (long-ISI-trail:1600 ms), or equal to the Target interval of 1200 ms. For each interval trial for both tDCS sessions, we measured: total and W2-CNV areas, the RTs of correct responses and the absolute number of errors prior and after tDCS. Results After cathodal tDCS a significant reduction in total-CNV and W2-CNV amplitudes selectively emerged for Short (p < 0.001; p = 0.003 respectively) and Target-ISI-trial (total-CNV: p < 0.001; W2-CNV:p = 0.003); similarly, a significant higher number of errors emerged for Short (p = 0.004) and Target-ISI-trial (p = 0.07) alone. No differences were detected for Longer-ISI-trials and after sham stimulation. Conclusions These data indicate that cerebellar inhibition selectively altered the ability to make time estimations for second and sub-second intervals. We speculate that cerebellum regulates the attentional mechanisms of automatic timing control by making predictions of interval timing

    Non-contrast CT markers of intracerebral hematoma expansion : a reliability study

    Full text link
    Objectives: We evaluated whether clinicians agree in the detection of non-contrast CT markers of intracerebral hemorrhage (ICH) expansion. Methods: From our local dataset, we randomly sampled 60 patients diagnosed with spontaneous ICH. Fifteen physicians and trainees (Stroke Neurology, Interventional and Diagnostic Neuroradiology) were trained to identify six density (Barras density, black hole, blend, hypodensity, fluid level, swirl) and three shape (Barras shape, island, satellite) expansion markers, using standardized definitions. Thirteen raters performed a second assessment. Inter and intra-rater agreement were measured using Gwet’s AC1, with a coefficient > 0.60 indicating substantial to almost perfect agreement. Results: Almost perfect inter-rater agreement was observed for the swirl (0.85, 95% CI: 0.78-0.90) and fluid level (0.84, 95% CI: 0.76-0.90) markers, while the hypodensity (0.67, 95% CI: 0.56-0.76) and blend (0.62, 95% CI: 0.51-0.71) markers showed substantial agreement. Inter-rater agreement was otherwise moderate, and comparable between density and shape markers. Inter-rater agreement was lower for the three markers that require the rater to identify one specific axial slice (Barras density, Barras shape, island: 0.46, 95% CI: 0.40-0.52 versus others: 0.60, 95% CI: 0.56-0.63). Inter-observer agreement did not differ when stratified for raters’ experience, hematoma location, volume or anticoagulation status. Intrarater agreement was substantial to almost perfect for all but the black hole marker. Conclusion: In a large sample of raters with different backgrounds and expertise levels, only four of nine non-contrast CT markers of ICH expansion showed substantial to almost perfect inter-rater agreement

    Adjunctive cenobamate in people with focal onset seizures: Insights from the Italian Expanded Access Program

    Get PDF
    Objective: This study was undertaken to assess the effectiveness/tolerability of adjunctive cenobamate, variations in the load of concomitant antiseizure medications (ASMs) and predictors of clinical response in people with focal epilepsy. Methods: This was a retrospective study at 21 centers participating in the Italian Expanded Access Program. Effectiveness outcomes included retention and responder rates (>= 50% and 100% reduction in baseline seizure frequency). Tolerability/safety outcomes included the rate of treatment discontinuation due to adverse events (AEs) and their incidence. Total drug load was quantified as the number of concomitant ASMs and total defined daily dose (DDD). Concomitant ASMs were also classified according to their mechanism of action and pharmacokinetic interactions to perform explorative subgroup analyses. Results: A total of 236 subjects with a median age of 38 (Q(1)-Q(3) = 27-49) years were included. At 12 months, cenobamate retention rate was 78.8% and responders were 57.5%. The seizure freedom rates during the preceding 3 months were 9.8%, 12.2%, 16.3%, and 14.0% at 3, 6, 9, and 12 months. A higher percentage of responders was observed among subjects treated with clobazam, although the difference was not statistically significant. A total of 223 AEs were recorded in 133 of 236 participants, leading to cenobamate discontinuation in 8.5% cases. At 12 months, a reduction of one or two concomitant ASMs occurred in 42.6% and 4.3% of the subjects. The median total DDD of all concomitant ASMs decreased from 3.34 (Q(1)-Q(3) = 2.50-4.47) at baseline to 2.50 (Q(1)-Q(3) = 1.67-3.50) at 12 months (p < .001, median percentage reduction = 22.2%). The highest rates of cotreatment withdrawal and reductions in the DDD were observed for sodium channel blockers and gamma-aminobutyric acidergic modulators (above all for those linked to pharmacokinetic interactions), and perampanel. Significance: Adjunctive cenobamate was associated with a reduction in seizure frequency and in the burden of concomitant ASMs in adults with difficult-to-treat focal epilepsy. The type of ASM associated did not influence effectiveness except for a favorable trend with clobazam

    Development and Evaluation of a Simulation-Based Algorithm to Optimize the Planning of Interim Analyses for Clinical Trials in ALS

    Get PDF
    BACKGROUND AND OBJECTIVES: Late-phase clinical trials for neurodegenerative diseases have a low probability of success. In this study, we introduce an algorithm that optimizes the planning of interim analyses for clinical trials in amyotrophic lateral sclerosis (ALS) to better use the time and resources available and minimize the exposure of patients to ineffective or harmful drugs. METHODS: A simulation-based algorithm was developed to determine the optimal interim analysis scheme by integrating prior knowledge about the success rate of ALS clinical trials with drug-specific information obtained in early-phase studies. Interim analysis schemes were optimized by varying the number and timing of interim analyses, together with their decision rules about when to stop a trial. The algorithm was applied retrospectively to 3 clinical trials that investigated the efficacy of diaphragm pacing or ceftriaxone on survival in patients with ALS. Outcomes were additionally compared with conventional interim designs. RESULTS: We evaluated 183-1,351 unique interim analysis schemes for each trial. Application of the optimal designs correctly established lack of efficacy, would have concluded all studies 1.2-19.4 months earlier (reduction of 4.6%-57.7% in trial duration), and could have reduced the number of randomized patients by 1.7%-58.1%. By means of simulation, we illustrate the efficiency for other treatment scenarios. The optimized interim analysis schemes outperformed conventional interim designs in most scenarios. DISCUSSION: Our algorithm uses prior knowledge to determine the uncertainty of the expected treatment effect in ALS clinical trials and optimizes the planning of interim analyses. Improving futility monitoring in ALS could minimize the exposure of patients to ineffective or harmful treatments and result in significant ethical and efficiency gains

    Combining Nitrous Oxide with Carbon Dioxide Decreases the Time to Loss of Consciousness during Euthanasia in Mice — Refinement of Animal Welfare?

    Get PDF
    Carbon dioxide (CO2) is the most commonly used euthanasia agent for rodents despite potentially causing pain and distress. Nitrous oxide is used in man to speed induction of anaesthesia with volatile anaesthetics, via a mechanism referred to as the “second gas” effect. We therefore evaluated the addition of Nitrous Oxide (N2O) to a rising CO2 concentration could be used as a welfare refinement of the euthanasia process in mice, by shortening the duration of conscious exposure to CO2. Firstly, to assess the effect of N2O on the induction of anaesthesia in mice, 12 female C57Bl/6 mice were anaesthetized in a crossover protocol with the following combinations: Isoflurane (5%)+O2 (95%); Isoflurane (5%)+N2O (75%)+O2 (25%) and N2O (75%)+O2 (25%) with a total flow rate of 3l/min (into a 7l induction chamber). The addition of N2O to isoflurane reduced the time to loss of the righting reflex by 17.6%. Secondly, 18 C57Bl/6 and 18 CD1 mice were individually euthanized by gradually filling the induction chamber with either: CO2 (20% of the chamber volume.min−1); CO2+N2O (20 and 60% of the chamber volume.min−1 respectively); or CO2+Nitrogen (N2) (20 and 60% of the chamber volume.min−1). Arterial partial pressure (Pa) of O2 and CO2 were measured as well as blood pH and lactate. When compared to the gradually rising CO2 euthanasia, addition of a high concentration of N2O to CO2 lowered the time to loss of righting reflex by 10.3% (P<0.001), lead to a lower PaO2 (12.55±3.67 mmHg, P<0.001), a higher lactataemia (4.64±1.04 mmol.l−1, P = 0.026), without any behaviour indicative of distress. Nitrous oxide reduces the time of conscious exposure to gradually rising CO2 during euthanasia and hence may reduce the duration of any stress or distress to which mice are exposed during euthanasia

    The weekend effect on the provision of Emergency Surgery before and during the COVID-19 pandemic: case–control analysis of a retrospective multicentre database

    Get PDF
    Introduction: The concept of “weekend effect”, that is, substandard healthcare during weekends, has never been fully demonstrated, and the different outcomes of emergency surgical patients admitted during weekends may be due to different conditions at admission and/or different therapeutic approaches. Aim of this international audit was to identify any change of pattern of emergency surgical admissions and treatments during weekends. Furthermore, we aimed at investigating the impact of the COVID-19 pandemic on the alleged “weekend effect”. Methods: The database of the CovidICE-International Study was interrogated, and 6263 patients were selected for analysis. Non-trauma, 18+ yo patients admitted to 45 emergency surgery units in Europe in the months of March–April 2019 and March–April 2020 were included. Demographic and clinical data were anonymised by the referring centre and centrally collected and analysed with a statistical package. This study was endorsed by the Association of Italian Hospital Surgeons (ACOI) and the World Society of Emergency Surgery (WSES). Results: Three-quarters of patients have been admitted during workdays and only 25.7% during weekends. There was no difference in the distribution of gender, age, ASA class and diagnosis during weekends with respect to workdays. The first wave of the COVID pandemic caused a one-third reduction of emergency surgical admission both during workdays and weekends but did not change the relation between workdays and weekends. The treatment was more often surgical for patients admitted during weekends, with no difference between 2019 and 2020, and procedures were more often performed by open surgery. However, patients admitted during weekends had a threefold increased risk of laparoscopy-to-laparotomy conversion (1% vs. 3.4%). Hospital stay was longer in patients admitted during weekends, but those patients had a lower risk of readmission. There was no difference of the rate of rescue surgery between weekends and workdays. Subgroup analysis revealed that interventional procedures for hot gallbladder were less frequently performed on patients admitted during weekends. Conclusions: Our analysis revealed that demographic and clinical profiles of patients admitted during weekends do not differ significantly from workdays, but the therapeutic strategy may be different probably due to lack of availability of services and skillsets during weekends. The first wave of the COVID-19 pandemic did not impact on this difference
    corecore