72 research outputs found

    Cholinergic Interneurons Are Differentially Distributed in the Human Striatum

    Get PDF
    BACKGROUND: The striatum (caudate nucleus, CN, and putamen, Put) is a group of subcortical nuclei involved in planning and executing voluntary movements as well as in cognitive processes. Its neuronal composition includes projection neurons, which connect the striatum with other structures, and interneurons, whose main roles are maintaining the striatal organization and the regulation of the projection neurons. The unique electrophysiological and functional properties of the cholinergic interneurons give them a crucial modulating function on the overall striatal response. METHODOLOGY/PRINCIPLE FINDINGS: This study was carried out using stereological methods to examine the volume and density (cells/mm(3)) of these interneurons, as visualized by choline acetyltransferase (ChAT) immunoreactivity, in the following territories of the CN and Put of nine normal human brains: 1) precommissural head; 2) postcommissural head; 3) body; 4) gyrus and 5) tail of the CN; 6) precommissural and 7) postcommissural Put. The distribution of ChAT interneurons was analyzed with respect to the topographical, functional and chemical territories of the dorsal striatum. The CN was more densely populated by cholinergic neurons than the Put, and their density increased along the anteroposterior axis of the striatum with the CN body having the highest neuronal density. The associative territory of the dorsal striatum was by far the most densely populated. The striosomes of the CN precommissural head and the postcommissural Put contained the greatest number of ChAT-ir interneurons. The intrastriosomal ChAT-ir neurons were abundant on the periphery of the striosomes throughout the striatum. CONCLUSIONS/SIGNIFICANCE: All these data reveal that cholinergic interneurons are differentially distributed in the distinct topographical and functional territories of the human dorsal striatum, as well as in its chemical compartments. This heterogeneity may indicate that the posterior aspects of the CN require a special integration of information by interneurons. Interestingly, these striatal regions have been very much left out in functional studies

    Key epidemiological drivers and impact of interventions in the 2020 SARS-CoV-2 epidemic in England.

    Get PDF
    We fitted a model of SARS-CoV-2 transmission in care homes and the community to regional surveillance data for England. Compared with other approaches, our model provides a synthesis of multiple surveillance data streams into a single coherent modeling framework, allowing transmission and severity to be disentangled from features of the surveillance system. Of the control measures implemented, only national lockdown brought the reproduction number (Rt eff) below 1 consistently; if introduced 1 week earlier, it could have reduced deaths in the first wave from an estimated 48,600 to 25,600 [95% credible interval (CrI): 15,900 to 38,400]. The infection fatality ratio decreased from 1.00% (95% CrI: 0.85 to 1.21%) to 0.79% (95% CrI: 0.63 to 0.99%), suggesting improved clinical care. The infection fatality ratio was higher in the elderly residing in care homes (23.3%, 95% CrI: 14.7 to 35.2%) than those residing in the community (7.9%, 95% CrI: 5.9 to 10.3%). On 2 December 2020, England was still far from herd immunity, with regional cumulative infection incidence between 7.6% (95% CrI: 5.4 to 10.2%) and 22.3% (95% CrI: 19.4 to 25.4%) of the population. Therefore, any vaccination campaign will need to achieve high coverage and a high degree of protection in vaccinated individuals to allow nonpharmaceutical interventions to be lifted without a resurgence of transmission

    Importance of patient bed pathways and length of stay differences in predicting COVID-19 hospital bed occupancy in England.

    Get PDF
    Background: Predicting bed occupancy for hospitalised patients with COVID-19 requires understanding of length of stay (LoS) in particular bed types. LoS can vary depending on the patient’s “bed pathway” - the sequence of transfers of individual patients between bed types during a hospital stay. In this study, we characterise these pathways, and their impact on predicted hospital bed occupancy. Methods: We obtained data from University College Hospital (UCH) and the ISARIC4C COVID-19 Clinical Information Network (CO-CIN) on hospitalised patients with COVID-19 who required care in general ward or critical care (CC) beds to determine possible bed pathways and LoS. We developed a discrete-time model to examine the implications of using either bed pathways or only average LoS by bed type to forecast bed occupancy. We compared model-predicted bed occupancy to publicly available bed occupancy data on COVID-19 in England between March and August 2020. Results: In both the UCH and CO-CIN datasets, 82% of hospitalised patients with COVID-19 only received care in general ward beds. We identified four other bed pathways, present in both datasets: “Ward, CC, Ward”, “Ward, CC”, “CC” and “CC, Ward”. Mean LoS varied by bed type, pathway, and dataset, between 1.78 and 13.53 days. For UCH, we found that using bed pathways improved the accuracy of bed occupancy predictions, while only using an average LoS for each bed type underestimated true bed occupancy. However, using the CO-CIN LoS dataset we were not able to replicate past data on bed occupancy in England, suggesting regional LoS heterogeneities. Conclusions: We identified five bed pathways, with substantial variation in LoS by bed type, pathway, and geography. This might be caused by local differences in patient characteristics, clinical care strategies, or resource availability, and suggests that national LoS averages may not be appropriate for local forecasts of bed occupancy for COVID-19. Trial registration: The ISARIC WHO CCP-UK study ISRCTN66726260 was retrospectively registered on 21/04/2020 and designated an Urgent Public Health Research Study by NIHR.</p

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    First measurement of the Z→μ+μ− angular coefficients in the forward region of pp collisions at √s = 13 TeV

    Get PDF
    The first study of the angular distribution of μ + μ − pairs produced in the forward rapidity region via the Drell-Yan reaction p p → γ ∗ / Z + X → ℓ + ℓ − + X is presented, using data collected with the LHCb detector at a center-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 5.1     fb − 1 . The coefficients of the five leading terms in the angular distribution are determined as a function of the dimuon transverse momentum and rapidity. The results are compared to various theoretical predictions of the Z -boson production mechanism and can also be used to probe transverse-momentum-dependent parton distributions within the proton

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Liver graft washout is determinant for the efficient prevention of reperfusion injury: A role for the cytoskeleton and glycocalyx preservation

    No full text
    Antecedents: Hepatic ischemia reperfusion injury (I/R) contributes to the initial poor function or primary non-function after transplantation. This is due to the cold preservation, rewarming and reperfusion, respectively. We evaluated the benefits of using a new rinse solution against I/R. Experimental: Sprague-Dawley rats (180-200 g; n= 6 for each group), were classified as follows: Group 1(controls)= Livers preserved in UW solution (24 hours; 4 °C) were flushed with Ringer lactate solution(RLS) and subjected to 2 h-reperfusion at 37 °C using an ex-vivo model; Group 2 = Same as 1 but the liver grafts were flushed with the new rinse solution composed by CaCl2.2H2O, KH2PO4, NaH2PO4, MgSO4.7H2O, lactobionate and rafinose at pH=7.4; Group 3: same as 2 but with polyethyleneglycol-35 (PEG-35) addition at 1g /L. Group 4 = same as 3 but with PEG-35 addition at 5g/L. Determinations: Liver injury (AST/ALT); mitochondrial damage (GLDH); function (bile output and vascular resistance), oxidative stress (MDA), nitric oxide; as well as glycocalyx (syndecan, hyaluronic acid) and cytoskeleton (filament and globular actin fraction) degradation were assessed. Also metalloproteinases (MMP2 and MMP9) were determined. In addition, actin confocal microscopy studies were carried out. Results: PEG-35 at 1g/L and 5g/L reduced AST/ALT levels, ameliorated liver function when compared to Groups 1 and 2. This was accompanied by decreases in GLDH, MDA and increases in e-NOS activation. Interestingly, the PEG-35 benefits were closely associated with the inhibition of MMP9 and the stabilization of actin filament fraction (western blot and confocal microscopy).Glycocalyx was also protected (low syndecan-1 and hyaluronic acid levels). Conclusions: The most suitable liver graft protection was observed for PEG35 (5 g/L) (“graft post-conditioning”) that reinforced the stabilization of the liver graft cytoskeleton and glycocalyx

    Markers of Early Renal Changes Induced By Industrial Pollutants .1. Application To Workers Exposed To Mercury-vapor

    No full text
    Several markers of renal changes have been measured in a cohort of 50 workers exposed to elemental mercury (Hg) and in 50 control workers. After application of selection criteria 44 exposed and 49 control workers were retained for the final statistical analysis. Exposed workers excreted on average 22 mug Hg/g creatinine and their mean duration of exposure was 11 years. Three types of renal markers were studied-namely, functional markers (creatinine and beta2-microglobulin in serum, urinary proteins of low or high molecular weight); cytotoxicity markers (tubular antigens and enzymes in urine), and biochemical markers (eicosanoids, thromboxane, fibronectin, kallikrein, sialic acid, glycosaminoglycans in urine, red blood cell membrane negative charges). Several blood-borne indicators of polyclonal activation were also measured to test the hypothesis that an immune mechanism might be involved in the renal toxicity of elemental Hg. The main renal changes associated with exposure to Hg were indicative of tubular cytotoxicity (increased leakage of tubular antigens and enzymes in urine) and biochemical alterations (decreased urinary excretion of some eicosanoids and glycosaminoglycans and lowering of urinary pH). The concentrations of anti-DNA antibodies and total immunoglobulin E in serum were also positively associated with the concentration of Hg in urine and in blood respectively. The renal effects were mainly found in workers excreting more than 50 mug Hg/g creatinine, which corroborates our previous estimate of the biological threshold of Hg in urine. As these effects, however, were unrelated to the duration of exposure and not accompanied by functional changes (for example, microproteinuria), they may not necessarily represent clinically significant alterations of renal function
    corecore