682 research outputs found

    Regulation of food intake by astrocytes in the brainstem dorsal vagal complex

    Get PDF
    This is the final version. Available on open access from Wiley via the DOI in this recordA role for glial cells in brain circuits controlling feeding has begun to be identified with hypothalamic astrocyte signaling implicated in regulating energy homeostasis. The nucleus of the solitary tract (NTS), within the brainstem dorsal vagal complex (DVC), integrates vagal afferent information from the viscera and plays a role in regulating food intake. We hypothesized that astrocytes in this nucleus respond to, and influence, food intake. Mice fed high‐fat chow for 12 hr during the dark phase showed NTS astrocyte activation, reflected in an increase in the number (65%) and morphological complexity of glial‐fibrillary acidic protein (GFAP)‐immunoreactive cells adjacent to the area postrema (AP), compared to control chow fed mice. To measure the impact of astrocyte activation on food intake, we delivered designer receptors exclusively activated by designer drugs (DREADDs) to DVC astrocytes (encompassing NTS, AP, and dorsal motor nucleus of the vagus) using an adeno‐associated viral (AAV) vector (AAV‐GFAP‐hM3Dq_mCherry). Chemogenetic activation with clozapine‐N‐oxide (0.3 mg/kg) produced in greater morphological complexity in astrocytes and reduced dark‐phase feeding by 84% at 4 hr postinjection compared with vehicle treatment. hM3Dq‐activation of DVC astrocytes also reduced refeeding after an overnight fast (71% lower, 4 hr postinjection) when compared to AAV‐GFAP‐mCherry expressing control mice. DREADD‐mediated astrocyte activation did not impact locomotion. hM3Dq activation of DVC astrocytes induced c‐FOS in neighboring neuronal feeding circuits (including in the parabrachial nucleus). This indicates that NTS astrocytes respond to acute nutritional excess, are involved in the integration of peripheral satiety signals, and can reduce food intake when activated.Diabetes UKMedical Research Council (MRC

    Measuring vascular reactivity with breath-holds after stroke: a method to aid interpretation of group-level BOLD signal changes in longitudinal fMRI studies

    Get PDF
    Blood oxygenation level dependent (BOLD) contrast fMRI is a widely used technique to map brain function, and to monitor its recovery after stroke. Since stroke has a vascular etiology, the neurovascular coupling between cerebral blood flow and neural activity may be altered, resulting in uncertainties when interpreting longitudinal BOLD signal changes. The purpose of this study was to demonstrate the feasibility of using a recently validated breath-hold task in patients with stroke, both to assess group level changes in cerebrovascular reactivity (CVR) and to determine if alterations in regional CVR over time will adversely affect interpretation of task-related BOLD signal changes. Three methods of analyzing the breathhold data were evaluated. The CVR measures were compared over healthy tissue, infarcted tissue, and the peri-infarct tissue, both sub-acutely (~two weeks) and chronically (~four months). In this cohort, a lack of CVR differences in healthy tissue between the patients and controls indicates that any group level BOLD signal change observed in these regions over time is unlikely to be related to vascular alterations. CVR was reduced in the peri-infarct tissue but remained unchanged over time. Therefore, although a lack of activation in this region compared to the controls may be confounded by a reduced CVR, longitudinal grouplevel BOLD changes may be more confidently attributed to neural activity changes in this cohort. By including this breath-hold based CVR assessment protocol in future studies of stroke recovery, researchers can be more assured that longitudinal changes in BOLD signal reflect true alterations in neural activity

    Risk Factors for Acute Leukemia in Children: A Review

    Get PDF
    Although overall incidence is rare, leukemia is the most common type of childhood cancer. It accounts for 30% of all cancers diagnosed in children younger than 15 years. Within this population, acute lymphocytic leukemia (ALL) occurs approximately five times more frequently than acute myelogenous leukemia (AML) and accounts for approximately 78% of all childhood leukemia diagnoses. Epidemiologic studies of acute leukemias in children have examined possible risk factors, including genetic, infectious, and environmental, in an attempt to determine etiology. Only one environmental risk factor (ionizing radiation) has been significantly linked to ALL or AML. Most environmental risk factors have been found to be weakly and inconsistently associated with either form of acute childhood leukemia. Our review focuses on the demographics of childhood leukemia and the risk factors that have been associated with the development of childhood ALL or AML. The environmental risk factors discussed include ionizing radiation, non-ionizing radiation, hydrocarbons, pesticides, alcohol use, cigarette smoking, and illicit drug use. Knowledge of these particular risk factors can be used to support measures to reduce potentially harmful exposures and decrease the risk of disease. We also review genetic and infectious risk factors and other variables, including maternal reproductive history and birth characteristics

    Incidence and Prediction of Falls in Dementia: A Prospective Study in Older People

    Get PDF
    Falls are a major cause of morbidity and mortality in dementia, but there have been no prospective studies of risk factors for falling specific to this patient population, and no successful falls intervention/prevention trials. This prospective study aimed to identify modifiable risk factors for falling in older people with mild to moderate dementia.179 participants aged over 65 years were recruited from outpatient clinics in the UK (38 Alzheimer's disease (AD), 32 Vascular dementia (VAD), 30 Dementia with Lewy bodies (DLB), 40 Parkinson's disease with dementia (PDD), 39 healthy controls). A multifactorial assessment of baseline risk factors was performed and fall diaries were completed prospectively for 12 months. Dementia participants experienced nearly 8 times more incident falls (9118/1000 person-years) than controls (1023/1000 person-years; incidence density ratio: 7.58, 3.11-18.5). In dementia, significant univariate predictors of sustaining at least one fall included diagnosis of Lewy body disorder (proportional hazard ratio (HR) adjusted for age and sex: 3.33, 2.11-5.26), and history of falls in the preceding 12 months (HR: 2.52, 1.52-4.17). In multivariate analyses, significant potentially modifiable predictors were symptomatic orthostatic hypotension (HR: 2.13, 1.19-3.80), autonomic symptom score (HR per point 0-36: 1.055, 1.012-1.099), and Cornell depression score (HR per point 0-40: 1.053, 1.01-1.099). Higher levels of physical activity were protective (HR per point 0-9: 0.827, 0.716-0.956).The management of symptomatic orthostatic hypotension, autonomic symptoms and depression, and the encouragement of physical activity may provide the core elements for the most fruitful strategy to reduce falls in people with dementia. Randomised controlled trials to assess such a strategy are a priority

    Nut production in Bertholletia excelsa across a logged forest mosaic: implications for multiple forest use

    Get PDF
    Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP) extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae) is one of the world’s most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity) on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013) in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80%) of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1–2 timber trees removed per ha). In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09) P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil nut rich forests, when a buffer zone cannot be observed, low logging intensities should be implemented. The sustainability of this integrated management system will ultimately depend on a complex series of socioeconomic and ecological interactions. Yet we submit that our study provides an important initial step in understanding the compatibility of timber harvesting with a high value NTFP, potentially allowing for diversification of forest use strategies in Amazonian PerĂč

    Retuning of Inferior Colliculus Neurons Following Spiral Ganglion Lesions: A Single-Neuron Model of Converging Inputs

    Get PDF
    Lesions of spiral ganglion cells, representing a restricted sector of the auditory nerve array, produce immediate changes in the frequency tuning of inferior colliculus (IC) neurons. There is a loss of excitation at the lesion frequencies, yet responses to adjacent frequencies remain intact and new regions of activity appear. This leads to immediate changes in tuning and in tonotopic progression. Similar effects are seen after different methods of peripheral damage and in auditory neurons in other nuclei. The mechanisms that underlie these postlesion changes are unknown, but the acute effects seen in IC strongly suggest the “unmasking” of latent inputs by the removal of inhibition. In this study, we explore computational models of single neurons with a convergence of excitatory and inhibitory inputs from a range of characteristic frequencies (CFs), which can simulate the narrow prelesion tuning of IC neurons, and account for the changes in CF tuning after a lesion. The models can reproduce the data if inputs are aligned relative to one another in a precise order along the dendrites of model IC neurons. Frequency tuning in these neurons approximates that seen physiologically. Removal of inputs representing a narrow range of frequencies leads to unmasking of previously subthreshold excitatory inputs, which causes changes in CF. Conversely, if all of the inputs converge at the same point on the cell body, receptive fields are broad and unmasking rarely results in CF changes. However, if the inhibition is tonic with no stimulus-driven component, then unmasking can still produce changes in CF

    Criteria for the selective use of chest computed tomography in blunt trauma patients

    Get PDF
    Item does not contain fulltextPURPOSE: The purpose of this study was to derive parameters that predict which high-energy blunt trauma patients should undergo computed tomography (CT) for detection of chest injury. METHODS: This observational study prospectively included consecutive patients (>or=16 years old) who underwent multidetector CT of the chest after a high-energy mechanism of blunt trauma in one trauma centre. RESULTS: We included 1,047 patients (median age, 37; 70% male), of whom 508 had chest injuries identified by CT. Using logistic regression, we identified nine predictors of chest injury presence on CT (age >or=55 years, abnormal chest physical examination, altered sensorium, abnormal thoracic spine physical examination, abnormal chest conventional radiography (CR), abnormal thoracic spine CR, abnormal pelvic CR or abdominal ultrasound, base excess or=1 positive predictors, 484 had injury on CT (95% of all 508 patients with injury). Of all 192 patients with no positive predictor, 24 (13%) had chest injury, of whom 4 (2%) had injuries that were considered clinically relevant. CONCLUSION: Omission of CT in patients without any positive predictor could reduce imaging frequency by 18%, while most clinically relevant chest injuries remain adequately detected.1 april 201

    Alpha thalassaemia-mental retardation, X linked

    Get PDF
    X-linked alpha thalassaemia mental retardation (ATR-X) syndrome in males is associated with profound developmental delay, facial dysmorphism, genital abnormalities and alpha thalassaemia. Female carriers are usually physically and intellectually normal. So far, 168 patients have been reported. Language is usually very limited. Seizures occur in about one third of the cases. While many patients are affectionate with their caregivers, some exhibit autistic-like behaviour. Patients present with facial hypotonia and a characteristic mouth. Genital abnormalities are observed in 80% of children and range from undescended testes to ambiguous genitalia. Alpha-thalassaemia is not always present. This syndrome is X-linked recessive and results from mutations in the ATRX gene. This gene encodes the widely expressed ATRX protein. ATRX mutations cause diverse changes in the pattern of DNA methylation at heterochromatic loci but it is not yet known whether this is responsible for the clinical phenotype. The diagnosis can be established by detection of alpha thalassaemia, identification of ATRX gene mutations, ATRX protein studies and X-inactivation studies. Genetic counselling can be offered to families. Management is multidisciplinary: young children must be carefully monitored for gastro-oesophageal reflux as it may cause death. A number of individuals with ATR-X are fit and well in their 30s and 40s

    Changes in practice patterns affecting in-hospital and post-discharge survival among ACS patients

    Get PDF
    BACKGROUND: Adherence to clinical practice guidelines for the treatment of specific illnesses may result in unexpected outcomes, given that multiple therapies must often be given to patients with diverse medical conditions. Yet, few studies have presented empirical evidence that quality improvement (QI) programs both change practice by improving adherence to guidelines and improve patient outcomes under the conditions of actual practice. Thus, we focus on patient survival, following hospitalization for acute coronary syndrome in three successive patient cohorts from the same community hospitals, with a quality improvement intervention occurring between cohorts two and three. METHODS: This study is a comparison of three historical cohorts of Acute Coronary Syndrome (ACS) patients in the same five community hospitals in 1994–5, 1997, 2002–3. A quality improvement project, the Guidelines Applied to Practice (GAP), was implemented in these hospitals in 2001. Study participants were recruited from community hospitals located in two Michigan communities during three separate time periods. The cohorts comprise (1) patients enrolled between December 1993 and April 1995 (N = 814), (2) patients enrolled between February 1997 and September 1997 (N = 452), and (3) patients enrolled between January 14, 2002 and April 13, 2003 (N = 710). Mortality data were obtained from Michigan's Bureau of Vital Statistics for all three patient cohorts. Predictor variables, obtained from medical record reviews, included demographic information, indicators of disease severity (ejection fraction), co-morbid conditions, hospital treatment information concerning most invasive procedures and the use of ace-inhibitors, beta-blockers and aspirin in the hospital and as discharge recommendations. RESULTS: Adjusted in-hospital mortality showed a marked improvement with a HR = 0.16 (p < 0.001) comparing 2003 patients in the same hospitals to those 10 years earlier. Large gains in the in-hospital mortality were maintained based on 1-year mortality rates after hospital discharge. CONCLUSION: Changes in practice patterns that follow recommended guidelines can significantly improve care for ACS patients. In-hospital mortality gains were maintained in the year following discharge
    • 

    corecore