676 research outputs found
Control of Leukocyte Trafficking by Stress-Associated Hormones
Leukocyte migration is a crucial process in both homeostatic and inflammatory conditions. The spatiotemporal distribution of immune cells is balanced between processes of cellular mobilization into the bloodstream, their adhesion to vascular beds and trafficking into tissues. Systemic regulation of leukocyte mobility is achieved by different signals including neuronal and hormonal cues, of which the catecholamines and glucocorticoids have been most extensively studied. These hormones are often associated with a stress response, however they regulate immune cell trafficking also in steady state, with effects dependent upon cell type, location, time-of-day, concentration, and duration of signal. Systemic administration of catecholamines, such as the sympathetic neurotransmitters adrenaline and noradrenaline, increases neutrophil numbers in the bloodstream but has different effects on other leukocyte populations. In contrast, local, endogenous sympathetic tone has been shown to be crucial for dynamic daily changes in adhesion molecule expression in the bone marrow and skeletal muscle, acting as a key signal to the endothelium and stromal cells to regulate immune cell trafficking. Conversely, glucocorticoids are often reported as anti-inflammatory, although recent data shows a more complex role, particularly under steady-state conditions. Endogenous changes in circulating glucocorticoid concentration induce redistribution of cells and potentiate inflammatory responses, and in many paradigms glucocorticoid action is strongly influenced by time of day. In this review, we discuss the current knowledge of catecholamine and glucocorticoid regulation of leukocyte migration under homeostatic and stimulated conditions
Is late-life dependency increasing or not? A comparison of the Cognitive Function and Ageing Studies (CFAS)
Background: Little is known about how dependency levels have changed between generational cohorts of older people. We estimated years lived in different care states at age 65 in 1991 and 2011 and new projections of future demand for care. Methods: Two population-based studies of older people in defined geographical areas conducted two decades apart (the Cognitive Function and Ageing Studies) provided prevalence estimates of dependency in four states: high (24-hour care); medium (daily care); low (less than daily); independent. Years in each dependency state were calculated by Sullivan’s method. To project future demand, the proportions in each dependency state (by age group and sex) were applied to the 2014 England population projections. Findings: Between 1991 and 2011 there were significant increases in years lived from age 65 with low (men:1·7 years, 95%CI 1·0-2·4; women:2·4 years, 95%CI 1·8-3·1) and high dependency (men:0·9 years, 95%CI 0·2-1·7; women:1·3 years, 95%CI 0·5-2·1). The majority of men’s extra years of life were independent (36%) or with low dependency (36%) whilst for women the majority were spent with low dependency (58%), only 5% being independent. There were substantial reductions in the proportions with medium and high dependency who lived in care homes, although, if these dependency and care home proportions remain constant in the future, further population ageing will require an extra 71,000 care home places by 2025. Interpretation: On average older men now spend 2.4 years and women 3.0 years with substantial care needs (medium or high dependency), and most will live in the community. These findings have considerable implications for older people’s families who provide the majority of unpaid care, but the findings also supply valuable new information for governments and care providers planning the resources and funding required for the care of their future ageing populations
Immunologic Profiling of the Atlantic Salmon Gill by Single Nuclei Transcriptomics
ACKNOWLEDGMENTS The authors thank all of the animal staff at Kårvik havbruksstasjonen for their expert care of the research animals, and the University of Manchester Genomics Technology core facility (UK) for performing chromium 10x library preparation for snRNAseq. We also thanks the reviewers for their constructive comments on the original manuscript FUNDING AW is supported by the Tromsø forskningsstiftelse (TFS) grant awarded to DH (TFS2016DH). The Sentinel North Transdisciplinary Research Program Université Laval and UiT awarded to DH supports this work. SW is supported a grant from the Tromsø forskningsstiftelse (TFS) starter grant TFS2016SW. Experimental costs were covered by HFSP grant “Evolution of seasonal timers” RGP0030/2015 awarded to AL and DH. Storage resources were provided by the Norwegian National Infrastructure for Research Data (NIRD, project NS9055K).Peer reviewedPublisher PD
Decreased Erythrocyte CCS Content is a Biomarker of Copper Overload in Rats
Copper (Cu) is an essential trace metal that is toxic in excess. It is therefore important to be able to accurately assess Cu deficiency or overload. Cu chaperone for Cu/Zn superoxide dismutase (CCS) protein expression is elevated in tissues of Cu-deficient animals. Increased CCS content in erythrocytes is particularly sensitive to decreased Cu status. Given the lack of a non-invasive, sensitive and specific biomarker for the assessment of Cu excess, we investigated whether CCS expression in erythrocytes reflects Cu overload. Rats were fed diets containing normal or high levels of Cu for 13 weeks. Diets contained 6.3 ± 0.6 (Cu-N), 985 ± 14 (Cu-1000) or 1944 ± 19 (Cu-2000) mg Cu/kg diet. Rats showed a variable response to the high Cu diets. Some rats showed severe Cu toxicity, while other rats showed no visible signs of toxicity and grew normally. Also, some rats had high levels of Cu in liver, whereas others had liver Cu concentrations within the normal range. Erythrocyte CCS protein expression was 30% lower in Cu-2000 rats compared to Cu-N rats (P < 0.05). Notably, only rats that accumulated high levels of Cu in liver had lower erythrocyte CCS (47% reduction, P < 0.05) compared to rats fed normal levels of Cu. Together, these data indicate that decreased erythrocyte CCS content is associated with Cu overload in rats and should be evaluated further as a potential biomarker for assessing Cu excess in humans
Is late-life dependency increasing or not? A comparison of the Cognitive Function and Ageing Studies (CFAS)
Background: Little is known about how the proportions of dependency states have changed between generational cohorts of older people. We aimed to estimate years lived in different dependency states at age 65 years in 1991 and 2011, and new projections of future demand for care. Methods: In this population-based study, we compared two Cognitive Function and Ageing Studies (CFAS I and CFAS II) of older people (aged ≥65 years) who were permanently registered with a general practice in three defined geographical areas (Cambridgeshire, Newcastle, and Nottingham; UK). These studies were done two decades apart (1991 and 2011). General practices provided lists of individuals to be contacted and were asked to exclude those who had died or might die over the next month. Baseline interviews were done in the community and care homes. Participants were stratified by age, and interviews occurred only after written informed consent was obtained. Information collected included basic sociodemographics, cognitive status, urinary incontinence, and self-reported ability to do activities of daily living. CFAS I was assigned as the 1991 cohort and CFAS II as the 2011 cohort, and both studies provided prevalence estimates of dependency in four states: high dependency (24-h care), medium dependency (daily care), low dependency (less than daily), and independent. Years in each dependency state were calculated by Sullivan's method. To project future demands for social care, the proportions in each dependency state (by age group and sex) were applied to the 2014 England population projections. Findings: Between 1991 and 2011, there were significant increases in years lived from age 65 years with low dependency (1·7 years [95% CI 1·0-2·4] for men and 2·4 years [1·8-3·1] for women) and increases with high dependency (0·9 years [0·2-1·7] for men and 1·3 years [0·5-2·1] for women). The majority of men's extra years of life were spent independent (36·3%) or with low dependency (36·3%) whereas for women the majority were spent with low dependency (58·0%), and only 4·8% were independent. There were substantial reductions in the proportions with medium and high dependency who lived in care homes, although, if these dependency and care home proportions remain constant in the future, further population ageing will require an extra 71 215 care home places by 2025. Interpretation: On average older men now spend 2·4 years and women 3·0 years with substantial care needs, and most will live in the community. These findings have considerable implications for families of older people who provide the majority of unpaid care, but the findings also provide valuable new information for governments and care providers planning the resources and funding required for the care of their future ageing populations. Funding: Medical Research Council (G9901400) and (G06010220), with support from the National Institute for Health Research Comprehensive Local research networks in West Anglia and Trent, UK, and Neurodegenerative Disease Research Network in Newcastle, UK
REVERBa couples the circadian clock to hepatic glucocorticoid action.
The glucocorticoid receptor (GR) is a major drug target in inflammatory disease. However, chronic glucocorticoid (GC) treatment leads to disordered energy metabolism, including increased weight gain, adiposity, and hepatosteatosis - all programs modulated by the circadian clock. We demonstrated that while antiinflammatory GC actions were maintained irrespective of dosing time, the liver was significantly more GC sensitive during the day. Temporal segregation of GC action was underpinned by a physical interaction of GR with the circadian transcription factor REVERBa and co-binding with liver-specific hepatocyte nuclear transcription factors (HNFs) on chromatin. REVERBa promoted efficient GR recruitment to chromatin during the day, acting in part by maintaining histone acetylation, with REVERBa-dependent GC responses providing segregation of carbohydrate and lipid metabolism. Importantly, deletion of Reverba inverted circadian liver GC sensitivity and protected mice from hepatosteatosis induced by chronic GC administration. Our results reveal a mechanism by which the circadian clock acts through REVERBa in liver on elements bound by HNF4A/HNF6 to direct GR action on energy metabolism
Changing prevalence and treatment of depression among older people over two decades
Background Depression is a leading cause of disability, with older people particularly susceptible to poor outcomes.Aims To investigate whether the prevalence of depression and antidepressant use have changed across two decades in older people.Method The Cognitive Function and Ageing Studies (CFAS I and CFAS II) are two English population-based cohort studies of older people aged ≥65 years, with baseline measurements for each cohort conducted two decades apart (between 1990 and 1993 and between 2008 and 2011). Depression was assessed by the Geriatric Mental State examination and diagnosed with the Automated Geriatric Examination for Computer-Assisted Taxonomy algorithm.Results In CFAS I, 7635 people aged ≥65 years were interviewed, of whom 1457 were diagnostically assessed. In CFAS II, 7762 people were interviewed and diagnostically assessed. Age-standardised depression prevalence in CFAS II was 6.8% (95% CI 6.3-7.5%), representing a non-significant decline from CFAS I (risk ratio 0.82, 95% CI 0.64-1.07, P = 0.14). At the time of CFAS II, 10.7% of the population (95% CI 10.0-11.5%) were taking antidepressant medication, more than twice that of CFAS I (risk ratio 2.79, 95% CI 1.96-3.97, P < 0.0001). Among care home residents, depression prevalence was unchanged, but the use of antidepressants increased from 7.4% (95% CI 3.8-13.8%) to 29.2% (95% CI 22.6-36.7%).Conclusions A substantial increase in the proportion of the population reporting taking antidepressant medication is seen across two decades for people aged ≥65 years. However there was no evidence for a change in age-specific prevalence of depression
Is late-life dependency increasing or not? A comparison of the Cognitive Function and Ageing Studies (CFAS)
Background Little is known about how the proportions of dependency states have changed between generational cohorts of older people. We aimed to estimate years lived in different dependency states at age 65 years in 1991 and 2011, and new projections of future demand for care. Methods In this population-based study, we compared two Cognitive Function and Ageing Studies (CFAS I and CFAS II) of older people (aged ≥65 years) who were permanently registered with a general practice in three defined geographical areas (Cambridgeshire, Newcastle, and Nottingham; UK). These studies were done two decades apart (1991 and 2011). General practices provided lists of individuals to be contacted and were asked to exclude those who had died or might die over the next month. Baseline interviews were done in the community and care homes. Participants were stratified by age, and interviews occurred only after written informed consent was obtained. Information collected included basic sociodemographics, cognitive status, urinary incontinence, and self-reported ability to do activities of daily living. CFAS I was assigned as the 1991 cohort and CFAS II as the 2011 cohort, and both studies provided prevalence estimates of dependency in four states: high dependency (24-h care), medium dependency (daily care), low dependency (less than daily), and independent. Years in each dependency state were calculated by Sullivan's method. To project future demands for social care, the proportions in each dependency state (by age group and sex) were applied to the 2014 England population projections. Findings Between 1991 and 2011, there were significant increases in years lived from age 65 years with low dependency (1·7 years [95% CI 1·0–2·4] for men and 2·4 years [1·8–3·1] for women) and increases with high dependency (0·9 years [0·2–1·7] for men and 1·3 years [0·5–2·1] for women). The majority of men's extra years of life were spent independent (36·3%) or with low dependency (36·3%) whereas for women the majority were spent with low dependency (58·0%), and only 4·8% were independent. There were substantial reductions in the proportions with medium and high dependency who lived in care homes, although, if these dependency and care home proportions remain constant in the future, further population ageing will require an extra 71 215 care home places by 2025. Interpretation On average older men now spend 2·4 years and women 3·0 years with substantial care needs, and most will live in the community. These findings have considerable implications for families of older people who provide the majority of unpaid care, but the findings also provide valuable new information for governments and care providers planning the resources and funding required for the care of their future ageing populations. Funding Medical Research Council (G9901400) and (G06010220), with support from the National Institute for Health Research Comprehensive Local research networks in West Anglia and Trent, UK, and Neurodegenerative Disease Research Network in Newcastle, UK
Finishing the euchromatic sequence of the human genome
The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2
Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age 6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score 652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701
- …