74 research outputs found
Behavioral Responses to Combinations of Timed Light, Food Availability, and Ultradian Rhythms in the Common Vole (Microtus arvalis)
Light is the main entraining signal of the central circadian clock, which drives circadian organization of activity. When food is made available during only certain parts of the day, it can entrain the clock in the liver without changing the phase of the central circadian clock. Although a hallmark of food entrainment is a behavioral anticipation of food availability, the extent of behavioral alterations in response to food availability has not been fully characterized. The authors have investigated interactions between light and temporal food availability in the timing of activity in the common vole. Temporally restricted food availability enhanced or attenuated re-entrainment to a phase advance in light entrainment when it was shifted together with the light or remained at the same time of day, respectively. When light-entrained behavior was challenged with temporal food availability cycles with a different period, two distinct activity components were observed. More so, the present data indicate that in the presence of cycles of different period length of food and light, an activity component emerged that appeared to be driven by a free-running (light-entrainable) clock. Because the authors have previously shown that in the common vole altering activity through running-wheel availability can alter the effectiveness of food availability to entrain the clock in the liver, the authors included running-wheel availability as a parameter that alters the circadian/ultradian balance in activity. In the current protocols, running-wheel availability enhanced the entraining potential of both light and food availability in a differential way. The data presented here show that in the vole activity is a complex of individually driven components and that this activity is, itself, an important modulator of the effectiveness of entraining signals such as light and food. (Author correspondence: [email protected]
SCN-AVP release of mPer1/mPer2 double-mutant mice in vitro
Background: Circadian organisation of behavioural and physiological rhythms in mammals is largely driven by the clock in the suprachiasmatic nuclei (SCN) of the hypothalamus. In this clock, a molecular transcriptional repression and activation mechanism generates near 24 hour rhythms. One of the outputs of the molecular clock in specific SCN neurons is arginine-vasopressin (AVP), which is responsive to transcriptional activation by clock gene products. As negative regulators, the protein products of the period genes are thought to repress transcriptional activity of the positive limb after heterodimerisation with CRYPTOCHROME. When both the Per1 and Per2 genes are dysfunctional by targeted deletion of the PAS heterodimer binding domain, mice lose circadian organization of behaviour upon release into constant environmental conditions. To which degree the period genes are involved in the control of AVP output is unknown.
Methods: Using an in vitro slice culture setup, SCN-AVP release of cultures made of 10 wildtype and 9 Per1/2 double-mutant mice was assayed. Mice were sacrificed in either the early light phase of the light-dark cycle, or in the early subjective day on the first day of constant dark.
Results: Here we report that in arrhythmic homozygous Per1/2 double-mutant mice there is still a diurnal peak in in vitro AVP release from the SCN similar to that of wildtypes but distinctively different from the release pattern from the paraventricular nucleus. Such a modulation of AVP release is unexpected in mice where the circadian clockwork is thought to be disrupted.
Conclusion: Our results suggest that the circadian clock in these animals, although deficient in (most) behavioural and molecular rhythms, may still be (partially) functional, possibly as an hourglass mechanism. The level of perturbation of the clock in Per1/2 double mutants may therefore be less than was originally thought.
Reduced glucose concentration enhances ultradian rhythms in Pdcd5 promoter activity in vitro
Intrinsically driven ultradian rhythms in the hourly range are often co-expressed with circadian rhythms in various physiological processes including metabolic processes such as feeding behaviour, gene expression and cellular metabolism. Several behavioural observations show that reduced energy intake or increased energy expenditure leads to a re-balancing of ultradian and circadian timing, favouring ultradian feeding and activity patterns when energy availability is limited. This suggests a close link between ultradian rhythmicity and metabolic homeostasis, but we currently lack models to test this hypothesis at a cellular level. We therefore transduced 3T3-L1 pre-adipocyte cells with a reporter construct that drives a destabilised luciferase via the Pdcd5 promotor, a gene we previously showed to exhibit robust ultradian rhythms in vitro. Ultradian rhythmicity in Pdcd5 promotor driven bioluminescence was observed in >80% of all cultures that were synchronised with dexamethasone, whereas significantly lower numbers exhibited ultradian rhythmicity in non-synchronised cultures (∼11%). Cosine fits to ultradian bioluminescence rhythms in cells cultured and measured in low glucose concentrations (2 mM and 5 mM), exhibited significantly higher amplitudes than all other cultures, and a shorter period (6.9 h vs. 8.2 h, N = 12). Our findings show substantial ultradian rhythmicity in Pdcd5 promotor activity in cells in which the circadian clocks have been synchronised in vitro, which is in line with observations of circadian synchronisation of behavioural ultradian rhythms. Critically, we show that the amplitude of ultradian rhythms is enhanced in low glucose conditions, suggesting that low energy availability enhances ultradian rhythmicity at the cellular level in vitro
Host circadian rhythms are disrupted during malaria infection in parasite genotype-specific manners
Infection can dramatically alter behavioural and physiological traits as hosts become sick and subsequently return to health. Such “sickness behaviours” include disrupted circadian rhythms in both locomotor activity and body temperature. Host sickness behaviours vary in pathogen species-specific manners but the influence of pathogen intraspecific variation is rarely studied. We examine how infection with the murine malaria parasite, Plasmodium chabaudi, shapes sickness in terms of parasite genotype-specific effects on host circadian rhythms. We reveal that circadian rhythms in host locomotor activity patterns and body temperature become differentially disrupted and in parasite genotype-specific manners. Locomotor activity and body temperature in combination provide more sensitive measures of health than commonly used virulence metrics for malaria (e.g. anaemia). Moreover, patterns of host disruption cannot be explained simply by variation in replication rate across parasite genotypes or the severity of anaemia each parasite genotype causes. It is well known that disruption to circadian rhythms is associated with non-infectious diseases, including cancer, type 2 diabetes, and obesity. Our results reveal that disruption of host circadian rhythms is a genetically variable virulence trait of pathogens with implications for host health and disease tolerance
Muscle mass, muscle strength and mortality in kidney transplant recipients:results of the TransplantLines Biobank and Cohort Study
Background: Survival of kidney transplant recipients (KTR) is low compared with the general population. Low muscle mass and muscle strength may contribute to lower survival, but practical measures of muscle status suitable for routine care have not been evaluated for their association with long-term survival and their relation with each other in a large cohort of KTR. Methods: Data of outpatient KTR ≥ 1 year post-transplantation, included in the TransplantLines Biobank and Cohort Study (ClinicalTrials.gov Identifier: NCT03272841), were used. Muscle mass was determined as appendicular skeletal muscle mass indexed for height2 (ASMI) through bio-electrical impedance analysis (BIA), and by 24-h urinary creatinine excretion rate indexed for height2 (CERI). Muscle strength was determined by hand grip strength indexed for height2 (HGSI). Secondary analyses were performed using parameters not indexed for height2. Cox proportional hazards models were used to investigate the associations between muscle mass and muscle strength and all-cause mortality, both in univariable and multivariable models with adjustment for potential confounders, including age, sex, body mass index (BMI), estimated glomerular filtration rate (eGFR) and proteinuria. Results: We included 741 KTR (62% male, age 55 ± 13 years, BMI 27.3 ± 4.6 kg/m2), of which 62 (8%) died during a median [interquartile range] follow-up of 3.0 [2.3–5.7] years. Compared with patients who survived, patients who died had similar ASMI (7.0 ± 1.0 vs. 7.0 ± 1.0 kg/m2; P = 0.57), lower CERI (4.2 ± 1.1 vs. 3.5 ± 0.9 mmol/24 h/m2; P < 0.001) and lower HGSI (12.6 ± 3.3 vs. 10.4 ± 2.8 kg/m2; P < 0.001). We observed no association between ASMI and all-cause mortality (HR 0.93 per SD increase; 95% confidence interval [CI] [0.72, 1.19]; P = 0.54), whereas CERI and HGSI were significantly associated with mortality, independent of potential confounders (HR 0.57 per SD increase; 95% CI [0.44, 0.81]; P = 0.002 and HR 0.47 per SD increase; 95% CI [0.33, 0.68]; P < 0.001, respectively), and associations of CERI and HGSI with mortality remained independent of each other (HR 0.68 per SD increase; 95% CI [0.47, 0.98]; P = 0.04 and HR 0.53 per SD increase; 95% CI [0.36, 0.76]; P = 0.001, respectively). Similar associations were found for unindexed parameters. Conclusions: Higher muscle mass assessed by creatinine excretion rate and higher muscle strength assessed by hand grip strength are complementary in their association with lower risk of all-cause mortality in KTR. Muscle mass assessed by BIA is not associated with mortality. Routine assessment using both 24-h urine samples and hand grip strength is recommended, to potentially target interdisciplinary interventions for KTR at risk for poor survival to improve muscle status.</p
Muscle mass, muscle strength and mortality in kidney transplant recipients:results of the TransplantLines Biobank and Cohort Study
Background: Survival of kidney transplant recipients (KTR) is low compared with the general population. Low muscle mass and muscle strength may contribute to lower survival, but practical measures of muscle status suitable for routine care have not been evaluated for their association with long-term survival and their relation with each other in a large cohort of KTR. Methods: Data of outpatient KTR ≥ 1 year post-transplantation, included in the TransplantLines Biobank and Cohort Study (ClinicalTrials.gov Identifier: NCT03272841), were used. Muscle mass was determined as appendicular skeletal muscle mass indexed for height2 (ASMI) through bio-electrical impedance analysis (BIA), and by 24-h urinary creatinine excretion rate indexed for height2 (CERI). Muscle strength was determined by hand grip strength indexed for height2 (HGSI). Secondary analyses were performed using parameters not indexed for height2. Cox proportional hazards models were used to investigate the associations between muscle mass and muscle strength and all-cause mortality, both in univariable and multivariable models with adjustment for potential confounders, including age, sex, body mass index (BMI), estimated glomerular filtration rate (eGFR) and proteinuria. Results: We included 741 KTR (62% male, age 55 ± 13 years, BMI 27.3 ± 4.6 kg/m2), of which 62 (8%) died during a median [interquartile range] follow-up of 3.0 [2.3–5.7] years. Compared with patients who survived, patients who died had similar ASMI (7.0 ± 1.0 vs. 7.0 ± 1.0 kg/m2; P = 0.57), lower CERI (4.2 ± 1.1 vs. 3.5 ± 0.9 mmol/24 h/m2; P < 0.001) and lower HGSI (12.6 ± 3.3 vs. 10.4 ± 2.8 kg/m2; P < 0.001). We observed no association between ASMI and all-cause mortality (HR 0.93 per SD increase; 95% confidence interval [CI] [0.72, 1.19]; P = 0.54), whereas CERI and HGSI were significantly associated with mortality, independent of potential confounders (HR 0.57 per SD increase; 95% CI [0.44, 0.81]; P = 0.002 and HR 0.47 per SD increase; 95% CI [0.33, 0.68]; P < 0.001, respectively), and associations of CERI and HGSI with mortality remained independent of each other (HR 0.68 per SD increase; 95% CI [0.47, 0.98]; P = 0.04 and HR 0.53 per SD increase; 95% CI [0.36, 0.76]; P = 0.001, respectively). Similar associations were found for unindexed parameters. Conclusions: Higher muscle mass assessed by creatinine excretion rate and higher muscle strength assessed by hand grip strength are complementary in their association with lower risk of all-cause mortality in KTR. Muscle mass assessed by BIA is not associated with mortality. Routine assessment using both 24-h urine samples and hand grip strength is recommended, to potentially target interdisciplinary interventions for KTR at risk for poor survival to improve muscle status.</p
Timing of host feeding drives rhythms in parasite replication
Circadian rhythms enable organisms to synchronise the processes underpinning survival and reproduction to anticipate daily changes in the external environment. Recent work shows that daily (circadian) rhythms also enable parasites to maximise fitness in the context of ecological interactions with their hosts. Because parasite rhythms matter for their fitness, understanding how they are regulated could lead to innovative ways to reduce the severity and spread of diseases. Here, we examine how host circadian rhythms influence rhythms in the asexual replication of malaria parasites. Asexual replication is responsible for the severity of malaria and fuels transmission of the disease, yet, how parasite rhythms are driven remains a mystery. We perturbed feeding rhythms of hosts by 12 hours (i.e. diurnal feeding in nocturnal mice) to desynchronise the hosts' peripheral oscillators from the central, light-entrained oscillator in the brain and their rhythmic outputs. We demonstrate that the rhythms of rodent malaria parasites in day-fed hosts become inverted relative to the rhythms of parasites in night-fed hosts. Our results reveal that the hosts' peripheral rhythms (associated with the timing of feeding and metabolism), but not rhythms driven by the central, light-entrained circadian oscillator in the brain, determine the timing (phase) of parasite rhythms. Further investigation reveals that parasite rhythms correlate closely with blood glucose rhythms. In addition, we show that parasite rhythms resynchronise to the altered host feeding rhythms when food availability is shifted, which is not mediated through rhythms in the host immune system. Our observations suggest that parasites actively control their developmental rhythms. Finally, counter to expectation, the severity of disease symptoms expressed by hosts was not affected by desynchronisation of their central and peripheral rhythms. Our study at the intersection of disease ecology and chronobiology opens up a new arena for studying host-parasite-vector coevolution and has broad implications for applied bioscience
Melioidosis in travelers: An analysis of Dutch melioidosis registry data 1985–2018
Background: Melioidosis, caused by the Gram-negative bacterium Burkholderia pseudomallei, is an opportunistic infection across the tropics. Here, we provide a systematic overview of imported human cases in a non-endemic country over a 25-year period. Methods: All 5
Recommended from our members
Occurrence and timing of withdrawal of life-sustaining measures in traumatic brain injury patients: a CENTER-TBI study
Funder: National Institute for Health Research (UK)Abstract: Background: In patients with severe brain injury, withdrawal of life-sustaining measures (WLSM) is common in intensive care units (ICU). WLSM constitutes a dilemma: instituting WLSM too early could result in death despite the possibility of an acceptable functional outcome, whereas delaying WLSM could unnecessarily burden patients, families, clinicians, and hospital resources. We aimed to describe the occurrence and timing of WLSM, and factors associated with timing of WLSM in European ICUs in patients with traumatic brain injury (TBI). Methods: The CENTER-TBI Study is a prospective multi-center cohort study. For the current study, patients with traumatic brain injury (TBI) admitted to the ICU and aged 16 or older were included. Occurrence and timing of WLSM were documented. For the analyses, we dichotomized timing of WLSM in early (< 72 h after injury) versus later (≥ 72 h after injury) based on recent guideline recommendations. We assessed factors associated with initiating WLSM early versus later, including geographic region, center, patient, injury, and treatment characteristics with univariable and multivariable (mixed effects) logistic regression. Results: A total of 2022 patients aged 16 or older were admitted to the ICU. ICU mortality was 13% (n = 267). Of these, 229 (86%) patients died after WLSM, and were included in the analyses. The occurrence of WLSM varied between regions ranging from 0% in Eastern Europe to 96% in Northern Europe. In 51% of the patients, WLSM was early. Patients in the early WLSM group had a lower maximum therapy intensity level (TIL) score than patients in the later WLSM group (median of 5 versus 10) The strongest independent variables associated with early WLSM were one unreactive pupil (odds ratio (OR) 4.0, 95% confidence interval (CI) 1.3–12.4) or two unreactive pupils (OR 5.8, CI 2.6–13.1) compared to two reactive pupils, and an Injury Severity Score (ISS) if over 41 (OR per point above 41 = 1.1, CI 1.0–1.1). Timing of WLSM was not significantly associated with region or center. Conclusion: WLSM occurs early in half of the patients, mostly in patients with severe TBI affecting brainstem reflexes who were severely injured. We found no regional or center influences in timing of WLSM. Whether WLSM is always appropriate or may contribute to a self-fulfilling prophecy requires further research and argues for reluctance to institute WLSM early in case of any doubt on prognosis
Recommended from our members
How do 66 European institutional review boards approve one protocol for an international prospective observational study on traumatic brain injury? Experiences from the CENTER-TBI study
Abstract: Background: The European Union (EU) aims to optimize patient protection and efficiency of health-care research by harmonizing procedures across Member States. Nonetheless, further improvements are required to increase multicenter research efficiency. We investigated IRB procedures in a large prospective European multicenter study on traumatic brain injury (TBI), aiming to inform and stimulate initiatives to improve efficiency. Methods: We reviewed relevant documents regarding IRB submission and IRB approval from European neurotrauma centers participating in the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI). Documents included detailed information on IRB procedures and the duration from IRB submission until approval(s). They were translated and analyzed to determine the level of harmonization of IRB procedures within Europe. Results: From 18 countries, 66 centers provided the requested documents. The primary IRB review was conducted centrally (N = 11, 61%) or locally (N = 7, 39%) and primary IRB approval was obtained after one (N = 8, 44%), two (N = 6, 33%) or three (N = 4, 23%) review rounds with a median duration of respectively 50 and 98 days until primary IRB approval. Additional IRB approval was required in 55% of countries and could increase duration to 535 days. Total duration from submission until required IRB approval was obtained was 114 days (IQR 75–224) and appeared to be shorter after submission to local IRBs compared to central IRBs (50 vs. 138 days, p = 0.0074). Conclusion: We found variation in IRB procedures between and within European countries. There were differences in submission and approval requirements, number of review rounds and total duration. Research collaborations could benefit from the implementation of more uniform legislation and regulation while acknowledging local cultural habits and moral values between countries
- …