188 research outputs found
Exercise-induced modulation of cardiac lipid content in healthy lean young men
Cardiac lipid accumulation is associated with decreased cardiac function and energy status (PCr/ATP). It has been suggested that elevated plasma fatty acid (FA) concentrations are responsible for the cardiac lipid accumulation. Therefore, the aim of the present study was to investigate if elevating plasma FA concentrations by exercise results in an increased cardiac lipid content, and if this influences cardiac function and energy status. Eleven male subjects (age 25.4 ± 1.1 years, BMI 23.6 ± 0.8 kg/m2) performed a 2-h cycling protocol, once while staying fasted and once while ingesting glucose, to create a state of high versus low plasma FA concentrations, respectively. Cardiac lipid content was measured by proton magnetic resonance spectroscopy (1H-MRS) at baseline, directly after exercise and again 4 h post-exercise, together with systolic function (by multi-slice cine-MRI) and cardiac energy status (by 31P-MRS). Plasma FA concentrations were increased threefold during exercise and ninefold during recovery in the fasted state compared with the glucose-fed state (p < 0.01). Cardiac lipid content was elevated at the end of the fasted test day (from 0.26 ± 0.04 to 0.44 ± 0.04%, p = 0.003), while it did not change with glucose supplementation (from 0.32 ± 0.03 to 0.26 ± 0.05%, p = 0.272). Furthermore, PCr/ATP was decreased by 32% in the high plasma FA state compared with the low FA state (n = 6, p = 0.014). However, in the high FA state, the ejection fraction 4 h post-exercise was higher compared with the low FA state (63 ± 2 vs. 59 ± 2%, p = 0.018). Elevated plasma FA concentrations, induced by exercise in the fasted state, lead to increased cardiac lipid content, but do not acutely hamper systolic function. Although the lower cardiac energy status is in line with a lipotoxic action of cardiac lipid content, a causal relationship cannot be proven
Celio (\u2705), Orkand (\u2707) Named Up and Coming Leaders
BackgroundAlthough auditory verbal hallucinations (AVH) are a core symptom of schizophrenia, they also occur in non-psychotic individuals, in the absence of other psychotic, affective, cognitive and negative symptoms. AVH have been hypothesized to result from deviant integration of inferior frontal, parahippocampal and superior temporal brain areas. However, a direct link between dysfunctional connectivity and AVH has not yet been established. To determine whether hallucinations are indeed related to aberrant connectivity, AVH should be studied in isolation, for example in non-psychotic individuals with AVH.MethodResting-state connectivity was investigated in 25 non-psychotic subjects with AVH and 25 matched control subjects using seed regression analysis with the (1) left and (2) right inferior frontal, (3) left and (4) right superior temporal and (5) left parahippocampal areas as the seed regions. To correct for cardiorespiratory (CR) pulsatility rhythms in the functional magnetic resonance imaging (fMRI) data, heartbeat and respiration were monitored during scanning and the fMRI data were corrected for these rhythms using the image-based method for retrospective correction of physiological motion effects RETROICOR.ResultsIn comparison with the control group, non-psychotic individuals with AVH showed increased connectivity between the left and the right superior temporal regions and also between the left parahippocampal region and the left inferior frontal gyrus. Moreover, this group did not show a negative correlation between the left superior temporal region and the right inferior frontal region, as was observed in the healthy control group.ConclusionsAberrant connectivity of frontal, parahippocampal and superior temporal brain areas can be specifically related to the predisposition to hallucinate in the auditory domain.</jats:sec
Treatment-specific risk of subsequent malignant neoplasms in five-year survivors of diffuse large B-cell lymphoma
Background: The introduction of rituximab significantly improved the prognosis of diffuse large B-cell lymphoma (DLBCL), emphasizing the importance of evaluating the long-term consequences of exposure to radiotherapy, alkylating agents and anthracycline-containing (immuno)chemotherapy among DLBCL survivors. Methods: Long-term risk of subsequent malignant neoplasms (SMNs) was examined in a multicenter cohort comprising 2373 5-year DLBCL survivors treated at ages 15-61 years in 1989-2012. Observed SMN numbers were compared with expected cancer incidence to estimate standardized incidence ratios (SIRs) and absolute excess risks (AERs/10 000 person-years). Treatment-specific risks were assessed using multivariable Cox regression. Results: After a median follow-up of 13.8 years, 321 survivors developed one or more SMNs (SIR 1.5, 95% CI 1.3-1.8, AER 51.8). SIRs remained increased for at least 20 years after first-line treatment (SIR ≥20-year follow-up 1.5, 95% CI 1.0-2.2, AER 81.8) and were highest among patients ≤40 years at first DLBCL treatment (SIR 2.7, 95% CI 2.0-3.5). Lung (SIR 2.0, 95% CI 1.5-2.7, AER 13.4) and gastrointestinal cancers (SIR 1.5, 95% CI 1.2-2.0, AER 11.8) accounted for the largest excess risks. Treatment with >4500 mg/m2 cyclophosphamide/>300 mg/m2 doxorubicin versus ≤2250 mg/m2/≤150 mg/m2, respectively, was associated with increased solid SMN risk (hazard ratio 1.5, 95% CI 1.0-2.2). Survivors who received rituximab had a lower risk of subdiaphragmatic solid SMNs (hazard ratio 0.5, 95% CI 0.3-1.0) compared with survivors who did not receive rituximab. Conclusion: Five-year DLBCL survivors have an increased risk of SMNs. Risks were higher for survivors ≤40 years at first treatment and survivors treated with >4500 mg/m2 cyclophosphamide/>300 mg/m2 doxorubicin, and may be lower for survivors treated in the rituximab era, emphasizing the need for studies with longer follow-up for rituximab-treated patients.</p
Treatment-specific risk of subsequent malignant neoplasms in five-year survivors of diffuse large B-cell lymphoma
Background: The introduction of rituximab significantly improved the prognosis of diffuse large B-cell lymphoma (DLBCL), emphasizing the importance of evaluating the long-term consequences of exposure to radiotherapy, alkylating agents and anthracycline-containing (immuno)chemotherapy among DLBCL survivors. Methods: Long-term risk of subsequent malignant neoplasms (SMNs) was examined in a multicenter cohort comprising 2373 5-year DLBCL survivors treated at ages 15-61 years in 1989-2012. Observed SMN numbers were compared with expected cancer incidence to estimate standardized incidence ratios (SIRs) and absolute excess risks (AERs/10 000 person-years). Treatment-specific risks were assessed using multivariable Cox regression. Results: After a median follow-up of 13.8 years, 321 survivors developed one or more SMNs (SIR 1.5, 95% CI 1.3-1.8, AER 51.8). SIRs remained increased for at least 20 years after first-line treatment (SIR ≥20-year follow-up 1.5, 95% CI 1.0-2.2, AER 81.8) and were highest among patients ≤40 years at first DLBCL treatment (SIR 2.7, 95% CI 2.0-3.5). Lung (SIR 2.0, 95% CI 1.5-2.7, AER 13.4) and gastrointestinal cancers (SIR 1.5, 95% CI 1.2-2.0, AER 11.8) accounted for the largest excess risks. Treatment with >4500 mg/m2 cyclophosphamide/>300 mg/m2 doxorubicin versus ≤2250 mg/m2/≤150 mg/m2, respectively, was associated with increased solid SMN risk (hazard ratio 1.5, 95% CI 1.0-2.2). Survivors who received rituximab had a lower risk of subdiaphragmatic solid SMNs (hazard ratio 0.5, 95% CI 0.3-1.0) compared with survivors who did not receive rituximab. Conclusion: Five-year DLBCL survivors have an increased risk of SMNs. Risks were higher for survivors ≤40 years at first treatment and survivors treated with >4500 mg/m2 cyclophosphamide/>300 mg/m2 doxorubicin, and may be lower for survivors treated in the rituximab era, emphasizing the need for studies with longer follow-up for rituximab-treated patients.</p
Treatment-specific risk of subsequent malignant neoplasms in five-year survivors of diffuse large B-cell lymphoma
BACKGROUND: The introduction of rituximab significantly improved the prognosis of diffuse large B-cell lymphoma (DLBCL), emphasizing the importance of evaluating the long-term consequences of exposure to radiotherapy, alkylating agents and anthracycline-containing (immuno)chemotherapy among DLBCL survivors. METHODS: Long-term risk of subsequent malignant neoplasms (SMNs) was examined in a multicenter cohort comprising 2373 5-year DLBCL survivors treated at ages 15-61 years in 1989-2012. Observed SMN numbers were compared with expected cancer incidence to estimate standardized incidence ratios (SIRs) and absolute excess risks (AERs/10 000 person-years). Treatment-specific risks were assessed using multivariable Cox regression. RESULTS: After a median follow-up of 13.8 years, 321 survivors developed one or more SMNs (SIR 1.5, 95% CI 1.3-1.8, AER 51.8). SIRs remained increased for at least 20 years after first-line treatment (SIR ≥20-year follow-up 1.5, 95% CI 1.0-2.2, AER 81.8) and were highest among patients ≤40 years at first DLBCL treatment (SIR 2.7, 95% CI 2.0-3.5). Lung (SIR 2.0, 95% CI 1.5-2.7, AER 13.4) and gastrointestinal cancers (SIR 1.5, 95% CI 1.2-2.0, AER 11.8) accounted for the largest excess risks. Treatment with >4500 mg/m 2 cyclophosphamide/>300 mg/m 2 doxorubicin versus ≤2250 mg/m 2/≤150 mg/m 2, respectively, was associated with increased solid SMN risk (hazard ratio 1.5, 95% CI 1.0-2.2). Survivors who received rituximab had a lower risk of subdiaphragmatic solid SMNs (hazard ratio 0.5, 95% CI 0.3-1.0) compared with survivors who did not receive rituximab. CONCLUSION: Five-year DLBCL survivors have an increased risk of SMNs. Risks were higher for survivors ≤40 years at first treatment and survivors treated with >4500 mg/m 2 cyclophosphamide/>300 mg/m 2 doxorubicin, and may be lower for survivors treated in the rituximab era, emphasizing the need for studies with longer follow-up for rituximab-treated patients
High cellular monocyte activation in people living with human immunodeficiency virus on combination antiretroviral therapy and lifestyle-matched controls is associated with greater inflammation in cerebrospinal fluid
Background. Increased monocyte activation and intestinal damage have been shown to be predictive for the increased morbidity and mortality observed in treated people living with human immunodeficiency virus (PLHIV). Methods. A cross-sectional analysis of cellular and soluble markers of monocyte activation, coagulation, intestinal damage, and inflammation in plasma and cerebrospinal fluid (CSF) of PLHIV with suppressed plasma viremia on combination antiretroviral therapy and age and demographically comparable HIV-negative individuals participating in the Comorbidity in Relation to AIDS (COBRA) cohort and, where appropriate, age-matched blood bank donors (BBD). Results. People living with HIV, HIV-negative individuals, and BBD had comparable percentages of classical, intermediate, and nonclassical monocytes. Expression of CD163, CD32, CD64, HLA-DR, CD38, CD40, CD86, CD91, CD11c, and CX3CR1 on monocytes did not differ between PLHIV and HIV-negative individuals, but it differed significantly from BBD. Principal component analysis revealed that 57.5% of PLHIV and 62.5% of HIV-negative individuals had a high monocyte activation profile compared with 2.9% of BBD. Cellular monocyte activation in the COBRA cohort was strongly associated with soluble markers of monocyte activation and inflammation in the CSF. Conclusions. People living with HIV and HIV-negative COBRA participants had high levels of cellular monocyte activation compared with age-matched BBD. High monocyte activation was predictive for inflammation in the CSF
Multivariate genome-wide analyses of the well-being spectrum
We introduce two novel methods for multivariate genome-wide-association meta-analysis (GWAMA) of related traits that correct for sample overlap. A broad range of simulation scenarios supports the added value of our multivariate methods relative to univariate GWAMA. We applied the novel methods to life satisfaction, positive affect, neuroticism, and depressive symptoms, collectively referred to as the well-being spectrum (Nobs = 2,370,390), and found 304 significant independent signals. Our multivariate approaches resulted in a 26% increase in the number of independent signals relative to the four univariate GWAMAs and in an ~57% increase in the predictive power of polygenic risk scores. Supporting transcriptome- and methylome-wide analyses (TWAS and MWAS, respectively) uncovered an additional 17 and 75 independent loci, respectively. Bioinformatic analyses, based on gene expression in brain tissues and cells, showed that genes differentially expressed in the subiculum and GABAergic interneurons are enriched in their effect on the well-being spectrum
Testing effectiveness of the revised Cape Town modified early warning and SBAR systems: a pilot pragmatic parallel group randomised controlled trial
Abstract
Background
Nurses’ recognition of clinical deterioration is crucial for patient survival. Evidence for the effectiveness of modified early warning scores (MEWS) is derived from large observation studies in developed countries.
Methods
We tested the effectiveness of the paper-based Cape Town (CT) MEWS vital signs observation chart and situation-background-assessment-recommendation (SBAR) communication guide. Outcomes were: proportion of appropriate responses to deterioration, differences in recording of clinical parameters and serious adverse events (SAEs) in intervention and control trial arms. Public teaching hospitals for adult patients in Cape Town were randomised to implementation of the CT MEWS/SBAR guide or usual care (observation chart without track-and-trigger information) for 31 days on general medical and surgical wards. Nurses in intervention wards received training, as they had no prior knowledge of early warning systems. Identification and reporting of patient deterioration in intervention and control wards were compared. In the intervention arm, 24 day-shift and 23 night-shift nurses received training. Clinical records were reviewed retrospectively at trial end. Only records of patients who had given signed consent were reviewed.
Results
We recruited two of six CT general hospitals. We consented 363 patients and analysed 292 (80.4%) patient records (n = 150, 51.4% intervention, n = 142, 48.6% control arm). Assistance was summoned for fewer patients with abnormal vital signs in the intervention arm (2/45, 4.4% versus (vs) 11/81, 13.6%, OR 0.29 (0.06–1.39)), particularly low systolic blood pressure. There was a significant difference in recording between trial arms for parameters listed on the MEWS chart but omitted from the standard observations chart: oxygen saturation, level of consciousness, pallor/cyanosis, pain, sweating, wound oozing, pedal pulses, glucose concentration, haemoglobin concentration, and “looks unwell”. SBAR was used twice. There was no statistically significant difference in SAEs (5/150, 3.3% vs 3/143, 2.1% P = 0.72, OR 1.61 (0.38–6.86)).
Conclusions
The revised CT MEWS observations chart improved recording of certain parameters, but did not improve nurses’ ability to identify early signs of clinical deterioration and to summon assistance. Recruitment of only two hospitals and exclusion of patients too ill to consent limits generalisation of results. Further work is needed on educational preparation for the CT MEWS/SBAR and its impact on nurses’ reporting behaviour.
Trial registration
Pan African Clinical Trials Registry, PACTR201406000838118. Registered on 2 June 2014, www.pactr.org
- …