61 research outputs found
Natural Choline from Egg Yolk Phospholipids Is More Efficiently Absorbed Compared with Choline Bitartrate; Outcomes of A Randomized Trial in Healthy Adults
Choline is a vitamin-like essential nutrient, important throughout one’s lifespan. Therefore,
choline salts are added to infant formula, supplements and functional foods. However, if choline
is present in a natural form, e.g. bound to phospholipids, it may be more efficiently absorbed.
The study’s aim was to evaluate if choline uptake is improved after consumption of an egg yolk
phospholipid drink, containing 3 g of phospholipid bound choline, compared to a control drink
with 3 g of choline bitartrate. We performed a randomized, double blind, cross-over trial with
18 participants. Plasma choline, betaine and dimethylglycine concentrations were determined before
and up to six hours after consumption of the drinks. The plasma choline response, as determined
by the incremental area under the curve, was four times higher after consumption of the egg yolk
phospholipid drink compared with the control drink (p < 0.01). Similar outcomes were also observed
for choline’s main metabolites, betaine (p < 0.01) and dimethylglycine (p = 0.01). Consumption of
natural choline from egg yolk phospholipids improved choline absorption compared to consumption
of chemically produced choline bitartrate. This information is of relevance for the food industry,
instead of adding choline-salts, adding choline from egg yolk phospholipids can improve choline
uptake and positively impact health
Sample Stability and Protein Composition of Saliva: Implications for Its Use as a Diagnostic Fluid
Saliva is an easy accessible plasma ultra-filtrate. Therefore, saliva can be an attractive alternative to blood for measurement of diagnostic protein markers. Our aim was to determine stability and protein composition of saliva. Protein stability at room temperature was examined by incubating fresh whole saliva with and without inhibitors of proteases and bacterial metabolism followed by Surface Enhanced Laser Desorption/Ionization (SELDI) analyses. Protein composition was determined by sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) fractionation of saliva proteins followed by digestion of excised bands and identification by liquid chromatography tandem mass spectrometry (LC-MS/MS). Results show that rapid protein degradation occurs within 30 minutes after sample collection. Degradation starts already during collection. Protease inhibitors partly prevented degradation while inhibition of bacterial metabolism did not affect degradation. Three stable degradation products of 2937 Da, 3370 Da and 4132 Da were discovered which can be used as markers to monitor sample quality. Saliva proteome analyses revealed 218 proteins of which 84 can also be found in blood plasma. Based on a comparison with seven other proteomics studies on whole saliva we identified 83 new saliva proteins. We conclude that saliva is a promising diagnostic fluid when precautions are taken towards protein breakdown
Whole Grain Wheat Consumption Affects Postprandial Inflammatory Response in a Randomized Controlled Trial in Overweight and Obese Adults with Mild Hypercholesterolemia in the Graandioos Study
BACKGROUND: Whole grain wheat (WGW) consumption is associated with health benefits in observational studies. However, WGW randomized controlled trial (RCT) studies show mixed effects. OBJECTIVES: The health impact of WGW consumption was investigated by quantification of the body's resilience, which was defined as the "ability to adapt to a standardized challenge." METHODS: A double-blind RCT was performed with overweight and obese (BMI: 25-35 kg/m2) men (n = 19) and postmenopausal women (n = 31) aged 45-70 y, with mildly elevated plasma total cholesterol (>5 mmol/L), who were randomly assigned to either 12-wk WGW (98 g/d) or refined wheat (RW). Before and after the intervention a standardized mixed-meal challenge was performed. Plasma samples were taken after overnight fasting and postprandially (30, 60, 120, and 240 min). Thirty-one biomarkers were quantified focusing on metabolism, liver, cardiovascular health, and inflammation. Linear mixed-models evaluated fasting compared with postprandial intervention effects. Health space models were used to evaluate intervention effects as composite markers representing resilience of inflammation, liver, and metabolism. RESULTS: Postprandial biomarker changes related to liver showed decreased alanine aminotransferase by WGW (P = 0.03) and increased β-hydroxybutyrate (P = 0.001) response in RW. Postprandial changes related to inflammation showed increased C-reactive protein (P = 0.001), IL-6 (P = 0.02), IL-8 (P = 0.007), and decreased IL-1B (P = 0.0002) in RW and decreased C-reactive protein (P < 0.0001), serum amyloid A (P < 0.0001), IL-8 (P = 0.02), and IL-10 (P < 0.0001) in WGW. Health space visualization demonstrated diminished inflammatory (P < 0.01) and liver resilience (P < 0.01) by RW, whereas liver resilience was rejuvenated by WGW (P < 0.05). CONCLUSIONS: Twelve-week 98 g/d WGW consumption can promote liver and inflammatory resilience in overweight and obese subjects with mildly elevated plasma cholesterol. The health space approach appeared appropriate to evaluate intervention effects as composite markers. This trial was registered at www.clinicaltrials.gov as NCT02385149.</p
Variation in Structure and Process of Care in Traumatic Brain Injury: Provider Profiles of European Neurotrauma Centers Participating in the CENTER-TBI Study.
INTRODUCTION: The strength of evidence underpinning care and treatment recommendations in traumatic brain injury (TBI) is low. Comparative effectiveness research (CER) has been proposed as a framework to provide evidence for optimal care for TBI patients. The first step in CER is to map the existing variation. The aim of current study is to quantify variation in general structural and process characteristics among centers participating in the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study. METHODS: We designed a set of 11 provider profiling questionnaires with 321 questions about various aspects of TBI care, chosen based on literature and expert opinion. After pilot testing, questionnaires were disseminated to 71 centers from 20 countries participating in the CENTER-TBI study. Reliability of questionnaires was estimated by calculating a concordance rate among 5% duplicate questions. RESULTS: All 71 centers completed the questionnaires. Median concordance rate among duplicate questions was 0.85. The majority of centers were academic hospitals (n = 65, 92%), designated as a level I trauma center (n = 48, 68%) and situated in an urban location (n = 70, 99%). The availability of facilities for neuro-trauma care varied across centers; e.g. 40 (57%) had a dedicated neuro-intensive care unit (ICU), 36 (51%) had an in-hospital rehabilitation unit and the organization of the ICU was closed in 64% (n = 45) of the centers. In addition, we found wide variation in processes of care, such as the ICU admission policy and intracranial pressure monitoring policy among centers. CONCLUSION: Even among high-volume, specialized neurotrauma centers there is substantial variation in structures and processes of TBI care. This variation provides an opportunity to study effectiveness of specific aspects of TBI care and to identify best practices with CER approaches
Recommended from our members
Occurrence and timing of withdrawal of life-sustaining measures in traumatic brain injury patients: a CENTER-TBI study
Funder: National Institute for Health Research (UK)Abstract: Background: In patients with severe brain injury, withdrawal of life-sustaining measures (WLSM) is common in intensive care units (ICU). WLSM constitutes a dilemma: instituting WLSM too early could result in death despite the possibility of an acceptable functional outcome, whereas delaying WLSM could unnecessarily burden patients, families, clinicians, and hospital resources. We aimed to describe the occurrence and timing of WLSM, and factors associated with timing of WLSM in European ICUs in patients with traumatic brain injury (TBI). Methods: The CENTER-TBI Study is a prospective multi-center cohort study. For the current study, patients with traumatic brain injury (TBI) admitted to the ICU and aged 16 or older were included. Occurrence and timing of WLSM were documented. For the analyses, we dichotomized timing of WLSM in early (< 72 h after injury) versus later (≥ 72 h after injury) based on recent guideline recommendations. We assessed factors associated with initiating WLSM early versus later, including geographic region, center, patient, injury, and treatment characteristics with univariable and multivariable (mixed effects) logistic regression. Results: A total of 2022 patients aged 16 or older were admitted to the ICU. ICU mortality was 13% (n = 267). Of these, 229 (86%) patients died after WLSM, and were included in the analyses. The occurrence of WLSM varied between regions ranging from 0% in Eastern Europe to 96% in Northern Europe. In 51% of the patients, WLSM was early. Patients in the early WLSM group had a lower maximum therapy intensity level (TIL) score than patients in the later WLSM group (median of 5 versus 10) The strongest independent variables associated with early WLSM were one unreactive pupil (odds ratio (OR) 4.0, 95% confidence interval (CI) 1.3–12.4) or two unreactive pupils (OR 5.8, CI 2.6–13.1) compared to two reactive pupils, and an Injury Severity Score (ISS) if over 41 (OR per point above 41 = 1.1, CI 1.0–1.1). Timing of WLSM was not significantly associated with region or center. Conclusion: WLSM occurs early in half of the patients, mostly in patients with severe TBI affecting brainstem reflexes who were severely injured. We found no regional or center influences in timing of WLSM. Whether WLSM is always appropriate or may contribute to a self-fulfilling prophecy requires further research and argues for reluctance to institute WLSM early in case of any doubt on prognosis
Variation in general supportive and preventive intensive care management of traumatic brain injury: a survey in 66 neurotrauma centers participating in the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study
Abstract
Background
General supportive and preventive measures in the intensive care management of traumatic brain injury (TBI) aim to prevent or limit secondary brain injury and optimize recovery. The aim of this survey was to assess and quantify variation in perceptions on intensive care unit (ICU) management of patients with TBI in European neurotrauma centers.
Methods
We performed a survey as part of the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study. We analyzed 23 questions focused on: 1) circulatory and respiratory management; 2) fever control; 3) use of corticosteroids; 4) nutrition and glucose management; and 5) seizure prophylaxis and treatment.
Results
The survey was completed predominantly by intensivists (n = 33, 50%) and neurosurgeons (n = 23, 35%) from 66 centers (97% response rate).
The most common cerebral perfusion pressure (CPP) target was > 60 mmHg (n = 39, 60%) and/or an individualized target (n = 25, 38%). To support CPP, crystalloid fluid loading (n = 60, 91%) was generally preferred over albumin (n = 15, 23%), and vasopressors (n = 63, 96%) over inotropes (n = 29, 44%). The most commonly reported target of partial pressure of carbon dioxide in arterial blood (PaCO2) was 36–40 mmHg (4.8–5.3 kPa) in case of controlled intracranial pressure (ICP) < 20 mmHg (n = 45, 69%) and PaCO2 target of 30–35 mmHg (4–4.7 kPa) in case of raised ICP (n = 40, 62%). Almost all respondents indicated to generally treat fever (n = 65, 98%) with paracetamol (n = 61, 92%) and/or external cooling (n = 49, 74%). Conventional glucose management (n = 43, 66%) was preferred over tight glycemic control (n = 18, 28%). More than half of the respondents indicated to aim for full caloric replacement within 7 days (n = 43, 66%) using enteral nutrition (n = 60, 92%). Indications for and duration of seizure prophylaxis varied, and levetiracetam was mostly reported as the agent of choice for both seizure prophylaxis (n = 32, 49%) and treatment (n = 40, 61%).
Conclusions
Practice preferences vary substantially regarding general supportive and preventive measures in TBI patients at ICUs of European neurotrauma centers. These results provide an opportunity for future comparative effectiveness research, since a more evidence-based uniformity in good practices in general ICU management could have a major impact on TBI outcome
Variation in neurosurgical management of traumatic brain injury
Background: Neurosurgical management of traumatic brain injury (TBI) is challenging, with only low-quality evidence. We aimed to explore differences in neurosurgical strategies for TBI across Europe. Methods: A survey was sent to 68 centers participating in the Collaborative European Neurotrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study. The questionnaire contained 21 questions, including the decision when to operate (or not) on traumatic acute subdural hematoma (ASDH) and intracerebral hematoma (ICH), and when to perform a decompressive craniectomy (DC) in raised intracranial pressure (ICP). Results: The survey was completed by 68 centers (100%). On average, 10 neurosurgeons work in each trauma center. In all centers, a neurosurgeon was available within 30 min. Forty percent of responders reported a thickness or volume threshold for evacuation of an ASDH. Most responders (78%) decide on a primary DC in evacuating an ASDH during the operation, when swelling is present. For ICH, 3% would perform an evacuation directly to prevent secondary deterioration and 66% only in case of clinical deterioration. Most respondents (91%) reported to consider a DC for refractory high ICP. The reported cut-off ICP for DC in refractory high ICP, however, differed: 60% uses 25 mmHg, 18% 30 mmHg, and 17% 20 mmHg. Treatment strategies varied substantially between regions, specifically for the threshold for ASDH surgery and DC for refractory raised ICP. Also within center variation was present: 31% reported variation within the hospital for inserting an ICP monitor and 43% for evacuating mass lesions. Conclusion: Despite a homogeneous organization, considerable practice variation exists of neurosurgical strategies for TBI in Europe. These results provide an incentive for comparative effectiveness research to determine elements of effective neurosurgical care
High fat challenges with different fatty acids affect distinct atherogenic gene expression pathways in immune cells from lean and obese subjects
Early perturbations in vascular health can be detected by imposing subjects to a high fat (HF) challenge and measure response capacity. Subtle responses can be determined by assessment of whole-genome transcriptional changes. We aimed to magnify differences in health by comparing gene-expression changes in peripheral blood mononuclear cells (PBMCs) towards a high MUFA or SFA challenge between subjects with different cardiovascular disease risk profiles and to identify fatty-acid specific gene-expression pathways. METHODS AND RESULTS: In a cross-over study, 17 lean and 15 obese men (50-70y) received two 95g fat shakes, high in SFAs or MUFAs. PBMC gene-expression profiles were assessed fasted and 4h postprandially. Comparisons were made between groups and shakes. During fasting, 294 genes were significantly different expressed between lean and obese. The challenge increased differences to 607 genes after SFA and 2516 genes after MUFA. In both groups, SFA decreased expression of cholesterol biosynthesis and uptake genes and increased cholesterol efflux genes. MUFA increased inflammatory genes and PPARα targets involved in β-oxidation. CONCLUSION: Based upon gene-expression changes, we conclude that a HF challenge magnifies differences in health, especially after MUFA. Our findings also demonstrate how SFAs and MUFAs exert distinct effects on lipid handling pathways in immune cells
Design and operational improvements for high ammonium removal by one-stage vertical flow constructed wetlands
This paper deals with the monitoring of 4 vertical flow constructed wetland pilots of different designs treating raw wastewater. The main objective was to identify the critical design parameters to improve ammonium removal on one stage of treatment. Results showed different treatment performances for dissolved pollutants confirming the importance of design parameters to achieve high ammonium removal. The load applied (i.e. the surface per p.e.), the depth and the granular material used as a filter medium (crushed granitic gravel or zeolite) showed different influences on nitrification performances. Increasing filtration depth from 40 to 100cm shows an improvement of ammonium removal from 54.2 to 75% respectively. On the other hand, decreasing the treatment surface area from 0.4 to 0.25m²/PE results in a drop of performances (40.3%) while no negative effects were observed considering particulate pollution. The use of zeolite slightly improves the ammonium treatment but without significance (60.1%)
- …