22 research outputs found
Temporal artery temperature measurements versus bladder temperature in critically ill patients, a prospective observational study
PurposeAccurate measurement of body temperature is important for the timely detection of fever or hypothermia in critically ill patients. In this prospective study, we evaluated whether the agreement between temperature measurements obtained with TAT (test method) and bladder catheter-derived temperature measurements (BT; reference method) is sufficient for clinical practice in critically ill patients.MethodsPatients acutely admitted to the Intensive Care Unit were included. After BT was recorded TAT measurements were performed by two independent researchers (TAT1; TAT2). The agreement between TAT and BT was assessed using Bland-Altman plots. Clinical acceptable limits of agreement (LOA) were defined a priori (ResultsIn total, 90 critically ill patients (64 males; mean age 62 years) were included. The observed mean difference (TAT-BT; ±SD, 95% LOA) between TAT and BT was 0.12°C (-1.08°C to +1.32°C) for TAT1 and 0.14°C (-1.05°C to +1.33°C) for TAT2. 36% (TAT1) and 42% (TAT2) of all paired measurements failed to meet the acceptable LOA of 0.5°C. Subgroup analysis showed that when patients were receiving intravenous norepinephrine, the measurements of the test method deviated more from the reference method (p = NS).ConclusionThe TAT is not sufficient for clinical practice in critically ill adults
Entry-level career paths in the life sciences: generic skills in Dutch job postings
The importance of generic skills for life scientists is commonly recognised by employers, graduates, and higher education institutes. As it remains unclear which generic skills are relevant for different life sciences career paths, this study aims to give an overview to inform and inspire universities and students, by analysing 179 Dutch entry-level job postings. We deductively coded nine career paths, namely: life sciences industry, PhD-student, quality compliance, research-related, sales & business, communication/education, information technology, consultancy, and policy. We coded generic skills using an adapted categorisation consisting of 46 generic skills within four categories, which were: self, others, information, and tasks. The descriptive statistics and cluster analysis results showed that although language, communication, and collaboration were the most requested skills, differences in requested generic skills between career paths and cluster composition were observed as well. We concluded that although some generic skills are important in general, other generic skills are relevant for specific life sciences career paths. To educate skilled life scientists, universities should consider the flexible integration of these generic skills in their life sciences programmes
Markers of sulfadoxine-pyrimethamine resistance in Eastern Democratic Republic of Congo; implications for malaria chemoprevention.
BACKGROUND: Sulfadoxine-pyrimethamine (SP) is a cornerstone of malaria chemoprophylaxis and is considered for programmes in the Democratic Republic of Congo (DRC). However, SP efficacy is threatened by drug resistance, that is conferred by mutations in the dhfr and dhps genes. The World Health Organization has specified that intermittent preventive treatment for infants (IPTi) with SP should be implemented only if the prevalence of the dhps K540E mutation is under 50%. There are limited current data on the prevalence of resistance-conferring mutations available from Eastern DRC. The current study aimed to address this knowledge gap. METHODS: Dried blood-spot samples were collected from clinically suspected malaria patients [outpatient department (OPD)] and pregnant women attending antenatal care (ANC) in four sites in North and South Kivu, DRC. Quantitative PCR (qPCR) was performed on samples from individuals with positive and with negative rapid diagnostic test (RDT) results. Dhps K450E and A581G and dhfr I164L were assessed by nested PCR followed by allele-specific primer extension and detection by multiplex bead-based assays. RESULTS: Across populations, Plasmodium falciparum parasite prevalence was 47.9% (1160/2421) by RDT and 71.7 (1763/2421) by qPCR. Median parasite density measured by qPCR in RDT-negative qPCR-positive samples was very low with a median of 2.3 parasites/µL (IQR 0.5-25.2). Resistance genotyping was successfully performed in RDT-positive samples and RDT-negative/qPCR-positive samples with success rates of 86.2% (937/1086) and 55.5% (361/651), respectively. The presence of dhps K540E was high across sites (50.3-87.9%), with strong evidence for differences between sites (p < 0.001). Dhps A581G mutants were less prevalent (12.7-47.2%). The dhfr I164L mutation was found in one sample. CONCLUSIONS: The prevalence of the SP resistance marker dhps K540E exceeds 50% in all four study sites in North and South Kivu, DRC. K540E mutations regularly co-occurred with mutations in dhps A581G but not with the dhfr I164L mutation. The current results do not support implementation of IPTi with SP in the study area
A Prediction Model for Successful Increase of Adalimumab Dose Intervals in Patients with Crohn’s Disease:Secondary Analysis of the Pragmatic Open-Label Randomised Controlled Non-inferiority LADI Trial
Background: In the pragmatic open-label randomised controlled non-inferiority LADI trial we showed that increasing adalimumab (ADA) dose intervals was non-inferior to conventional dosing for persistent flares in patients with Crohn’s disease (CD) in clinical and biochemical remission. Aims: To develop a prediction model to identify patients who can successfully increase their ADA dose interval based on secondary analysis of trial data. Methods: Patients in the intervention group of the LADI trial increased ADA intervals to 3 and then to 4 weeks. The dose interval increase was defined as successful when patients had no persistent flare (> 8 weeks), no intervention-related severe adverse events, no rescue medication use during the study, and were on an increased dose interval while in clinical and biochemical remission at week 48. Prediction models were based on logistic regression with relaxed LASSO. Models were internally validated using bootstrap optimism correction. Results: We included 109 patients, of which 60.6% successfully increased their dose interval. Patients that were active smokers (odds ratio [OR] 0.90), had previous CD-related intra-abdominal surgeries (OR 0.85), proximal small bowel disease (OR 0.92), an increased Harvey-Bradshaw Index (OR 0.99) or increased faecal calprotectin (OR 0.997) were less likely to successfully increase their dose interval. The model had fair discriminative ability (AUC = 0.63) and net benefit analysis showed that the model could be used to select patients who could increase their dose interval. Conclusion: The final prediction model seems promising to select patients who could successfully increase their ADA dose interval. The model should be validated externally before it may be applied in clinical practice. Clinical Trial Registration Number: ClinicalTrials.gov, number NCT03172377.</p
Association between i.v. thrombolysis volume and door-to-needle times in acute ischemic stroke
Centralization of intravenous thrombolysis (IVT) for acute ischemic stroke in high-volume centers is believed to improve the door-to-needle times (DNT), but limited data support this assumption. We examined the association between DNT and IVT volume in a large Dutch province. We identified consecutive patients treated with IVT between January 2009 and 2013. Based on annualized IVT volume, hospitals were categorized as low-volume (≤ 24), medium-volume (25-49) or high-volume (≥ 50). In logistic regression analysis, low-volume hospitals were used as reference category. Of 17,332 stroke patients from 11 participating hospitals, 1962 received IVT (11.3 %). We excluded 140 patients because of unknown DNT (n = 86) or in-hospital stroke (n = 54). There were two low-volume (total 101 patients), five medium-volume (747 patients) and four high-volume hospitals (974 patients). Median DNT was shorter in high-volume hospitals (30 min) than in medium-volume (42 min, p < 0.001) and low-volume hospitals (38 min, p < 0.001). Patients admitted to high-volume hospitals had a higher chance of DNT < 30 min (adjusted OR 3.13, 95 % CI 1.70-5.75), lower risk of symptomatic intracerebral hemorrhage (adjusted OR 0.39, 95 % CI 0.16-0.92), and a lower mortality risk (adjusted OR 0.45, 95 % CI 0.21-1.01), compared to low-volume centers. There was no difference in DNT between low- and medium-volume hospitals. Onset-to-needle times (ONT) did not differ between the groups. Hospitals in this Dutch province generally achieved short DNTs. Despite this overall good performance, higher IVT volumes were associated with shorter DNTs and lower complication risks. The ONT was not associated with IVT volum
The nature of the self: Neural analyses and heritability estimates of self-evaluations in middle childhood
How neural correlates of self-concept are influenced by environmental versus genetic factors is currently not fully understood. We investigated heritability estimates of behavioral and neural correlates of self-concept in middle childhood since this phase is an important time window for taking on new social roles in academic and social contexts. To do so, a validated self-concept fMRI task was applied in a twin sample of 345 participants aged between 7 and 9 years. In the self-concept condition, participants were asked to indicate whether academic and social traits applied to them whereas the control condition required trait categorization. The self-processing activation analyses (n = 234) revealed stronger medial prefrontal cortex (mPFC) activation for self than for control conditions. This effect was more pronounced for social-self than academic self-traits, whereas stronger dorsolateral prefrontal cortex (DLPFC) activation was observed for academic versus social self-evaluations. Behavioral genetic modeling (166 complete twin pairs) revealed that 25–52% of the variation in academic self-evaluations was explained by genetic factors, whereas 16–49% of the variation in social self-evaluations was explained by shared environmental factors. Neural genetic modeling (91 complete twin pairs) for variation in mPFC and anterior prefrontal cortex (PFC) activation for academic self-evaluations confirmed genetic and unique environmental influences, whereas anterior PFC activation for social self-evaluations was additionally influenced by shared environmental influences. This indicates that environmental context possibly has a larger impact on the behavioral and neural correlates of social self-concept at a young age. This is the first study demonstrating in a young twin sample that self-concept depends on both genetic and environmental factors, depending on the specific domain
Temporal artery temperature measurements versus bladder temperature in critically ill patients, a prospective observational study
Purpose: Accurate measurement of body temperature is important for the timely detection of fever or hypothermia in critically ill patients. In this prospective study, we evaluated whether the agreement between temperature measurements obtained with TAT (test method) and bladder catheter-derived temperature measurements (BT; reference method) is sufficient for clinical practice in critically ill patients. Methods: Patients acutely admitted to the Intensive Care Unit were included. After BT was recorded TAT measurements were performed by two independent researchers (TAT1; TAT2). The agreement between TAT and BT was assessed using Bland-Altman plots. Clinical acceptable limits of agreement (LOA) were defined a priori (<0.5°C). Subgroup analysis was performed in patients receiving norepinephrine. Results: In total, 90 critically ill patients (64 males; mean age 62 years) were included. The observed mean difference (TAT-BT; ±SD, 95% LOA) between TAT and BT was 0.12°C (-1.08°C to +1.32°C) for TAT1 and 0.14°C (-1.05°C to +1.33°C) for TAT2. 36% (TAT1) and 42% (TAT2) of all paired measurements failed to meet the acceptable LOA of 0.5°C. Subgroup analysis showed that when patients were receiving intravenous norepinephrine, the measurements of the test method deviated more from the reference method (p = NS). Conclusion: The TAT is not sufficient for clinical practice in critically ill adult
Pros and cons of streamlining and use of computerised clinical decision support systems to future-proof oncological multidisciplinary team meetings.
INTRODUCTION: Nowadays nearly every patient with cancer is discussed in a multidisciplinary team meeting (MDTM) to determine an optimal treatment plan. The growth in the number of patients to be discussed is unsustainable. Streamlining and use of computerised clinical decision support systems (CCDSSs) are two major ways to restructure MDTMs. Streamlining is the process of selecting the patients who need to be discussed and in which type of MDTM. Using CCDSSs, patient data is automatically loaded into the minutes and a guideline-based treatment proposal is generated. We aimed to identify the pros and cons of streamlining and CCDSSs. METHODS: Semi-structured interviews were conducted with Dutch MDTM participants. With purposive sampling we maximised variation in participants' characteristics. Interview data were thematically analysed. RESULTS: Thirty-five interviews were analysed. All interviewees agreed on the need to change the current MDTM workflow. Streamlining suggestions were thematised based on standard and complex cases and the location of the MDTM (i.e. local, regional or nationwide). Interviewees suggested easing the pressure on MDTMs by discussing standard cases briefly, not at all, or outside the MDTM with only two to three specialists. Complex cases should be discussed in tumour-type-specific regional MDTMs and highly complex cases by regional/nationwide expert teams. Categorizing patients as standard or complex was found to be the greatest challenge of streamlining. CCDSSs were recognised as promising, although none of the interviewees had made use of them. The assumed advantage was their capacity to generate protocolised treatment proposals based on automatically uploaded patient data, to unify treatment proposals and to facilitate research. However, they were thought to limit the freedom to deviate from the treatment advice. CONCLUSION: To make oncological MDTMs sustainable, methods of streamlining should be developed and introduced. Physicians still have doubts about the value of CCDSSs
Temporal artery temperature measurements versus bladder temperature in critically ill patients, a prospective observational study
PURPOSE: Accurate measurement of body temperature is important for the timely detection of fever or hypothermia in critically ill patients. In this prospective study, we evaluated whether the agreement between temperature measurements obtained with TAT (test method) and bladder catheter-derived temperature measurements (BT; reference method) is sufficient for clinical practice in critically ill patients. METHODS: Patients acutely admitted to the Intensive Care Unit were included. After BT was recorded TAT measurements were performed by two independent researchers (TAT1; TAT2). The agreement between TAT and BT was assessed using Bland-Altman plots. Clinical acceptable limits of agreement (LOA) were defined a priori (<0.5°C). Subgroup analysis was performed in patients receiving norepinephrine. RESULTS: In total, 90 critically ill patients (64 males; mean age 62 years) were included. The observed mean difference (TAT-BT; ±SD, 95% LOA) between TAT and BT was 0.12°C (-1.08°C to +1.32°C) for TAT1 and 0.14°C (-1.05°C to +1.33°C) for TAT2. 36% (TAT1) and 42% (TAT2) of all paired measurements failed to meet the acceptable LOA of 0.5°C. Subgroup analysis showed that when patients were receiving intravenous norepinephrine, the measurements of the test method deviated more from the reference method (p = NS). CONCLUSION: The TAT is not sufficient for clinical practice in critically ill adults