229 research outputs found
Implementing SBIRT in a Critical Access Emergency Department
Purpose: Universal screening and brief intervention with referral to treatment (SBIRT) has become best practice for emergency departments (EDs) over the last two decades. Given the prevalence of alcohol use and the subsequent health impacts of drinking, EDs are well positioned to be on the front line of screening for risky drinking. The available literature is clear in its consensus that universal screening for alcohol use in the ED is critical to identifying people at high risk for drinking and improving health outcomes.
Aims: This project aimed to implement an SBIRT process in a critical access ED. To achieve this global aim, the project team developed an SBIRT process and educated nurses and providers on its use in the department.
Methods: The project team performed a two-month retrospective chart review determining the baseline rate of alcohol screening in the department. An SBIRT process was implemented in the unit. After implementation of the SBIRT process, a two-month chart review measured staff usage of the new procedure.
Results: Over the two-month implementation period, the percentage of patients in the ED screened for alcohol use increased from an average of sixty-five percent before the intervention to seventy-nine percent after.
Conclusions: Increased alcohol screening for patients in a critical access ED is possible with education and buy in from clinical staff. The existing electronic screener tool was widely preferred to the newer, paper AUDIT_C tool. Embedding the new screener tool in the electronic chart may be a way to increase convenience and therefore its adoption.
Keywords: SBIRT, alcohol use disorder, emergency department alcohol screening
Computer-aided diagnosis for (123I)FP-CIT imaging: impact on clinical reporting
BACKGROUND: For (123I)FP-CIT imaging, a number of algorithms have shown high performance in distinguishing normal patient images from those with disease, but none have yet been tested as part of reporting workflows. This study aims to evaluate the impact on reporters' performance of a computer-aided diagnosis (CADx) tool developed from established machine learning technology. Three experienced (123I)FP-CIT reporters (two radiologists and one clinical scientist) were asked to visually score 155 reconstructed clinical and research images on a 5-point diagnostic confidence scale (read 1). Once completed, the process was then repeated (read 2). Immediately after submitting each image score for a second time, the CADx system output was displayed to reporters alongside the image data. With this information available, the reporters submitted a score for the third time (read 3). Comparisons between reads 1 and 2 provided evidence of intra-operator reliability, and differences between reads 2 and 3 showed the impact of the CADx. RESULTS: The performance of all reporters demonstrated a degree of variability when analysing images through visual analysis alone. However, inclusion of CADx improved consistency between reporters, for both clinical and research data. The introduction of CADx increased the accuracy of the radiologists when reporting (unfamiliar) research images but had less impact on the clinical scientist and caused no significant change in accuracy for the clinical data. CONCLUSIONS: The outcomes for this study indicate the value of CADx as a diagnostic aid in the clinic and encourage future development for more refined incorporation into clinical practice
Renin-angiotensin-aldosterone system polymorphisms: a role or a hole in occurrence and long-term prognosis of acute myocardial infarction at young age
<p>Abstract</p> <p>Background</p> <p>The renin-angiotensin-aldosterone system (RAAS) is involved in the cardiovascular homeostasis as shown by previous studies reporting a positive association between specific RAAS genotypes and an increased risk of myocardial infarction. Anyhow the prognostic role in a long-term follow-up has not been yet investigated.</p> <p>Aim of the study was to evaluate the influence of the most studied RAAS genetic Single Nucleotide Polymorphisms (SNPs) on the occurrence and the long-term prognosis of acute myocardial infarction (AMI) at young age in an Italian population.</p> <p>Methods</p> <p>The study population consisted of 201 patients and 201 controls, matched for age and sex (mean age 40 ± 4 years; 90.5% males). The most frequent conventional risk factors were smoke (p < 0.001), family history for coronary artery diseases (p < 0.001), hypercholesterolemia (p = 0.001) and hypertension (p = 0.002). The tested genetic polymorphisms were angiotensin converting enzyme insertion/deletion (ACE I/D), angiotensin II type 1 receptor (AGTR1) A1166C and aldosterone synthase (CYP11B2) C-344T. Considering a long-term follow-up (9 ± 4 years) we compared genetic polymorphisms of patients with and without events (cardiac death, myocardial infarction, revascularization procedures).</p> <p>Results</p> <p>We found a borderline significant association of occurrence of AMI with the ACE D/I polymorphism (DD genotype, 42% in cases vs 31% in controls; p = 0.056). DD genotype remained statistically involved in the incidence of AMI also after adjustment for clinical confounders.</p> <p>On the other hand, during the 9-year follow-up (65 events, including 13 deaths) we found a role concerning the AGTR1: the AC heterozygous resulted more represented in the event group (p = 0.016) even if not independent from clinical confounders. Anyhow the Kaplan-Meier event free curves seem to confirm the unfavourable role of this polymorphism.</p> <p>Conclusion</p> <p>Polymorphisms in RAAS genes can be important in the onset of a first AMI in young patients (ACE, CYP11B2 polymorphisms), but not in the disease progression after a long follow-up period. Larger collaborative studies are needed to confirm these results.</p
Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?
Background
Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified.
This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson’s Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features:
Voxel intensities
Principal components of image voxel intensities
Striatal binding radios from the putamen and caudate.
Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods:
Minimum of age-matched controls
Mean minus 1/1.5/2 standard deviations from age-matched controls
Linear regression of normal patient data against age (minus 1/1.5/2 standard errors)
Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data
Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times.
Results
The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson’s disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively.
Conclusions
Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context
Advantages of the single delay model for the assessment of insulin sensitivity from the intravenous glucose tolerance test
The Minimal Model, (MM), used to assess insulin sensitivity (IS) from Intra-Venous Glucose-Tolerance Test (IVGTT) data, suffers from frequent lack of identifiability (parameter estimates with Coefficients of Variation (CV) less than 52%). The recently proposed Single Delay Model (SDM) is evaluated as a practical alternative
Critical Dietetics and Sustainable Food Systems
In this chapter, we invite readers to consider a food system that is based on values where individual health, the health of the society (social system) and ecosystem health are of equal importance. With this as a lens, there is a clear need to move beyond the biosciences to consider transdisciplinary approaches as important for nutrition and Dietetic in today and tomorrow’s reality
Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set
We report a measurement of the bottom-strange meson mixing phase \beta_s
using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays
in which the quark-flavor content of the bottom-strange meson is identified at
production. This measurement uses the full data set of proton-antiproton
collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment
at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity.
We report confidence regions in the two-dimensional space of \beta_s and the
B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2,
-1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in
agreement with the standard model expectation. Assuming the standard model
value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +-
0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +-
0.009 (syst) ps, which are consistent and competitive with determinations by
other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012
Biosignals reflect pair-dynamics in collaborative work : EDA and ECG study of pair-programming in a classroom environment
Collaboration is a complex phenomenon, where intersubjective dynamics can greatly affect the productive outcome. Evaluation of collaboration is thus of great interest, and can potentially help achieve better outcomes and performance. However, quantitative measurement of collaboration is difficult, because much of the interaction occurs in the intersubjective space between collaborators. Manual observation and/or self-reports are subjective, laborious, and have a poor temporal resolution. The problem is compounded in natural settings where task-activity and response-compliance cannot be controlled. Physiological signals provide an objective mean to quantify intersubjective rapport (as synchrony), but require novel methods to support broad deployment outside the lab. We studied 28 student dyads during a self-directed classroom pair-programming exercise. Sympathetic and parasympathetic nervous system activation was measured during task performance using electrodermal activity and electrocardiography. Results suggest that (a) we can isolate cognitive processes (mental workload) from confounding environmental effects, and (b) electrodermal signals show role-specific but correlated affective response profiles. We demonstrate the potential for social physiological compliance to quantify pair-work in natural settings, with no experimental manipulation of participants required. Our objective approach has a high temporal resolution, is scalable, non-intrusive, and robust.Peer reviewe
Angiogenesis is associated with the onset of hyperplasia in human ductal breast disease
BACKGROUND: The precise timing of the angiogenic switch and the role of angiogenesis in the development of breast malignancy is currently unknown. METHODS: Therefore, the expression of CD31 (pan endothelial cells (ECs)), endoglin (actively proliferating ECs), hypoxia-inducible factor-1 (HIF-1alpha), vascular endothelial growth factor-A (VEGF) and tissue factor (TF) were quantified in 140 surgical specimens comprising normal human breast, benign and pre-malignant hyperplastic tissue, in situ and invasive breast cancer specimens. RESULTS: Significant increases in angiogenesis (microvessel density) were observed between normal and benign hyperplastic breast tissue (P<0.005), and between in situ and invasive carcinomas (P<0.0005). In addition, significant increases in proliferating ECs were observed in benign hyperplastic breast compared with normal breast (P<0.05) cancers and in invasive compared with in situ cancers (P<0.005). Hypoxia-inducible factor-1alpha, VEGF and TF expression were significantly associated with increases in both angiogenesis and proliferating ECs (P<0.05). Moreover, HIF-1alpha was expressed by 60-75% of the hyperplastic lesions, and a significant association was observed between VEGF and TF in ECs (P<0.005) and invasive tumour cells (P<0.01). CONCLUSIONS: These findings are the first to suggest that the angiogenic switch, associated with increases in HIF-1alpha, VEGF and TF expression, occurs at the onset of hyperplasia in the mammary duct, although the greatest increase in angiogenesis occurs with the development of invasion
- …