133 research outputs found
Effects of oral L-Carnitine supplementation on insulin sensitivity indices in response to glucose feeding in lean and overweight/obese males
Infusion of carnitine has been observed to increase non-oxidative glucose disposal in several studies but the effect of oral carnitine on glucose disposal in non-diabetic lean vs. overweight/obese humans has not been examined. This study examined the effects of 14 days L-Carnitine L-Tartrate oral supplementation (LC) on blood glucose, insulin, NEFA and GLP-1 responses to an oral glucose tolerance test (OGTT). Sixteen male participants were recruited (lean (n=8) and overweight/obese (n=8)). After completing a submaximal predictive exercise test, participants were asked to attend three experimental sessions. These three visits were conducted, in the morning to obtain fasting blood samples and to conduct 2hr OGTT’s. The first visit was a familiarisation trial and the final two visits were conducted two weeks apart following 14 days of ingestion of placebo (PL, 3g glucose/day) then LC (3g LC/day) ingested as 2 capsules 3x/day with meals. On each visit blood was drawn at rest, at intervals during the OGTT for analysis of glucose, insulin, non-esterified fatty acids (NEFA) and total glucagon-like peptide-1 (GLP-1). Data obtained were used for determination of usual insulin sensitivity indices (HOMA-IR, AUC glucose, AUC insulin, 1st phase and 2nd phase β-cell function, estimated insulin sensitivity index, and estimated metabolic clearance rate). Data were analysed using RMANOVA and post-hoc comparisons where appropriate. There was a significant difference between groups for body mass, % fat and BMI with no significant difference in age and height. Mean (SEM) plasma glucose concentration at 30 minutes was significantly lower (
Risk of cardiovascular disease and total mortality in adults with type 1 diabetes: Scottish registry linkage study
<p>Background: Randomized controlled trials have shown the importance of tight glucose control in type 1 diabetes (T1DM), but few recent studies have evaluated the risk of cardiovascular disease (CVD) and all-cause mortality among adults with T1DM. We evaluated these risks in adults with T1DM compared with the non-diabetic population in a nationwide study from Scotland and examined control of CVD risk factors in those with T1DM.</p>
<p>Methods and Findings: The Scottish Care Information-Diabetes Collaboration database was used to identify all people registered with T1DM and aged ≥20 years in 2005–2007 and to provide risk factor data. Major CVD events and deaths were obtained from the national hospital admissions database and death register. The age-adjusted incidence rate ratio (IRR) for CVD and mortality in T1DM (n = 21,789) versus the non-diabetic population (3.96 million) was estimated using Poisson regression. The age-adjusted IRR for first CVD event associated with T1DM versus the non-diabetic population was higher in women (3.0: 95% CI 2.4–3.8, p<0.001) than men (2.3: 2.0–2.7, p<0.001) while the IRR for all-cause mortality associated with T1DM was comparable at 2.6 (2.2–3.0, p<0.001) in men and 2.7 (2.2–3.4, p<0.001) in women. Between 2005–2007, among individuals with T1DM, 34 of 123 deaths among 10,173 who were <40 years and 37 of 907 deaths among 12,739 who were ≥40 years had an underlying cause of death of coma or diabetic ketoacidosis. Among individuals 60–69 years, approximately three extra deaths per 100 per year occurred among men with T1DM (28.51/1,000 person years at risk), and two per 100 per year for women (17.99/1,000 person years at risk). 28% of those with T1DM were current smokers, 13% achieved target HbA1c of <7% and 37% had very poor (≥9%) glycaemic control. Among those aged ≥40, 37% had blood pressures above even conservative targets (≥140/90 mmHg) and 39% of those ≥40 years were not on a statin. Although many of these risk factors were comparable to those previously reported in other developed countries, CVD and mortality rates may not be generalizable to other countries. Limitations included lack of information on the specific insulin therapy used.</p>
<p>Conclusions: Although the relative risks for CVD and total mortality associated with T1DM in this population have declined relative to earlier studies, T1DM continues to be associated with higher CVD and death rates than the non-diabetic population. Risk factor management should be improved to further reduce risk but better treatment approaches for achieving good glycaemic control are badly needed.</p>
A New Process for Organizing Assessments of Social, Economic, and Environmental Outcomes: Case Study of Wildland Fire Management in the USA
Ecological risk assessments typically are organized using the processes of planning (a discussion among managers, stakeholders, and analysts to clarify ecosystem management goals and assessment scope) and problem formulation (evaluation of existing information to generate hypotheses about adverse ecological effects, select assessment endpoints, and develop an analysis plan). These processes require modification to be applicable for integrated assessments that evaluate ecosystem management alternatives in terms of their ecological, economic, and social consequences.We present 8 questions that define the steps of a new process we term integrated problem formulation (IPF), and we illustrate the use of IPF through a retrospective case study comparing 2 recent phases of development of the Fire Program Analysis (FPA) system, a planning and budgeting system for the management of wildland fire throughout publicly managed lands in the United States. IPF extends traditional planning and problem formulation by including the explicit comparison of management alternatives, the valuation of ecological, economic and social endpoints, and the combination or integration of those endpoints. The phase 1, limitedprototype FPAsystem used a set of assessment endpoints ofcommonform (i.e., probabilities of givenflameheights over acres of selected land-resource types), which were specified and assigned relative weights at the local level in relation to a uniform national standard. This approach was chosen to permit system-wide optimization of fire management budget allocations according to a cost-effectiveness criterion. Before full development, however, the agencies abandoned this approach in favor of a phase 2 system that examined locally specified (rather than system-optimized) allocation alternatives and was more permissive as to endpoint form. We demonstrate how the IPF process illuminates the nature, rationale, and consequences of these differences, and argue that its early use for the FPA system may have enabled a smoother development path
A primary care, multi-disciplinary disease management program for opioid-treated patients with chronic non-cancer pain and a high burden of psychiatric comorbidity
BACKGROUND: Chronic non-cancer pain is a common problem that is often accompanied by psychiatric comorbidity and disability. The effectiveness of a multi-disciplinary pain management program was tested in a 3 month before and after trial. METHODS: Providers in an academic general medicine clinic referred patients with chronic non-cancer pain for participation in a program that combined the skills of internists, clinical pharmacists, and a psychiatrist. Patients were either receiving opioids or being considered for opioid therapy. The intervention consisted of structured clinical assessments, monthly follow-up, pain contracts, medication titration, and psychiatric consultation. Pain, mood, and function were assessed at baseline and 3 months using the Brief Pain Inventory (BPI), the Center for Epidemiological Studies-Depression Scale scale (CESD) and the Pain Disability Index (PDI). Patients were monitored for substance misuse. RESULTS: Eighty-five patients were enrolled. Mean age was 51 years, 60% were male, 78% were Caucasian, and 93% were receiving opioids. Baseline average pain was 6.5 on an 11 point scale. The average CESD score was 24.0, and the mean PDI score was 47.0. Sixty-three patients (73%) completed 3 month follow-up. Fifteen withdrew from the program after identification of substance misuse. Among those completing 3 month follow-up, the average pain score improved to 5.5 (p = 0.003). The mean PDI score improved to 39.3 (p < 0.001). Mean CESD score was reduced to 18.0 (p < 0.001), and the proportion of depressed patients fell from 79% to 54% (p = 0.003). Substance misuse was identified in 27 patients (32%). CONCLUSIONS: A primary care disease management program improved pain, depression, and disability scores over three months in a cohort of opioid-treated patients with chronic non-cancer pain. Substance misuse and depression were common, and many patients who had substance misuse identified left the program when they were no longer prescribed opioids. Effective care of patients with chronic pain should include rigorous assessment and treatment of these comorbid disorders and intensive efforts to insure follow up
Applications of aerospace technology
Highlights are presented for the Research Triangle Institute (RTI) Applications Team activities over the past quarter. Progress in fulfilling the requirements of the contract is summarized, along with the status of the eight add-on tasks. New problem statements are presented. Transfer activities for ongoing projects with the NASA Centers are included
Physician and Patient Predictors of Evidence-Based Prescribing in Heart Failure: A Multilevel Study
BACKGROUND: The management of patients with heart failure (HF) needs to account for changeable and complex individual clinical characteristics. The use of renin angiotensin system inhibitors (RAAS-I) to target doses is recommended by guidelines. But physicians seemingly do not sufficiently follow this recommendation, while little is known about the physician and patient predictors of adherence. METHODS: To examine the coherence of primary care (PC) physicians' knowledge and self-perceived competencies regarding RAAS-I with their respective prescribing behavior being related to patient-associated barriers. Cross-sectional follow-up study after a randomized medical educational intervention trial with a seven month observation period. PC physicians (n = 37) and patients with systolic HF (n = 168) from practices in Baden-Wuerttemberg. Measurements were knowledge (blueprint-based multiple choice test), self-perceived competencies (questionnaire on global confidence in the therapy and on frequency of use of RAAS-I), and patient variables (age, gender, NYHA functional status, blood pressure, potassium level, renal function). Prescribing was collected from the trials' documentation. The target variable consisted of ≥50% of recommended RAAS-I dosage being investigated by two-level logistic regression models. RESULTS: Patients (69% male, mean age 68.8 years) showed symptomatic and objectified left ventricular (NYHA II vs. III/IV: 51% vs. 49% and mean LVEF 33.3%) and renal (GFR<50%: 22%) impairment. Mean percentage of RAAS-I target dose was 47%, 59% of patients receiving ≥50%. Determinants of improved prescribing of RAAS-I were patient age (OR 0.95, CI 0.92-0.99, p = 0.01), physician's global self-confidence at follow-up (OR 1.09, CI 1.02-1.05, p = 0.01) and NYHA class (II vs. III/IV) (OR 0.63, CI 0.38-1.05, p = 0.08). CONCLUSIONS: A change in physician's confidence as a predictor of RAAS-I dose increase is a new finding that might reflect an intervention effect of improved physicians' intention and that might foster novel strategies to improve safe evidence-based prescribing. These should include targeting knowledge, attitudes and skills
Young and vulnerable: Spatial-temporal trends and risk factors for infant mortality in rural South Africa (Agincourt), 1992-2007
<p>Abstract</p> <p>Background</p> <p>Infant mortality is an important indicator of population health in a country. It is associated with several health determinants, such as maternal health, access to high-quality health care, socioeconomic conditions, and public health policy and practices.</p> <p>Methods</p> <p>A spatial-temporal analysis was performed to assess changes in infant mortality patterns between 1992-2007 and to identify factors associated with infant mortality risk in the Agincourt sub-district, rural northeast South Africa. Period, sex, refugee status, maternal and fertility-related factors, household mortality experience, distance to nearest primary health care facility, and socio-economic status were examined as possible risk factors. All-cause and cause-specific mortality maps were developed to identify high risk areas within the study site. The analysis was carried out by fitting Bayesian hierarchical geostatistical negative binomial autoregressive models using Markov chain Monte Carlo simulation. Simulation-based Bayesian kriging was used to produce maps of all-cause and cause-specific mortality risk.</p> <p>Results</p> <p>Infant mortality increased significantly over the study period, largely due to the impact of the HIV epidemic. There was a high burden of neonatal mortality (especially perinatal) with several hot spots observed in close proximity to health facilities. Significant risk factors for all-cause infant mortality were mother's death in first year (most commonly due to HIV), death of previous sibling and increasing number of household deaths. Being born to a Mozambican mother posed a significant risk for infectious and parasitic deaths, particularly acute diarrhoea and malnutrition.</p> <p>Conclusions</p> <p>This study demonstrates the use of Bayesian geostatistical models in assessing risk factors and producing smooth maps of infant mortality risk in a health and socio-demographic surveillance system. Results showed marked geographical differences in mortality risk across a relatively small area. Prevention of vertical transmission of HIV and survival of mothers during the infants' first year in high prevalence villages needs to be urgently addressed, including expanded antenatal testing, prevention of mother-to-child transmission, and improved access to antiretroviral therapy. There is also need to assess and improve the capacity of district hospitals for emergency obstetric and newborn care. Persisting risk factors, including inadequate provision of clean water and sanitation, are yet to be fully addressed.</p
Online patient simulation training to improve clinical reasoning: a feasibility randomised controlled trial
Background Online patient simulations (OPS) are a novel method for teaching clinical reasoning skills to students and could contribute to reducing diagnostic errors. However, little is known about how best to implement and evaluate OPS in medical curricula. The aim of this study was to assess the feasibility, acceptability and potential effects of eCREST — the electronic Clinical Reasoning Educational Simulation Tool. Methods A feasibility randomised controlled trial was conducted with final year undergraduate students from three UK medical schools in academic year 2016/2017 (cohort one) and 2017/2018 (cohort two). Student volunteers were recruited in cohort one via email and on teaching days, and in cohort two eCREST was also integrated into a relevant module in the curriculum. The intervention group received three patient cases and the control group received teaching as usual; allocation ratio was 1:1. Researchers were blind to allocation. Clinical reasoning skills were measured using a survey after 1 week and a patient case after 1 month. Results Across schools, 264 students participated (18.2% of all eligible). Cohort two had greater uptake (183/833, 22%) than cohort one (81/621, 13%). After 1 week, 99/137 (72%) of the intervention and 86/127 (68%) of the control group remained in the study. eCREST improved students’ ability to gather essential information from patients over controls (OR = 1.4; 95% CI 1.1–1.7, n = 148). Of the intervention group, most (80/98, 82%) agreed eCREST helped them to learn clinical reasoning skills. Conclusions eCREST was highly acceptable and improved data gathering skills that could reduce diagnostic errors. Uptake was low but improved when integrated into course delivery. A summative trial is needed to estimate effectiveness
Burst-Time-Dependent Plasticity Robustly Guides ON/OFF Segregation in the Lateral Geniculate Nucleus
Spontaneous retinal activity (known as “waves”) remodels synaptic connectivity to the lateral geniculate nucleus (LGN) during development. Analysis of retinal waves recorded with multielectrode arrays in mouse suggested that a cue for the segregation of functionally distinct (ON and OFF) retinal ganglion cells (RGCs) in the LGN may be a desynchronization in their firing, where ON cells precede OFF cells by one second. Using the recorded retinal waves as input, with two different modeling approaches we explore timing-based plasticity rules for the evolution of synaptic weights to identify key features underlying ON/OFF segregation. First, we analytically derive a linear model for the evolution of ON and OFF weights, to understand how synaptic plasticity rules extract input firing properties to guide segregation. Second, we simulate postsynaptic activity with a nonlinear integrate-and-fire model to compare findings with the linear model. We find that spike-time-dependent plasticity, which modifies synaptic weights based on millisecond-long timing and order of pre- and postsynaptic spikes, fails to segregate ON and OFF retinal inputs in the absence of normalization. Implementing homeostatic mechanisms results in segregation, but only with carefully-tuned parameters. Furthermore, extending spike integration timescales to match the second-long input correlation timescales always leads to ON segregation because ON cells fire before OFF cells. We show that burst-time-dependent plasticity can robustly guide ON/OFF segregation in the LGN without normalization, by integrating pre- and postsynaptic bursts irrespective of their firing order and over second-long timescales. We predict that an LGN neuron will become ON- or OFF-responsive based on a local competition of the firing patterns of neighboring RGCs connecting to it. Finally, we demonstrate consistency with ON/OFF segregation in ferret, despite differences in the firing properties of retinal waves. Our model suggests that diverse input statistics of retinal waves can be robustly interpreted by a burst-based rule, which underlies retinogeniculate plasticity across different species
- …