28 research outputs found
The right time to measure anti-Xa activity in critical illness:pharmacokinetics of therapeutic dose nadroparin
BACKGROUND: Peak anti-Xa activity of low-molecular-weight heparin nadroparin is measured 3 to 5 hours after subcutaneous injection. In critically ill patients, physiological changes and medical therapies may result in peak activities before or after this interval, possibly impacting dosing.OBJECTIVES: The primary objective was to determine the percentage of critically ill patients with adequately estimated peak activities drawn 3 to 5 hours after subcutaneous administration of a therapeutic dose of nadroparin. Adequate was defined as a peak activity of ≥80% of the actual peak anti-Xa activity. If ≥80% of patients had adequately estimated peak activities in the 3- to 5-hour interval, measurement in this interval was regarded as acceptable. The secondary objective was to determine the pharmacokinetic profile of nadroparin.METHODS: In this single-center, prospective study, we evaluated anti-Xa activities in patients admitted to a general intensive care unit. After ≥4 equal doses of nadroparin, anti-Xa activity was measured according to a 12- to 24-hour sampling scheme.RESULTS: In 25 patients, anti-Xa activities drawn between 3 and 5 hours after administration ranged 80% to 100% of the actual peak activity. Compared to the threshold level of an adequate estimation in at least 20 patients (≥80%), measuring anti-Xa activities in the 3- to 5-hour interval is an acceptable method (1-tailed binomial test; P < .02). We found a large interindividual variability for nadroparin exposure (mean ± SD area-under-the-curve 0-12h, 10.3 ± 4.8 IU·h/mL) and delayed elimination (t 1/2 range, 4.0-120.9 hours) despite adequate renal function. CONCLUSION: In critically ill patients, measuring anti-Xa activity in a 3- to 5-hour interval after subcutaneous injection of therapeutic nadroparin is an acceptable method to estimate the actual peak anti-Xa activity.</p
Whole-body insulin clearance in people with type 2 diabetes and normal kidney function:Relationship with glomerular filtration rate, renal plasma flow, and insulin sensitivity
OBJECTIVE: Kidney insulin clearance, proposed to be the main route of extra-hepatic insulin clearance, occurs in tubular cells following glomerular filtration and peritubular uptake, a process that may be impaired in people with type 2 diabetes (T2D) and/or impaired kidney function. Human studies that investigated kidney insulin clearance are limited by the invasive nature of the measurement. Instead, we evaluated relationships between whole-body insulin clearance, and gold-standard measured kidney function and insulin sensitivity in adults with T2D and normal kidney function. RESEARCH DESIGN AND METHODS: We determined insulin, inulin/iohexol and para-aminohippuric acid (PAH) clearances during a hyperinsulinemic-euglycemic clamp to measure whole-body insulin clearance and kidney function. Insulin sensitivity was expressed by glucose infusion rate (M value). Associations between whole-body insulin clearance, kidney function and insulin sensitivity were examined using univariable and multivariable linear regressions models. RESULTS: We investigated 44 predominantly male (77%) T2D adults aged 63 ± 7, with fat mass 34.5 ± 9 kg, lean body mass 63.0 ± 11.8 kg, and HbA1c 7.4 ± 0.6%. Average whole-body insulin clearance was 1188 ± 358 mL/min. Mean GFR was 110 ± 22 mL/min, mean ERPF 565 ± 141 mL/min, and M value averaged 3.9 ± 2.3 mg/min. Whole-body insulin clearance was positively correlated with lean body mass, ERPF and insulin sensitivity, but not with GFR. ERPF explained 6% of the variance when entered in a nested multivariable linear regression model op top of lean body mass (25%) and insulin sensitivity (15%). CONCLUSIONS: In adults with T2D and normal kidney function, whole-body insulin clearance was predicted best by lean body mass and insulin sensitivity, and to a lesser extent by ERPF. GFR was not associated with whole-body insulin clearance. In contrast to prior understanding, this suggests that in this population kidney insulin clearance may not play such a dominant role in whole-body insulin clearance
A new paradigm to indicate antidepressant treatments
This article develops the idea that clinical depression can be seen as a typical human response, largely rooted in human culture, to events of loss or times of adversity. Various biological, psychological, and social factors may cause some individuals to have a depressive reaction that is ineffectually limited in time and/or severity. Recovery occurs mainly based on natural resilience mechanisms, which come into play spontaneously, but which are sometimes inhibited or blocked by specific pathological biopsychosocial mechanisms. One of the mechanisms for this could be the influence of the circuits that regulate pleasure and happiness, along the dorsal diencephalic connection (DDC) pathway from the forebrain to the midbrain via the habenula. Therapy works by undermining the biopsychosocial factors that prevent the natural recovery mechanism from working. Treatment should, therefore, be seen as facilitating rather than causing natural recovery. This approach is in line with the high recovery rate after placebo treatments and the positive influence of pharmacological treatments with completely different sites of action. Acceptance of this model means that when studying new treatments for depression, a new paradigm must be applied in which the relative value of antidepressant treatment is specifically weighted in terms of enabling the natural resilience process
Influence of eight ABCB1 polymorphisms on antidepressant response in a prospective cohort of treatment-free Russian patients with moderate or severe depression:An explorative psychopharmacological study with naturalistic design
BACKGROUND: Many antidepressants are substrates of P-glycoprotein, an efflux transporter in the blood-brain-barrier encoded by the ABCB1 gene. Genetic variations might influence the transport rate of antidepressants and hence their pharmacological effects. This study investigates the influence of eight polymorphisms in the ABCB1 gene on antidepressant treatment response. METHOD: 152 patients were included from psychiatric departments of the Mental Health Research Institute in Tomsk. The difference in Hamilton-Depression-Rating-Scale (HAMD-17)-scores between baseline and week two, week two and four, and baseline and week four was used to estimate timing of improvement of depression. Associations between the ABCB1 gene-polymorphisms and reduction in HAMD-17 score were assessed using independent t-test and multiple linear regression. RESULTS: Tricyclic antidepressants were associated with a higher reduction of HAMD-17 score when compared to SSRIs. The SNP rs2235040 A-allele had a significant positive influence on the ΔHAMD-17(0→2W) score but a significant negative influence on the ΔHAMD-17(2→4W) score. The rs4148739 G-allele had a significant negative influence on the ΔHAMD-17(0→2W) score but a significant positive influence on the ΔHAMD-17(2→4W) score. The SNP rs2235015 T-allele is significant negatively related to the ΔHAMD-17(2→4W) score. CONCLUSION: ABCB1 Genetic variations appear to affect speed but not magnitude of antidepressant drug response
SGLT2 inhibition versus sulfonylurea treatment effects on electrolyte and acid-base balance:secondary analysis of a clinical trial reaching glycemic equipoise: Tubular effects of SGLT2 inhibition in Type 2 diabetes
Sodium-glucose transporter (SGLT)2 inhibitors increase plasma magnesium and plasma phosphate and may cause ketoacidosis, but the contribution of improved glycemic control to these observations as well as effects on other electrolytes and acid-base parameters remain unknown. Therefore, our objective was to compare the effects of SGLT2 inhibitors dapagliflozin and sulfonylurea gliclazide on plasma electrolytes, urinary electrolyte excretion, and acid-base balance in people with Type 2 diabetes (T2D). We assessed the effects of dapagliflozin and gliclazide treatment on plasma electrolytes and bicarbonate, 24-hour urinary pH and excretions of electrolytes, ammonium, citrate, and sulfate in 44 metformin-treated people with T2D and preserved kidney function. Compared with gliclazide, dapagliflozin increased plasma chloride by 1.4 mmol/l (95% CI 0.4-2.4), plasma magnesium by 0.03 mmol/l (95% CI 0.01-0.06), and plasma sulfate by 0.02 mmol/l (95% CI 0.01-0.04). Compared with baseline, dapagliflozin also significantly increased plasma phosphate, but the same trend was observed with gliclazide. From baseline to week 12, dapagliflozin increased the urinary excretion of citrate by 0.93 ± 1.72 mmol/day, acetoacetate by 48 μmol/day (IQR 17-138), and β-hydroxybutyrate by 59 μmol/day (IQR 0-336), without disturbing acid-base balance. In conclusion, dapagliflozin increases plasma magnesium, chloride, and sulfate compared with gliclazide, while reaching similar glucose-lowering in people with T2D. Dapagliflozin also increases urinary ketone excretion without changing acid-base balance. Therefore, the increase in urinary citrate excretion by dapagliflozin may reflect an effect on cellular metabolism including the tricarboxylic acid cycle. This potentially contributes to kidney protection
The authors reply
The efficacy and toxicity of aminoglycosides show a strong direct positive relationship with blood drug concentrations, therefore, therapy with aminoglycosides in adults is usually guided by therapeutic drug monitoring. Dosing regimens in adults have evolved from multiple daily dosing to extended-interval dosing. This evolution has also taken place in neonates. Neonates, however, display large interindividual differences in the pharmacokinetics of aminoglycosides due to developmental differences early in life. The volume of distribution of aminoglycosides shows a strong relationship with bodyweight, which tends to be larger (corrected for bodyweight) in more premature infants and those with sepsis. Renal clearance of aminoglycosides increases with gestational age and accelerates immediately after birth. Because of these developmental influences, there is great inter- and intraindividual variability in the volume of distribution and clearance of these drugs, and investigators have established aminoglycoside dosing regimens based on bodyweight and/or gestational age. Widely practised dosing regimens comprise 4-5 mg/kg bodyweight of gentamicin every 24-48 hours as a first dose, followed by dose adjustment based on therapeutic drug monitoring. Although formal toxicity studies are scarce, there is no evidence that aminoglycoside toxicity in neonates differs from that in adults. Monitoring of blood drug concentrations and intelligent reconstruction of individual pharmacokinetic behaviour using a population pharmacokinetic model, optimally chosen blood sampling times and appropriate pharmacokinetic software, help clinicians to quickly optimize aminoglycoside dosing regimens to maximize the clinical effect and minimize the toxicity of these drugs