25 research outputs found

    Pharmacokinetic/Pharmacodynamic (PK/PD) Modeling of Anti-Neoplastic Agents

    Get PDF
    Development of tumor resistance to chemotherapeutics is related to inherent tumor variations regarding sensitivity to chemotherapeutics and to sub-optimal dosing regimens, including variation in patient pharmacokinetics that result in suboptimal exposure of tumor cells to anti-neoplastic drugs [1, 2]. The rate and extent of drug efficacy depends on the extent of drug exposure at the tumor site and the time above the effective concentration [3]. In vitro models that incorporate these pharmacokinetic and pharmacodynamic (PK/PD) principles to optimize therapeutic response may be considered the method of choice for optimizing dosing schedules before translating data from static assays to animals and clinical trials [4, 5]. The hollow fiber bioreactor was recently used to evaluate pharmacokinetic/pharmacodynamic (PK/PD) effects of gemcitibine in lung and breast cancers and to model HIV treatments [4-6].https://digitalcommons.chapman.edu/pharmacy_books/1008/thumbnail.jp

    Metabolic networks in a porcine model of trauma and hemorrhagic shock demonstrate different control mechanism with carbohydrate pre-feed

    Get PDF
    Background: Treatment with oral carbohydrate prior to trauma and hemorrhage confers a survival benefit in small animal models. The impact of fed states on survival in traumatically injured humans is unknown. This work uses regulatory networks to examine the effect of carbohydrate pre-feeding on metabolic response to polytrauma and hemorrhagic shock in a clinically-relevant large animal model. Methods: Male Yorkshire pigs were fasted overnight (n = 64). Pre-fed animals (n = 32) received an oral bolus of Karo\textregistered\syrup before sedation. All animals underwent a standardized trauma, hemorrhage, and resuscitation protocol. Serum samples were obtained at set timepoints. Proton NMR was used to identify and quantify serum metabolites. Metabolic regulatory networks were constructed from metabolite concentrations and rates of change in those concentrations to identify controlled nodes and controlling nodes of the network. Results: Oral carbohydrate pre-treatment was not associated with survival benefit. Six metabolites were identified as controlled nodes in both groups: adenosine, cytidine, glycerol, hypoxanthine, lactate, and uridine. Distinct groups of controlling nodes were associated with controlled nodes; however, the composition of these groups depended on feeding status. Conclusions: A common metabolic output, typically associated with injury and hypoxia, results from trauma and hemorrhagic shock. However, this output is directed by different metabolic inputs depending upon the feeding status of the subject. Nodes of the network that are related to mortality can potentially be manipulated for therapeutic effect; however, these nodes differ depending upon feeding status

    Infectious consequences of hematoma from cardiac implantable electronic device procedures and the role of the antibiotic envelope: A WRAP-IT trial analysis.

    Get PDF
    Hematoma is a complication of cardiac implantable electronic device (CIED) procedures and may lead to device infection. The TYRX antibacterial envelope reduced major CIED infection by 40% in the randomized WRAP-IT (World-wide Randomized Antibiotic Envelope Infection Prevention Trial) study, but its effectiveness in the presence of hematoma is not well understood.The purpose of this study was to evaluate the incidence and infectious consequences of hematoma and the association between envelope use, hematomas, and major CIED infection among WRAP-IT patients.All 6800 study patients were included in this analysis (control 3429; envelope 3371). Hematomas occurring within 30 days postprocedure (acute) were characterized and grouped by study treatment and evaluated for subsequent infection risk. Data were analyzed using Cox proportional hazard regression modeling.Acute hematoma incidence was 2.2% at 30 days, with no significant difference between treatment groups (envelope vs control hazard ratio [HR] 1.15; 95% confidence interval [CI] 0.84-1.58; P = .39). Through all follow-up, the risk of major infection was significantly higher among control patients with hematoma vs those without (13.1% vs 1.6%; HR 11.3; 95% CI 5.5-23.2; P.001). The risk of major infection was significantly lower in the envelope vs control patients with hematoma (2.5% vs 13.1%; HR 0.18; 95% CI 0.04-0.85; P = .03).The risk of hematoma was 2.2% among WRAP-IT patients. Among control patients, hematoma carried a11-fold risk of developing a major CIED infection. This risk was significantly mitigated with antibacterial envelope use, with an 82% reduction in major CIED infection among envelope patients who developed hematoma compared to control

    Risk Factors for CIED Infection After Secondary Procedures

    Get PDF
    OBJECTIVES This study aimed to identify risk factors for infection after secondary cardiac implantable electronic device (CIED) procedures. BACKGROUND Risk factors for CIED infection are not well defined and techniques to minimize infection lack supportive evidence. WRAP-IT (World-wide Randomized Antibiotic Envelope Infection Prevention trial), a large study that assessed the safety and efficacy of an antibacterial envelope for CIED infection reduction, offers insight into procedural details and infection prevention strategies. METHODS This analysis included 2,803 control patients from the WRAP-IT trial who received standard preoperative antibiotics but not the envelope (44 patients with major infections through all follow-up). A multivariate least absolute shrinkage and selection operator machine learning model, controlling for patient characteristics and procedural variables, was used for risk factor selection and identification. Risk factors consistently retaining predictive value in the model (appeared >10 times) across 100 iterations of imputed data were deemed significant. RESULTS Of the 81 variables screened, 17 were identified as risk factors with 6 being patient/device-related (nonmodifiable) and 11 begin procedure-related (potentially modifiable). Patient/device-related factors included higher number of previous CIED procedures, history of atrial arrhythmia, geography (outside North America and Europe), device type, and lower body mass index. Procedural factors associated with increased risk included longer procedure time, implant location (non-left pectoral subcutaneous), perioperative glycopeptide antibiotic versus nonglycopeptide, anticoagulant, and/or antiplatelet use, and capsulectomy. Factors associated with decreased risk of infection included chlorhexidine skin preparation and antibiotic pocket wash. CONCLUSIONS In WRAP-IT patients, we observed that several procedural risk factors correlated with infection risk. These results can help guide infection prevention strategies to minimize infections associated with secondary CIED procedures. (J Am Coll Cardiol EP 2022;8:101-111) (c) 2022 The Authors. Published by Elsevier on behalf of the American College of Cardiology Foundation. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

    A four-compartment metabolomics analysis of the liver, muscle, serum, and urine response to polytrauma with hemorrhagic shock following carbohydrate prefeed.

    No full text
    OBJECTIVE:Hemorrhagic shock accompanied by injury represents a major physiologic stress. Fasted animals are often used to study hemorrhagic shock (with injury). A fasted state is not guaranteed in the general human population. The objective of this study was to determine if fed animals would exhibit a different metabolic profile in response to hemorrhagic shock with trauma when compared to fasted animals. METHODS:Proton (1H) NMR spectroscopy was used to determine concentrations of metabolites from four different compartments (liver, muscle, serum, urine) taken at defined time points throughout shock/injury and resuscitation. PLS-DA was performed and VIP lists established for baseline, shock and resuscitation (10 metabolites for each compartment at each time interval) on metabolomics data from surviving animals. RESULTS:Fed status prior to the occurrence of hemorrhagic shock with injury alters the metabolic course of this trauma and potentially affects mortality. The death rate for CPF animals is higher than FS animals (47 vs 28%). The majority of deaths occur post-resuscitation suggesting reperfusion injury. The metabolomics response to shock reflects priorities evident at baseline. FS animals raise the baseline degree of proteolysis to provide additional amino acids for energy production while CPF animals rely on both glucose and, to a lesser extent, amino acids. During early resuscitation levels of metabolites associated with energy production drop, suggesting diminished demand. CONCLUSIONS:Feeding status prior to the occurrence of hemorrhagic shock with injury alters the metabolic course of this trauma and potentially affects mortality. The response to shock reflects metabolic priorities at baseline

    Estimating the incidence of atrial fibrillation in single-chamber implantable cardioverter defibrillator patients

    Full text link
    BACKGROUND Atrial arrhythmias are associated with major adverse cardiovascular events. Recent reports among implantable cardioverter defibrillator (ICD) patients have demonstrated a high prevalence of atrial fibrillation (AF), predominantly in dual-chamber recipients. AF incidence among patients with single-chamber systems (approximately 50% of all ICDs) is currently unknown. The objective was to estimate the prevalence of new-onset AF among single-chamber ICD patients by observing the rates of new atrial tachycardia (AT)/AF among a propensity scoring matched cohort of dual-chamber ICD patients from the PainFree SST study, to better inform screening initiatives. METHODS Among 2,770 patients enrolled, 1,862 single-chamber, dual-chamber, and cardiac resynchronization therapy (CRT) subjects with no prior history of atrial tachyarrhythmias were included. Daily AT/AF burden was estimated using a propensity score weighted model against data from dual-chamber ICDs. RESULTS Over 22 ± 9 months of follow-up, the estimated incidence of AT/AF - lasting at least 6 minutes, 6 hours and 24 hours per day - in the single-chamber cohort was 22.0%, 9.8% and 6.3%, whereas among dual-chamber patients, the prevalence was 26.6%, 13.1%, and 7.1%, respectively. Initiation of oral anticoagulation (OAC) was estimated to occur in 9.8% of the propensity matched single-chamber cohort, which was higher than the actual observed rate of 6.0%. Stroke and transient ischemic attack (TIA) occurred at low rates in all device subgroups. CONCLUSIONS Atrial arrhythmias occur frequently, and significant underutilization of anticoagulation is suggested in single-chamber ICD recipients. Routine screening for AF should be considered among single-chamber ICD recipients. This article is protected by copyright. All rights reserved

    SVT discrimination algorithms significantly reduce the rate of inappropriate therapy in the setting of modern day delayed high-rate detection programming

    Full text link
    BACKGROUND Contemporary ICD programming involving delayed high-rate detection and use of SVT discriminators has significantly reduced the rate of inappropriate shocks. The extent to which SVT algorithms alone reduce inappropriate therapies is poorly understood. METHODS AND RESULTS PainFree SST enrolled 2,770 patients with a single or dual-chamber ICD or cardiac resynchronization defibrillator. Patients were followed for 22±9 months with SVT discriminators on in 96% of patients. Sustained ventricular tachyarrhythmias and SVT episodes were adjudicated by an independent physician committee. For this analysis, all episodes were subjected to post-processing computer simulation with SVT discriminators off with and without delayed high-rate detection criteria (VF zone only, 30/40@320ms). There were 3,282 adjudicated SVT episodes of which 115 resulted in an ICD shock and 113 received only ATP (2-year inappropriate shock and therapy rates of 3.1% and 4.1%). Therapy was appropriately withheld for the remaining 3,054 SVT episodes. With both SVT discriminators and delayed high-rate detection simulated off, the 2-year inappropriate therapy rate would have been 22.9% (Hazard Ratio [HR]=6.24, 95% confidence interval [CI]: 5.20-7.49). With SVT discriminators simulated off and delayed high-rate detection simulated on in all patients, the 2-year rate would have been 6.4% (HR=1.63, CI: 1.44-1.85). CONCLUSIONS Use of SVT discriminators has a significant role in reducing the rate of inappropriate ICD therapy even in the setting of delayed high-rate detection settings. Deactivating SVT discriminators would have resulted in an overall increase in the inappropriate ICD therapy rate by 63% and 524% with and without delayed high-rate detection programming, respectively. This article is protected by copyright. All rights reserved

    VIP (variable importance in projection) metabolites.

    No full text
    <p>Mean liver (2A) and muscle (2B) values are reported as mM/1 g lyophilyzed muscle tissue. Mean serum (2C) values are reported as mmol/L. The lab values of urea (obtained with a Gem Premier 3000 blood gas analyzer) are reported instead of NMR values since the water suppression in the CPMG pulse sequence compromises the urea signal. Mean urine (2D) values are reported as nmol/hr/kg. Bolded numbers indicate metabolites that achieved VIP scores above 1.0 or were the top 10 metabolites of those metabolites with VIP scores above 1.0. Only the Baseline serum urea (laboratory analysis) reached statistical significance (p = 0.003, Student T test assuming unequal variance).</p><p>VIP (variable importance in projection) metabolites.</p

    PLS-DA scores plots for the S45-B time interval.

    No full text
    <p>PLS-DA scores plots show model discrimination between FS and CPF animals during the response to shock (S45-B) in each of the four compartments (liver, muscle, serum, urine). Models are of varying quality and statistical significance as reported in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0124467#pone.0124467.t001" target="_blank">Table 1</a> but indicate that there is a difference in response to shock according to feeding status.</p

    Purine degradation/salvage pathway.

    No full text
    <p>Metabolomics evidence from four compartments suggests alternate routes for observed differences in purine abundances. ATP is degraded in a series of reactions to uric acid or allantoin. Under non-stress conditions, this degradation process proceeds at a low level. When physiologic conditions change, intermediates in the pathway can be diverted to meet these demands. For example, during fasting, IMP can be salvaged for the production of ATP thus reducing the level of HX. In our study, HX levels are higher at baseline in liver of CPF animals. We propose that this observation is a result of increased salvage in FS animals. During ischemia 5'-nucleotidase acts on AMP to generate adenosine, a potential vasodilator. In our study, adenosine levels are ~3X higher in the liver of CPF animals when compared to FS animals suggesting a greater need for vasodilation. Metabolites in red were identified as VIP metabolites by PLS-DA analysis. Letters in superscript indicate the compartment the metabolite was observed in: liver (L), muscle (M), serum (S), or urine (U).</p
    corecore