8 research outputs found

    Dose banding as an alternative to body surface area-based dosing of chemotherapeutic agents

    Get PDF
    Background: Dose banding is a recently suggested dosing method that uses predefined ranges (bands) of body surface area (BSA) to calculate each patients dose by using a single BSA-value per band. Thus, drugs with sufficient long-term stability can be prepared in advance. The main advantages of dose banding are to reduce patient waiting time and improve pharmacy capacity planning; additional benefits include reduced medication errors, reduced drug wastage, and prospective quality control. This study compares dose banding with individual BSA dosing and fixed dose according to pharmacokinetic criteria.Methods:Three BSA bands were defined: BSA1.7 m2, 1.7 m2 BSA1.9 m 2, BSA1.9 m2 and each patient dose was calculated based on a unique BSA-value per band (1.55, 1.80, and 2.05 m 2, respectively). By using individual clearance values of six drugs (cisplatin, docetaxel, paclitaxel, doxorubicin, irinotecan, and topotecan) from 1012 adult cancer patients in total, the AUCs corresponding to three dosing methods (BSA dosing, dose banding, and fixed dose) were compared with a target AUC for each drug.Results:For all six drugs, the per cent variation in individual dose obtained with dose banding compared with BSA dosing ranged between 14% and 22%, and distribution of AUC values was very similar with both dosing methods. In terms of reaching the target AUC, there was no significant difference in precision between dose banding and BSA dosing, except for paclitaxel (32.0% vs 30.7%, respectively; P=0.05). However, precision was significantly better for BSA dosing compared with fixed dose for four out of six drugs.Conclusion:For the studied drugs, implementation of dose banding should be considered as it entails no significant increase in interindividual plasma exposure

    Association between Vitamin D Levels and Nonalcoholic Fatty Liver Disease: Potential Confounding Variables

    Get PDF
    Nonalcoholic fatty liver disease (NAFLD), historically considered to be the hepatic component of the metabolic syndrome, is a spectrum of fat-associated liver conditions, in the absence of secondary causes, that may progress to nonalcoholic steatohepatitis (NASH), fibrosis, and cirrhosis. Disease progression is closely associated with body weight or fatness, dyslipidemia, insulin resistance, oxidative stress, and inflammation. Recently, vitamin D deficiency has been linked to the pathogenesis and severity of NAFLD because of vitamin D "pleiotropic" functions, with roles in immune modulation, cell differentiation and proliferation, and regulation of inflammation. Indeed, several studies have reported an association between vitamin D and NAFLD/NASH. However, other studies have failed to find an association. Therefore, we sought to critically review the current evidence on the association between vitamin D deficiency and NAFLD/NASH, and to analyze and discuss some key variables that may interfere with this evaluation, such as host-, environment-, and heritability-related factors regulating vitamin D synthesis and metabolism; definitions of deficient or optimal vitamin D status with respect to skeletal and nonskeletal outcomes including NAFLD/NASH; methods of measuring 25(OH)D; and methods of diagnosing NAFLD as well as quantifying adiposity, the cardinal link between vitamin D deficiency and NAFLD

    Breast Imaging

    No full text

    Vorapaxar in the secondary prevention of atherothrombotic events

    Get PDF
    Item does not contain fulltextBACKGROUND: Thrombin potently activates platelets through the protease-activated receptor PAR-1. Vorapaxar is a novel antiplatelet agent that selectively inhibits the cellular actions of thrombin through antagonism of PAR-1. METHODS: We randomly assigned 26,449 patients who had a history of myocardial infarction, ischemic stroke, or peripheral arterial disease to receive vorapaxar (2.5 mg daily) or matching placebo and followed them for a median of 30 months. The primary efficacy end point was the composite of death from cardiovascular causes, myocardial infarction, or stroke. After 2 years, the data and safety monitoring board recommended discontinuation of the study treatment in patients with a history of stroke owing to the risk of intracranial hemorrhage. RESULTS: At 3 years, the primary end point had occurred in 1028 patients (9.3%) in the vorapaxar group and in 1176 patients (10.5%) in the placebo group (hazard ratio for the vorapaxar group, 0.87; 95% confidence interval [CI], 0.80 to 0.94; P<0.001). Cardiovascular death, myocardial infarction, stroke, or recurrent ischemia leading to revascularization occurred in 1259 patients (11.2%) in the vorapaxar group and 1417 patients (12.4%) in the placebo group (hazard ratio, 0.88; 95% CI, 0.82 to 0.95; P=0.001). Moderate or severe bleeding occurred in 4.2% of patients who received vorapaxar and 2.5% of those who received placebo (hazard ratio, 1.66; 95% CI, 1.43 to 1.93; P<0.001). There was an increase in the rate of intracranial hemorrhage in the vorapaxar group (1.0%, vs. 0.5% in the placebo group; P<0.001). CONCLUSIONS: Inhibition of PAR-1 with vorapaxar reduced the risk of cardiovascular death or ischemic events in patients with stable atherosclerosis who were receiving standard therapy. However, it increased the risk of moderate or severe bleeding, including intracranial hemorrhage. (Funded by Merck; TRA 2P-TIMI 50 ClinicalTrials.gov number, NCT00526474.)

    Use of the WHO Access, Watch, and Reserve classification to define patterns of hospital antibiotic use (AWaRe): an analysis of paediatric survey data from 56 countries

    No full text
    Background Improving the quality of hospital antibiotic use is a major goal of WHO's global action plan to combat antimicrobial resistance. The WHO Essential Medicines List Access, Watch, and Reserve (AWaRe) classification could facilitate simple stewardship interventions that are widely applicable globally. We aimed to present data on patterns of paediatric AWaRe antibiotic use that could be used for local and national stewardship interventions. Methods 1-day point prevalence survey antibiotic prescription data were combined from two independent global networks: the Global Antimicrobial Resistance, Prescribing, and Efficacy in Neonates and Children and the Global Point Prevalence Survey on Antimicrobial Consumption and Resistance networks. We included hospital inpatients aged younger than 19 years receiving at least one antibiotic on the day of the survey. The WHO AWaRe classification was used to describe overall antibiotic use as assessed by the variation between use of Access, Watch, and Reserve antibiotics, for neonates and children and for the commonest clinical indications. Findings Of the 23 572 patients included from 56 countries, 18305 were children (77.7%) and 5267 were neonates (22.3%). Access antibiotic use in children ranged from 7.8% (China) to 61.2% (Slovenia) of all antibiotic prescriptions. The use of Watch antibiotics in children was highest in Iran (77.3%) and lowest in Finland (23.0%). In neonates, Access antibiotic use was highest in Singapore (100.0%) and lowest in China (24.2%). Reserve antibiotic use was low in all countries. Major differences in clinical syndrome-specific patterns of AWaRe antibiotic use in lower respiratory tract infection and neonatal sepsis were observed between WHO regions and countries. Interpretation There is substantial global variation in the proportion of AWaRe antibiotics used in hospitalised neonates and children. The AWaRe classification could potentially be used as a simple traffic light metric of appropriate antibiotic use. Future efforts should focus on developing and evaluating paediatric antibiotic stewardship programmes on the basis of the AWaRe index. Copyright (C) 2019 The Author(s). Published by Elsevier Ltd

    Prostatakarzinom

    No full text
    corecore