117 research outputs found

    Higher risk of revision for infection using systemic clindamycin prophylaxis than with cloxacillin

    Get PDF
    To access publisher's full text version of this article, please click on the hyperlink in Additional Links field or click on the hyperlink at the top of the page marked FilesBackground and purpose - Clindamycin has not been compared with other antibiotics for prophylaxis in arthroplasty. Since 2009, the Swedish Knee Arthroplasty Register (SKAR) has been collecting information on the prophylactic antibiotic regime used at every individual operation. In Sweden, when there is allergy to penicillin, clindamycin has been the recommended alternative. We examined whether there were differences in the rate of revision due to infection depending on which antibiotic was used as systemic prophylaxis. Patients and methods - Patients who had a total knee arthroplasty (TKA) performed due to osteoarthritis (OA) during the years 2009-2015 were included in the study. Information on which antibiotic was used was available for 80,018 operations (55,530 patients). Survival statistics were used to calculate the rate of revision due to infection until the end of 2015, comparing the group of patients who received cloxacillin with those who received clindamycin as systemic prophylaxis. Results - Cloxacillin was used in 90% of the cases, clindamycin in 7%, and cephalosporins in 2%. The risk of being revised due to infection was higher when clindamycin was used than when cloxacillin was used (RR =1.5, 95% CI: 1.2-2.0; p = 0.001). There was no significant difference in the revision rate for other causes (p = 0.2). Interpretation - We advise that patients reporting allergic reaction to penicillin should have their allergic history explored. In the absence of a clear history of type-I allergic reaction (e.g. urticaria, anaphylaxis, or bronchospasm), we suggest the use of a third-generation cephalosporin instead of clindamycin as perioperative prophylaxis when undergoing a TKR. No recommendation can be given regarding patients with type-1 allergy

    Intravitreal ranibizumab versus isovolemic hemodilution in the treatment of macular edema secondary to central retinal vein occlusion: Twelve-month results of a prospective, randomized, multicenter trial

    Get PDF
    PURPOSE This is a prospective, randomized, multicenter, investigator-initiated trial to evaluate the 12-month effectiveness of isovolemic hemodilution (IH) with prompt versus deferred intravitreal injections (IVI) of ranibizumab 0.5 mg for the treatment of macular edema secondary to early central retinal vein occlusion (CRVO). METHODS Eyes with macular edema due to CRVO having occurred not more than 8 weeks previously received either monthly ranibizumab IVI in combination with IH (group I, n = 28) or IH alone (group II, n = 30). From month 2 to 12, the patients in both groups could be treated with monthly intravitreal ranibizumab. The main outcome variables were gain of visual acuity and the course of central retinal thickness as measured with optical coherence tomography. RESULTS At 12 months, eyes in group I on average gained +28.1 (±19.3) letters compared to +25.2 (±20.9) letters in group II (p = 0.326). This result was achieved with significantly fewer injections in group II. Additionally, 30% of the eyes in group II did not need ranibizumab IVI during the 12 months of the trial. CONCLUSION Ranibizumab IVI in addition to IH proved to be highly effective in increasing visual acuity and reducing macular edema secondary to CRVO. Initial IH in early CRVO may be a first treatment option in patients anxious about IVI

    Plasma neurofilament light protein correlates with diffusion tensor imaging metrics in frontotemporal dementia

    Get PDF
    Neurofilaments are structural components of neurons and are particularly abundant in highly myelinated axons. The levels of neurofilament light chain (NfL) in both cerebrospinal fluid (CSF) and plasma have been related to degeneration in several neurodegenerative conditions including frontotemporal dementia (FTD) and NfL is currently considered as the most promising diagnostic and prognostic fluid biomarker in FTD. Although the location and function of filaments in the healthy nervous system suggests a link between increased NfL and white matter degeneration, such a claim has not been fully elucidated in vivo, especially in the context of FTD. The present study provides evidence of an association between the plasma levels of NfL and white matter involvement in behavioral variant FTD (bvFTD) by relating plasma concentration of NfL to diffusion tensor imaging (DTI) metrics in a group of 20 bvFTD patients. The results of both voxel-wise and tract specific analysis showed that increased plasma NfL concentration is associated with a reduction in fractional anisotropy (FA) in a widespread set of white matter tracts including the superior longitudinal fasciculus, the fronto-occipital fasciculus the anterior thalamic radiation and the dorsal cingulum bundle. Plasma NfL concentration also correlated with cortical thinning in a portion of the right medial prefrontal cortex and of the right lateral orbitofrontal cortex. These results support the hypothesis that blood NfL levels reflect the global level of neurodegeneration in bvFTD and help to advance our understanding of the association between this blood biomarker for FTD and the disease process

    Parsing heterogeneity within dementia with Lewy bodies using clustering of biological, clinical, and demographic data

    Get PDF
    Dementia with Lewy bodies (DLB) includes various core clinical features that result in different phenotypes. In addition, Alzheimer's disease (AD) and cerebrovascular pathologies are common in DLB. All this increases the heterogeneity within DLB and hampers clinical diagnosis. We addressed this heterogeneity by investigating subgroups of patients with similar biological, clinical, and demographic features. We studied 107 extensively phenotyped DLB patients from the European DLB consortium. Factorial analysis of mixed data (FAMD) was used to identify dimensions in the data, based on sex, age, years of education, disease duration, Mini-Mental State Examination (MMSE), cerebrospinal fluid (CSF) levels of AD biomarkers, core features of DLB, and regional brain atrophy. Subsequently, hierarchical clustering analysis was used to subgroup individuals based on the FAMD dimensions. We identified 3 dimensions using FAMD that explained 38% of the variance. Subsequent hierarchical clustering identified 4 clusters. Cluster 1 was characterized by amyloid-β and cerebrovascular pathologies, medial temporal atrophy, and cognitive fluctuations. Cluster 2 had posterior atrophy and showed the lowest frequency of visual hallucinations and cognitive fluctuations and the worst cognitive performance. Cluster 3 had the highest frequency of tau pathology, showed posterior atrophy, and had a low frequency of parkinsonism. Cluster 4 had virtually normal AD biomarkers, the least regional brain atrophy and cerebrovascular pathology, and the highest MMSE scores. This study demonstrates that there are subgroups of DLB patients with different biological, clinical, and demographic characteristics. These findings may have implications in the diagnosis and prognosis of DLB, as well as in the treatment response in clinical trials. The online version contains supplementary material available at 10.1186/s13195-021-00946-w

    Multinational survey of osteoporotic fracture management

    Get PDF
    Abstract Osteoporosis is characterized by a decreased bone mass and an increased bone fragility and susceptibility to fracture

    Delay to celiac disease diagnosis and its implications for health-related quality of life

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To determine how the delay in diagnosing celiac disease (CD) has developed during recent decades and how this affects the burden of disease in terms of health-related quality of life (HRQoL), and also to consider differences with respect to sex and age.</p> <p>Methods</p> <p>In collaboration with the Swedish Society for Coeliacs, a questionnaire was sent to 1,560 randomly selected members, divided in equal-sized age- and sex strata, and 1,031 (66%) responded. HRQoL was measured with the EQ-5D descriptive system and was then translated to quality-adjusted life year (QALY) scores. A general population survey was used as comparison.</p> <p>Results</p> <p>The mean delay to diagnosis from the first symptoms was 9.7 years, and from the first doctor visit it was 5.8 years. The delay has been reduced over time for some age groups, but is still quite long. The mean QALY score during the year prior to initiated treatment was 0.66; it improved after diagnosis and treatment to 0.86, and was then better than that of a general population (0.79).</p> <p>Conclusions</p> <p>The delay from first symptoms to CD diagnosis is unacceptably long for many persons. Untreated CD results in poor HRQoL, which improves to the level of the general population if diagnosed and treated. By shortening the diagnostic delay it is possible to reduce this unnecessary burden of disease. Increased awareness of CD as a common health problem is needed, and active case finding should be intensified. Mass screening for CD might be an option in the future.</p

    Field-adapted sampling of whole blood to determine the levels of amodiaquine and its metabolite in children with uncomplicated malaria treated with amodiaquine plus artesunate combination

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Artemisinin combination therapy (ACT) has been widely adopted as first-line treatment for uncomplicated falciparum malaria. In Uganda, amodiaquine plus artesunate (AQ+AS), is the alternative first-line regimen to Coartem<sup>® </sup>(artemether + lumefantrine) for the treatment of uncomplicated falciparum malaria. Currently, there are few field-adapted analytical techniques for monitoring amodiaquine utilization in patients. This study evaluates the field applicability of a new method to determine amodiaquine and its metabolite concentrations in whole blood dried on filter paper.</p> <p>Methods</p> <p>Twelve patients aged between 1.5 to 8 years with uncomplicated malaria received three standard oral doses of AQ+AS. Filter paper blood samples were collected before drug intake and at six different time points over 28 days period. A new field-adapted sampling procedure and liquid chromatographic method was used for quantitative determination of amodiaquine and its metabolite in whole blood.</p> <p>Results</p> <p>The sampling procedure was successively applied in the field. Amodiaquine could be quantified for at least three days and the metabolite up to 28 days. All parasites in all the 12 patients cleared within the first three days of treatment and no adverse drug effects were observed.</p> <p>Conclusion</p> <p>The methodology is suitable for field studies. The possibility to determine the concentration of the active metabolite of amodiaquine up to 28 days suggested that the method is sensitive enough to monitor amodiaquine utilization in patients. Amodiaquine plus artesunate seems effective for treatment of falciparum malaria.</p

    Effect of commercial breakfast fibre cereals compared with corn flakes on postprandial blood glucose, gastric emptying and satiety in healthy subjects: a randomized blinded crossover trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Dietary fibre food intake is related to a reduced risk of developing diabetes mellitus. However, the mechanism of this effect is still not clear. The aim of this study was to evaluate the effect of commercial fibre cereals on the rate of gastric emptying, postprandial glucose response and satiety in healthy subjects.</p> <p>Methods</p> <p>Gastric emptying rate (GER) was measured by standardized real time ultrasonography. Twelve healthy subjects were assessed using a randomized crossover blinded trial. The subjects were examined after an 8 hour fast and after assessment of normal fasting blood glucose level. Satiety scores were estimated and blood glucose measurements were taken before and at 0, 20, 30, 40, 60, 80, 100 and 120 min after the end of the meal. GER was calculated as the percentage change in the antral cross-sectional area 15 and 90 min after ingestion of sour milk with corn flakes (GER1), cereal bran flakes (GER2) or wholemeal oat flakes (GER3).</p> <p>Results</p> <p>The median value was, respectively, 42% for GER1, 33 % for GER2 and 51% for GER3. The difference between the GER after ingestion of bran flakes compared to wholemeal oat flakes was statistically significant (p = 0.023). The postprandial delta blood glucose level was statistically significantly lower at 40 min (p = 0.045) and 120 min (p = 0.023) after the cereal bran flakes meal. There was no statistical significance between the areas under the curve (AUCs) of the cereals as far as blood glucose and satiety were concerned.</p> <p>Conclusion</p> <p>The result of this study demonstrates that the intake of either bran flakes or wholemeal oat flakes has no effect on the total postprandial blood glucose response or satiety when compared to corn flakes. However, the study does show that the intake of cereal bran flakes slows the GER when compared to oat flakes and corn flakes, probably due to a higher fibre content. Since these products do not differ in terms of glucose response and satiety on healthy subjects, they should be considered equivalent in this respect.</p> <p>Trial registration</p> <p>ISRCTN90535566</p

    Dapagliflozin and Diuretic Use in Patients With Heart Failure and Reduced Ejection Fraction in DAPA-HF

    Get PDF
    Background: In the DAPA-HF trial (Dapagliflozin and Prevention of Adverse-Outcomes in Heart Failure), the sodium-glucose cotransporter 2 inhibitor dapagliflozin reduced the risk of worsening heart failure and death in patients with heart failure and reduced ejection fraction. We examined the efficacy and tolerability of dapagliflozin in relation to background diuretic treatment and change in diuretic therapy after randomization to dapagliflozin or placebo. Methods: We examined the effects of study treatment in the following subgroups: No diuretic and diuretic dose equivalent to furosemide 40 mg daily at baseline. We examined the primary composite end point of cardiovascular death or a worsening heart failure event and its components, all-cause death and symptoms. Results: Of 4616 analyzable patients, 736 (15.9%) were on no diuretic, 1311 (28.4%) were on 40 mg. Compared with placebo, dapagliflozin reduced the risk of the primary end point across each of these subgroups: Hazard ratios were 0.57 (95% CI, 0.36-0.92), 0.83 (95% CI, 0.63-1.10), 0.77 (95% CI, 0.60-0.99), and 0.78 (95% CI, 0.63-0.97), respectively (P for interaction=0.61). The hazard ratio in patients taking any diuretic was 0.78 (95% CI, 0.68-0.90). Improvements in symptoms and treatment toleration were consistent across the diuretic subgroups. Diuretic dose did not change in most patients during follow-up, and mean diuretic dose did not differ between the dapagliflozin and placebo groups after randomization. Conclusions: The efficacy and safety of dapagliflozin were consistent across the diuretic subgroups examined in DAPA-HF

    Infection of the Central Nervous System, Sepsis and Amyotrophic Lateral Sclerosis

    Get PDF
    Severe infections may lead to chronic inflammation in the central nervous system (CNS) which may in turn play a role in the etiopathogenesis of amyotrophic lateral sclerosis (ALS). The relentless progression and invasive supportive treatments of ALS may on the other hand induce severe infections among ALS patients.The present study included 4,004 ALS patients identified from the Swedish Patient Register during 1991-2007 and 20,020 age and sex matched general population controls. Conditional logistic regression was used to estimate the odds ratios (ORs) of ALS given a previous hospitalization for CNS infection or sepsis. Cox models were used to estimate the hazard ratios (HRs) of hospitalization for CNS infection or sepsis after ALS diagnosis. Overall, previous CNS infection (OR: 1.3, 95% confidence interval [CI]: 0.8, 2.4) or sepsis (OR: 1.2, 95% CI: 0.9, 1.6) was not associated with ALS risk. However, compared to ALS free individuals, ALS cases were more likely to be hospitalized for sepsis after diagnosis (HR: 2.6, 95% CI: 1.9, 3.5). We did not observe a higher risk of CNS infection after ALS diagnosis.Our results suggest that acute and severe infections unlikely contribute to the development of ALS; however, ALS patients are at a higher risk of sepsis after diagnosis, compared to ALS free individuals
    • …
    corecore