5,673 research outputs found

    Estimating the number needed to treat from continuous outcomes in randomised controlled trials: methodological challenges and worked example using data from the UK Back Pain Exercise and Manipulation (BEAM) trial

    Get PDF
    Background Reporting numbers needed to treat (NNT) improves interpretability of trial results. It is unusual that continuous outcomes are converted to numbers of individual responders to treatment (i.e., those who reach a particular threshold of change); and deteriorations prevented are only rarely considered. We consider how numbers needed to treat can be derived from continuous outcomes; illustrated with a worked example showing the methods and challenges. Methods We used data from the UK BEAM trial (n = 1, 334) of physical treatments for back pain; originally reported as showing, at best, small to moderate benefits. Participants were randomised to receive 'best care' in general practice, the comparator treatment, or one of three manual and/or exercise treatments: 'best care' plus manipulation, exercise, or manipulation followed by exercise. We used established consensus thresholds for improvement in Roland-Morris disability questionnaire scores at three and twelve months to derive NNTs for improvements and for benefits (improvements gained+deteriorations prevented). Results At three months, NNT estimates ranged from 5.1 (95% CI 3.4 to 10.7) to 9.0 (5.0 to 45.5) for exercise, 5.0 (3.4 to 9.8) to 5.4 (3.8 to 9.9) for manipulation, and 3.3 (2.5 to 4.9) to 4.8 (3.5 to 7.8) for manipulation followed by exercise. Corresponding between-group mean differences in the Roland-Morris disability questionnaire were 1.6 (0.8 to 2.3), 1.4 (0.6 to 2.1), and 1.9 (1.2 to 2.6) points. Conclusion In contrast to small mean differences originally reported, NNTs were small and could be attractive to clinicians, patients, and purchasers. NNTs can aid the interpretation of results of trials using continuous outcomes. Where possible, these should be reported alongside mean differences. Challenges remain in calculating NNTs for some continuous outcomes

    Identifying chemokines as therapeutic targets in renal disease: Lessons from antagonist studies and knockout mice

    Get PDF
    Chemokines, in concert with cytokines and adhesion molecules, play multiple roles in local and systemic immune responses. In the kidney, the temporal and spatial expression of chemokines correlates with local renal damage and accumulation of chemokine receptor-bearing leukocytes. Chemokines play important roles in leukocyte trafficking and blocking chemokines can effectively reduce renal leukocyte recruitment and subsequent renal damage. However, recent data indicate that blocking chemokine or chemokine receptor activity in renal disease may also exacerbate renal inflammation under certain conditions. An increasing amount of data indicates additional roles of chemokines in the regulation of innate and adaptive immune responses, which may adversively affect the outcome of interventional studies. This review summarizes available in vivo studies on the blockade of chemokines and chemokine receptors in kidney diseases, with a special focus on the therapeutic potential of anti-chemokine strategies, including potential side effects, in renal disease. Copyright (C) 2004 S. Karger AG, Basel

    Persistent currents in a circular array of Bose-Einstein condensates

    Full text link
    A ring-shaped array of Bose-Einstein condensed atomic gases can display circular currents if the relative phase of neighboring condensates becomes locked to certain values. It is shown that, irrespective of the mechanism responsible for generating these states, only a restricted set of currents are stable, depending on the number of condensates, on the interaction and tunneling energies, and on the total number of particles. Different instabilities due to quasiparticle excitations are characterized and possible experimental setups for testing the stability prediction are also discussed.Comment: 7 pages, REVTex

    Fast and Reliable Differentiation of Eight Trichinella Species Using a High Resolution Melting Assay

    Get PDF
    High resolution melting analysis (HRMA) is a single-tube method, which can be carried out rapidly as an additional step following real-time quantitative PCR (qPCR). The method enables the differentiation of genetic variation (down to single nucleotide polymorphisms) in amplified DNA fragments without sequencing. HRMA has previously been adopted to determine variability in the amplified genes of a number of organisms. However, only one work to date has focused on pathogenic parasites–nematodes from the genus Trichinella. In this study, we employed a qPCR-HRMA assay specifically targeting two sequential gene fragments–cytochrome c oxidase subunit I (COI) and expansion segment V (ESV), in order to differentiate 37 single L1 muscle larvae samples of eight Trichinella species. We prove that qPCR-HRMA based on the mitochondrial COI gene allows differentiation between the sequences of PCR products of the same length. This simple, rapid and reliable method can be used to identify at the species level single larvae of eight Trichinella taxa.High resolution melting analysis (HRMA) is a single-tube method, which can be carried out rapidly as an additional step following real-time quantitative PCR (qPCR). The method enables the differentiation of genetic variation (down to single nucleotide polymorphisms) in amplified DNA fragments without sequencing. HRMA has previously been adopted to determine variability in the amplified genes of a number of organisms. However, only one work to date has focused on pathogenic parasites–nematodes from the genus Trichinella. In this study, we employed a qPCR-HRMA assay specifically targeting two sequential gene fragments–cytochrome c oxidase subunit I (COI) and expansion segment V (ESV), in order to differentiate 37 single L1 muscle larvae samples of eight Trichinella species. We prove that qPCR-HRMA based on the mitochondrial COI gene allows differentiation between the sequences of PCR products of the same length. This simple, rapid and reliable method can be used to identify at the species level single larvae of eight Trichinella taxa

    Manifold Elastic Net: A Unified Framework for Sparse Dimension Reduction

    Full text link
    It is difficult to find the optimal sparse solution of a manifold learning based dimensionality reduction algorithm. The lasso or the elastic net penalized manifold learning based dimensionality reduction is not directly a lasso penalized least square problem and thus the least angle regression (LARS) (Efron et al. \cite{LARS}), one of the most popular algorithms in sparse learning, cannot be applied. Therefore, most current approaches take indirect ways or have strict settings, which can be inconvenient for applications. In this paper, we proposed the manifold elastic net or MEN for short. MEN incorporates the merits of both the manifold learning based dimensionality reduction and the sparse learning based dimensionality reduction. By using a series of equivalent transformations, we show MEN is equivalent to the lasso penalized least square problem and thus LARS is adopted to obtain the optimal sparse solution of MEN. In particular, MEN has the following advantages for subsequent classification: 1) the local geometry of samples is well preserved for low dimensional data representation, 2) both the margin maximization and the classification error minimization are considered for sparse projection calculation, 3) the projection matrix of MEN improves the parsimony in computation, 4) the elastic net penalty reduces the over-fitting problem, and 5) the projection matrix of MEN can be interpreted psychologically and physiologically. Experimental evidence on face recognition over various popular datasets suggests that MEN is superior to top level dimensionality reduction algorithms.Comment: 33 pages, 12 figure

    Different definition of sarcopenia and mortality in cancer: A meta-analysis

    Get PDF
    Objectives: Sarcopenia has been an emerging theme in clinical oncology. Various definitions of sarcopenia have been proposed, but their prognostic performance have yet to be evaluated and compared. The aim of this meta-analysis is to comprehensively evaluate the performance of different cutoff definitions of sarcopenia in cancer mortality prognostication. / Methods: This is a meta-analysis. Cohort studies on lean mass and mortality published before December 20, 2017 were obtained by systematic search on PubMed, Cochrane Library, and Embase. Inclusion criteria were cohort studies reporting binary lean mass categorized according to clearly defined cutoffs, and with all-cause mortality as study outcome. Studies were stratified according to the cutoff(s) used in defining low lean mass. The cutoff-specific hazard ratios (HRs) and 95% confidence intervals (CIs) of low lean mass on cancer mortality were pooled with a random-effects model and compared. / Results: Altogether 81 studies that studied binary lean mass were included. The pooled HRs on cancer mortality using the 3 most used definitions were: 1.74 (95% CI, 1.46–2.07) using the definition proposed by International Consensus of Cancer Cachexia, 1.45 (95% CI, 1.21–1.75) using that by Martin, and 1.58 (95% CI, 1.35–1.84) using that by Prado. The associations between sarcopenia and cancer mortality using other definitions were all statistically significant, despite different estimates were observed. / Conclusions: The association of low lean mass with increased mortality was consistent across different definitions; this provides further evidence on the poorer survival in cancer patients with sarcopenia. However, further studies evaluating the performance of each definition are warranted

    Amyloid β induces early changes in the ribosomal machinery, cytoskeletal organization and oxidative phosphorylation in retinal photoreceptor cells

    Full text link
    Amyloid β (Aβ) accumulation and its aggregation is characteristic molecular feature of the development of Alzheimer’s disease (AD). More recently, Aβ has been suggested to be associated with retinal pathology associated with AD, glaucoma and drusen deposits in age related macular degeneration (AMD). In this study, we investigated the proteins and biochemical networks that are affected by Aβ in the 661 W photoreceptor cells in culture. Time and dose dependent effects of Aβ on the photoreceptor cells were determined utilizing tandem mass tag (TMT) labeling-based quantitative mass-spectrometric approach. Bioinformatic analysis of the data revealed concentration and time dependent effects of the Aβ peptide stimulation on various key biochemical pathways that might be involved in mediating the toxicity effects of the peptide. We identified increased Tau phosphorylation, GSK3β dysregulation and reduced cell viability in cells treated with Aβ in a dose and time dependent manner. This study has delineated for the first-time molecular networks in photoreceptor cells that are impacted early upon Aβ treatment and contrasted the findings with a longer-term treatment effect. Proteins associated with ribosomal machinery homeostasis, mitochondrial function and cytoskeletal organization were affected in the initial stages of Aβ exposure, which may provide key insights into AD effects on the photoreceptors and specific molecular changes induced by Aβ peptide
    corecore