97 research outputs found

    Cytotoxic CD4+ T cells in patients with B cell chronic lymphocytic leukemia kill via a perforin-mediated pathway

    Get PDF
    Background and Objectives: B-cell chronic lymphocytic leukemia (B-CLL) is a clonal expansion of CD5+B cells that accumulate due to their uncontrolled growth and resistance to apoptosis. We have previously shown that up to 50% of blood CD4+ T cells in BCLL patients have a cytotoxicity-related CD28-CD57+ phenotype and high content of both granzyme B and perforin (PF). In this study we investigate the cytotoxic potential of these cells against autologous B-CLL cells. Design and Methods: Blood CD4+ or CD8+ T cells were positively isolated from B-CLL patients and cultured under a range of conditions with autologous purified B-CLL cells and with bispecific [anti-CD3 x anti-CD19] antibodies. Apoptosis of labeled B-CLL cells was assessed using the change of mitochondrial membrane potential with the fluorescent dye DiOC6 and confirmed by annexin V binding. Results: There was time- and dose-dependent killing of B-CLL cells by both CD8+ and CD4+ T cells and this ranged from 6.6 - 68.0% for CD4+ cells and 6.4 - 57.8% for CD8+ cells. Almost complete inhibition by concanamycin A suggests that CD4+ T cells like CD8+ T cells induced apoptosis through a perforin-mediated pathway, but not via Fas/FasL (as indicated by lack of blocking with brefeldin A), tumor necrosis factor or TRAIL. Interpretation and Conclusions: This study shows that blood CD4+PF+ T cells enriched in B-CLL patients, are able to kill autologous B-CLL cells ex vivo, through bispecific antibodies via a perforin mediated mechanism

    CHecklist for statistical Assessment of Medical Papers: the CHAMP statement

    Get PDF
    Misuse of statistics in medical and sports science research is common and may lead to detrimental consequences to healthcare. Many authors, editors and peer reviewers of medical papers will not have expert knowledge of statistics or may be unconvinced about the importance of applying correct statistics in medical research. Although there are guidelines on reporting statistics in medical papers, a checklist on the more general and commonly seen aspects of statistics to assess when peer-reviewing an article is needed. In this article, we propose a CHecklist for statistical Assessment of Medical Papers (CHAMP) comprising 30 items related to the design and conduct, data analysis, reporting and presentation, and interpretation of a research paper. While CHAMP is primarily aimed at editors and peer reviewers during the statistical assessment of a medical paper, we believe it will serve as a useful reference to improve authors’ and readers’ practice in their use of statistics in medical research. We strongly encourage editors and peer reviewers to consult CHAMP when assessing manuscripts for potential publication. Authors also may apply CHAMP to ensure the validity of their statistical approach and reporting of medical research, and readers may consider using CHAMP to enhance their statistical assessment of a paper

    A CHecklist for statistical Assessment of Medical Papers (the CHAMP statement): explanation and elaboration

    Get PDF
    Misuse of statistics in medical and sports science research is common and may lead to detrimental consequences to healthcare. Many authors, editors and peer reviewers of medical papers will not have expert knowledge of statistics or may be unconvinced about the importance of applying correct statistics in medical research. Although there are guidelines on reporting statistics in medical papers, a checklist on the more general and commonly seen aspects of statistics to assess when peer-reviewing an article is needed. In this article, we propose a CHecklist for statistical Assessment of Medical Papers (CHAMP) comprising 30 items related to the design and conduct, data analysis, reporting and presentation, and interpretation of a research paper. While CHAMP is primarily aimed at editors and peer reviewers during the statistical assessment of a medical paper, we believe it will serve as a useful reference to improve authors’ and readers’ practice in their use of statistics in medical research. We strongly encourage editors and peer reviewers to consult CHAMP when assessing manuscripts for potential publication. Authors also may apply CHAMP to ensure the validity of their statistical approach and reporting of medical research, and readers may consider using CHAMP to enhance their statistical assessment of a paper

    Random Convex Hulls and Extreme Value Statistics

    Full text link
    In this paper we study the statistical properties of convex hulls of NN random points in a plane chosen according to a given distribution. The points may be chosen independently or they may be correlated. After a non-exhaustive survey of the somewhat sporadic literature and diverse methods used in the random convex hull problem, we present a unifying approach, based on the notion of support function of a closed curve and the associated Cauchy's formulae, that allows us to compute exactly the mean perimeter and the mean area enclosed by the convex polygon both in case of independent as well as correlated points. Our method demonstrates a beautiful link between the random convex hull problem and the subject of extreme value statistics. As an example of correlated points, we study here in detail the case when the points represent the vertices of nn independent random walks. In the continuum time limit this reduces to nn independent planar Brownian trajectories for which we compute exactly, for all nn, the mean perimeter and the mean area of their global convex hull. Our results have relevant applications in ecology in estimating the home range of a herd of animals. Some of these results were announced recently in a short communication [Phys. Rev. Lett. {\bf 103}, 140602 (2009)].Comment: 61 pages (pedagogical review); invited contribution to the special issue of J. Stat. Phys. celebrating the 50 years of Yeshiba/Rutgers meeting

    Generative Adversarial Networks for Scintillation Signal Simulation in EXO-200

    Full text link
    Generative Adversarial Networks trained on samples of simulated or actual events have been proposed as a way of generating large simulated datasets at a reduced computational cost. In this work, a novel approach to perform the simulation of photodetector signals from the time projection chamber of the EXO-200 experiment is demonstrated. The method is based on a Wasserstein Generative Adversarial Network - a deep learning technique allowing for implicit non-parametric estimation of the population distribution for a given set of objects. Our network is trained on real calibration data using raw scintillation waveforms as input. We find that it is able to produce high-quality simulated waveforms an order of magnitude faster than the traditional simulation approach and, importantly, generalize from the training sample and discern salient high-level features of the data. In particular, the network correctly deduces position dependency of scintillation light response in the detector and correctly recognizes dead photodetector channels. The network output is then integrated into the EXO-200 analysis framework to show that the standard EXO-200 reconstruction routine processes the simulated waveforms to produce energy distributions comparable to that of real waveforms. Finally, the remaining discrepancies and potential ways to improve the approach further are highlighted.Comment: 20 pages, 10 figure

    Search for Neutrinoless Double- β Decay with the Complete EXO-200 Dataset

    Get PDF
    A search for neutrinoless double-β decay (0νββ) in Xe136 is performed with the full EXO-200 dataset using a deep neural network to discriminate between 0νββ and background events. Relative to previous analyses, the signal detection efficiency has been raised from 80.8% to 96.4±3.0%, and the energy resolution of the detector at the Q value of Xe136 0νββ has been improved from σ/E=1.23% to 1.15±0.02% with the upgraded detector. Accounting for the new data, the median 90% confidence level 0νββ half-life sensitivity for this analysis is 5.0×1025 yr with a total Xe136 exposure of 234.1 kg yr. No statistically significant evidence for 0νββ is observed, leading to a lower limit on the 0νββ half-life of 3.5×1025 yr at the 90% confidence level

    Search for Neutrinoless Double-Beta Decay with the Upgraded EXO-200 Detector

    Get PDF
    Results from a search for neutrinoless double-beta decay (0νββ) of Xe136 are presented using the first year of data taken with the upgraded EXO-200 detector. Relative to previous searches by EXO-200, the energy resolution of the

    Dimethyl fumarate in patients admitted to hospital with COVID-19 (RECOVERY): a randomised, controlled, open-label, platform trial

    Get PDF
    Dimethyl fumarate (DMF) inhibits inflammasome-mediated inflammation and has been proposed as a treatment for patients hospitalised with COVID-19. This randomised, controlled, open-label platform trial (Randomised Evaluation of COVID-19 Therapy [RECOVERY]), is assessing multiple treatments in patients hospitalised for COVID-19 (NCT04381936, ISRCTN50189673). In this assessment of DMF performed at 27 UK hospitals, adults were randomly allocated (1:1) to either usual standard of care alone or usual standard of care plus DMF. The primary outcome was clinical status on day 5 measured on a seven-point ordinal scale. Secondary outcomes were time to sustained improvement in clinical status, time to discharge, day 5 peripheral blood oxygenation, day 5 C-reactive protein, and improvement in day 10 clinical status. Between 2 March 2021 and 18 November 2021, 713 patients were enroled in the DMF evaluation, of whom 356 were randomly allocated to receive usual care plus DMF, and 357 to usual care alone. 95% of patients received corticosteroids as part of routine care. There was no evidence of a beneficial effect of DMF on clinical status at day 5 (common odds ratio of unfavourable outcome 1.12; 95% CI 0.86-1.47; p = 0.40). There was no significant effect of DMF on any secondary outcome

    Anticipating future learning affects current control decisions:A comparison between passive and active adaptive management in an epidemiological setting

    No full text
    Infectious disease epidemics present a difficult task for policymakers, requiring the implementation of control strategies under significant time constraints and uncertainty. Mathematical models can be used to predict the outcome of control interventions, providing useful information to policymakers in the event of such an epidemic. However, these models suffer in the early stages of an outbreak from a lack of accurate, relevant information regarding the dynamics and spread of the disease and the efficacy of control. As such, recommendations provided by these models are often incorporated in an ad hoc fashion, as and when more reliable information becomes available. In this work, we show that such trial-and-error-type approaches to management, which do not formally take into account the resolution of uncertainty and how control actions affect this, can lead to sub-optimal management outcomes. We compare three approaches to managing a theoretical epidemic: a non-adaptive management (AM) approach that does not use real-time outbreak information to adapt control, a passive AM approach that incorporates real-time information if and when it becomes available, and an active AM approach that explicitly incorporates the future resolution of uncertainty through gathering real-time information into its initial recommendations. The structured framework of active AM encourages the specification of quantifiable objectives, models of system behaviour and possible control and monitoring actions, followed by an iterative learning and control phase that is able to employ complex control optimisations and resolve system uncertainty. The result is a management framework that is able to provide dynamic, long-term projections to help policymakers meet the objectives of management. We investigate in detail the effect of different methods of incorporating up-to-date outbreak information. We find that, even in a highly simplified system, the method of incorporating new data can lead to different results that may influence initial policy decisions, with an active AM approach to management providing better information that can lead to more desirable outcomes from an epidemic. © 2020 The Author

    Synergistic interventions to control COVID-19:Mass testing and isolation mitigates reliance on distancing

    No full text
    Stay-at-home orders and shutdowns of non-essential businesses are powerful, but socially costly, tools to control the pandemic spread of SARS-CoV-2. Mass testing strategies, which rely on widely administered frequent and rapid diagnostics to identify and isolate infected individuals, could be a potentially less disruptive management strategy, particularly where vaccine access is limited. In this paper, we assess the extent to which mass testing and isolation strategies can reduce reliance on socially costly non-pharmaceutical interventions, such as distancing and shutdowns. We develop a multi-compartmental model of SARS-CoV-2 transmission incorporating both preventative non-pharmaceutical interventions (NPIs) and testing and isolation to evaluate their combined effect on public health outcomes. Our model is designed to be a policy-guiding tool that captures important realities of the testing system, including constraints on test administration and non-random testing allocation. We show how strategic changes in the characteristics of the testing system, including test administration, test delays, and test sensitivity, can reduce reliance on preventative NPIs without compromising public health outcomes in the future. The lowest NPI levels are possible only when many tests are administered and test delays are short, given limited immunity in the population. Reducing reliance on NPIs is highly dependent on the ability of a testing program to identify and isolate unreported, asymptomatic infections. Changes in NPIs, including the intensity of lockdowns and stay at home orders, should be coordinated with increases in testing to ensure epidemic control; otherwise small additional lifting of these NPIs can lead to dramatic increases in infections, hospitalizations and deaths. Importantly, our results can be used to guide ramp-up of testing capacity in outbreak settings, allow for the flexible design of combined interventions based on social context, and inform future cost-benefit analyses to identify efficient pandemic management strategies
    corecore