1,832 research outputs found

    neutrophils are on the payroll

    Get PDF
    Viral hemorrhagic fever caused by hantaviruses is an emerging infectious disease for which suitable treatments are not available. In order to improve this situation a better understanding of hantaviral pathogenesis is urgently required. Hantaviruses infect endothelial cell layers in vitro without causing any cytopathogenic effect and without increasing permeability. This implies that the mechanisms underlying vascular hyperpermeability in hantavirus- associated disease are more complex and that immune mechanisms play an important role. In this review we highlight the latest developments in hantavirus-induced immunopathogenesis. A possible contribution of neutrophils has been neglected so far. For this reason, we place special emphasis on the pathogenic role of neutrophils in disrupting the endothelial barrier

    Tests of Bayesian Model Selection Techniques for Gravitational Wave Astronomy

    Full text link
    The analysis of gravitational wave data involves many model selection problems. The most important example is the detection problem of selecting between the data being consistent with instrument noise alone, or instrument noise and a gravitational wave signal. The analysis of data from ground based gravitational wave detectors is mostly conducted using classical statistics, and methods such as the Neyman-Pearson criteria are used for model selection. Future space based detectors, such as the \emph{Laser Interferometer Space Antenna} (LISA), are expected to produced rich data streams containing the signals from many millions of sources. Determining the number of sources that are resolvable, and the most appropriate description of each source poses a challenging model selection problem that may best be addressed in a Bayesian framework. An important class of LISA sources are the millions of low-mass binary systems within our own galaxy, tens of thousands of which will be detectable. Not only are the number of sources unknown, but so are the number of parameters required to model the waveforms. For example, a significant subset of the resolvable galactic binaries will exhibit orbital frequency evolution, while a smaller number will have measurable eccentricity. In the Bayesian approach to model selection one needs to compute the Bayes factor between competing models. Here we explore various methods for computing Bayes factors in the context of determining which galactic binaries have measurable frequency evolution. The methods explored include a Reverse Jump Markov Chain Monte Carlo (RJMCMC) algorithm, Savage-Dickie density ratios, the Schwarz-Bayes Information Criterion (BIC), and the Laplace approximation to the model evidence. We find good agreement between all of the approaches.Comment: 11 pages, 6 figure

    Bayesian Hyperbolic Multidimensional Scaling

    Full text link
    Multidimensional scaling (MDS) is a widely used approach to representing high-dimensional, dependent data. MDS works by assigning each observation a location on a low-dimensional geometric manifold, with distance on the manifold representing similarity. We propose a Bayesian approach to multidimensional scaling when the low-dimensional manifold is hyperbolic. Using hyperbolic space facilitates representing tree-like structures common in many settings (e.g. text or genetic data with hierarchical structure). A Bayesian approach provides regularization that minimizes the impact of measurement error in the observed data and assesses uncertainty. We also propose a case-control likelihood approximation that allows for efficient sampling from the posterior distribution in larger data settings, reducing computational complexity from approximately O(n2)O(n^2) to O(n)O(n). We evaluate the proposed method against state-of-the-art alternatives using simulations, canonical reference datasets, Indian village network data, and human gene expression data

    A Bayesian Approach to the Detection Problem in Gravitational Wave Astronomy

    Full text link
    The analysis of data from gravitational wave detectors can be divided into three phases: search, characterization, and evaluation. The evaluation of the detection - determining whether a candidate event is astrophysical in origin or some artifact created by instrument noise - is a crucial step in the analysis. The on-going analyses of data from ground based detectors employ a frequentist approach to the detection problem. A detection statistic is chosen, for which background levels and detection efficiencies are estimated from Monte Carlo studies. This approach frames the detection problem in terms of an infinite collection of trials, with the actual measurement corresponding to some realization of this hypothetical set. Here we explore an alternative, Bayesian approach to the detection problem, that considers prior information and the actual data in hand. Our particular focus is on the computational techniques used to implement the Bayesian analysis. We find that the Parallel Tempered Markov Chain Monte Carlo (PTMCMC) algorithm is able to address all three phases of the anaylsis in a coherent framework. The signals are found by locating the posterior modes, the model parameters are characterized by mapping out the joint posterior distribution, and finally, the model evidence is computed by thermodynamic integration. As a demonstration, we consider the detection problem of selecting between models describing the data as instrument noise, or instrument noise plus the signal from a single compact galactic binary. The evidence ratios, or Bayes factors, computed by the PTMCMC algorithm are found to be in close agreement with those computed using a Reversible Jump Markov Chain Monte Carlo algorithm.Comment: 19 pages, 12 figures, revised to address referee's comment

    ‘Not clinically effective but cost-effective’ - paradoxical conclusions in randomised controlled trials with ‘doubly null’ results: a cross-sectional study

    Get PDF
    Objectives Randomised controlled trials in healthcare increasingly include economic evaluations. Some show small differences which are not statistically significant. Yet these sometimes come to paradoxical conclusions such as: 'the intervention is not clinically effective' but 'is probably cost-effective'. This study aims to quantify the extent of non-significant results and the types of conclusions drawn from them. Design Cross-sectional retrospective analysis of randomised trials published by the UK's National Institute for Health Research (NIHR) Health Technology Assessment programme. We defined as 'doubly null' those trials that found non-statistically significant differences in both primary outcome and cost per patient. Paradoxical was defined as concluding in favour of an intervention, usually compared with placebo or usual care. No human participants were involved. Our sample was 226 randomised trial projects published by the Health Technology Assessment programme 2004 to 2017. All are available free online. Results The 226 projects contained 193 trials with a full economic evaluation. Of these 76 (39%) had at least one 'doubly null' comparison. These 76 trials contained 94 comparisons. In these 30 (32%) drew economic conclusions in favour of an intervention. Overall report conclusions split roughly equally between those favouring the intervention (14), and those favouring either the control (7) or uncertainty (9). Discussion Trials with 'doubly null' results and paradoxical conclusions are not uncommon. The differences observed in cost and quality-adjustedlife year were small and non-statistically significant. Almost all these trials were also published in leading peer-reviewed journals. Although some guidelines for reporting economic results require cost-effectiveness estimates regardless of statistical significance, the interpretability of paradoxical results has nowhere been addressed. Conclusions Reconsideration is required of the interpretation of cost-effectiveness analyses in randomised controlled trials with 'doubly null' results, particularly when economics favours a novel intervention.</p

    Investigating the driving mechanisms of coronal mass ejections

    Full text link
    The objective of this investigation was to first examine the kinematics of coronal mass ejections (CMEs) using EUV and coronagraph images, and then to make a comparison with theoretical models in the hope to identify the driving mechanisms of the CMEs. We have studied two CMEs which occurred on 2006 Dec. 17 (CME06) and 2007 Dec. 31 (CME07). The models studied in this work were catastrophe, breakout, and toroidal instability models. We found that after the eruption, the accelerations of both events exhibited a drop before increasing again. Our comparisons with the theories suggested that CME06 can be best described by a hybrid of the catastrophe and breakout models while CME07 is most consistent with the breakout model.Comment: 9 pages 7 figure

    Metformin Treatment Does Not Inhibit Growth of Pancreatic Cancer Patient-Derived Xenografts

    Get PDF
    There is currently tremendous interest in developing anti-cancer therapeutics targeting cell signaling pathways important for both cancer cell metabolism and growth. Several epidemiological studies have shown that diabetic patients taking metformin have a decreased incidence of pancreatic cancer. This has prompted efforts to evaluate metformin, a drug with negligible toxicity, as a therapeutic modality in pancreatic cancer. Preclinical studies in cell line xenografts and one study in patient-derived xenograft (PDX) models were promising, while recently published clinical trials showed no benefit to adding metformin to combination therapy regimens for locally advanced and metastatic pancreatic cancer. PDX models in which patient tumors are directly engrafted into immunocompromised mice have been shown to be excellent preclinical models for biomarker discovery and therapeutic development. We evaluated the response of four PDX tumor lines to metformin treatment and found that all four of our PDX lines were resistant to metformin. We found that the mechanisms of resistance may occur through lack of sustained activation of adenosine monophosphate-activated protein kinase (AMPK) or downstream reactivation of the mammalian target of rapamycin (mTOR). Moreover, combined treatment with metformin and mTOR inhibitors failed to improve responses in cell lines, which further indicates that metformin alone or in combination with mTOR inhibitors will be ineffective in patients, and that resistance to metformin may occur through multiple pathways. Further studies are required to better understand these mechanisms of resistance and inform potential combination therapies with metformin and existing or novel therapeutics

    The SWAP EUV Imaging Telescope Part I: Instrument Overview and Pre-Flight Testing

    Full text link
    The Sun Watcher with Active Pixels and Image Processing (SWAP) is an EUV solar telescope on board ESA's Project for Onboard Autonomy 2 (PROBA2) mission launched on 2 November 2009. SWAP has a spectral bandpass centered on 17.4 nm and provides images of the low solar corona over a 54x54 arcmin field-of-view with 3.2 arcsec pixels and an imaging cadence of about two minutes. SWAP is designed to monitor all space-weather-relevant events and features in the low solar corona. Given the limited resources of the PROBA2 microsatellite, the SWAP telescope is designed with various innovative technologies, including an off-axis optical design and a CMOS-APS detector. This article provides reference documentation for users of the SWAP image data.Comment: 26 pages, 9 figures, 1 movi
    • …
    corecore