710 research outputs found

    Editor\u27s Note

    Get PDF

    "Virus hunting" using radial distance weighted discrimination

    Get PDF
    Motivated by the challenge of using DNA-seq data to identify viruses in human blood samples, we propose a novel classification algorithm called "Radial Distance Weighted Discrimination" (or Radial DWD). This classifier is designed for binary classification, assuming one class is surrounded by the other class in very diverse radial directions, which is seen to be typical for our virus detection data. This separation of the 2 classes in multiple radial directions naturally motivates the development of Radial DWD. While classical machine learning methods such as the Support Vector Machine and linear Distance Weighted Discrimination can sometimes give reasonable answers for a given data set, their generalizability is severely compromised because of the linear separating boundary. Radial DWD addresses this challenge by using a more appropriate (in this particular case) spherical separating boundary. Simulations show that for appropriate radial contexts, this gives much better generalizability than linear methods, and also much better than conventional kernel based (nonlinear) Support Vector Machines, because the latter methods essentially use much of the information in the data for determining the shape of the separating boundary. The effectiveness of Radial DWD is demonstrated for real virus detection.Comment: Published at http://dx.doi.org/10.1214/15-AOAS869 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The use of hybrid cellular automaton models for improving cancer therapy, In Proceedings, Cellular Automata: 6th International Conference on Cellular Automata for Research and Industry, ACRI 2004, Amsterdam, The Netherlands, eds P.M.A. Sloot, B. Chopard, A.G. Hoekstra

    Get PDF
    The Hybrid Cellular Automata (HCA) modelling framework can be an efficient approach to a number of biological problems, particularly those which involve the integration of multiple spatial and temporal scales. As such, HCA may become a key modelling tool in the development of the so-called intergrative biology. In this paper, we first discuss HCA on a general level and then present results obtained when this approach was implemented in cancer research

    A mathematical model of Doxorubicin treatment efficacy on non-Hodgkin’s lymphoma: Investigation of current protocol through theoretical modelling results

    Get PDF
    Doxorubicin treatment outcomes for non-Hodgkin’s lymphomas (NHL) are mathematically modelled and computationally analyzed. The NHL model includes a tumor structure incorporating mature and immature vessels, vascular structural adaptation and NHL cell-cycle kinetics in addition to Doxorubicin pharmacokinetics (PK) and pharmacodynamics (PD). Simulations provide qualitative estimations of the effect of Doxorubicin on high-grade (HG), intermediate-grade (IG) and low-grade (LG) NHL. Simulation results imply that if the interval between successive drug applications is prolonged beyond a certain point, treatment will be inefficient due to effects caused by heterogeneous blood flow in the system

    Efficiency clustering for low-density microarrays and its application to QPCR

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Pathway-targeted or low-density arrays are used more and more frequently in biomedical research, particularly those arrays that are based on quantitative real-time PCR. Typical QPCR arrays contain 96-1024 primer pairs or probes, and they bring with it the promise of being able to reliably measure differences in target levels without the need to establish absolute standard curves for each and every target. To achieve reliable quantification all primer pairs or array probes must perform with the same efficiency.</p> <p><b>Results</b></p> <p>Our results indicate that QPCR primer-pairs differ significantly both in reliability and efficiency. They can only be used in an array format if the raw data (so called CT values for real-time QPCR) are transformed to take these differences into account. We developed a novel method to obtain efficiency-adjusted CT values. We introduce transformed confidence intervals as a novel measure to identify unreliable primers. We introduce a robust clustering algorithm to combine efficiencies of groups of probes, and our results indicate that using <it>n </it>< 10 cluster-based mean efficiencies is comparable to using individually determined efficiency adjustments for each primer pair (<it>N </it>= 96-1024).</p> <p><b>Conclusions</b></p> <p>Careful estimation of primer efficiency is necessary to avoid significant measurement inaccuracies. Transformed confidence intervals are a novel method to assess and interprete the reliability of an efficiency estimate in a high throughput format. Efficiency clustering as developed here serves as a compromise between the imprecision in assuming uniform efficiency, and the computational complexity and danger of over-fitting when using individually determined efficiencies.</p

    Persistent Homology Analysis of Brain Artery Trees.

    Get PDF
    New representations of tree-structured data objects, using ideas from topological data analysis, enable improved statistical analyses of a population of brain artery trees. A number of representations of each data tree arise from persistence diagrams that quantify branching and looping of vessels at multiple scales. Novel approaches to the statistical analysis, through various summaries of the persistence diagrams, lead to heightened correlations with covariates such as age and sex, relative to earlier analyses of this data set. The correlation with age continues to be significant even after controlling for correlations from earlier significant summaries

    A Fast and Compact Quantum Random Number Generator

    Get PDF
    We present the realization of a physical quantum random number generator based on the process of splitting a beam of photons on a beam splitter, a quantum mechanical source of true randomness. By utilizing either a beam splitter or a polarizing beam splitter, single photon detectors and high speed electronics the presented devices are capable of generating a binary random signal with an autocorrelation time of 11.8 ns and a continuous stream of random numbers at a rate of 1 Mbit/s. The randomness of the generated signals and numbers is shown by running a series of tests upon data samples. The devices described in this paper are built into compact housings and are simple to operate.Comment: 23 pages, 6 Figs. To appear in Rev. Sci. Inst
    • …
    corecore