1,358 research outputs found

    Detecting community structure in networks using edge prediction methods

    Full text link
    Community detection and edge prediction are both forms of link mining: they are concerned with discovering the relations between vertices in networks. Some of the vertex similarity measures used in edge prediction are closely related to the concept of community structure. We use this insight to propose a novel method for improving existing community detection algorithms by using a simple vertex similarity measure. We show that this new strategy can be more effective in detecting communities than the basic community detection algorithms.Comment: 5 pages, 2 figure

    Effect of Statistical Fluctuation in Monte Carlo Based Photon Beam Dose Calculation on Gamma Index Evaluation

    Full text link
    The gamma-index test has been commonly adopted to quantify the degree of agreement between a reference dose distribution and an evaluation dose distribution. Monte Carlo (MC) simulation has been widely used for the radiotherapy dose calculation for both clinical and research purposes. The goal of this work is to investigate both theoretically and experimentally the impact of the MC statistical fluctuation on the gamma-index test when the fluctuation exists in the reference, the evaluation, or both dose distributions. To the first order approximation, we theoretically demonstrated in a simplified model that the statistical fluctuation tends to overestimate gamma-index values when existing in the reference dose distribution and underestimate gamma-index values when existing in the evaluation dose distribution given the original gamma-index is relatively large for the statistical fluctuation. Our numerical experiments using clinical photon radiation therapy cases have shown that 1) when performing a gamma-index test between an MC reference dose and a non-MC evaluation dose, the average gamma-index is overestimated and the passing rate decreases with the increase of the noise level in the reference dose; 2) when performing a gamma-index test between a non-MC reference dose and an MC evaluation dose, the average gamma-index is underestimated when they are within the clinically relevant range and the passing rate increases with the increase of the noise level in the evaluation dose; 3) when performing a gamma-index test between an MC reference dose and an MC evaluation dose, the passing rate is overestimated due to the noise in the evaluation dose and underestimated due to the noise in the reference dose. We conclude that the gamma-index test should be used with caution when comparing dose distributions computed with Monte Carlo simulation

    Prognostic and predictive value of circulating tumor cells and CXCR4 expression as biomarkers for a CXCR4 peptide antagonist in combination with carboplatin-etoposide in small cell lung cancer: exploratory analysis of a phase II study.

    Get PDF
    Background Circulating tumor cells (CTCs) and chemokine (C-X-C motif) receptor 4 (CXCR4) expression in CTCs and tumor tissue were evaluated as prognostic or predictive markers of CXCR4 peptide antagonist LY2510924 plus carboplatin-etoposide (CE) versus CE in extensive-stage disease small cell lung cancer (ED-SCLC). Methods This exploratory analysis of a phase II study evaluated CXCR4 expression in baseline tumor tissue and peripheral blood CTCs and in post-treatment CTCs. Optimum cutoff values were determined for CTC counts and CXCR4 expression in tumors and CTCs as predictors of survival outcome. Kaplan-Meier estimates and hazard ratios were used to determine biomarker prognostic and predictive values. Results There was weak positive correlation at baseline between CXCR4 expression in tumor tissue and CTCs. Optimum cutoff values were H-score ≥ 210 for CXCR4+ tumor, ≥7% CTCs with CXCR4 expression (CXCR4+ CTCs), and ≥6 CTCs/7.5 mL blood. Baseline H-score for CXCR4+ tumor was not prognostic of progression-free survival (PFS) or overall survival (OS). Baseline CXCR4+ CTCs ≥7% was prognostic of shorter PFS. CTCs ≥6 at baseline and cycle 2, day 1 were prognostic of shorter PFS and OS. None of the biomarkers at their respective optimum cutoffs was predictive of treatment response of LY2510924 plus CE versus CE. Conclusions In patients with ED-SCLC, baseline CXCR4 expression in tumor tissue was not prognostic of survival or predictive of LY2510924 treatment response. Baseline CXCR4+ CTCs ≥7% was prognostic of shorter PFS. CTC count ≥6 at baseline and after 1 cycle of treatment were prognostic of shorter PFS and OS

    Evaluating the Suitability of Commercial Clouds for NASA's High Performance Computing Applications: A Trade Study

    Get PDF
    NASAs High-End Computing Capability (HECC) Project is periodically asked if it could be more cost effective through the use of commercial cloud resources. To answer the question, HECCs Application Performance and Productivity (APP) team undertook a performance and cost evaluation comparing three domains: two commercial cloud providers, Amazon and Penguin, and HECCs in-house resourcesthe Pleiades and Electra systems. In the study, the APP team used a combination of the NAS Parallel Benchmarks (NPB) and six full applications from NASAs workload on Pleiades and Electra to compare performance of nodes based on three different generations of Intel Xeon processorsHaswell, Broadwell, and Skylake. Because of export control limitations, the most heavily used applications on Pleiades and Electra could not be used in the cloud; therefore, only one of the applications, OpenFOAM, represents work from the Aeronautics Research Mission Directorate and the Human and Exploration Mission Directorate. The other five applications are from the Science Mission Directorate

    GPU-based Iterative Cone Beam CT Reconstruction Using Tight Frame Regularization

    Full text link
    X-ray imaging dose from serial cone-beam CT (CBCT) scans raises a clinical concern in most image guided radiation therapy procedures. It is the goal of this paper to develop a fast GPU-based algorithm to reconstruct high quality CBCT images from undersampled and noisy projection data so as to lower the imaging dose. For this purpose, we have developed an iterative tight frame (TF) based CBCT reconstruction algorithm. A condition that a real CBCT image has a sparse representation under a TF basis is imposed in the iteration process as regularization to the solution. To speed up the computation, a multi-grid method is employed. Our GPU implementation has achieved high computational efficiency and a CBCT image of resolution 512\times512\times70 can be reconstructed in ~5 min. We have tested our algorithm on a digital NCAT phantom and a physical Catphan phantom. It is found that our TF-based algorithm is able to reconstrct CBCT in the context of undersampling and low mAs levels. We have also quantitatively analyzed the reconstructed CBCT image quality in terms of modulation-transfer-function and contrast-to-noise ratio under various scanning conditions. The results confirm the high CBCT image quality obtained from our TF algorithm. Moreover, our algorithm has also been validated in a real clinical context using a head-and-neck patient case. Comparisons of the developed TF algorithm and the current state-of-the-art TV algorithm have also been made in various cases studied in terms of reconstructed image quality and computation efficiency.Comment: 24 pages, 8 figures, accepted by Phys. Med. Bio

    Processing Images from the Zwicky Transient Facility

    Get PDF
    The Zwicky Transient Facility is a new robotic-observing program, in which a newly engineered 600-MP digital camera with a pioneeringly large field of view, 47~square degrees, will be installed into the 48-inch Samuel Oschin Telescope at the Palomar Observatory. The camera will generate ∼1\sim 1~petabyte of raw image data over three years of operations. In parallel related work, new hardware and software systems are being developed to process these data in real time and build a long-term archive for the processed products. The first public release of archived products is planned for early 2019, which will include processed images and astronomical-source catalogs of the northern sky in the gg and rr bands. Source catalogs based on two different methods will be generated for the archive: aperture photometry and point-spread-function fitting.Comment: 6 pages, 4 figures, submitted to RTSRE Proceedings (www.rtsre.org

    Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport

    Full text link
    Monte Carlo simulation is the most accurate method for absorbed dose calculations in radiotherapy. Its efficiency still requires improvement for routine clinical applications, especially for online adaptive radiotherapy. In this paper, we report our recent development on a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport. We have implemented the Dose Planning Method (DPM) Monte Carlo dose calculation package (Sempau et al, Phys. Med. Biol., 45(2000)2263-2291) on GPU architecture under CUDA platform. The implementation has been tested with respect to the original sequential DPM code on CPU in phantoms with water-lung-water or water-bone-water slab geometry. A 20 MeV mono-energetic electron point source or a 6 MV photon point source is used in our validation. The results demonstrate adequate accuracy of our GPU implementation for both electron and photon beams in radiotherapy energy range. Speed up factors of about 5.0 ~ 6.6 times have been observed, using an NVIDIA Tesla C1060 GPU card against a 2.27GHz Intel Xeon CPU processor.Comment: 13 pages, 3 figures, and 1 table. Paper revised. Figures update
    • …
    corecore