158 research outputs found

    Learning from the Success of MPI

    Full text link
    The Message Passing Interface (MPI) has been extremely successful as a portable way to program high-performance parallel computers. This success has occurred in spite of the view of many that message passing is difficult and that other approaches, including automatic parallelization and directive-based parallelism, are easier to use. This paper argues that MPI has succeeded because it addresses all of the important issues in providing a parallel programming model.Comment: 12 pages, 1 figur

    Non-equilibrium Fluctuation Relations in a Quantum Coherent Conductor

    Full text link
    We experimentally demonstrate the validity of non-equilibrium fluctuation relations by using a quantum coherent conductor. In equilibrium the fluctuation-dissipation relation leads to the correlation between current and current noise at the conductor, namely, Johnson-Nyqusit relation. When the conductor is voltage-biased so that the non-linear regime is entered, the fluctuation theorem has predicted similar non-equilibrium fluctuation relations, which hold true even when the Onsager-Casmir relations are broken in magnetic fields. Our experiments qualitatively validate the predictions as the first evidence of this theorem in the non-equilibrium quantum regime. In the appendix, we give simple deduction of the higher order correlations between the current and the current noise based on the fluctuation theorem.Comment: 4 pages, 4 figures with 1-page appendix

    Improving the scalability of parallel N-body applications with an event driven constraint based execution model

    Full text link
    The scalability and efficiency of graph applications are significantly constrained by conventional systems and their supporting programming models. Technology trends like multicore, manycore, and heterogeneous system architectures are introducing further challenges and possibilities for emerging application domains such as graph applications. This paper explores the space of effective parallel execution of ephemeral graphs that are dynamically generated using the Barnes-Hut algorithm to exemplify dynamic workloads. The workloads are expressed using the semantics of an Exascale computing execution model called ParalleX. For comparison, results using conventional execution model semantics are also presented. We find improved load balancing during runtime and automatic parallelism discovery improving efficiency using the advanced semantics for Exascale computing.Comment: 11 figure

    A Microchip CD4 Counting Method for HIV Monitoring in Resource-Poor Settings

    Get PDF
    BACKGROUND: More than 35 million people in developing countries are living with HIV infection. An enormous global effort is now underway to bring antiretroviral treatment to at least 3 million of those infected. While drug prices have dropped considerably, the cost and technical complexity of laboratory tests essential for the management of HIV disease, such as CD4 cell counts, remain prohibitive. New, simple, and affordable methods for measuring CD4 cells that can be implemented in resource-scarce settings are urgently needed. METHODS AND FINDINGS: Here we describe the development of a prototype for a simple, rapid, and affordable method for counting CD4 lymphocytes. Microliter volumes of blood without further sample preparation are stained with fluorescent antibodies, captured on a membrane within a miniaturized flow cell and imaged through microscope optics with the type of charge-coupled device developed for digital camera technology. An associated computer algorithm converts the raw digital image into absolute CD4 counts and CD4 percentages in real time. The accuracy of this prototype system was validated through testing in the United States and Botswana, and showed close agreement with standard flow cytometry (r = 0.95) over a range of absolute CD4 counts, and the ability to discriminate clinically relevant CD4 count thresholds with high sensitivity and specificity. CONCLUSION: Advances in the adaptation of new technologies to biomedical detection systems, such as the one described here, promise to make complex diagnostics for HIV and other infectious diseases a practical global reality

    Impact of the Introduction of Calcimimetics on Timing of Parathyroidectomy in Secondary and Tertiary Hyperparathyroidism

    Get PDF
    Hyperparathyroidism (HPT), both secondary and tertiary, is common in patients with end-stage renal disease, and is associated with severe bone disorders, cardiovascular complications, and increased mortality. Since the introduction of calcimimetics in 2004, treatment of HPT has shifted from surgery to predominantly medical therapy. The aim of this study was to evaluate the impact of this change of management on the HPT patient population before undergoing (sub-)total parathyroidectomy (PTx). Overall, 119 patients with secondary or tertiary HPT undergoing PTx were included in a retrospective, single-center cohort. Group A, who underwent PTx before January 2005, was compared with group B, who underwent PTx after January 2005. Patient characteristics, time interval between HPT diagnosis and PTx, and postoperative complications were compared. Group A comprised 70 (58.8 %) patients and group B comprised 49 (41.2 %) patients. The median interval between HPT diagnosis and PTx was 27 (interquartile range [IQR] 12.5-48.0) and 49 (IQR 21.0-75.0) months for group A and B, respectively (p = 0.007). Baseline characteristics were similar among both groups. The median preoperative serum parathyroid hormone (PTH) level was 936 pg/mL (IQR 600-1273) for group A versus 1091 pg/mL (IQR 482-1373) for group B (p = 0.38). PTx resulted in a dramatic PTH reduction (less than twofold the upper limit: A, 80.0 %; B, 85.4 %), and postoperative complication rates were low in both groups (A: 7.8 %; B: 10.2 %) [p = 0.66]. The introduction of calcimimetics in 2004 is associated with a significant 2-year delay of surgery with continuously elevated preoperative PTH levels, while parathyroid surgery, even in a fragile population, is considered a safe and effective procedure

    Immune infiltration in invasive lobular breast cancer

    Get PDF
    Background: Invasive lobular breast cancer (ILC) is the second most common histological subtype of breast cancer after invasive ductal cancer (IDC). Here, we aimed at evaluating the prevalence, levels and composition of tumor infiltrating lymphocytes (TIL) and their association with clinico-pathological, and outcome variables in ILC, and to compare it with IDC. Methods: We considered two patient series with TIL data: a multi-centric retrospective series (n=614) and the BIG 02-98 study (n=149 ILC and 807 IDC). We compared immune subsets identified by immuno-histochemistry in the ILC (n=159) and IDC (n=468) patients from the Nottingham series, as well as the CIBERSORT immune profiling of the ILC (n=98) and IDC (n=388) METABRIC and TCGA patients. All ILC/IDC comparisons were done in ER-positive/HER2-negative tumors. All statistical tests were two-sided. Results: TIL levels were statistically significantly lower in ILC compared to IDC (fold change =0.79; 95%CI: 0.70-0.88, P<.001). In ILC, high TIL levels were associated with young age, lymph node involvement, and high proliferative tumors. In the univariable analysis, high TIL levels were associated with worse prognosis in the retrospective and BIG 02-98 lobular series, although it did not reach statistical significance in the latter. The Nottingham series revealed that the levels of intra-tumoral but not total CD8+ were statistically significantly lower in ILC compared to IDC. Comparison of the CIBERSORT profiles highlighted statistically significant differences in terms of immune composition. Conclusion: This study shows differences between the immune infiltrates of ER-positive/HER2-negative ILC and IDC in terms of prevalence, levels, localization, composition, and clinical associations
    • …
    corecore