15 research outputs found

    Learning from the Success of MPI

    Full text link
    The Message Passing Interface (MPI) has been extremely successful as a portable way to program high-performance parallel computers. This success has occurred in spite of the view of many that message passing is difficult and that other approaches, including automatic parallelization and directive-based parallelism, are easier to use. This paper argues that MPI has succeeded because it addresses all of the important issues in providing a parallel programming model.Comment: 12 pages, 1 figur

    Co-array Python: A Parallel Extension to the Python Language

    No full text

    A Parallel Numerical Library for UPC

    No full text

    Multidimensional Blocking in UPC

    No full text

    Improving the Performance of X10 Programs by Clock Removal

    No full text

    Remote Store Programming

    No full text

    A Characterization of Shared Data Access Patterns in UPC Programs

    No full text
    corecore