6 research outputs found
Learning from the Success of MPI
The Message Passing Interface (MPI) has been extremely successful as a
portable way to program high-performance parallel computers. This success has
occurred in spite of the view of many that message passing is difficult and
that other approaches, including automatic parallelization and directive-based
parallelism, are easier to use. This paper argues that MPI has succeeded
because it addresses all of the important issues in providing a parallel
programming model.Comment: 12 pages, 1 figur
Complete image partitioning on spiral architecture
Uniform image partitioning has been achieved on Spiral Architecture, which plays an important role in parallel image processing on many aspects such as uniform data partitioning, load balancing, zero data exchange between the processing nodes et al. However, when the number of partitions is not the power of seven like 49, each sub-image except one is split into a few fragments which are mixed together. We could not tell which fragments belong to which sub-image. It is an unacceptable flaw to parallel image processing. This paper proposes a method to resolve the problem mentioned above. From the experimental results, it is shown that the proposed method correctly identifies the fragments belonging to the same sub-image and successfully collects them together to be a complete sub-image. Then, these sub-images can be distributed into the different processing nodes for further processing. © Springer-Verlag Berlin Heidelberg 2003
Learning from the Success of MPI
The Message Passing Interface (MPI) has been extremely successful as a portable way to program high-performance parallel computers