Computer Science – Distributed – Parallel – and Cluster Computing
Scientific paper
2001-09-13
Computer Science
Distributed, Parallel, and Cluster Computing
12 pages, 1 figure
Scientific paper
The Message Passing Interface (MPI) has been extremely successful as a portable way to program high-performance parallel computers. This success has occurred in spite of the view of many that message passing is difficult and that other approaches, including automatic parallelization and directive-based parallelism, are easier to use. This paper argues that MPI has succeeded because it addresses all of the important issues in providing a parallel programming model.
No associations
LandOfFree
Learning from the Success of MPI does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Learning from the Success of MPI, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Learning from the Success of MPI will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-258519