The Message Passing Interface, or MPI, is a standard library of subroutines (Fortran) or function calls (C) that can be used to implement a message-passing program. MPI allows for the coordination of a program running as multiple processes in a distributed memory environment, yet is flexible enough to be used in a shared memory environment. MPI is the defacto standard for message-passing, and as such, MPI programs should compile and run on any platform supporting it. This provides ease of use and source code portability. It also allows efficient implementations across a range of architectures, offers a great deal of functionality, includes different communication types and special routines for common collective operations, handles user-defined data types and topologies, and supports heterogeneous parallel architectures.
This tutorial provides an introduction to MPI so you can begin using it to develop message-passing programs in Fortran or C.
Target Audience: Programmers and researchers interested in using or writing parallel programs to solve complex problems.
Prerequisites: No prior experience with MPI or parallel programming is required to take this course. However, an understanding of computer programming is necessary.
Note: This course was previously offered on CI-Tutor.