Supercomputer Applications
MPI Labs, Fall 2003

  • MPI (Message Passing Interface) Programs, Parallel Programming
    1. MPI Program Lab00, "HelloCluster.c"
      MPI_Init(), MPI_Comm_size() , MPI_Comm_rank(), MPI_Finalize() MPI_COMM_WORLD (scroll down to "Communicators")

    2. MPI Program Lab01, share.c
      --Input a number with scanf, broadcast this value to other nodes in the "lamhosts" file
      uses MPI_Bcast(), also see MPI Data types
      NOTE: Place the call to MPI_Bcast outside of the "if (rank == 0)" block:
               if (rank == 0) {
      	      scanf( ... );
      	  }
      	  MPI_Bcast( ... );  //MPI_Bcast is like a combination send/receive,
      	                     //so each process needs it
      	

    3. MPI Program Lab02, DinnerParty.c
      Use MPI_Send(), and MPI_Recv(), INSTEAD OF MPI_Bcast()
      Sample Output:
      mpirun N lab02
      Process 0, Sending out random int 406999 to process 1
      Process 2 received: 1252750, mysalary=380245, sending out 1632995 to process 3
      Process 3 received: 1632995, mysalary=600007, sending out 2233002 to process 4
      Process 1 received: 406999, mysalary=845751, sending out 1252750 to process 2
      Process 0 received 3014004, mySalary=157838, average salary=$460807.17
      Process 4 received: 2233002, mysalary=618007, sending out 2851009 to process 5
      Process 5 received: 2851009, mysalary=162995, sending out 3014004 to process 0

    4. MPI Program Lab03, IntervalSlicing.c - Calculate Pi,
      --parallelize the program "gregory.c"
      Use MPI_Reduce() to sum data that is local to each process

    5. MPI Program Lab04, rms.c - Calculate the "root mean square" of an array,
      Use MPI_Reduce() to sum data that is local to each process

    6. MPI Resources and Sample Programs

  • SAMPLE WEB PAGE with project listing, each is a link to another page(s)