Below xyz is your group name and abc is your userid.
Compiling MPI program:
See also Hyak Intel MPI
Load one of below modules.
Intel:
Intel Mox:
module load icc_18-impi_2018
Intel Ikt:
module load icc_18-impi_2018
gcc:
gcc Mox:
module load gcc_4.8.5-impi_2017
gcc Ikt:
module load gcc_4.4.7-impi_5.1.2
Compile your program using one of below compilers:
intel C compiler is mpiicc
intel C++ compiler is mpiicpc
intel Fortran compiler is mpiifort
gnu C compiler is mpigcc
gnu C++ compiler is mpigxx
gnu Fortran compiler mpifc
generic C compiler is mpicc
generic C++ compiler is mpiCC
generic Fortran compiler mpif90
Example:
mpicc my_mpi_program.c -o my_mpi_program
Above generic means it could be intel or gnu compiler depending on what module was loaded. Below command shows details of mpicc and the compiler used:
mpicc -show
Use below command to find location of MPI:
which mpirun
Running MPI program:
(More details are at https://slurm.schedmd.com/mpi_guide.html)
Interactive MPI:
Below must be same module which was used to compile the program.
module load icc_17-impi_2017
For ikt, use below line. Do not use below line for mox.
export MX_RCACHE=0
While running in an interactive node you must give the -np option. The -np option is not required for batch jobs.
Use full path or cd to the appropriate directory and use "mpirun -np 28 ./my_mpi_program".
mpirun -np 28 /gscratch/xyz/abc/my_mpi_dir/my_mpi_program
Batch MPI:
Below is a sample MPI Job Script myscript.slurm. Change options appropriately and submit from the login node using "sbatch myscript.slurm".
#!/bin/bash
## Job Name
#SBATCH --job-name=my_mpi_job
## Allocation Definition
#SBATCH --account=xyz
#SBATCH --partition=xyz
## Resources
## Total number of nodes
#SBATCH --nodes=2
##Number of cores per node
#SBATCH --ntasks-per-node=28
## Walltime (2 hours)
#SBATCH --time=2:00:00
## Memory per node
#SBATCH --mem=100G
## Specify the working directory for this job
## Make this directory before submiting this job.
#SBATCH --chdir=/gscratch/xyz/abc/my_mpi_dir
##turn on e-mail notification
#SBATCH --mail-type=ALL
#SBATCH --mail-user=your_email_address
#For ikt, use below line. Delete below line for mox.
export MX_RCACHE=0
#Below must be same module which was used to compile the program.
module load icc_17-impi_2017
#Use full path or cd to the appropriate directory and use "mpirun ./my_mpi_program".
#Here in batch mode we do not need to use -np option with mpirun.
mpirun /gscratch/xyz/abc/my_mpi_dir/my_mpi_progr