1.6 KiB
title, last_updated, keywords, summary, sidebar, permalink
title | last_updated | keywords | summary | sidebar | permalink |
---|---|---|---|---|---|
Intel MPI Support | 13 March 2020 | software, impi, slurm | This document describes how to use Intel MPI in the Merlin6 cluster | merlin6_sidebar | /merlin6/impi.html |
Introduction
This document describes which set of Intel MPI versions in PModules are supported in the Merlin6 cluster.
srun
We strongly recommend the use of 'srun' over 'mpirun' or 'mpiexec'. Using 'srun' would properly bind tasks in to cores and less customization is needed, while 'mpirun' and 'mpiexec' might need more advanced configuration and should be only used by advanced users. Please, always adapt your scripts for using 'srun' before opening a support ticket. Also, please contact us on any problem when using a module.
{{site.data.alerts.tip}} Always run Intel MPI with the srun command. The only exception is for advanced users, however srun is still recommended. {{site.data.alerts.end}}
When running with srun, one should tell Intel MPI to use the PMI libraries provided by Slurm. For PMI-1:
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so
Alternatively, one can use PMI-2, but then one needs to specify it as follows:
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi2.so
export I_MPI_PMI2=yes
For more information, please read Slurm Intel MPI Guide
Note: Please note that PMI2 might not work properly in some Intel MPI versions. If so, you can either fallback to PMI-1 or to contact the Merlin administrators.