refactor CSCS and Meg content add merlin6 quick start update merlin6 nomachine docs give the userdoc its own color scheme we use the Materials default one refactored slurm general docs merlin6 add merlin6 JB docs add software support m6 docs add all files to nav vibed changes #1 add missing pages further vibing #2 vibe #3 further fixes
41 lines
1.3 KiB
Markdown
41 lines
1.3 KiB
Markdown
# Intel MPI Support
|
|
|
|
This document describes which set of Intel MPI versions in PModules are supported in the Merlin6 cluster.
|
|
|
|
## Usage
|
|
|
|
### srun
|
|
|
|
We strongly recommend the use of **`srun`** over **`mpirun`** or **`mpiexec`**. Using **`srun`** would properly
|
|
bind tasks in to cores and less customization is needed, while **`mpirun`** and **`mpiexec`** might need more advanced
|
|
configuration and should be only used by advanced users. Please, ***always*** adapt your scripts for using **`srun`**
|
|
before opening a support ticket. Also, please contact us on any problem when using a module.
|
|
|
|
!!! tip
|
|
Always run Intel MPI with the **srun** command. The only exception is
|
|
for advanced users, however **srun** is still recommended.
|
|
|
|
When running with **srun**, one should tell Intel MPI to use the PMI libraries provided by Slurm. For PMI-1:
|
|
|
|
```bash
|
|
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so
|
|
|
|
srun ./app
|
|
```
|
|
|
|
Alternatively, one can use PMI-2, but then one needs to specify it as follows:
|
|
|
|
```bash
|
|
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi2.so
|
|
export I_MPI_PMI2=yes
|
|
|
|
srun ./app
|
|
```
|
|
|
|
For more information, please read [Slurm Intel MPI Guide](https://slurm.schedmd.com/mpi_guide.html#intel_mpi)
|
|
|
|
!!! note
|
|
Please note that PMI2 might not work properly in some Intel MPI versions.
|
|
If so, you can either fallback to PMI-1 or to contact the Merlin
|
|
administrators.
|