first stab at mkdocs migration

refactor CSCS and Meg content

add merlin6 quick start

update merlin6 nomachine docs

give the userdoc its own color scheme

we use the Materials default one

refactored slurm general docs merlin6

add merlin6 JB docs

add software support m6 docs

add all files to nav

vibed changes #1

add missing pages

further vibing #2

vibe #3

further fixes
This commit is contained in:
2025-11-26 17:28:07 +01:00
parent 149de6fb18
commit bde174b726
313 changed files with 2608 additions and 11593 deletions

View File

@@ -0,0 +1,40 @@
# Intel MPI Support
This document describes which set of Intel MPI versions in PModules are supported in the Merlin6 cluster.
## Usage
### srun
We strongly recommend the use of **`srun`** over **`mpirun`** or **`mpiexec`**. Using **`srun`** would properly
bind tasks in to cores and less customization is needed, while **`mpirun`** and **`mpiexec`** might need more advanced
configuration and should be only used by advanced users. Please, ***always*** adapt your scripts for using **`srun`**
before opening a support ticket. Also, please contact us on any problem when using a module.
!!! tip
Always run Intel MPI with the **srun** command. The only exception is
for advanced users, however **srun** is still recommended.
When running with **srun**, one should tell Intel MPI to use the PMI libraries provided by Slurm. For PMI-1:
```bash
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so
srun ./app
```
Alternatively, one can use PMI-2, but then one needs to specify it as follows:
```bash
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi2.so
export I_MPI_PMI2=yes
srun ./app
```
For more information, please read [Slurm Intel MPI Guide](https://slurm.schedmd.com/mpi_guide.html#intel_mpi)
!!! note
Please note that PMI2 might not work properly in some Intel MPI versions.
If so, you can either fallback to PMI-1 or to contact the Merlin
administrators.