initial formatting changes complete
This commit is contained in:
@@ -1,12 +1,4 @@
|
||||
---
|
||||
title: ANSYS RSM (Remote Resolve Manager)
|
||||
#tags:
|
||||
keywords: software, ansys, rsm, slurm, interactive, rsm, windows
|
||||
last_updated: 23 August 2024
|
||||
summary: "This document describes how to use the ANSYS Remote Resolve Manager service in the Merlin7 cluster"
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/ansys-rsm.html
|
||||
---
|
||||
# ANSYS RSM (Remote Resolve Manager)
|
||||
|
||||
## ANSYS Remote Resolve Manager
|
||||
|
||||
@@ -32,18 +24,19 @@ The different steps and settings required to make it work are that following:
|
||||
2. Right-click the **HPC Resources** icon followed by **Add HPC Resource...**
|
||||

|
||||
3. In the **HPC Resource** tab, fill up the corresponding fields as follows:
|
||||

|
||||

|
||||
* **"Name"**: Add here the preffered name for the cluster. For example: `Merlin7 cluster`
|
||||
|
||||
* **"HPC Type"**: Select `SLURM`
|
||||
* **"Submit host"**: `service03.merlin7.psi.ch`
|
||||
* **"Slurm Job submission arguments (optional)"**: Add any required Slurm options for running your jobs.
|
||||
* `--hint=nomultithread` must be present.
|
||||
* `--exclusive` must also be present for now, due to a bug in the `Slingshot` interconnect which does not allow running shared nodes.
|
||||
* **"Slurm Job submission arguments (optional)"**: Add any required Slurm options for running your jobs.
|
||||
* `--hint=nomultithread` must be present.
|
||||
* `--exclusive` must also be present for now, due to a bug in the `Slingshot` interconnect which does not allow running shared nodes.
|
||||
* Check **"Use SSH protocol for inter and intra-node communication (Linux only)"**
|
||||
* Select **"Able to directly submit and monitor HPC jobs"**.
|
||||
* **"Apply"** changes.
|
||||
4. In the **"File Management"** tab, fill up the corresponding fields as follows:
|
||||

|
||||

|
||||
* Select **"RSM internal file transfer mechanism"** and add **`/data/scratch/shared`** as the **"Staging directory path on Cluster"**
|
||||
* Select **"Scratch directory local to the execution node(s)"** and add **`/scratch`** as the **HPC scratch directory**.
|
||||
* **Never check** the option "Keep job files in the staging directory when job is complete" if the previous
|
||||
@@ -51,12 +44,12 @@ option "Scratch directory local to the execution node(s)" was set.
|
||||
* **"Apply"** changes.
|
||||
5. In the **"Queues"** tab, use the left button to auto-discover partitions
|
||||

|
||||
* If no authentication method was configured before, an authentication window will appear. Use your
|
||||
* If no authentication method was configured before, an authentication window will appear. Use your
|
||||
PSI account to authenticate. Notice that the **`PSICH\`** prefix **must not be added**.
|
||||

|
||||
* From the partition list, select the ones you want to typically use.
|
||||
* In general, standard Merlin users must use **`hourly`**, **`daily`** and **`general`** only.
|
||||
* Other partitions are reserved for allowed users only.
|
||||
* In general, standard Merlin users must use **`hourly`**, **`daily`** and **`general`** only.
|
||||
* Other partitions are reserved for allowed users only.
|
||||
* **"Apply"** changes.
|
||||

|
||||
6. *[Optional]* You can perform a test by submitting a test job on each partition by clicking on the **Submit** button
|
||||
@@ -67,7 +60,7 @@ for each selected partition.
|
||||
|
||||
## Using RSM in ANSYS
|
||||
|
||||
Using the RSM service in ANSYS is slightly different depending on the ANSYS software being used.
|
||||
Using the RSM service in ANSYS is slightly different depending on the ANSYS software being used.
|
||||
Please follow the official ANSYS documentation for details about how to use it for that specific software.
|
||||
|
||||
Alternativaly, please refer to some the examples showed in the following chapters (ANSYS specific software).
|
||||
|
||||
@@ -1,12 +1,4 @@
|
||||
---
|
||||
title: ANSYS
|
||||
#tags:
|
||||
keywords: software, ansys, slurm, interactive, rsm, pmodules, overlay, overlays
|
||||
last_updated: 23 August 2024
|
||||
summary: "This document describes how to load and use ANSYS in the Merlin7 cluster"
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/ansys.html
|
||||
---
|
||||
# ANSYS
|
||||
|
||||
This document describes generic information of how to load and run ANSYS software in the Merlin cluster
|
||||
|
||||
@@ -14,15 +6,14 @@ This document describes generic information of how to load and run ANSYS softwar
|
||||
|
||||
The ANSYS software can be loaded through **[PModules](pmodules.md)**.
|
||||
|
||||
The default ANSYS versions are loaded from the central PModules repository.
|
||||
The default ANSYS versions are loaded from the central PModules repository.
|
||||
|
||||
However, we provide local installations on Merlin7 which are needed mainly for some ANSYS packages, like Ansys RSM.
|
||||
Due to this, and also to improve the interactive experience of the user, ANSYS has been also installed in the
|
||||
Merlin high performance storage and we have made it available from Pmodules.
|
||||
Due to this, and also to improve the interactive experience of the user, ANSYS has been also installed in the
|
||||
Merlin high performance storage and we have made it available from Pmodules.
|
||||
|
||||
### Loading Merlin7 ANSYS
|
||||
|
||||
|
||||
```bash
|
||||
module purge
|
||||
module use unstable # Optional
|
||||
@@ -37,9 +28,9 @@ module load ANSYS/2025R2
|
||||
<details>
|
||||
<summary>[Example] Loading ANSYS from the Merlin7 PModules repository</summary>
|
||||
<pre class="terminal code highlight js-syntax-highlight plaintext" lang="plaintext" markdown="false">
|
||||
🔥 [caubet_m@login001:~]# module purge
|
||||
🔥 [caubet_m@login001:~]# module use unstable
|
||||
🔥 [caubet_m@login001:~]# module load cray
|
||||
🔥 [caubet_m@login001:~]# module purge
|
||||
🔥 [caubet_m@login001:~]# module use unstable
|
||||
🔥 [caubet_m@login001:~]# module load cray
|
||||
|
||||
🔥 [caubet_m@login002:~]# module search ANSYS --verbose
|
||||
ANSYS/2022R2:
|
||||
@@ -69,7 +60,6 @@ ANSYS/2025R2:
|
||||
</pre>
|
||||
</details>
|
||||
|
||||
|
||||
!!! tip
|
||||
Please always run **ANSYS/2024R2 or superior**.
|
||||
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: CP2k
|
||||
keywords: CP2k software, compile
|
||||
summary: "CP2k is a quantum chemistry and solid state physics software package"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/cp2k.html
|
||||
---
|
||||
# CP2k
|
||||
|
||||
## CP2k
|
||||
|
||||
@@ -131,14 +124,13 @@ module purge
|
||||
module use Spack unstable
|
||||
module load gcc/12.3 openmpi/5.0.8-r5lz-A100-gpu dbcsr/2.8.0-3r22-A100-gpu-omp cosma/2.7.0-y2tr-gpu cuda/12.6.0-3y6a dftd4/3.7.0-4k4c-omp elpa/2025.01.002-bovg-A100-gpu-omp fftw/3.3.10-syba-omp hdf5/1.14.6-pcsd libint/2.11.1-3lxv libxc/7.0.0-u556 libxsmm/1.17-2azz netlib-scalapack/2.2.2-rmcf openblas/0.3.30-ynou-omp plumed/2.9.2-47hk py-fypp/3.1-z25p py-numpy/2.3.2-45ay python/3.13.5-qivs sirius/develop-qz4c-A100-gpu-omp spglib/2.5.0-jl5l-omp spla/1.6.1-hrgf-gpu cmake/3.31.8-j47l ninja/1.12.1-afxy
|
||||
|
||||
git clone https://github.com/cp2k/cp2k.git
|
||||
git clone <https://github.com/cp2k/cp2k.git>
|
||||
cd cp2k
|
||||
|
||||
mkdir build && cd build
|
||||
CC=mpicc CXX=mpic++ FC=mpifort cmake -GNinja -DCMAKE_CUDA_HOST_COMPILER=mpicc -DCP2K_USE_LIBXC=ON -DCP2K_USE_LIBINT2=ON -DCP2K_USE_SPGLIB=ON -DCP2K_USE_ELPA=ON -DCP2K_USE_SPLA=ON -DCP2K_USE_SIRIUS=ON -DCP2K_USE_PLUMED=ON -DCP2K_USE_DFTD4=ON -DCP2K_USE_COSMA=ON -DCP2K_USE_ACCEL=CUDA -DCMAKE_CUDA_ARCHITECTURES=80 -DCP2K_USE_FFTW3=ON ..
|
||||
ninja -j 16
|
||||
|
||||
|
||||
```
|
||||
#### GH200
|
||||
[](https://gitea.psi.ch/HPCE/spack-psi)
|
||||
|
||||
@@ -1,12 +1,4 @@
|
||||
---
|
||||
title: Cray Programming Environment
|
||||
#tags:
|
||||
keywords: cray, module
|
||||
last_updated: 24 Mai 2023
|
||||
summary: "This document describes how to use the Cray Programming Environment on Merlin7."
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/cray-module-env.html
|
||||
---
|
||||
# Cray Programming Environment
|
||||
|
||||
## Loading the Cray module
|
||||
|
||||
@@ -24,21 +16,21 @@ The Cray Programming Environment will load all the necessary dependencies. In ex
|
||||
🔥 [caubet_m@login001:~]# module list
|
||||
Currently Loaded Modules:
|
||||
1) craype-x86-rome 2) libfabric/1.15.2.0
|
||||
3) craype-network-ofi
|
||||
3) craype-network-ofi
|
||||
4) xpmem/2.9.6-1.1_20240510205610__g087dc11fc19d 5) PrgEnv-cray/8.5.0
|
||||
6) cce/17.0.0 7) cray-libsci/23.12.5
|
||||
8) cray-mpich/8.1.28 9) craype/2.7.30
|
||||
10) perftools-base/23.12.0 11) cpe/23.12
|
||||
12) cray/23.12
|
||||
12) cray/23.12
|
||||
```
|
||||
|
||||
You will notice an unfamiliar `PrgEnv-cray/8.5.0` that was loaded. This is a meta-module that Cray provides to simplify the switch of compilers and their associated dependencies and libraries,
|
||||
as a whole called Programming Environment. In the Cray Programming Environment, there are 4 key modules.
|
||||
|
||||
* `cray-libsci` is a collection of numerical routines tuned for performance on Cray systems.
|
||||
* `libfabric` is an important low-level library that allows you to take advantage of the high performance Slingshot network.
|
||||
* `libfabric` is an important low-level library that allows you to take advantage of the high performance Slingshot network.
|
||||
* `cray-mpich` is a CUDA-aware MPI implementation, optimized for Cray systems.
|
||||
* `cce` is the compiler from Cray. C/C++ compilers are based on Clang/LLVM while Fortran supports Fortran 2018 standard. More info: https://user.cscs.ch/computing/compilation/cray/
|
||||
* `cce` is the compiler from Cray. C/C++ compilers are based on Clang/LLVM while Fortran supports Fortran 2018 standard. More info: <https://user.cscs.ch/computing/compilation/cray/>
|
||||
|
||||
You can switch between different programming environments. You can check the available module with the `module avail` command, as follows:
|
||||
|
||||
@@ -46,13 +38,13 @@ You can switch between different programming environments. You can check the ava
|
||||
🔥 [caubet_m@login001:~]# module avail PrgEnv
|
||||
--------------------- /opt/cray/pe/lmod/modulefiles/core ---------------------
|
||||
|
||||
PrgEnv-cray/8.5.0 PrgEnv-gnu/8.5.0
|
||||
PrgEnv-nvhpc/8.5.0 PrgEnv-nvidia/8.5.0
|
||||
PrgEnv-cray/8.5.0 PrgEnv-gnu/8.5.0
|
||||
PrgEnv-nvhpc/8.5.0 PrgEnv-nvidia/8.5.0
|
||||
```
|
||||
## Switching compiler suites
|
||||
|
||||
Compiler suites can be exchanged with PrgEnv (Programming Environments) provided by HPE-Cray. The wrappers call the correct compiler with appropriate options to build
|
||||
and link applications with relevant libraries, as required by the loaded modules (only dynamic linking is supported) and therefore should replace direct calls to compiler
|
||||
Compiler suites can be exchanged with PrgEnv (Programming Environments) provided by HPE-Cray. The wrappers call the correct compiler with appropriate options to build
|
||||
and link applications with relevant libraries, as required by the loaded modules (only dynamic linking is supported) and therefore should replace direct calls to compiler
|
||||
drivers in Makefiles and build scripts.
|
||||
|
||||
To swap the the compiler suite from the default Cray to GNU compiler, one can run the following.
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: GROMACS
|
||||
keywords: GROMACS software, compile
|
||||
summary: "GROMACS (GROningen Machine for Chemical Simulations) is a versatile and widely-used open source package to perform molecular dynamics"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/gromacs.html
|
||||
---
|
||||
# GROMACS
|
||||
|
||||
## GROMACS
|
||||
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: IPPL
|
||||
keywords: IPPL software, compile
|
||||
summary: "Independent Parallel Particle Layer (IPPL) is a performance portable C++ library for Particle-Mesh methods"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/ippl.html
|
||||
---
|
||||
# IPPL
|
||||
|
||||
## IPPL
|
||||
|
||||
@@ -15,12 +8,12 @@ Independent Parallel Particle Layer (IPPL) is a performance portable C++ library
|
||||
|
||||
GNU GPLv3
|
||||
|
||||
## How to run on Merlin7
|
||||
## How to run on Merlin7
|
||||
### A100 nodes
|
||||
[](https://gitea.psi.ch/HPCE/spack-psi)
|
||||
```bash
|
||||
module use Spack unstable
|
||||
module load gcc/13.2.0 openmpi/4.1.6-57rc-A100-gpu
|
||||
module load gcc/13.2.0 openmpi/4.1.6-57rc-A100-gpu
|
||||
module load boost/1.82.0-e7gp fftw/3.3.10 gnutls/3.8.3 googletest/1.14.0 gsl/2.8 h5hut/2.0.0rc7 openblas/0.3.26-omp cmake/3.31.6-oe7u
|
||||
|
||||
cd <path to IPPL source directory>
|
||||
@@ -39,8 +32,8 @@ salloc --partition=gh-daily --clusters=gmerlin7 --time=08:00:00 --ntasks=4 --nod
|
||||
ssh <allocated_gpu>
|
||||
|
||||
module use Spack unstable
|
||||
module load gcc/13.2.0 openmpi/5.0.3-3lmi-GH200-gpu
|
||||
module load boost/1.82.0-3ns6 fftw/3.3.10 gnutls/3.8.3 googletest/1.14.0 gsl/2.7.1 h5hut/2.0.0rc7 openblas/0.3.26 cmake/3.31.4-u2nm
|
||||
module load gcc/13.2.0 openmpi/5.0.3-3lmi-GH200-gpu
|
||||
module load boost/1.82.0-3ns6 fftw/3.3.10 gnutls/3.8.3 googletest/1.14.0 gsl/2.7.1 h5hut/2.0.0rc7 openblas/0.3.26 cmake/3.31.4-u2nm
|
||||
|
||||
cd <path to IPPL source directory>
|
||||
mkdir build_gh
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: LAMMPS
|
||||
keywords: LAMMPS software, compile
|
||||
summary: "LAMMPS is a classical molecular dynamics code that models an ensemble of particles in a liquid, solid, or gaseous state"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/lammps.html
|
||||
---
|
||||
# LAMMPS
|
||||
|
||||
## LAMMPS
|
||||
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: OPAL-X
|
||||
keywords: OPAL-X software, compile
|
||||
summary: "OPAL (Object Oriented Particle Accelerator Library) is an open source C++ framework for general particle accelerator simulations including 3D space charge, short range wake fields and particle matter interaction."
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/opal-x.html
|
||||
---
|
||||
# OPAL-X
|
||||
|
||||
## OPAL
|
||||
|
||||
|
||||
@@ -1,12 +1,4 @@
|
||||
---
|
||||
title: OpenMPI Support
|
||||
#tags:
|
||||
last_updated: 15 January 2025
|
||||
keywords: software, openmpi, slurm
|
||||
summary: "This document describes how to use OpenMPI in the Merlin7 cluster"
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/openmpi.html
|
||||
---
|
||||
# OpenMPI Support
|
||||
|
||||
## Introduction
|
||||
|
||||
@@ -70,7 +62,7 @@ specific pmix plugin versions available: pmix_v5,pmix_v4,pmix_v3,pmix_v2
|
||||
```
|
||||
|
||||
Important Notes:
|
||||
* For OpenMPI, always use `pmix` by specifying the appropriate version (`pmix_$version`).
|
||||
* For OpenMPI, always use `pmix` by specifying the appropriate version (`pmix_$version`).
|
||||
When loading an OpenMPI module (via [PModules](pmodules.md) or [Spack](spack.md)), the corresponding PMIx version will be automatically loaded.
|
||||
* Users do not need to manually manage PMIx compatibility.
|
||||
|
||||
|
||||
@@ -1,16 +1,8 @@
|
||||
---
|
||||
title: PSI Modules
|
||||
#tags:
|
||||
keywords: Pmodules, software, stable, unstable, deprecated, overlay, overlays, release stage, module, package, packages, library, libraries
|
||||
last_updated: 07 September 2022
|
||||
#summary: ""
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/pmodules.html
|
||||
---
|
||||
# PSI Modules
|
||||
|
||||
## PSI Environment Modules
|
||||
|
||||
On top of the operating system stack we provide different software using the PSI developed PModule system.
|
||||
On top of the operating system stack we provide different software using the PSI developed PModule system.
|
||||
|
||||
PModules is the official supported way and each package is deployed by a specific expert. Usually, in PModules
|
||||
software which is used by many people will be found.
|
||||
@@ -22,25 +14,25 @@ If you miss any package/versions or a software with a specific missing feature,
|
||||
To ensure proper software lifecycle management, PModules uses three release stages: unstable, stable, and deprecated.
|
||||
|
||||
1. **Unstable Release Stage:**
|
||||
* Contains experimental or under-development software versions.
|
||||
* Not visible to users by default. Use explicitly:
|
||||
* Contains experimental or under-development software versions.
|
||||
* Not visible to users by default. Use explicitly:
|
||||
|
||||
```bash
|
||||
module use unstable
|
||||
```
|
||||
* Software is promoted to **stable** after validation.
|
||||
* Software is promoted to **stable** after validation.
|
||||
2. **Stable Release Stage:**
|
||||
* Default stage, containing fully tested and supported software versions.
|
||||
* Recommended for all production workloads.
|
||||
* Default stage, containing fully tested and supported software versions.
|
||||
* Recommended for all production workloads.
|
||||
|
||||
3. **Deprecated Release Stage:**
|
||||
* Contains software versions that are outdated or discontinued.
|
||||
* These versions are hidden by default but can be explicitly accessed:
|
||||
* Contains software versions that are outdated or discontinued.
|
||||
* These versions are hidden by default but can be explicitly accessed:
|
||||
|
||||
```bash
|
||||
module use deprecated
|
||||
```
|
||||
* Deprecated software can still be loaded directly without additional configuration to ensure user transparency.
|
||||
* Deprecated software can still be loaded directly without additional configuration to ensure user transparency.
|
||||
|
||||
## PModules commands
|
||||
|
||||
@@ -57,7 +49,7 @@ module purge # unload all loaded packages and cleanup the en
|
||||
```
|
||||
|
||||
Please refer to the **external [PSI Modules](https://pmodules.gitpages.psi.ch/chap3.html) document** for
|
||||
detailed information about the `module` command.
|
||||
detailed information about the `module` command.
|
||||
|
||||
### module use/unuse
|
||||
|
||||
@@ -85,7 +77,7 @@ Please run `module avail --help` for further listing options.
|
||||
### module search
|
||||
|
||||
This is used to **search** for **software packages**. By default, if no **Release Stage** or **Software Group** is specified
|
||||
in the options of the `module search` command, it will search from the already invoked *Software Groups* and *Release Stages*.
|
||||
in the options of the `module search` command, it will search from the already invoked *Software Groups* and *Release Stages*.
|
||||
Direct package dependencies will be also showed.
|
||||
|
||||
```bash
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: Quantum Espresso
|
||||
keywords: Quantum Espresso software, compile
|
||||
summary: "Quantum Espresso code for electronic-structure calculations and materials modeling at the nanoscale"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/quantum-espresso.html
|
||||
---
|
||||
# Quantum Espresso
|
||||
|
||||
## Quantum ESPRESSO
|
||||
|
||||
@@ -121,7 +114,6 @@ module purge
|
||||
module use Spack unstable
|
||||
module load nvhpc/25.3 openmpi/5.0.7-e3bf-GH200-gpu fftw/3.3.10-sfpw-omp hdf5/develop-2.0-ztvo nvpl-blas/0.4.0.1-3zpg nvpl-lapack/0.3.0-ymy5 netlib-scalapack/2.2.2-qrhq cmake/3.31.6-5dl7
|
||||
|
||||
|
||||
cd <path to QE source directory>
|
||||
mkdir build
|
||||
cd build
|
||||
|
||||
@@ -1,11 +1,4 @@
|
||||
---
|
||||
title: Spack
|
||||
keywords: spack, python, software, compile
|
||||
summary: "Spack the HPC package manager documentation"
|
||||
sidebar: merlin7_sidebar
|
||||
toc: false
|
||||
permalink: /merlin7/spack.html
|
||||
---
|
||||
# Spack
|
||||
|
||||
For Merlin7 the *package manager for supercomputing* [Spack](https://spack.io/) is available. It is meant to compliment the existing PModules
|
||||
solution, giving users the opertunity to manage their own software environments.
|
||||
|
||||
Reference in New Issue
Block a user