Add RSM service
BIN
images/ANSYS/merlin7/HFSS/01_Select_Scheduler_Menu.png
Normal file
After Width: | Height: | Size: 22 KiB |
BIN
images/ANSYS/merlin7/HFSS/02_Select_Scheduler_RSM_Remote.png
Normal file
After Width: | Height: | Size: 9.6 KiB |
BIN
images/ANSYS/merlin7/HFSS/03_Select_Scheduler_Slurm.png
Normal file
After Width: | Height: | Size: 9.7 KiB |
BIN
images/ANSYS/merlin7/HFSS/04_Submit_Job_Menu.png
Normal file
After Width: | Height: | Size: 22 KiB |
BIN
images/ANSYS/merlin7/HFSS/05_Submit_Job_Product_Path.png
Normal file
After Width: | Height: | Size: 67 KiB |
BIN
images/ANSYS/merlin7/cfx5launcher.png
Normal file
After Width: | Height: | Size: 39 KiB |
BIN
images/ANSYS/merlin7/merlin7/cfx5launcher.png
Normal file
After Width: | Height: | Size: 39 KiB |
BIN
images/ANSYS/merlin7/rsm-1-add_hpc_resource.png
Normal file
After Width: | Height: | Size: 508 KiB |
BIN
images/ANSYS/merlin7/rsm-2-add_cluster.png
Normal file
After Width: | Height: | Size: 27 KiB |
BIN
images/ANSYS/merlin7/rsm-3-add_scratch_info.png
Normal file
After Width: | Height: | Size: 35 KiB |
BIN
images/ANSYS/merlin7/rsm-4-get_slurm_queues.png
Normal file
After Width: | Height: | Size: 26 KiB |
BIN
images/ANSYS/merlin7/rsm-5-authenticating.png
Normal file
After Width: | Height: | Size: 6.4 KiB |
BIN
images/ANSYS/merlin7/rsm-6-selected-partitions.png
Normal file
After Width: | Height: | Size: 37 KiB |
@ -2,35 +2,85 @@
|
||||
title: ANSYS RSM (Remote Resolve Manager)
|
||||
#tags:
|
||||
keywords: software, ansys, rsm, slurm, interactive, rsm, windows
|
||||
last_updated: 24 Mai 2023
|
||||
last_updated: 23 August 2024
|
||||
summary: "This document describes how to use the ANSYS Remote Resolve Manager service in the Merlin7 cluster"
|
||||
sidebar: merlin7_sidebar
|
||||
permalink: /merlin7/ansys-rsm.html
|
||||
---
|
||||
|
||||
{:style="display:block; margin-left:auto; margin-right:auto"}
|
||||
## ANSYS Remote Resolve Manager
|
||||
|
||||
{{site.data.alerts.warning}}The Merlin7 documentation is <b>Work In Progress</b>.
|
||||
Please do not use or rely on this documentation until this becomes official.
|
||||
This applies to any page under <b><a href="https://lsm-hpce.gitpages.psi.ch/merlin7/">https://lsm-hpce.gitpages.psi.ch/merlin7/</a></b>
|
||||
**ANSYS Remote Solve Manager (RSM)** is used by ANSYS Workbench to submit computational jobs to HPC clusters directly from Workbench on your desktop.
|
||||
|
||||
{{site.data.alerts.warning}} Merlin7 is running behind a firewall, however, there are firewall policies in place to access the Merlin7 ANSYS RSM service from the main PSI networks. If you can not connect to it, please contact us, and please provide the IP address for the corresponding workstation: we will check the PSI firewall rules in place and request for an update if necessary.
|
||||
{{site.data.alerts.end}}
|
||||
|
||||
## ANSYS RSM Configuration tool settings:
|
||||
### The Merlin7 RSM service
|
||||
|
||||
Use Merlin7 ANSYS to submit to RSM:
|
||||
A RSM service is running on a dedicated Virtual Machine server. This service will listen a specific port and will process any request using RSM (in example, from ANSYS users workstations).
|
||||
The following nodes are configured with such services:
|
||||
* `merlin7-ansys-rsm.psi.ch`
|
||||
|
||||
If you want to submit from Titan is possible too, you have to set SSH keys in your local workstation as described in: https://www.purdue.edu/science/scienceit/ssh-keys-windows.html
|
||||
The earliest version supported in the Merlin7 cluster is ANSYS/2022R2. Older versions are not supported due to existing bugs or missing functionalities. In case you strongly need to run an older version, please do not hesitate to contact the Merlin admins.
|
||||
|
||||
"HPC_Resource" tab configuration:
|
||||
## Configuring RSM client on Windows workstations
|
||||
|
||||
HPC Type: Slurm
|
||||
Submit Host: psi-dev.cscs.ch
|
||||
Slurm job arguments: --hint=nomultithread
|
||||
Users can setup ANSYS RSM in their workstations to connect to the Merlin7 cluster.
|
||||
The different steps and settings required to make it work are that following:
|
||||
|
||||
"File Management" tab configuration:
|
||||
1. Open the RSM Configuration service in Windows for the ANSYS release you want to configure.
|
||||
2. Right-click the **HPC Resources** icon followed by **Add HPC Resource...**
|
||||

|
||||
3. In the **HPC Resource** tab, fill up the corresponding fields as follows:
|
||||

|
||||
* **"Name"**: Add here the preffered name for the cluster. In example: `Merlin7 cluster - merlin-l-001`
|
||||
* **"HPC Type"**: Select `SLURM`
|
||||
* **"Submit host"**: `merlin7-rsm01.psi.ch`
|
||||
* **"Slurm Job submission arguments (optional)"**: Add any required Slurm options for running your jobs.
|
||||
* `--hint=nomultithread` must be present.
|
||||
* `--exclusive` must also be present for now, due to a bug in the `Slingshot` interconnect which does not allow running shared nodes.
|
||||
* Check **"Use SSH protocol for inter and intra-node communication (Linux only)"**
|
||||
* Select **"Able to directly submit and monitor HPC jobs"**.
|
||||
* **"Apply"** changes.
|
||||
4. In the **"File Management"** tab, fill up the corresponding fields as follows:
|
||||

|
||||
* Select **"RSM internal file transfer mechanism"** and add **`/data/scratch/shared`** as the **"Staging directory path on Cluster"**
|
||||
* Select **"Scratch directory local to the execution node(s)"** and add **`/scratch`** as the **HPC scratch directory**.
|
||||
* **Never check** the option "Keep job files in the staging directory when job is complete" if the previous
|
||||
option "Scratch directory local to the execution node(s)" was set.
|
||||
* **"Apply"** changes.
|
||||
5. In the **"Queues"** tab, use the left button to auto-discover partitions
|
||||

|
||||
* If no authentication method was configured before, an authentication window will appear. Use your
|
||||
PSI account to authenticate. Notice that the **`PSICH\`** prefix **must not be added**.
|
||||

|
||||
* From the partition list, select the ones you want to typically use.
|
||||
* In general, standard Merlin users must use **`hourly`**, **`daily`** and **`general`** only.
|
||||
* Other partitions are reserved for allowed users only.
|
||||
* **"Apply"** changes.
|
||||

|
||||
6. *[Optional]* You can perform a test by submitting a test job on each partition by clicking on the **Submit** button
|
||||
for each selected partition.
|
||||
|
||||
External mechanism for the transfer (SCP, custom)
|
||||
Transfer Mechanism SCP via SSH
|
||||
As staging directory, use /scratch/tmp
|
||||
As account, use your PSI username
|
||||
SSH Keys have to be configured to make it work.
|
||||
{{site.data.alerts.tip}}
|
||||
In the future, we might provide this service also from the login nodes for better transfer performance.
|
||||
{{site.data.alerts.end}}
|
||||
|
||||
## Using RSM in ANSYS
|
||||
|
||||
Using the RSM service in ANSYS is slightly different depending on the ANSYS software being used.
|
||||
Please follow the official ANSYS documentation for details about how to use it for that specific software.
|
||||
|
||||
Alternativaly, please refer to some the examples showed in the following chapters (ANSYS specific software).
|
||||
|
||||
### Using RSM in ANSYS Fluent
|
||||
|
||||
For further information for using RSM with Fluent, please visit the **[ANSYS RSM](/merlin7/ansys-fluent.html)** section.
|
||||
|
||||
### Using RSM in ANSYS CFX
|
||||
|
||||
For further information for using RSM with CFX, please visit the **[ANSYS RSM](/merlin7/ansys-cfx.html)** section.
|
||||
|
||||
### Using RSM in ANSYS MAPDL
|
||||
|
||||
For further information for using RSM with MAPDL, please visit the **[ANSYS RSM](/merlin7/ansys-mapdl.html)** section.
|
||||
|