updating README
Run apocalypse tests / Explore-Gitea-Actions (push) Successful in 8s

This commit is contained in:
2026-04-24 16:37:37 +02:00
parent 11b3c38038
commit af8a13cb1b
+11 -7
View File
@@ -2,9 +2,9 @@
Apocalypse (apo) is a library that provides the SwissFEL user platform to effortlessly run (and re run) their scripts on the SwissFEL infrastructure -- compute cluster "Eris" (login node: sf-eris) with [sf-daq](https://github.com/paulscherrerinstitute/sf_daq_broker) output files.
More information about [Eris](https://hpc-sysadmins.gitpages.psi.ch/swissfel/compute.html)
More information about [Eris](https://dari.pages.psi.ch/swissfel/compute.html)
Apo is a rewrite/ expansion of [Automatic Processing tool (ap)](https://gitlab.psi.ch/sf-daq/ap) giving user flexibility to how to handle the raw data and separates handling processed data to an another step.
Apo is a rewrite/ expansion of [Automatic Processing tool (ap)](https://gitea.psi.ch/sf-daq/ap) giving user flexibility to how to handle the raw data and separates handling processed data to an another step.
### Getting started
#### Before you start
@@ -15,8 +15,9 @@ salloc
ssh sf-cn-#
<pth_script> --input <filename>.h5
```
- pure `salloc` will allocate with default setup. If you would like to have more cores avialable use `salloc -c #` where `#` is the number of desired cores.
- `sf-cn-#` -- corresponds to the node allocated for interactive use and displayed after `salloc`
- `<pth_script>` -- is top level script that should run exactly as stated. That means any additional actions (e.g. setting environment, adding pmodules, additional parameter handling) should be included in it. See [examples](https://gitea.psi.ch/woznic_n/dummy_apocalypse/src/branch/main/examples)
- `<pth_script>` -- is top level script that should run exactly as stated. That means any additional actions (e.g. setting environment, adding pmodules, additional parameter handling) should be included in it. See [examples](https://gitea.psi.ch/SwissFEL/apocalypse/src/branch/main/examples)
- `<filename>.h5` is the path to file that you would like to process on `/sf/<endstation>/data/<pgroup>/raw/`
Once your tests are successful remember to cancel your allocation by:
@@ -25,6 +26,8 @@ scancel <JID>
```
- `<JID>` is slurm's job ID, displayed when allocating resources, you can also check it by `squeue -u $USER`
or by closing/exiting your terminal.
#### Setup
Run apo from Eris:
```bash
@@ -67,7 +70,7 @@ Re-run will just emit "file written" message similar as when original run was ta
### Examples
#### Simple python script
Simple script with writing meta-data can be found in [simple1](https://gitea.psi.ch/woznic_n/dummy_apocalypse/src/branch/main/examples/simple1)
Simple script with writing meta-data can be found in [simple1](https://gitea.psi.ch/SwissFEL/apocalypse/src/branch/main/examples/simple1)
This example just takes one camera file and will save an output file with projections of channels with more than one dimension along with apo meta file. There is a simple txt meta file written as meta, that won't be parsed later on by apo and will be passed as is with success message.
Since there is no filter for bs files the best way to run this script is:
@@ -77,7 +80,7 @@ apocalypse -s ./examples/simple1/run_simple1.sh -e endstation -writer-type image
```
to make sure that the jobs are submitted only for cam files
#### Simple python script with bash wrapper
Simple script with writing meta-data and handling additional parameter can be found in [simple2](https://gitea.psi.ch/woznic_n/dummy_apocalypse/src/branch/main/examples/simple2)
Simple script with writing meta-data and handling additional parameter can be found in [simple2](https://gitea.psi.ch/SwissFEL/apocalypse/src/branch/main/examples/simple2)
This example takes one camera file and a defined background file and will save an output file with projections of channels with more than one dimension along with apo meta file. In this example meta file is written as .json and it will be parsed to dictionary before passing the data to the success message.
Since there is no filter for bs files the best way to run this script is:
@@ -89,8 +92,9 @@ to make sure that the jobs are submitted only for cam files
It is convenient to just make the background file a symlink and change when needed.
### General remarks
- Make sure to write enough metadata in your files to make your work reproduceable apo does not handle that for you
- it is recommended to mirror raw data structure in res, only metadata files in `/sf/<endstation>/data/<pgroup>/res/processed/<run>/meta` and called `apo_acq<acq>` (.json, .yaml, .txt) will be be processed and sent with success message.
- Make sure to write enough metadata in your files to make your work reproduceable, as apo does not handle that for you
- It is recommended to mirror raw data structure in res, only metadata files in `/sf/<endstation>/data/<pgroup>/res/processed/<run>/meta` and called `apo_acq<acq>` (.json, .yaml, .txt) will be be processed and sent with success message.
- Editing script that is used by running apo will affect jobs that did not yet execute. Take special care when changing parameters for a script that still has jobs in the queue -- unless it was an error, stop apo and run with copy of differently named script.
buffer - file type correspondence: