76 Commits

Author SHA1 Message Date
992f5a8f0d pgroup add to a get_scan.. 2026-02-03 13:25:28 +01:00
9f6b0f8290 Channel names feature 2026-02-03 11:26:17 +01:00
d2f6e7285a Channel names update 2026-02-03 11:20:06 +01:00
9beff10fe6 Removing cluster tools (now with own repo at /sf/cristallina/applications/it/cluster_tools) 2026-02-03 11:19:14 +01:00
82b6b681d0 No real changes, just file permissions it seems 2026-02-03 11:17:18 +01:00
25ee07977b Update tests (no obvious changes, just ownership it seems) 2026-02-03 11:08:36 +01:00
9f01f5ad03 Docs cleanup 2026-02-03 11:06:33 +01:00
9e54ae4bda added number of retries 2025-12-05 14:53:24 +01:00
5112e050d3 more cleanups 2025-11-24 23:37:55 +01:00
a9bfc61a25 cleanup in analysis, comments, black, and upper threshold. 2025-11-24 15:23:37 +01:00
69e311cb85 Added RSM, in a first iteration 2025-11-24 15:05:43 +01:00
e95b5a1670 added code for p22760 experiment 2025-11-14 22:25:10 +01:00
6406fea7ed added custom value-to-color mapper 2025-11-14 22:23:59 +01:00
aa51b3fc41 finalized JF_stacks and unfinished scan in scan_info 2025-11-12 16:53:59 +01:00
ed645874dc added JF_stack calculator 2025-11-11 15:52:03 +01:00
a70ef5caa2 Unfinished scan including tests 2025-11-11 15:09:54 +01:00
d397a9668b changes before migration 2025-09-03 18:17:39 +02:00
ed090be840 added expansion functionality to ROI 2025-07-13 15:27:53 +02:00
b671e2d8dc updated channels definitions 2025-07-12 23:12:11 +02:00
81e8c2b7c5 added shift function to ROI 2025-07-12 23:11:19 +02:00
fbb2cac698 added a pgroup module 2025-05-14 16:30:28 +02:00
63f810f42e made imports more robust 2025-05-14 16:09:15 +02:00
0f23e462a8 Upper cut threshold added to perform image calculations 2025-04-02 04:18:31 +02:00
Jakub Vonka
3ad42521f9 Line plot with colorbar added 2025-04-02 04:15:31 +02:00
Karina Kazarian
259212282b edit histogram code 2025-03-29 17:06:02 +01:00
34ddfb376f added partialjson to requirements 2025-02-04 11:29:10 +01:00
637d7ac955 added scikit-image to the requirements 2025-02-04 11:15:20 +01:00
ebdb161726 Update .gitlab-ci.yml file 2025-02-04 11:11:25 +01:00
dd5b119c55 Update .gitlab-ci.yml file 2025-02-04 11:08:39 +01:00
2321dc0487 trying with micromamba 2025-02-03 17:46:50 +01:00
6758f3cc6d Thresholding and sum part 2 2025-01-18 20:26:35 +01:00
f8eaf1ad2c added thresholding to sum and new detectors 2025-01-18 20:25:52 +01:00
f9f062d219 Adding cluster tools to this repo 2024-11-26 16:36:26 +01:00
7649164f88 Beamtime p22199-Puma commisioning Nov2024 2024-11-26 16:36:25 +01:00
16b06a7445 Update .gitlab-ci.yml file 2024-09-26 17:55:02 +02:00
43c01d021a Update .gitlab-ci.yml file 2024-07-01 16:33:15 +02:00
aea3b1b306 unfinished scan and update for newer jungfrau_utils 2024-06-27 15:21:22 +02:00
e01c8050ee adapted jupyter_helper ttl 2024-06-24 17:30:21 +02:00
29a122209a added Unfinished Scan 2024-06-24 17:29:31 +02:00
688c6ed4c5 Updated channels and fixed typo in utils 2024-06-21 14:19:55 +02:00
02b9279da7 special case for RA cluster 2024-06-21 14:19:55 +02:00
7ce5015780 disabled deploy 2024-06-17 17:15:52 +02:00
2319fa9cbf collaborative helper extended 2024-06-17 17:03:17 +02:00
25b17435f8 small examples added 2024-04-23 18:08:31 +02:00
1d33e86f76 added JF frame checker, small cleanup 2024-04-23 18:05:52 +02:00
2efbf3a64a fixed regression tests, file was archived 2024-03-13 17:20:01 +01:00
45b71789c8 fixed roi cutoffs 2024-03-13 16:28:36 +01:00
bdeb37b5e2 improved error handling in pulse to timestamp 2024-02-13 17:36:54 +01:00
cc893dcb22 added pulse id to datetime conversion 2024-02-01 13:36:47 +01:00
619d88bb7a added copy constructor to ROI 2023-12-07 10:05:36 +01:00
e960d2a11f toggle plot legend 2023-11-30 18:30:11 +01:00
9ff4b485ca and another fix for pgroup 2023-11-30 17:44:45 +01:00
9173e2fb6a bug fix in heuristic pgroup extraction 2023-11-30 17:42:33 +01:00
dd51585580 Update README.rst 2023-11-26 19:28:32 +01:00
1457839114 Update .gitlab-ci.yml file 2023-11-26 18:47:18 +01:00
3b49e9f2d8 Update .gitlab-ci.yml file 2023-11-26 18:34:31 +01:00
737c5aa8c0 added timestamp exctraction for runs 2023-11-26 17:53:13 +01:00
2a2d449339 plot style template added 2023-11-26 02:46:24 +01:00
b3001316ef channel names and more plots 2023-11-26 01:52:09 +01:00
087b4406b8 added 1D Gauss fit with LMFIT 2023-11-22 23:08:02 +01:00
19eeee0b37 fixed stack sum threshold 2023-11-22 03:41:16 +01:00
be78d68819 added ROI calculation including tests 2023-11-14 16:20:52 +01:00
4d5263e620 modified jupyter helper for collaboration >=2.x 2023-11-14 16:20:13 +01:00
aa6190e9dd Added checker for the pid offset between two channels 2023-11-13 18:18:53 +01:00
ff87dbdd60 Removing stand table from testing, because github can not read /sf/ 2023-11-13 15:25:26 +01:00
97f683b73c Added Stand table reading in utils 2023-11-13 15:14:51 +01:00
6b58d30f76 added jupyter helper module for collaborative features 2023-09-27 17:27:34 +02:00
be01a75c68 cleaned up caching and small fixes 2023-09-14 11:29:52 +02:00
787c863cd1 added image calculation tests 2023-08-17 12:40:17 +02:00
4735301f0d added image calculation tests 2023-08-17 12:28:57 +02:00
4ff6072056 added metadata collection routine and tests 2023-08-16 18:05:43 +02:00
546295ae77 added image analysis tools and small updates throughout 2023-07-24 18:14:41 +02:00
2e17a6f8b3 Update tests/test_utils.py 2023-04-21 15:57:43 +00:00
99a3f51c57 fixed for lmfit 1.2 2023-04-21 17:55:29 +02:00
f13e36ce6d Revert "enabled xraydb tests"
This reverts commit 53f7953d24
2023-04-21 08:05:19 +00:00
53f7953d24 enabled xraydb tests 2023-04-21 09:33:33 +02:00
122 changed files with 5727 additions and 2605 deletions

0
.coveragerc Normal file → Executable file
View File

1
.gitignore vendored Normal file → Executable file
View File

@@ -37,6 +37,7 @@ htmlcov/*
junit*.xml junit*.xml
coverage.xml coverage.xml
.pytest_cache/ .pytest_cache/
tests/cache_test/
# Build and docs folder/files # Build and docs folder/files
build/* build/*

21
.gitlab-ci.yml Normal file → Executable file
View File

@@ -9,8 +9,10 @@ build-job:
run_tests: run_tests:
stage: test stage: test
before_script: before_script:
- echo "Running as $(whoami)"
- echo "$CI_BUILDS_DIR" - echo "$CI_BUILDS_DIR"
- source /home/gitlab-runner/miniconda3/bin/activate cristallina - eval "$(/home/gitlab-runner/bin/micromamba shell hook --shell=bash)"
- micromamba activate cristallina
- echo `pwd` - echo `pwd`
- pip install -e . - pip install -e .
script: script:
@@ -18,3 +20,20 @@ run_tests:
- coverage run -m pytest - coverage run -m pytest
- coverage report - coverage report
coverage: '/TOTAL.*\s+(\d+\%)/' coverage: '/TOTAL.*\s+(\d+\%)/'
pages:
stage: deploy
before_script:
- echo "$CI_BUILDS_DIR"
- eval "$(/home/gitlab-runner/bin/micromamba shell hook --shell=bash)"
- micromamba activate cristallina
script:
- tox -e docs
- mv docs/_build/html/ public/
artifacts:
paths:
- public
rules:
- if: $CI_COMMIT_REF_NAME == $CI_DEFAULT_BRANCH

0
.readthedocs.yml Normal file → Executable file
View File

1
AUTHORS.rst Normal file → Executable file
View File

@@ -3,3 +3,4 @@ Contributors
============ ============
* Alexander Steppke <alexander.steppke@psi.ch> * Alexander Steppke <alexander.steppke@psi.ch>
* Jakub Vonka <jakub.vonka@psi.ch>

0
CHANGELOG.rst Normal file → Executable file
View File

0
CONTRIBUTING.rst Normal file → Executable file
View File

0
LICENSE.txt Normal file → Executable file
View File

8
README.rst Normal file → Executable file
View File

@@ -27,6 +27,14 @@
:alt: Project generated with PyScaffold :alt: Project generated with PyScaffold
:target: https://pyscaffold.org/ :target: https://pyscaffold.org/
.. image:: https://gitlab.psi.ch/sf_cristallina/cristallina/badges/master/pipeline.svg
:alt: pipeline status
:target: https://gitlab.psi.ch/sf_cristallina/cristallina/-/commits/master
.. image:: https://gitlab.psi.ch/sf_cristallina/cristallina/-/badges/release.svg
:alt: Latest Release
:target: https://gitlab.psi.ch/sf_cristallina/cristallina/-/releases
.. image:: https://gitlab.psi.ch/sf_cristallina/cristallina/badges/master/coverage.svg .. image:: https://gitlab.psi.ch/sf_cristallina/cristallina/badges/master/coverage.svg
:alt: Cristallina coverage report :alt: Cristallina coverage report
:target: https://gitlab.psi.ch/sf_cristallina/cristallina/-/commits/master :target: https://gitlab.psi.ch/sf_cristallina/cristallina/-/commits/master

0
docs/Makefile Normal file → Executable file
View File

0
docs/_build/doctrees/api/cristallina.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/api/modules.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/authors.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/changelog.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/contributing.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/environment.pickle vendored Normal file → Executable file
View File

0
docs/_build/doctrees/index.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/license.doctree vendored Normal file → Executable file
View File

0
docs/_build/doctrees/readme.doctree vendored Normal file → Executable file
View File

0
docs/_build/html/.buildinfo vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/cristallina/SEA_GraphClient.html vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/cristallina/analysis.html vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/cristallina/plot.html vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/cristallina/skeleton.html vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/cristallina/utils.html vendored Normal file → Executable file
View File

0
docs/_build/html/_modules/index.html vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/api/cristallina.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/api/modules.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/authors.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/changelog.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/contributing.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/index.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/license.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_sources/readme.rst.txt vendored Normal file → Executable file
View File

0
docs/_build/html/_static/alabaster.css vendored Normal file → Executable file
View File

0
docs/_build/html/_static/basic.css vendored Normal file → Executable file
View File

0
docs/_build/html/_static/custom.css vendored Normal file → Executable file
View File

0
docs/_build/html/_static/doctools.js vendored Normal file → Executable file
View File

0
docs/_build/html/_static/documentation_options.js vendored Normal file → Executable file
View File

0
docs/_build/html/_static/file.png vendored Normal file → Executable file
View File

Before

Width:  |  Height:  |  Size: 286 B

After

Width:  |  Height:  |  Size: 286 B

0
docs/_build/html/_static/language_data.js vendored Normal file → Executable file
View File

0
docs/_build/html/_static/minus.png vendored Normal file → Executable file
View File

Before

Width:  |  Height:  |  Size: 90 B

After

Width:  |  Height:  |  Size: 90 B

0
docs/_build/html/_static/plus.png vendored Normal file → Executable file
View File

Before

Width:  |  Height:  |  Size: 90 B

After

Width:  |  Height:  |  Size: 90 B

0
docs/_build/html/_static/pygments.css vendored Normal file → Executable file
View File

0
docs/_build/html/_static/searchtools.js vendored Normal file → Executable file
View File

0
docs/_build/html/_static/sphinx_highlight.js vendored Normal file → Executable file
View File

0
docs/_build/html/api/cristallina.html vendored Normal file → Executable file
View File

0
docs/_build/html/api/modules.html vendored Normal file → Executable file
View File

0
docs/_build/html/authors.html vendored Normal file → Executable file
View File

0
docs/_build/html/changelog.html vendored Normal file → Executable file
View File

0
docs/_build/html/contributing.html vendored Normal file → Executable file
View File

0
docs/_build/html/genindex.html vendored Normal file → Executable file
View File

0
docs/_build/html/index.html vendored Normal file → Executable file
View File

0
docs/_build/html/license.html vendored Normal file → Executable file
View File

0
docs/_build/html/objects.inv vendored Normal file → Executable file
View File

0
docs/_build/html/py-modindex.html vendored Normal file → Executable file
View File

0
docs/_build/html/readme.html vendored Normal file → Executable file
View File

0
docs/_build/html/search.html vendored Normal file → Executable file
View File

0
docs/_build/html/searchindex.js vendored Normal file → Executable file
View File

0
docs/_static/.gitignore vendored Normal file → Executable file
View File

0
docs/api/cristallina.rst Normal file → Executable file
View File

0
docs/api/modules.rst Normal file → Executable file
View File

0
docs/authors.rst Normal file → Executable file
View File

0
docs/changelog.rst Normal file → Executable file
View File

0
docs/conf.py Normal file → Executable file
View File

0
docs/contributing.rst Normal file → Executable file
View File

0
docs/index.rst Normal file → Executable file
View File

0
docs/license.rst Normal file → Executable file
View File

0
docs/readme.rst Normal file → Executable file
View File

0
docs/requirements.txt Normal file → Executable file
View File

2519
examples/2d_gaussian_fitting_example.ipynb Normal file → Executable file

File diff suppressed because one or more lines are too long

763
examples/SmallExample.ipynb Executable file

File diff suppressed because one or more lines are too long

212
examples/data_examples.ipynb Executable file

File diff suppressed because one or more lines are too long

0
pyproject.toml Normal file → Executable file
View File

0
pytest.ini Normal file → Executable file
View File

14
requirements.txt Executable file
View File

@@ -0,0 +1,14 @@
#
importlib_metadata
joblib>=1.2.0
jungfrau_utils>=3.12.1
lmfit>=1.1.0
matplotlib
scikit-image # for the image history equalization
PyYAML
scipy
setuptools
Sphinx
tqdm
xraydb
partialjson # for parsing scan.json that are in progress

0
setup.cfg Normal file → Executable file
View File

0
setup.py Normal file → Executable file
View File

View File

@@ -1,58 +0,0 @@
#!/bin/bash
### Small script to create an SSH tunnel from outside of PSI network to access RA.
### Change to your PSI account username and run. Once running, going to https://localhost:5000 leads to jupyera.psi.ch without needing VPN.
def_user=$USER
def_addr="jupytera"
def_port="5000"
def_debug=false
def_node=false
function show_usage {
echo "usage: $0 [-u USER] [-p PORT] [-s]"
echo " -u, --user USER set user (default: ${def_user})"
echo " -p, --port PORT set local port (default: ${def_port})"
echo " -s, --staging tunnel to jupytera staging (default: jupytera production)"
echo " -n, --node node at which jupyter is running"
echo " -d, --debug turn on debug mode (default: off)"
echo " -h, --help, -? show this help"
}
user=${def_user}
addr=${def_addr}
port=${def_port}
debug=${def_debug}
node=${def_node}
while [[ "$#" -gt 0 ]]; do
case $1 in
-u|--user) user="$2" ; shift ;;
-p|--port) port="$2" ; shift ;;
-n|--node) node="$2" ; shift ;;
-s|--staging) addr="jupytera-staging" ;;
-d|--debug) debug=true ;;
-h|--help|-\?) show_usage ; exit ;;
*) echo "unknown argument: $1" ; show_usage ; exit 1 ;;
esac
shift
done
echo "Creating tunnel to ${addr}.psi.ch on https://localhost:${port}/ ..."
echo
echo "Username: ${user}"
# If node not given, link to a generic jupyter from spawner. If node given and a notebook already started there, link there.
if [ "${node}" = false ]; then
cmd="ssh -J ${user}@hop.psi.ch ${user}@ra.psi.ch -L ${port}:${addr}.psi.ch:443"
else
echo "Establishing tunnel to ${node}"
cmd="ssh -J ${user}@hop.psi.ch ${user}@ra.psi.ch -L ${port}:${node}.psi.ch:8888"
fi
if ${debug} ; then
echo "DEBUG:" $cmd
else
$cmd
fi

0
src/cristallina/SEA_GraphClient.py Normal file → Executable file
View File

15
src/cristallina/__init__.py Normal file → Executable file
View File

@@ -15,7 +15,16 @@ except PackageNotFoundError: # pragma: no cover
finally: finally:
del version, PackageNotFoundError del version, PackageNotFoundError
try:
from . import utils from . import utils
from . import plot from . import plot
from . import analysis from . import analysis
from . import image
from . import channels
from . import uscan
except (ImportError, ModuleNotFoundError) as e:
# Handle the case where the package is not installed
# or if there are issues with the imports
print(f"Error importing modules: {e}")

381
src/cristallina/analysis.py Normal file → Executable file
View File

@@ -1,6 +1,8 @@
import re import re
from collections import defaultdict from collections import defaultdict
from pathlib import Path
from typing import Optional from typing import Optional
import logging
import numpy as np import numpy as np
from matplotlib import pyplot as plt from matplotlib import pyplot as plt
@@ -9,13 +11,13 @@ import lmfit
from sfdata import SFDataFiles, sfdatafile, SFScanInfo from sfdata import SFDataFiles, sfdatafile, SFScanInfo
import joblib from joblib import Memory
from joblib import Parallel, delayed, Memory
from . import utils from . import utils
from . import channels
from .utils import ROI from .utils import ROI
memory = None logger = logging.getLogger(__name__)
def setup_cachedirs(pgroup=None, cachedir=None): def setup_cachedirs(pgroup=None, cachedir=None):
@@ -26,11 +28,10 @@ def setup_cachedirs(pgroup=None, cachedir=None):
If heuristics fail we use "/tmp" as a non-persistent alternative. If heuristics fail we use "/tmp" as a non-persistent alternative.
""" """
global memory
if cachedir is not None: if cachedir is not None:
# explicit directory given, use this choice # explicit directory given, use this choice
memory = Memory(cachedir, verbose=0, compress=2) memory = Memory(cachedir, verbose=0, compress=2)
return return memory
try: try:
if pgroup is None: if pgroup is None:
@@ -38,34 +39,49 @@ def setup_cachedirs(pgroup=None, cachedir=None):
else: else:
parts = re.split(r"(\d.*)", pgroup) # ['p', '2343', ''] parts = re.split(r"(\d.*)", pgroup) # ['p', '2343', '']
pgroup_no = parts[-2] pgroup_no = parts[-2]
cachedir = f"/das/work/units/cristallina/p{pgroup_no}/cachedir"
candidates = [
f"/sf/cristallina/data/p{pgroup_no}/work",
f"/sf/cristallina/data/p{pgroup_no}/res",
]
for cache_parent_dir in candidates:
if Path(cache_parent_dir).exists():
cachedir = Path(cache_parent_dir) / "cachedir"
break
except KeyError as e: except KeyError as e:
print(e) logger.warning(f"Could not determine p-group due to {e}. Using default cachedir.")
cachedir = "/das/work/units/cristallina/p19739/cachedir" cachedir = "/das/work/units/cristallina/p19739/cachedir"
try: try:
memory = Memory(cachedir, verbose=0, compress=2) memory = Memory(cachedir, verbose=0, compress=2)
except PermissionError as e: except PermissionError as e:
logger.warning(f"Could not use cachedir {cachedir} due to {e}. Using /tmp instead.")
cachedir = "/tmp" cachedir = "/tmp"
memory = Memory(cachedir, verbose=0, compress=2) memory = Memory(cachedir, verbose=0, compress=2)
return memory
setup_cachedirs()
memory = setup_cachedirs()
@memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes @memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes
def perform_image_calculations( def perform_image_calculations(
fileset, filesets,
channel="JF16T03V01", channel="JF16T03V01",
alignment_channels=None, alignment_channels=None,
batch_size=10, batch_size=10,
roi: Optional[ROI] = None, roi: Optional[ROI] = None,
preview=False, preview=False,
operations=["sum"], operations=["sum"],
lower_cutoff_threshold=None,
upper_cutoff_threshold=None,
): ):
""" """
Performs one or more calculations ("sum", "mean" or "std") for a given region of interest (roi) Performs one or more calculations ("sum", "mean" or "std") for a given region of interest (roi)
for an image channel from a fileset (e.g. "run0352/data/acq0001.*.h5" or step.fnames from a SFScanInfo object). for an image channel from a fileset (e.g. ["run0352/data/acq0001.*.h5"] or step.fnames from a SFScanInfo object).
Allows alignment, i.e. reducing only to a common subset with other channels. Allows alignment, i.e. reducing only to a common subset with other channels.
@@ -83,7 +99,7 @@ def perform_image_calculations(
"std": ["mean", np.std], "std": ["mean", np.std],
} }
with SFDataFiles(*fileset) as data: with SFDataFiles(*filesets) as data:
if alignment_channels is not None: if alignment_channels is not None:
channels = [channel] + [ch for ch in alignment_channels] channels = [channel] + [ch for ch in alignment_channels]
else: else:
@@ -106,6 +122,12 @@ def perform_image_calculations(
else: else:
im_ROI = im[:, roi.rows, roi.cols] im_ROI = im[:, roi.rows, roi.cols]
# use cutoff to set values to 0 below
if lower_cutoff_threshold is not None:
im_ROI = np.where(im_ROI < lower_cutoff_threshold, 0, im_ROI)
if upper_cutoff_threshold is not None:
im_ROI = np.where(im_ROI > upper_cutoff_threshold, 0, im_ROI)
# iterate over all operations # iterate over all operations
for op in operations: for op in operations:
label, func = possible_operations[op] label, func = possible_operations[op]
@@ -122,12 +144,13 @@ def perform_image_calculations(
@memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes @memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes
def sum_images( def sum_images(
fileset, filesets,
channel="JF16T03V01", channel="JF16T03V01",
alignment_channels=None, alignment_channels=None,
batch_size=10, batch_size=10,
roi: Optional[ROI] = None, roi: Optional[ROI] = None,
preview=False, preview=False,
lower_cutoff_threshold=None,
): ):
""" """
Sums a given region of interest (roi) for an image channel from a Sums a given region of interest (roi) for an image channel from a
@@ -144,37 +167,299 @@ def sum_images(
""" """
return perform_image_calculations( return perform_image_calculations(
fileset, filesets,
channel=channel, channel=channel,
alignment_channels=alignment_channels, alignment_channels=alignment_channels,
batch_size=batch_size, batch_size=batch_size,
roi=roi, roi=roi,
preview=preview, preview=preview,
operations=["sum"], operations=["sum"],
lower_cutoff_threshold=lower_cutoff_threshold,
) )
def get_contrast_images( @memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes
fileset, def perform_image_stack_sum(
filesets,
channel="JF16T03V01", channel="JF16T03V01",
alignment_channels=None, alignment_channels=None,
batch_size=10, batch_size=10,
roi: Optional[ROI] = None, roi: Optional[ROI] = None,
preview=False, preview=False,
lower_cutoff_threshold=None, # in keV
upper_cutoff_threshold=None, # in keV
): ):
""" """
See perform_image_calculations. Here calculates mean and standard deviation for a given set of images. Performs summation along the pulse dimensionfor a given region of interest (roi) for an image channel
from a fileset (e.g. "run0352/data/acq0001.*.h5" or step.fnames from a SFScanInfo object).
Allows alignment, i.e. reducing only to a common subset with other channels.
Calculations are performed in batches to reduce maximum memory requirements.
Preview only applies calculation to first batch and returns.
Returns: A 2D array with the summed image over all pulses without missing data.
""" """
return perform_image_calculations( with SFDataFiles(*filesets) as data:
fileset, if alignment_channels is not None:
channel=channel, channels = [channel] + [ch for ch in alignment_channels]
alignment_channels=alignment_channels, else:
batch_size=batch_size, channels = [channel]
roi=roi,
preview=preview, subset = data[channels]
operations=["mean", "std"], subset.drop_missing()
)
Images = subset[channel]
# create empty array for stack sum with right shape
im = Images[0]
if roi is None:
im_ROI = im[:]
else:
im_ROI = im[roi.rows, roi.cols]
summed = np.zeros(im_ROI[0].shape)
for index_slice, im in Images.in_batches(batch_size):
if roi is None:
im_ROI = im
else:
im_ROI = im[:, roi.rows, roi.cols]
if lower_cutoff_threshold is not None:
im_ROI = np.where(im_ROI < lower_cutoff_threshold, 0, im_ROI)
if upper_cutoff_threshold is not None:
im_ROI = np.where(im_ROI > upper_cutoff_threshold, 0, im_ROI)
summed = summed + np.sum(im_ROI, axis=(0))
# only return first batch
if preview:
break
return summed
@memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes
def perform_image_roi_crop(
filesets,
channel="JF16T03V01",
alignment_channels=None,
batch_size=10,
roi: ROI = None,
preview=False,
lower_cutoff=None,
upper_cutoff=np.inf,
):
"""
Cuts out a region of interest (ROI) for an image channel
from a fileset (e.g. "run0352/data/acq0001.*.h5" or step.fnames from a SFScanInfo object).
Drops missing data from output and allows alignment,
i.e. reducing only to a common subset with other channels.
Calculations are performed in batches to reduce maximum memory requirements.
Lower- and upper cutoff allow to threshold the data.
Preview only applies calculation to first batch and returns.
Returns: An 1D array (along the pulses recorded without missing) of 2D images
Beware though: this can create a rather large array that exceeds available memory.
TODO: should we create a complete channel here instead of returning `raw` data?
"""
with SFDataFiles(*filesets) as data:
if alignment_channels is not None:
channels = [channel] + [ch for ch in alignment_channels]
else:
channels = [channel]
subset = data[channels]
subset.drop_missing()
Images = subset[channel]
rois_within_batch = []
for image_slice in Images.in_batches(batch_size):
index_slice, im = image_slice
if roi is None:
im_ROI = im
else:
im_ROI = im[:, roi.rows, roi.cols]
if lower_cutoff is not None:
im_ROI = np.where((im_ROI < lower_cutoff) | (im_ROI > upper_cutoff), 0, im_ROI)
rois_within_batch.extend(im_ROI)
# only return first batch
if preview:
break
return np.array(rois_within_batch)
@memory.cache()
def calculate_JF_stacks(
scan: SFScanInfo,
lower_cutoff_threshold=7.6, # in keV
upper_cutoff_threshold=None, # in keV
exclude_steps: list[int] | None = None,
recompute: bool = False,
detector: str = "JF16T03V02",
):
"""Calculate image stacks for JF detectors in a scan.
Args:
scan (SFScanInfo): The scan object containing scan steps.
lower_cutoff_threshold: Threshold to apply to pixel values before summation in keV.
upper_cutoff_threshold: Upper threshold to apply to pixel values before summation in keV.
exclude_steps: List of step indices to exclude from processing. Defaults to None.
recompute: If True, forces recomputation even if cached results are available. Defaults to False.
detector: The detector channel to process. Defaults to "JF16T03V02" (JF 1.5M).
Returns:
stacks: List of summed image stacks for each step.
I0_norms: List of I0 normalizations for each step.
"""
stacks = []
I0_norms = []
for i, step in enumerate(scan):
if exclude_steps is not None and i in exclude_steps:
logger.info(f"skipping step {i}")
continue
JF_channels = [channels.JF, channels.JF_8M, channels.JF_I0, channels.GASMONITOR]
available_channels = [ch.name for ch in step.channels]
selected_channels = [ch for ch in JF_channels if ch in available_channels]
subset = step[*selected_channels]
pids_before = subset.pids.copy()
subset.drop_missing()
pids_after = subset.pids.copy()
if set(pids_before) != set(pids_after):
logger.warning(
f"Step {i}: dropped {set(pids_before) - set(pids_after)} pulse IDs due to missing data."
)
# we only consider the JF files here
files = [f.fname for f in step.files if "JF" in f.fname]
stack = perform_image_stack_sum(
files,
channel=detector,
lower_cutoff_threshold=lower_cutoff_threshold,
upper_cutoff_threshold=upper_cutoff_threshold,
)
stacks.append(stack)
# TODO: define roi for I0 detector
stack_I0 = perform_image_roi_crop(
files,
channel=channels.JF_I0,
lower_cutoff=2,
)
I0_norm = np.sum(stack_I0, axis=(0, 1, 2))
I0_norms.append(I0_norm)
return np.array(stacks), np.array(I0_norms)
@memory.cache(ignore=["batch_size"]) # we ignore batch_size for caching purposes
def calculate_image_histograms(
filesets,
channel="JF16T03V01",
alignment_channels=None,
batch_size=10,
roi: Optional[ROI] = None,
preview=False,
lower_cutoff_threshold=None,
bins=None,
):
"""
Calculates a histogram for a given region of interest (roi)
for an image channel from a fileset (e.g. "run0352/data/acq0001.*.h5" or step.fnames from a SFScanInfo object).
Allows alignment, i.e. reducing only to a common subset with other channels.
Calculations are performed in batches to reduce maximum memory requirements.
Preview only applies calculation to first batch and returns.
Returns:
(histogram, bins)
"""
with SFDataFiles(*filesets) as data:
if alignment_channels is not None:
channels = [channel] + [ch for ch in alignment_channels]
else:
channels = [channel]
subset = data[channels]
subset.drop_missing()
Images = subset[channel]
# create empty array for stack sum with right shape
im = Images[0]
if roi is None:
im_ROI = im[:]
else:
im_ROI = im[roi.rows, roi.cols]
summed = np.zeros(im_ROI[0].shape)
if bins is None:
for image_slice in Images.in_batches(batch_size):
index_slice, im = image_slice
if roi is None:
im_ROI = im
else:
im_ROI = im[:, roi.rows, roi.cols]
if lower_cutoff_threshold is not None:
im_ROI = np.where(im_ROI < lower_cutoff_threshold, 0, im_ROI)
bins = np.histogram_bin_edges(im.flatten(), bins="auto")
# only return first batch to calculate bins
break
all_hist = np.zeros(len(bins) - 1)
for image_slice in Images.in_batches(batch_size):
index_slice, im = image_slice
if roi is None:
im_ROI = im
else:
im_ROI = im[:, roi.rows, roi.cols]
if lower_cutoff_threshold is not None:
im_ROI = np.where(im_ROI < lower_cutoff_threshold, 0, im_ROI)
if bins is None:
bins = np.histogram_bin_edges(im.flatten(), bins="auto")
summed = summed + np.sum(im_ROI, axis=(0))
hist, _ = np.histogram(im.flatten(), bins=bins)
all_hist += hist
# only return first batch
if preview:
break
return all_hist, bins
def fit_2d_gaussian(image, roi: Optional[ROI] = None, plot=False): def fit_2d_gaussian(image, roi: Optional[ROI] = None, plot=False):
@@ -204,7 +489,7 @@ def fit_2d_gaussian(image, roi: Optional[ROI] = None, plot=False):
z = im.ravel() # and this also as a 1D z = im.ravel() # and this also as a 1D
model = lmfit.models.Gaussian2dModel() model = lmfit.models.Gaussian2dModel()
params = model.guess(z, x, y) params = model.guess(z.astype("float"), x.astype("float"), y.astype("float"))
result = model.fit( result = model.fit(
z, z,
x=x, x=x,
@@ -235,7 +520,7 @@ def _plot_2d_gaussian_fit(im, z, model, result):
"""Plot helper function to use the current image data, model and fit result and """Plot helper function to use the current image data, model and fit result and
plots them together. plots them together.
""" """
from matplotlib import pyplot as plt
from scipy.interpolate import griddata from scipy.interpolate import griddata
len_y, len_x = im.shape len_y, len_x = im.shape
@@ -264,9 +549,7 @@ def _plot_2d_gaussian_fit(im, z, model, result):
ax = axs[1, 0] ax = axs[1, 0]
fit = model.func(X, Y, **result.best_values) fit = model.func(X, Y, **result.best_values)
art = ax.pcolormesh( art = ax.pcolormesh(X, Y, Z - fit, vmin=-0.05 * vmax, vmax=0.05 * vmax, cmap="gray", shading="auto")
X, Y, Z - fit, vmin=-0.05 * vmax, vmax=0.05 * vmax, cmap="gray", shading="auto"
)
fig.colorbar(art, ax=ax, label="z", shrink=0.5) fig.colorbar(art, ax=ax, label="z", shrink=0.5)
ax.set_title("Data - Fit") ax.set_title("Data - Fit")
@@ -401,7 +684,43 @@ def fit_2d_gaussian_rotated(
center_x = result.params["centerx"].value center_x = result.params["centerx"].value
center_y = result.params["centery"].value center_y = result.params["centery"].value
if plot == True: if plot:
_plot_2d_gaussian_fit(im, z, mod, result) _plot_2d_gaussian_fit(im, z, mod, result)
return center_x, center_y, result return center_x, center_y, result
def fit_1d_gaussian(x, y, use_offset=True, ax=None, print_results=False):
"""
1D-Gaussian fit with optional constant offset using LMFIT.
Uses a heuristic guess for initial parameters.
Returns: lmfit.model.ModelResult
"""
peak = lmfit.models.GaussianModel()
offset = lmfit.models.ConstantModel()
model = peak + offset
if use_offset:
pars = offset.make_params(c=np.median(y))
else:
pars = offset.make_params(c=0)
pars["c"].vary = False
pars += peak.guess(y, x, amplitude=(np.max(y) - np.min(y)) / 2)
result = model.fit(
y,
pars,
x=x,
)
if print_results:
print(result.fit_report())
if ax is not None:
ax.plot(x, result.best_fit, label="fit")
return result

36
src/cristallina/channels.py Executable file
View File

@@ -0,0 +1,36 @@
# Machine parameters
GASMONITOR = 'SARFE10-PBIG050-EVR0:CALCI'
PULSE_E_AVG = 'SARFE10-PBPG050:PHOTON-ENERGY-PER-PULSE-AVG'
# Event receiver BS channel
EVR = 'SAR-CVME-TIFALL6:EvtSet'
# Jungfrau 1.5M detector
JF = 'JF16T03V02'
# Jungfrau 8M detector
JF_8M = "JF17T16V01"
# JF I0 monitor and stripsel detector
JF_I0 = 'JF20T01V01'
JF_strip = 'JF05T01V01'
# PSSS
PSSS_X = 'SARFE10-PSSS059:SPECTRUM_X'
PSSS_Y = 'SARFE10-PSSS059:SPECTRUM_Y'
PSSS_SUM = 'SARFE10-PSSS059:SPECTRUM_Y_SUM'
PSSS_COM = 'SARFE10-PSSS059:SPECT-COM'
# Beam position monitors
PBPS149_intensity = "SAROP31-PBPS149:INTENSITY_UJ"
PBPS113_intensity = "SAROP31-PBPS113:INTENSITY_UJ"
PBPS053_intensity = "SARFE10-PBPS053:INTENSITY"
# X-ray eye
XEYE = "SARES30-CAMS156-XE:FPICTURE"
# Detector Up/Down stage
TDY = "SARES30-MOBI2:MOT_Z.RBV"
# Monochromator energy
MONO_ENERGY = "SAROP31-ODCC110:MOT_ENY.RBV"

0
src/cristallina/config.py Normal file → Executable file
View File

View File

@@ -0,0 +1,532 @@
#### MATPLOTLIBRC FORMAT
## ***************************************************************************
## * LINES *
## ***************************************************************************
## See https://matplotlib.org/stable/api/artist_api.html#module-matplotlib.lines
## for more information on line properties.
#lines.linewidth: 1.5 # line width in points
#lines.linestyle: - # solid line
#lines.color: C0 # has no affect on plot(); see axes.prop_cycle
#lines.marker: None # the default marker
#lines.markerfacecolor: auto # the default marker face color
#lines.markeredgecolor: auto # the default marker edge color
#lines.markeredgewidth: 1.0 # the line width around the marker symbol
#lines.markersize: 6 # marker size, in points
#lines.dash_joinstyle: round # {miter, round, bevel}
#lines.dash_capstyle: butt # {butt, round, projecting}
#lines.solid_joinstyle: round # {miter, round, bevel}
#lines.solid_capstyle: projecting # {butt, round, projecting}
#lines.antialiased: True # render lines in antialiased (no jaggies)
## The three standard dash patterns. These are scaled by the linewidth.
#lines.dashed_pattern: 3.7, 1.6
#lines.dashdot_pattern: 6.4, 1.6, 1, 1.6
#lines.dotted_pattern: 1, 1.65
#lines.scale_dashes: True
#markers.fillstyle: full # {full, left, right, bottom, top, none}
#pcolor.shading: auto
#pcolormesh.snap: True # Whether to snap the mesh to pixel boundaries. This is
# provided solely to allow old test images to remain
# unchanged. Set to False to obtain the previous behavior.
## ***************************************************************************
## * PATCHES *
## ***************************************************************************
## Patches are graphical objects that fill 2D space, like polygons or circles.
## See https://matplotlib.org/stable/api/artist_api.html#module-matplotlib.patches
## for more information on patch properties.
#patch.linewidth: 1.0 # edge width in points.
#patch.facecolor: C0
#patch.edgecolor: black # if forced, or patch is not filled
#patch.force_edgecolor: False # True to always use edgecolor
#patch.antialiased: True # render patches in antialiased (no jaggies)
## ***************************************************************************
## * HATCHES *
## ***************************************************************************
#hatch.color: black
#hatch.linewidth: 1.0
## ***************************************************************************
## * FONT *
## ***************************************************************************
## The font properties used by `text.Text`.
## See https://matplotlib.org/stable/api/font_manager_api.html for more information
## on font properties. The 6 font properties used for font matching are
## given below with their default values.
##
## The font.family property can take either a single or multiple entries of any
## combination of concrete font names (not supported when rendering text with
## usetex) or the following five generic values:
## - 'serif' (e.g., Times),
## - 'sans-serif' (e.g., Helvetica),
## - 'cursive' (e.g., Zapf-Chancery),
## - 'fantasy' (e.g., Western), and
## - 'monospace' (e.g., Courier).
## Each of these values has a corresponding default list of font names
## (font.serif, etc.); the first available font in the list is used. Note that
## for font.serif, font.sans-serif, and font.monospace, the first element of
## the list (a DejaVu font) will always be used because DejaVu is shipped with
## Matplotlib and is thus guaranteed to be available; the other entries are
## left as examples of other possible values.
##
## The font.style property has three values: normal (or roman), italic
## or oblique. The oblique style will be used for italic, if it is not
## present.
##
## The font.variant property has two values: normal or small-caps. For
## TrueType fonts, which are scalable fonts, small-caps is equivalent
## to using a font size of 'smaller', or about 83 % of the current font
## size.
##
## The font.weight property has effectively 13 values: normal, bold,
## bolder, lighter, 100, 200, 300, ..., 900. Normal is the same as
## 400, and bold is 700. bolder and lighter are relative values with
## respect to the current weight.
##
## The font.stretch property has 11 values: ultra-condensed,
## extra-condensed, condensed, semi-condensed, normal, semi-expanded,
## expanded, extra-expanded, ultra-expanded, wider, and narrower. This
## property is not currently implemented.
##
## The font.size property is the default font size for text, given in points.
## 10 pt is the standard value.
##
## Note that font.size controls default text sizes. To configure
## special text sizes tick labels, axes, labels, title, etc., see the rc
## settings for axes and ticks. Special text sizes can be defined
## relative to font.size, using the following values: xx-small, x-small,
## small, medium, large, x-large, xx-large, larger, or smaller
font.family: sans-serif
#font.style: normal
#font.variant: normal
#font.weight: normal
#font.stretch: normal
#font.size: 10.0
#font.serif: DejaVu Serif, Bitstream Vera Serif, Computer Modern Roman, New Century Schoolbook, Century Schoolbook L, Utopia, ITC Bookman, Bookman, Nimbus Roman No9 L, Times New Roman, Times, Palatino, Charter, serif
font.sans-serif: Roboto, DejaVu Sans, Bitstream Vera Sans, Computer Modern Sans Serif, Lucida Grande, Verdana, Geneva, Lucid, Arial, Helvetica, Avant Garde, sans-serif
#font.cursive: Apple Chancery, Textile, Zapf Chancery, Sand, Script MT, Felipa, Comic Neue, Comic Sans MS, cursive
#font.fantasy: Chicago, Charcoal, Impact, Western, xkcd script, fantasy
#font.monospace: DejaVu Sans Mono, Bitstream Vera Sans Mono, Computer Modern Typewriter, Andale Mono, Nimbus Mono L, Courier New, Courier, Fixed, Terminal, monospace
## ***************************************************************************
## * TEXT *
## ***************************************************************************
## The text properties used by `text.Text`.
## See https://matplotlib.org/stable/api/artist_api.html#module-matplotlib.text
## for more information on text properties
#text.color: black
## FreeType hinting flag ("foo" corresponds to FT_LOAD_FOO); may be one of the
## following (Proprietary Matplotlib-specific synonyms are given in parentheses,
## but their use is discouraged):
## - default: Use the font's native hinter if possible, else FreeType's auto-hinter.
## ("either" is a synonym).
## - no_autohint: Use the font's native hinter if possible, else don't hint.
## ("native" is a synonym.)
## - force_autohint: Use FreeType's auto-hinter. ("auto" is a synonym.)
## - no_hinting: Disable hinting. ("none" is a synonym.)
#text.hinting: force_autohint
#text.hinting_factor: 8 # Specifies the amount of softness for hinting in the
# horizontal direction. A value of 1 will hint to full
# pixels. A value of 2 will hint to half pixels etc.
#text.kerning_factor: 0 # Specifies the scaling factor for kerning values. This
# is provided solely to allow old test images to remain
# unchanged. Set to 6 to obtain previous behavior.
# Values other than 0 or 6 have no defined meaning.
#text.antialiased: True # If True (default), the text will be antialiased.
# This only affects raster outputs.
#text.parse_math: True # Use mathtext if there is an even number of unescaped
# dollar signs.
## ***************************************************************************
## * LaTeX *
## ***************************************************************************
## For more information on LaTeX properties, see
## https://matplotlib.org/stable/users/explain/text/usetex.html
#text.usetex: False # use latex for all text handling. The following fonts
# are supported through the usual rc parameter settings:
# new century schoolbook, bookman, times, palatino,
# zapf chancery, charter, serif, sans-serif, helvetica,
# avant garde, courier, monospace, computer modern roman,
# computer modern sans serif, computer modern typewriter
#text.latex.preamble: # IMPROPER USE OF THIS FEATURE WILL LEAD TO LATEX FAILURES
# AND IS THEREFORE UNSUPPORTED. PLEASE DO NOT ASK FOR HELP
# IF THIS FEATURE DOES NOT DO WHAT YOU EXPECT IT TO.
# text.latex.preamble is a single line of LaTeX code that
# will be passed on to the LaTeX system. It may contain
# any code that is valid for the LaTeX "preamble", i.e.
# between the "\documentclass" and "\begin{document}"
# statements.
# Note that it has to be put on a single line, which may
# become quite long.
# The following packages are always loaded with usetex,
# so beware of package collisions:
# geometry, inputenc, type1cm.
# PostScript (PSNFSS) font packages may also be
# loaded, depending on your font settings.
## The following settings allow you to select the fonts in math mode.
#mathtext.fontset: dejavusans # Should be 'dejavusans' (default),
# 'dejavuserif', 'cm' (Computer Modern), 'stix',
# 'stixsans' or 'custom'
## "mathtext.fontset: custom" is defined by the mathtext.bf, .cal, .it, ...
## settings which map a TeX font name to a fontconfig font pattern. (These
## settings are not used for other font sets.)
#mathtext.bf: sans:bold
#mathtext.bfit: sans:italic:bold
#mathtext.cal: cursive
#mathtext.it: sans:italic
#mathtext.rm: sans
#mathtext.sf: sans
#mathtext.tt: monospace
#mathtext.fallback: cm # Select fallback font from ['cm' (Computer Modern), 'stix'
# 'stixsans'] when a symbol cannot be found in one of the
# custom math fonts. Select 'None' to not perform fallback
# and replace the missing character by a dummy symbol.
#mathtext.default: it # The default font to use for math.
# Can be any of the LaTeX font names, including
# the special name "regular" for the same font
# used in regular text.
## ***************************************************************************
## * AXES *
## ***************************************************************************
## Following are default face and edge colors, default tick sizes,
## default font sizes for tick labels, and so on. See
## https://matplotlib.org/stable/api/axes_api.html#module-matplotlib.axes
#axes.facecolor: white # axes background color
#axes.edgecolor: black # axes edge color
#axes.linewidth: 0.8 # edge line width
#axes.grid: False # display grid or not
#axes.grid.axis: both # which axis the grid should apply to
#axes.grid.which: major # grid lines at {major, minor, both} ticks
#axes.titlelocation: center # alignment of the title: {left, right, center}
#axes.titlesize: large # font size of the axes title
#axes.titleweight: normal # font weight of title
#axes.titlecolor: auto # color of the axes title, auto falls back to
# text.color as default value
#axes.titley: None # position title (axes relative units). None implies auto
#axes.titlepad: 6.0 # pad between axes and title in points
#axes.labelsize: medium # font size of the x and y labels
#axes.labelpad: 4.0 # space between label and axis
#axes.labelweight: normal # weight of the x and y labels
#axes.labelcolor: black
#axes.axisbelow: line # draw axis gridlines and ticks:
# - below patches (True)
# - above patches but below lines ('line')
# - above all (False)
#axes.formatter.limits: -5, 6 # use scientific notation if log10
# of the axis range is smaller than the
# first or larger than the second
#axes.formatter.use_locale: False # When True, format tick labels
# according to the user's locale.
# For example, use ',' as a decimal
# separator in the fr_FR locale.
#axes.formatter.use_mathtext: False # When True, use mathtext for scientific
# notation.
#axes.formatter.min_exponent: 0 # minimum exponent to format in scientific notation
#axes.formatter.useoffset: True # If True, the tick label formatter
# will default to labeling ticks relative
# to an offset when the data range is
# small compared to the minimum absolute
# value of the data.
#axes.formatter.offset_threshold: 4 # When useoffset is True, the offset
# will be used when it can remove
# at least this number of significant
# digits from tick labels.
#axes.spines.left: True # display axis spines
#axes.spines.bottom: True
#axes.spines.top: True
#axes.spines.right: True
#axes.unicode_minus: True # use Unicode for the minus symbol rather than hyphen. See
# https://en.wikipedia.org/wiki/Plus_and_minus_signs#Character_codes
#axes.prop_cycle: cycler('color', ['1f77b4', 'ff7f0e', '2ca02c', 'd62728', '9467bd', '8c564b', 'e377c2', '7f7f7f', 'bcbd22', '17becf'])
# color cycle for plot lines as list of string color specs:
# single letter, long name, or web-style hex
# As opposed to all other parameters in this file, the color
# values must be enclosed in quotes for this parameter,
# e.g. '1f77b4', instead of 1f77b4.
# See also https://matplotlib.org/stable/users/explain/artists/color_cycle.html
# for more details on prop_cycle usage.
#axes.xmargin: .05 # x margin. See `axes.Axes.margins`
#axes.ymargin: .05 # y margin. See `axes.Axes.margins`
#axes.zmargin: .05 # z margin. See `axes.Axes.margins`
#axes.autolimit_mode: data # If "data", use axes.xmargin and axes.ymargin as is.
# If "round_numbers", after application of margins, axis
# limits are further expanded to the nearest "round" number.
#polaraxes.grid: True # display grid on polar axes
#axes3d.grid: True # display grid on 3D axes
#axes3d.xaxis.panecolor: (0.95, 0.95, 0.95, 0.5) # background pane on 3D axes
#axes3d.yaxis.panecolor: (0.90, 0.90, 0.90, 0.5) # background pane on 3D axes
#axes3d.zaxis.panecolor: (0.925, 0.925, 0.925, 0.5) # background pane on 3D axes
## ***************************************************************************
## * AXIS *
## ***************************************************************************
#xaxis.labellocation: center # alignment of the xaxis label: {left, right, center}
#yaxis.labellocation: center # alignment of the yaxis label: {bottom, top, center}
## ***************************************************************************
## * DATES *
## ***************************************************************************
## These control the default format strings used in AutoDateFormatter.
## Any valid format datetime format string can be used (see the python
## `datetime` for details). For example, by using:
## - '%x' will use the locale date representation
## - '%X' will use the locale time representation
## - '%c' will use the full locale datetime representation
## These values map to the scales:
## {'year': 365, 'month': 30, 'day': 1, 'hour': 1/24, 'minute': 1 / (24 * 60)}
#date.autoformatter.year: %Y
#date.autoformatter.month: %Y-%m
#date.autoformatter.day: %Y-%m-%d
#date.autoformatter.hour: %m-%d %H
#date.autoformatter.minute: %d %H:%M
#date.autoformatter.second: %H:%M:%S
#date.autoformatter.microsecond: %M:%S.%f
## The reference date for Matplotlib's internal date representation
## See https://matplotlib.org/stable/gallery/ticks/date_precision_and_epochs.html
#date.epoch: 1970-01-01T00:00:00
## 'auto', 'concise':
#date.converter: auto
## For auto converter whether to use interval_multiples:
#date.interval_multiples: True
## ***************************************************************************
## * TICKS *
## ***************************************************************************
## See https://matplotlib.org/stable/api/axis_api.html#matplotlib.axis.Tick
#xtick.top: False # draw ticks on the top side
#xtick.bottom: True # draw ticks on the bottom side
#xtick.labeltop: True # draw label on the top
#xtick.labelbottom: True # draw label on the bottom
#xtick.major.size: 3.5 # major tick size in points
#xtick.minor.size: 2 # minor tick size in points
#xtick.major.width: 0.8 # major tick width in points
#xtick.minor.width: 0.6 # minor tick width in points
#xtick.major.pad: 3.5 # distance to major tick label in points
#xtick.minor.pad: 3.4 # distance to the minor tick label in points
#xtick.color: black # color of the ticks
#xtick.labelcolor: inherit # color of the tick labels or inherit from xtick.color
#xtick.labelsize: medium # font size of the tick labels
xtick.direction: in # direction: {in, out, inout}
#xtick.minor.visible: False # visibility of minor ticks on x-axis
xtick.major.top: True # draw x axis top major ticks
xtick.major.bottom: True # draw x axis bottom major ticks
#xtick.minor.top: True # draw x axis top minor ticks
#xtick.minor.bottom: True # draw x axis bottom minor ticks
#xtick.minor.ndivs: auto # number of minor ticks between the major ticks on x-axis
#xtick.alignment: center # alignment of xticks
ytick.left: True # draw ticks on the left side
ytick.right: True # draw ticks on the right side
#ytick.labelleft: True # draw tick labels on the left side
#ytick.labelright: False # draw tick labels on the right side
#ytick.major.size: 3.5 # major tick size in points
#ytick.minor.size: 2 # minor tick size in points
#ytick.major.width: 0.8 # major tick width in points
#ytick.minor.width: 0.6 # minor tick width in points
#ytick.major.pad: 3.5 # distance to major tick label in points
#ytick.minor.pad: 3.4 # distance to the minor tick label in points
#ytick.color: black # color of the ticks
#ytick.labelcolor: inherit # color of the tick labels or inherit from ytick.color
#ytick.labelsize: medium # font size of the tick labels
ytick.direction: in # direction: {in, out, inout}
#ytick.minor.visible: False # visibility of minor ticks on y-axis
#ytick.major.left: True # draw y axis left major ticks
#ytick.major.right: True # draw y axis right major ticks
#ytick.minor.left: True # draw y axis left minor ticks
#ytick.minor.right: True # draw y axis right minor ticks
#ytick.minor.ndivs: auto # number of minor ticks between the major ticks on y-axis
#ytick.alignment: center_baseline # alignment of yticks
## ***************************************************************************
## * GRIDS *
## ***************************************************************************
#grid.color: "#b0b0b0" # grid color
#grid.linestyle: - # solid
#grid.linewidth: 0.8 # in points
#grid.alpha: 1.0 # transparency, between 0.0 and 1.0
## ***************************************************************************
## * LEGEND *
## ***************************************************************************
#legend.loc: best
#legend.frameon: True # if True, draw the legend on a background patch
legend.framealpha: 1 # legend patch transparency
#legend.facecolor: inherit # inherit from axes.facecolor; or color spec
#legend.edgecolor: 0.8 # background patch boundary color
#legend.fancybox: True # if True, use a rounded box for the
# legend background, else a rectangle
#legend.shadow: False # if True, give background a shadow effect
#legend.numpoints: 1 # the number of marker points in the legend line
#legend.scatterpoints: 1 # number of scatter points
#legend.markerscale: 1.0 # the relative size of legend markers vs. original
legend.fontsize: small
#legend.labelcolor: None
#legend.title_fontsize: None # None sets to the same as the default axes.
## Dimensions as fraction of font size:
#legend.borderpad: 0.4 # border whitespace
#legend.labelspacing: 0.5 # the vertical space between the legend entries
#legend.handlelength: 2.0 # the length of the legend lines
#legend.handleheight: 0.7 # the height of the legend handle
#legend.handletextpad: 0.8 # the space between the legend line and legend text
#legend.borderaxespad: 0.5 # the border between the axes and legend edge
#legend.columnspacing: 2.0 # column separation
## ***************************************************************************
## * FIGURE *
## ***************************************************************************
## See https://matplotlib.org/stable/api/figure_api.html#matplotlib.figure.Figure
#figure.titlesize: large # size of the figure title (``Figure.suptitle()``)
#figure.titleweight: normal # weight of the figure title
#figure.labelsize: large # size of the figure label (``Figure.sup[x|y]label()``)
#figure.labelweight: normal # weight of the figure label
#figure.figsize: 6.4, 4.8 # figure size in inches
figure.dpi: 110 # figure dots per inch
#figure.facecolor: white # figure face color
#figure.edgecolor: white # figure edge color
#figure.frameon: True # enable figure frame
#figure.max_open_warning: 20 # The maximum number of figures to open through
# the pyplot interface before emitting a warning.
# If less than one this feature is disabled.
#figure.raise_window : True # Raise the GUI window to front when show() is called.
## The figure subplot parameters. All dimensions are a fraction of the figure width and height.
#figure.subplot.left: 0.125 # the left side of the subplots of the figure
#figure.subplot.right: 0.9 # the right side of the subplots of the figure
#figure.subplot.bottom: 0.11 # the bottom of the subplots of the figure
#figure.subplot.top: 0.88 # the top of the subplots of the figure
#figure.subplot.wspace: 0.2 # the amount of width reserved for space between subplots,
# expressed as a fraction of the average axis width
#figure.subplot.hspace: 0.2 # the amount of height reserved for space between subplots,
# expressed as a fraction of the average axis height
## Figure layout
#figure.autolayout: False # When True, automatically adjust subplot
# parameters to make the plot fit the figure
# using `tight_layout`
figure.constrained_layout.use: True # When True, automatically make plot
# elements fit on the figure. (Not
# compatible with `autolayout`, above).
## Padding (in inches) around axes; defaults to 3/72 inches, i.e. 3 points.
#figure.constrained_layout.h_pad: 0.04167
#figure.constrained_layout.w_pad: 0.04167
## Spacing between subplots, relative to the subplot sizes. Much smaller than for
## tight_layout (figure.subplot.hspace, figure.subplot.wspace) as constrained_layout
## already takes surrounding texts (titles, labels, # ticklabels) into account.
#figure.constrained_layout.hspace: 0.02
#figure.constrained_layout.wspace: 0.02
## ***************************************************************************
## * IMAGES *
## ***************************************************************************
#image.aspect: equal # {equal, auto} or a number
#image.interpolation: antialiased # see help(imshow) for options
#image.cmap: viridis # A colormap name (plasma, magma, etc.)
#image.lut: 256 # the size of the colormap lookup table
#image.origin: upper # {lower, upper}
#image.resample: True
#image.composite_image: True # When True, all the images on a set of axes are
# combined into a single composite image before
# saving a figure as a vector graphics file,
# such as a PDF.
## ***************************************************************************
## * CONTOUR PLOTS *
## ***************************************************************************
#contour.negative_linestyle: dashed # string or on-off ink sequence
#contour.corner_mask: True # {True, False}
#contour.linewidth: None # {float, None} Size of the contour line
# widths. If set to None, it falls back to
# `line.linewidth`.
#contour.algorithm: mpl2014 # {mpl2005, mpl2014, serial, threaded}
## ***************************************************************************
## * ERRORBAR PLOTS *
## ***************************************************************************
#errorbar.capsize: 0 # length of end cap on error bars in pixels
## ***************************************************************************
## * HISTOGRAM PLOTS *
## ***************************************************************************
#hist.bins: 10 # The default number of histogram bins or 'auto'.
## ***************************************************************************
## * SCATTER PLOTS *
## ***************************************************************************
#scatter.marker: o # The default marker type for scatter plots.
#scatter.edgecolors: face # The default edge colors for scatter plots.
## ***************************************************************************
## * PATHS *
## ***************************************************************************
#path.simplify: True # When True, simplify paths by removing "invisible"
# points to reduce file size and increase rendering
# speed
#path.simplify_threshold: 0.111111111111 # The threshold of similarity below
# which vertices will be removed in
# the simplification process.
#path.snap: True # When True, rectilinear axis-aligned paths will be snapped
# to the nearest pixel when certain criteria are met.
# When False, paths will never be snapped.
#path.sketch: None # May be None, or a 3-tuple of the form:
# (scale, length, randomness).
# - *scale* is the amplitude of the wiggle
# perpendicular to the line (in pixels).
# - *length* is the length of the wiggle along the
# line (in pixels).
# - *randomness* is the factor by which the length is
# randomly scaled.
#path.effects:
## ***************************************************************************
## * SAVING FIGURES *
## ***************************************************************************
## The default savefig parameters can be different from the display parameters
## e.g., you may want a higher resolution, or to make the figure
## background white
#savefig.dpi: figure # figure dots per inch or 'figure'
#savefig.facecolor: auto # figure face color when saving
#savefig.edgecolor: auto # figure edge color when saving
#savefig.format: png # {png, ps, pdf, svg}
#savefig.bbox: standard # {tight, standard}
# 'tight' is incompatible with generating frames
# for animation
#savefig.pad_inches: 0.1 # padding to be used, when bbox is set to 'tight'
#savefig.directory: ~ # default directory in savefig dialog, gets updated after
# interactive saves, unless set to the empty string (i.e.
# the current directory); use '.' to start at the current
# directory but update after interactive saves
#savefig.transparent: False # whether figures are saved with a transparent
# background by default
#savefig.orientation: portrait # orientation of saved figure, for PostScript output only

46
src/cristallina/image.py Executable file
View File

@@ -0,0 +1,46 @@
import numpy as np
import matplotlib
from matplotlib import pyplot as plt
from skimage import exposure
from skimage.filters import rank
from skimage.morphology import disk
def plot_image(image, norm='linear', cmap=plt.cm.viridis, title="Image", ax=None):
""" Plots an image (array-like or PIL image) using some default settings.
"""
if ax is None:
fig, ax = plt.subplots(constrained_layout=True, dpi=150)
ax.imshow(image, origin='lower', cmap=plt.cm.viridis, norm=norm)
ax.set_title(title)
def enhance_image(image, algorithm='autolevel', radius=5):
""" Enhanced a given image (2d array) using one out of
'equalize_hist'
'entropy'
'autolevel'
'global_equalize'
algorithms with a given (pixel) radius.
"""
arr_norm = np.clip(image, 0, np.max(image))
arr_norm = image/(np.max(image)-np.min(image))
if algorithm == 'equalize_hist':
img_algo = exposure.equalize_adapthist(arr_norm, kernel_size=radius, clip_limit=0.99)
elif algorithm == 'entropy':
img_algo = rank.entropy(arr_norm, footprint=disk(radius*2))
elif algorithm == 'autolevel':
img_algo = rank.autolevel(arr_norm, disk(radius*2))
elif algorithm == 'global_equalize':
img_algo = rank.equalize(arr_norm, footprint=disk(radius*2))
return img_algo

View File

@@ -0,0 +1,60 @@
""" Helper module to address https://github.com/jupyterlab/jupyter-collaboration/issues/49 and similar issues.
To use:
jupyter lab --YDocExtension.ystore_class=cristallina.jupyter_helper.MySQLiteYStore
In 2.0.x this is replaced by:
# The Store class used for storing Y updates (default: SQLiteYStore).
jupyter lab --YDocExtension.ystore_class=jupyter_collaboration.stores.FileYStore
This should work on the local consoles and also on RA (yet untested).
"""
from packaging.version import Version
import random
import os
import jupyter_collaboration
if Version(jupyter_collaboration.__version__) < Version("1.9"):
from ypy_websocket.ystore import SQLiteYStore
elif Version(jupyter_collaboration.__version__) >= Version("2.0") and Version(jupyter_collaboration.__version__) < Version("3.0a1"):
# approach for >=2.0 but smaller than 3.0
from jupyter_collaboration.stores import SQLiteYStore
elif Version(jupyter_collaboration.__version__) >= Version("3.0a1"):
# approach for >=3.0
from jupyter_server_ydoc.stores import SQLiteYStore
class MySQLiteYStore(SQLiteYStore):
"""
Custom SQL location for jupyter collaboration to avoid NFS locking issues.
"""
suffix = random.randint(int(1E9), int(9E9))
if os.path.isdir('/scratch'):
# We are on RA and should use the scratch space here
db_path = f"/scratch/ystore_{suffix}.db"
else:
db_path = f"/tmp/ystore_{suffix}.db"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.db_path = MySQLiteYStore.db_path
print(f"Using custom SQLiteYStore at {self.db_path}")
self.log.info(f"Using custom SQLiteYStore with path {self.db_path}")
if self.document_ttl is not None:
print(f"Using database_ttl: {self.document_ttl}")
self.log.info(f"Using database_ttl: {self.document_ttl}")
self.document_ttl = int(self.document_ttl)
def __del__(self):
# Delete the temporary database file upon shutdown
try:
if os.path.exists(self.db_path):
os.remove(self.db_path)
except OSError as e:
self.log.warning(f"Error when removing temporary database: {e.strerror}")

18
src/cristallina/names.py Executable file
View File

@@ -0,0 +1,18 @@
### Channels for slic
GAS_MONITOR = 'SARFE10-PBIG050-EVR0:CALCI'
PDIM113_INT = 'SAROP31-PBPS113:Lnk9Ch0-PP_VAL_PD4'
PSCD153_INT = 'SAROP31-PBPS149:Lnk9Ch0-PP_VAL_PD4'
PBPS053_INT = 'SARFE10-PBPS053:INTENSITY'
PBPS053_XPOS = 'SARFE10-PBPS053:XPOS'
PBPS053_YPOS = 'SARFE10-PBPS053:YPOS'
PBPS113_INT = 'SAROP31-PBPS113:INTENSITY'
PBPS113_XPOS = 'SAROP31-PBPS113:XPOS'
PBPS113_YPOS = 'SAROP31-PBPS113:YPOS'
PBPS149_INT = 'SAROP31-PBPS149:INTENSITY'
PBPS149_XPOS = 'SAROP31-PBPS149:XPOS'
PBPS149_YPOS = 'SAROP31-PBPS149:YPOS'

View File

@@ -0,0 +1,36 @@
import os
import json
from pathlib import Path
from collections import defaultdict, deque
from copy import copy
import time
import scipy
import numpy as np
import pandas as pd
from tqdm.notebook import tqdm
import re
from pathlib import Path
import warnings
from matplotlib import pyplot as plt
import matplotlib as mpl
import cristallina.utils as cu
import cristallina as cr
from loguru import logger
from cristallina.uscan import UnfinishedScan
from joblib import Parallel, delayed, Memory
import sfdata
from sfdata import SFProcFile, SFDataFile, SFDataFiles, SFScanInfo
memory = Memory(location="/sf/cristallina/data/p22478/work/joblib", compress=2, verbose=2)
pgroup = "p22478"
print(f"p22478 module imported.")

View File

@@ -0,0 +1,69 @@
import os
import json
from pathlib import Path
from collections import defaultdict, deque
from copy import copy
import time
import scipy
import numpy as np
import pandas as pd
from tqdm.notebook import tqdm
import re
from pathlib import Path
import warnings
from matplotlib import pyplot as plt
import matplotlib as mpl
import cristallina.utils as cu
import cristallina as cr
from loguru import logger
from cristallina.uscan import UnfinishedScan
from joblib import Parallel, delayed, Memory
import sfdata
from sfdata import SFProcFile, SFDataFile, SFDataFiles, SFScanInfo
memory = Memory(location="/sf/cristallina/data/p22760/work/joblib", compress=2, verbose=2)
pgroup = "p22760"
print(f"p22760 module imported.")
def get_theta_prediction(B_field):
"""Get predicted theta angle from B field using linear interpolation.
Args:
B_field (float): Magnetic field value.
"""
B = np.array([0. , 0. , 0.05 , 0.1 , 0.15 , 0.2 , 0.25 , 0.3 ,
0.35 , 0.4 , 0.5 , 0.525 , 0.53 , 0.55 , 0.555 , 0.56 ,
0.565 , 0.5675, 0.57 , 0.5725, 0.575 , 0.6 ])
left = np.array([-16.72402197, -16.72592595, -16.72266943, -16.71780642,
-16.70710748, -16.68879129, -16.67129814, -16.64430405,
-16.61258093, -16.57162955, -16.43958512, -16.36767873,
-16.35235521, -16.20000072, -16.14585335, -16.14986136,
-16.14953497, -16.15033841, -16.14907281, -16.14883662,
-16.14936827, -16.14544489])
right = np.array([-15.82641889, -15.82810495, -15.82810142, -15.83362565,
-15.84191596, -15.85030174, -15.86875095, -15.88716295,
-15.91257856, -15.94358668, -16.0436225 , -16.07455302,
-16.08374928, -16.13199655, -16.14585335, -16.14986136,
-16.14953497, -16.15033841, -16.14907281, -16.14883662,
-16.14936827, -16.14544489])
interpolation_right = scipy.interpolate.interp1d(B, right, kind='linear')
interpolation_left = scipy.interpolate.interp1d(B, left, kind='linear')
theta_right = interpolation_right(B_field)
theta_left = interpolation_left(B_field)
return theta_left, theta_right

View File

155
src/cristallina/plot.py Normal file → Executable file
View File

@@ -1,17 +1,17 @@
import re
from collections import defaultdict from collections import defaultdict
import matplotlib import matplotlib
import matplotlib as mpl
from matplotlib import pyplot as plt from matplotlib import pyplot as plt
import warnings import warnings
# because of https://github.com/kornia/kornia/issues/1425 # because of https://github.com/kornia/kornia/issues/1425
warnings.simplefilter("ignore", DeprecationWarning) warnings.simplefilter("ignore", DeprecationWarning)
import numpy as np import numpy as np
from tqdm import tqdm from tqdm import tqdm
from matplotlib import patches from matplotlib import patches
from pathlib import Path from pathlib import Path
from sfdata import SFDataFiles, sfdatafile, SFScanInfo from sfdata import SFDataFiles, sfdatafile, SFScanInfo
@@ -20,29 +20,37 @@ import jungfrau_utils as ju
from . import utils from . import utils
from .utils import ROI from .utils import ROI
# setup style sheet
plt.style.use("cristallina.cristallina_style")
def ju_patch_less_verbose(ju_module): def ju_patch_less_verbose(ju_module):
"""Quick monkey patch to suppress verbose messages from gain & pedestal file searcher.""" """Quick monkey patch to suppress verbose messages from gain & pedestal file searcher.
ju_module.swissfel_helpers._locate_gain_file = ju_module.swissfel_helpers.locate_gain_file Not anymore required for newer versions of ju."""
ju_module.swissfel_helpers._locate_pedestal_file = ju_module.swissfel_helpers.locate_pedestal_file
def less_verbose_gain(*args, **kwargs): if hasattr(ju_module, "swissfel_helpers"):
kwargs["verbose"] = False
return ju_module.swissfel_helpers._locate_gain_file(*args, **kwargs)
def less_verbose_pedestal(*args, **kwargs): ju_module.swissfel_helpers._locate_gain_file = ju_module.swissfel_helpers.locate_gain_file
kwargs["verbose"] = False ju_module.swissfel_helpers._locate_pedestal_file = ju_module.swissfel_helpers.locate_pedestal_file
return ju_module.swissfel_helpers._locate_pedestal_file(*args, **kwargs)
# ju_module.swissfel_helpers.locate_gain_file = less_verbose_gain def less_verbose_gain(*args, **kwargs):
# ju_module.swissfel_helpers.locate_pedestal_file = less_verbose_pedestal kwargs["verbose"] = False
return ju_module.swissfel_helpers._locate_gain_file(*args, **kwargs)
ju_module.file_adapter.locate_gain_file = less_verbose_gain def less_verbose_pedestal(*args, **kwargs):
ju_module.file_adapter.locate_pedestal_file = less_verbose_pedestal kwargs["verbose"] = False
return ju_module.swissfel_helpers._locate_pedestal_file(*args, **kwargs)
# ju_module.swissfel_helpers.locate_gain_file = less_verbose_gain
# ju_module.swissfel_helpers.locate_pedestal_file = less_verbose_pedestal
ju_module.file_adapter.locate_gain_file = less_verbose_gain
ju_module.file_adapter.locate_pedestal_file = less_verbose_pedestal
ju_patch_less_verbose(ju) ju_patch_less_verbose(ju)
def plot_correlation(x, y, ax=None, **ax_kwargs): def plot_correlation(x, y, ax=None, **ax_kwargs):
""" """
Plots the correlation of x and y in a normalized scatterplot. Plots the correlation of x and y in a normalized scatterplot.
@@ -59,7 +67,7 @@ def plot_correlation(x, y, ax=None, **ax_kwargs):
ynorm = (y - np.mean(y)) / ystd ynorm = (y - np.mean(y)) / ystd
n = len(y) n = len(y)
r = 1 / (n) * sum(xnorm * ynorm) r = 1 / (n) * sum(xnorm * ynorm)
if ax is None: if ax is None:
@@ -73,10 +81,11 @@ def plot_correlation(x, y, ax=None, **ax_kwargs):
return ax, r return ax, r
def plot_channel(data : SFDataFiles, channel_name, ax=None):
""" def plot_channel(data: SFDataFiles, channel_name, ax=None):
Plots a given channel from an SFDataFiles object. """
Plots a given channel from an SFDataFiles object.
Optionally: a matplotlib axis to plot into Optionally: a matplotlib axis to plot into
""" """
@@ -95,7 +104,6 @@ def plot_channel(data : SFDataFiles, channel_name, ax=None):
def axis_styling(ax, channel_name, description): def axis_styling(ax, channel_name, description):
ax.set_title(channel_name) ax.set_title(channel_name)
# ax.set_xlabel('x') # ax.set_xlabel('x')
# ax.set_ylabel('a.u.') # ax.set_ylabel('a.u.')
@@ -110,7 +118,7 @@ def axis_styling(ax, channel_name, description):
) )
def plot_1d_channel(data : SFDataFiles, channel_name, ax=None): def plot_1d_channel(data: SFDataFiles, channel_name, ax=None):
""" """
Plots channel data for a channel that contains a single numeric value per pulse. Plots channel data for a channel that contains a single numeric value per pulse.
""" """
@@ -131,7 +139,7 @@ def plot_1d_channel(data : SFDataFiles, channel_name, ax=None):
axis_styling(ax, channel_name, description) axis_styling(ax, channel_name, description)
def plot_2d_channel(data : SFDataFiles, channel_name, ax=None): def plot_2d_channel(data: SFDataFiles, channel_name, ax=None):
""" """
Plots channel data for a channel that contains a 1d array of numeric values per pulse. Plots channel data for a channel that contains a 1d array of numeric values per pulse.
""" """
@@ -153,25 +161,39 @@ def plot_2d_channel(data : SFDataFiles, channel_name, ax=None):
axis_styling(ax, channel_name, description) axis_styling(ax, channel_name, description)
def plot_image_channel(data : SFDataFiles, channel_name, pulse=0, ax=None, rois=None, norms=None, log_colorscale=False): def plot_detector_image(
image_data,
title=None,
comment=None,
ax=None,
rois=None,
norms=None,
log_colorscale=False,
show_legend=True,
ax_colormap=matplotlib.colormaps["viridis"],
**fig_kw,
):
""" """
Plots channel data for a channel that contains an image (2d array) of numeric values per pulse. Plots channel data for a channel that contains an image (2d array) of numeric values per pulse.
Optional: Optional:
- rois: draw a rectangular patch for the given roi(s) - rois: draw a rectangular patch for the given roi(s)
- norms: [min, max] values for colormap - norms: [min, max] values for colormap
- log_colorscale: True for a logarithmic colormap - log_colorscale: True for a logarithmic colormap
- title: Title of the plot
- show_legend: True if the legend box should be drawn
- ax_colormap: a matplotlib colormap (viridis by default)
""" """
im = data[channel_name][pulse] im = image_data
def log_transform(z): def log_transform(z):
return np.log(np.clip(z, 1E-12, np.max(z))) return np.log(np.clip(z, 1e-12, np.max(z)))
if log_colorscale: if log_colorscale:
im = log_transform(im) im = log_transform(im)
if ax is None: if ax is None:
fig, ax = plt.subplots(constrained_layout=True) fig, ax = plt.subplots(constrained_layout=True, **fig_kw)
std = im.std() std = im.std()
mean = im.mean() mean = im.mean()
@@ -181,7 +203,8 @@ def plot_image_channel(data : SFDataFiles, channel_name, pulse=0, ax=None, rois=
else: else:
norm = matplotlib.colors.Normalize(vmin=norms[0], vmax=norms[1]) norm = matplotlib.colors.Normalize(vmin=norms[0], vmax=norms[1])
ax.imshow(im, norm=norm) ax.imshow(im, norm=norm, cmap=ax_colormap)
ax.invert_yaxis() ax.invert_yaxis()
if rois is not None: if rois is not None:
@@ -189,19 +212,42 @@ def plot_image_channel(data : SFDataFiles, channel_name, pulse=0, ax=None, rois=
for i, roi in enumerate(rois): for i, roi in enumerate(rois):
# Create a rectangle with ([bottom left corner coordinates], width, height) # Create a rectangle with ([bottom left corner coordinates], width, height)
rect = patches.Rectangle( rect = patches.Rectangle(
[roi.left, roi.bottom], roi.width, roi.height, [roi.left, roi.bottom],
linewidth=3, roi.width,
roi.height,
linewidth=2,
edgecolor=f"C{i}", edgecolor=f"C{i}",
facecolor="none", facecolor="none",
label=roi.name, label=roi.name,
) )
ax.add_patch(rect) ax.add_patch(rect)
description = f"mean: {mean:.2e},\nstd: {std:.2e}" if comment is not None:
axis_styling(ax, channel_name, description) description = f"{comment}\nmean: {mean:.2e},\nstd: {std:.2e}"
plt.legend(loc=4) else:
description = f"mean: {mean:.2e},\nstd: {std:.2e}"
def plot_spectrum_channel(data : SFDataFiles, channel_name_x, channel_name_y, average=True, pulse=0, ax=None): if not show_legend:
description = ""
axis_styling(ax, title, description)
def plot_image_channel(data: SFDataFiles, channel_name, pulse=0, ax=None, rois=None, norms=None, log_colorscale=False):
"""
Plots channel data for a channel that contains an image (2d array) of numeric values per pulse.
Optional:
- rois: draw a rectangular patch for the given roi(s)
- norms: [min, max] values for colormap
- log_colorscale: True for a logarithmic colormap
"""
image_data = data[channel_name][pulse]
plot_detector_image(image_data, title=channel_name, ax=ax, rois=rois, norms=norms, log_colorscale=log_colorscale)
def plot_spectrum_channel(data: SFDataFiles, channel_name_x, channel_name_y, average=True, pulse=0, ax=None):
""" """
Plots channel data for two channels where the first is taken as the (constant) x-axis Plots channel data for two channels where the first is taken as the (constant) x-axis
and the second as the y-axis (here we take by default the mean over the individual pulses). and the second as the y-axis (here we take by default the mean over the individual pulses).
@@ -217,12 +263,45 @@ def plot_spectrum_channel(data : SFDataFiles, channel_name_x, channel_name_y, av
y_data = mean_over_frames y_data = mean_over_frames
else: else:
y_data = data[channel_name_y].data[pulse] y_data = data[channel_name_y].data[pulse]
if ax is None: if ax is None:
fig, ax = plt.subplots(constrained_layout=True) fig, ax = plt.subplots(constrained_layout=True)
ax.plot(data[channel_name_x].data[0], y_data) ax.plot(data[channel_name_x].data[0], y_data)
description = None # f"mean: {mean:.2e},\nstd: {std:.2e}" description = None # f"mean: {mean:.2e},\nstd: {std:.2e}"
ax.set_xlabel(channel_name_x) ax.set_xlabel(channel_name_x)
axis_styling(ax, channel_name_y, description) axis_styling(ax, channel_name_y, description)
def line_plot_with_colorbar(xs,ys,colors, cmap=plt.cm.viridis,
markers='o',markersize=6,alpha=1,
title=None,xlabel=None,ylabel=None,cbar_label=None,
**fig_kw):
'''Plot lines with colorbar.
xs, ys -> array of arrays
colors -> array
'''
fig,ax = plt.subplots(1,1,constrained_layout=True,**fig_kw)
# normalise to [0..1]
norm = mpl.colors.Normalize(vmin=np.min(colors),vmax=np.max(colors))
# create a ScalarMappable and initialize a data structure
s_m = mpl.cm.ScalarMappable(cmap=cmap, norm=norm)
s_m.set_array([])
for x,y,col in zip(xs,ys,colors):
ax.plot(x,y,color=s_m.to_rgba(col),marker=markers,markersize=markersize,alpha=alpha)
if title:
plt.suptitle(title)
if xlabel:
ax.set_xlabel(xlabel)
if ylabel:
ax.set_ylabel(ylabel)
# add colorbar
fig.colorbar(s_m,ax=ax,ticks=colors,label=cbar_label,alpha=alpha)
def get_normalized_colormap(vmin, vmax, cmap=mpl.cm.viridis):
norm = mpl.colors.Normalize(vmin=vmin, vmax=vmax)
return lambda x: cmap(norm(x))

901
src/cristallina/rsm.py Executable file
View File

@@ -0,0 +1,901 @@
import re
from copy import copy
from collections import defaultdict, deque
from pathlib import Path
from typing import Optional
import logging
import time
import numpy as np
from matplotlib import pyplot as plt
from sfdata import SFDataFiles, sfdatafile, SFScanInfo
from joblib import Memory
from . import utils
from . import channels, analysis
from .utils import ROI
logger = logging.getLogger(__name__)
def setup_cachedirs(pgroup=None, cachedir=None):
"""
Sets the path to a persistent cache directory either from the given p-group (e.g. "p20841")
or an explicitly given directory.
If heuristics fail we use "/tmp" as a non-persistent alternative.
"""
if cachedir is not None:
# explicit directory given, use this choice
memory = Memory(cachedir, verbose=0, compress=2)
return memory
try:
if pgroup is None:
pgroup_no = utils.heuristic_extract_pgroup()
else:
parts = re.split(r"(\d.*)", pgroup) # ['p', '2343', '']
pgroup_no = parts[-2]
candidates = [f"/sf/cristallina/data/p{pgroup_no}/work",
f"/sf/cristallina/data/p{pgroup_no}/res",]
for cache_parent_dir in candidates:
if Path(cache_parent_dir).exists():
cachedir = Path(cache_parent_dir) / "cachedir"
break
except KeyError as e:
print(e)
cachedir = "/das/work/units/cristallina/p19739/cachedir"
try:
memory = Memory(cachedir, verbose=0, compress=2)
except PermissionError as e:
cachedir = "/tmp"
memory = Memory(cachedir, verbose=0, compress=2)
return memory
memory = setup_cachedirs()
@memory.cache() # we ignore batch_size for caching purposes
def RSM(run_number, sam, det, roi, offset=0, lower_cutoff_threshold=10, detector=channels.JF):
det = copy(det)
sam=copy(sam)
roi=copy(roi)
scan = utils.scan_info(run_number, use_uscan=True)
for i in range(2):
try:
channel_Exray='SAROP31-ODCC110:MOT_ENY.RBV'
energy=np.array(scan[0][channel_Exray])[1][0]
# detector 2 theta
channel_detangle='SARES31-GPS:ROT2THETA-BS'
tth=np.mean(np.array(scan[0][channel_detangle])[1])
break
except KeyError:
print("Waiting")
time.sleep(60)
readbacks=scan.readbacks
stacks, IO_norms = analysis.calculate_JF_stacks(scan,
lower_cutoff_threshold=lower_cutoff_threshold,
exclude_steps=None,
recompute=False,
detector=detector)
roi.values = np.array(stacks)[:, roi.rows, roi.cols]
roi.values = np.where(roi.values < lower_cutoff_threshold, 0, roi.values)
roi.values = roi.values/IO_norms[:,None, None]
roi.intensities = np.sum(roi.values[:], axis=(1, 2))
roi.intensities_normalized = roi.intensities/(roi.width*roi.height)
roi.intensities_I0 = roi.intensities/IO_norms
roicoord=[roi.left, roi.right, roi.bottom, roi.top]
coors_h_all=[]
coors_k_all=[]
coors_l_all=[]
inten_all=[]
det.move_angles(gamma=tth)
for i in range(len(readbacks)):
#
sam.rotate_sample(alpha=readbacks[i]+offset)
exp = Experiment(sam, det, energy)
# change the coordinates into h,k,l
hkl_temp=exp.ROI_hkl(roicoord)
if len(inten_all)==0:
coors_h_all=np.reshape(hkl_temp[0],-1)
coors_k_all=np.reshape(hkl_temp[1],-1)
coors_l_all=np.reshape(hkl_temp[2],-1)
inten_all=np.reshape(roi.values[i],-1)
else:
coors_h_all=np.concatenate((coors_h_all,np.reshape(hkl_temp[0],-1)))
coors_k_all=np.concatenate((coors_k_all,np.reshape(hkl_temp[1],-1)))
coors_l_all=np.concatenate((coors_l_all,np.reshape(hkl_temp[2],-1)))
inten_all=np.concatenate((inten_all,np.reshape(roi.values[i],-1)))
return coors_h_all, coors_k_all, coors_l_all, inten_all
@memory.cache()
def RSM_interpolate(coors_h_all, coors_k_all, coors_l_all, inten_all, hbound, kbound, gridsize):
lmin, lmax = coors_l_all.min(),coors_l_all.max()
l_grid=np.arange(lmin,lmax+gridsize,gridsize)
k_grid=np.arange(kbound[0], kbound[1]+gridsize, gridsize)
h_grid=np.arange(hbound[0], hbound[1]+gridsize, gridsize)
[H,K,L]=np.meshgrid(h_grid,k_grid,l_grid)
# interpolation
grided_all=griddata(np.transpose(np.array([coors_h_all,coors_k_all,coors_l_all])),inten_all,np.transpose(np.array([H,K,L])),method='nearest')
return grided_all
@memory.cache()
def RSM_full(run_number, sam, det, roi, hbound, kbound, gridsize, offset=0, lower_cutoff_threshold=10, detector=channels.JF):
det = copy(det)
sam=copy(sam)
roi=copy(roi)
coors_h_all, coors_k_all, coors_l_all, inten_all = RSM(run_number, sam, det, roi, offset=offset, lower_cutoff_threshold=lower_cutoff_threshold, detector=detector)
return RSM_interpolate(coors_h_all, coors_k_all, coors_l_all, inten_all, hbound, kbound, gridsize)
# based on henry's codes
#numpy imports
import numpy as np
import uncertainties.unumpy as unumpy
#visualization
import matplotlib.pyplot as plt
from matplotlib import animation, rc, gridspec
#scipy imports
from scipy.spatial.transform import Rotation as Rot
from scipy.optimize import brentq, curve_fit, minimize
from scipy.interpolate import griddata
from tqdm import tqdm
# data handling
import re
import concurrent.futures
import sys
import os
#constants
hbar_eV = 6.582119569e-16
c_mps = 2.99792458e8
#for ease of use later
xhat = np.array((1,0,0))
yhat = np.array((0,1,0))
zhat = np.array((0,0,1))
#Helper functions
def real_space_vectors(sample):
#this function takes alpha, beta, gamma and unit cell sizes and returns unit cell vectors a,b,c
#equations were copied from the internet.
#It works correctly for alp=bet=90deg (as in tas2). I didnt check for the other angles.
a=sample["a"]*xhat
b=sample["b"]*np.array([np.cos(sample["gam"]*np.pi/180),np.sin(sample["gam"]*np.pi/180),0])
c1=np.cos(sample["bet"]*np.pi/180)
c2=(np.cos(sample["alp"]*np.pi/180)-np.cos(sample["bet"]*np.pi/180)*np.cos(sample["gam"]*np.pi/180))/np.sin(sample["gam"]*np.pi/180)
c3=np.sqrt(1-(np.cos(sample["bet"]*np.pi/180))**2-c2**2)
c=sample["c"]*np.array([c1,c2,c3])
return a, b, c
def reciprocal_space_vectors(a, b, c):
#Takes real space basis, a, b and c and returns reciprocal space basis arec, brec, crec.
denom = a @ np.cross(b, c)
arec = 2*np.pi*np.cross(b, c)/denom
brec = 2*np.pi*np.cross(c, a)/denom
crec = 2*np.pi*np.cross(a, b)/denom
return arec, brec, crec
def rotate(vector, axis, angle, degrees = True):
#rotates vector around axis by angle degrees.
axis = axis/np.sqrt(axis @ axis) #normalize just in case
if degrees:
p = np.pi/180 #conversion constant
else:
p = 1 #no conversion needed
r = Rot.from_rotvec(axis*angle*p)
return r.apply(vector)
def E_to_k(energy):
"""
Converts energy to wavevector magnitude
Input:
energy : float, energy of incoming radiation in eV
Output:
wavevector : wavevector magnitude in 1/nm
"""
return 1e-9*energy/(hbar_eV*c_mps)
class Detector():
'''
Description:
Holds Information about the detector, specifically for a Bernina Experiment. Detector position and tilt can be given as well as pixel size and offset between
detector panels. Contains useful functions that return the real space position of each pixel on the detector and visualization tools that show the detector in
real space.
Inputs:
Location:
gamma : float, Angle in degrees
delta : float, Angle in degrees
R_m : float, Distance from sample in meters
Location:
center_pixel : tuple, Center pixel indices
px_size_um : float, Side length of a pixel in um
Orientation:
psi : float, Angle representing the rotation of detector around detector normal
beta : float, Angle representing rotation around x-pixel axis
xi : float, Angle representing rotation around y-pixel axis
'''
### Setup ###
def __init__(self, R_m, px_size_um, center_pixel, detector_size_px = (1030,514), gamma = 0, delta = 0, psi = 0, beta = 0, xi = 0):
self.px_size_um = px_size_um
self.center_pixel = center_pixel
self.detector_size_px = detector_size_px
self.phi = psi
self.beta = beta
self.xi = xi
self.R_m = R_m
self.pixel_positions_real = np.array(()) #for plotting later
self.move_angles(gamma, delta)
self.get_cartesian_coordinates()
def move_angles(self, gamma = None, delta = None, phi = None, beta = None, xi = None):
'''
Rotates the detector to gamma and delta. Redefines r_m and delta_hat and gamma_hat
Input:
Location:
gamma : float, Rotation angle in azimuthal direction
delta : float, Rotation angle in polar direction
Orientation:
phi : float, Angle representing the rotation of detector around detector normal
beta : float, Angle representing rotation around x-pixel axis
xi : float, Angle representing rotation around y-pixel axis
'''
if gamma != None:
self.gamma = gamma
if delta != None:
self.delta = delta
if phi != None:
self.phi = phi
if beta != None:
self.beta = beta
if xi != None:
self.xi = xi
self.get_cartesian_coordinates()
gamma_rad = np.pi*self.gamma/180
delta_rad = np.pi*self.delta/180
#change delta_hat, gamma_hat perpendicular to R
delta_hat_perp = np.array((np.sin(delta_rad)*np.cos(gamma_rad), np.sin(delta_rad)*np.sin(gamma_rad),np.cos(delta_rad)))
gamma_hat_perp = np.array((np.sin(gamma_rad), -np.cos(gamma_rad),0))
#Rotate delta_hat, gamma_hat by phi
delta_hat_p = rotate(delta_hat_perp, self.cart_vec, self.phi, degrees = True)
gamma_hat_p = rotate(gamma_hat_perp, self.cart_vec, self.phi, degrees = True)
#rotate gamma hat prime by beta around delta hat
self.gamma_hat = rotate(gamma_hat_p, delta_hat_p, self.beta, degrees = True)
#rotate delta hat prime by beta around gamma hat
self.delta_hat = rotate(delta_hat_p, self.gamma_hat, self.xi, degrees = True)
def move_to_cartesian(self,vec):
'''
Moves the detector to be in the path of a cartesian vector.
Note, off angles dont work here
Input:
vec : 1D np.ndarray of size 3, location to move the detector in cartesian coordinates
'''
projxy = (vec @ xhat)*xhat + (vec @ yhat)*yhat #project k_out onto xy plane for gamma
projxy = projxy/np.sqrt(projxy@projxy) #normalize
projz = (vec@zhat)*zhat
projz = projz/np.sqrt(projz@projz)
vec_normalize = vec/np.sqrt(vec@vec)
self.delta = np.sign(zhat @ vec) * np.arcsin(projz@vec_normalize) * 180/np.pi
self.gamma = -np.sign(yhat @ vec) * np.arccos(-projxy@xhat) * 180/np.pi
self.move_angles(self.gamma, self.delta)
### Computational Tools ###
def get_cartesian_coordinates(self):
"""
Returns cartesian coordinates of the detector as a vector.
"""
gamma_rad = np.pi*self.gamma/180
delta_rad = np.pi*self.delta/180
self.cart_vec = self.R_m*(-np.cos(gamma_rad)*np.cos(delta_rad)*xhat-np.sin(gamma_rad)*np.cos(delta_rad)*yhat+np.sin(delta_rad)*zhat)
def pixel_cartesian_coordinate(self, i_y, i_x):
"""
Find the cartesian coordinate of a pixel with index i_y and i_x
Inputs:
i_y : int, defined above
i_x : int, defined above
"""
i_x = self.center_pixel[0] - i_x
i_y = self.center_pixel[1] - i_y
pixel_position_relative = 1e-6*self.px_size_um*(np.array((i_x*self.gamma_hat[0], i_x*self.gamma_hat[1], i_x*self.gamma_hat[2])) + np.array((i_y*self.delta_hat[0], i_y*self.delta_hat[1], i_y*self.delta_hat[2])))
pixel_position_real = self.cart_vec + pixel_position_relative
return pixel_position_real
def whole_detector_cartesian_coordinates(self):
"""
Calculates the real space coordinates for each pixel of the detector in its defined postion as a numpy array.
"""
i_x = self.center_pixel[0] - np.arange(self.detector_size_px[0])
i_y = self.center_pixel[1] - np.arange(self.detector_size_px[1])
ii_x, ii_y = np.meshgrid(i_x, i_y)
#probably a better way to do this part
sum_one = 1e-6*self.px_size_um*np.array((ii_x*self.gamma_hat[0], ii_x*self.gamma_hat[1], ii_x*self.gamma_hat[2]))
sum_two = 1e-6*self.px_size_um*np.array((ii_y*self.delta_hat[0], ii_y*self.delta_hat[1], ii_y*self.delta_hat[2]))
pixel_positions_relative = sum_one + sum_two
pixel_positions_real = self.cart_vec + np.transpose(pixel_positions_relative, [1,2,0])
self.pixel_positions_real = np.transpose(pixel_positions_real,[2,0,1])
def ROI_cartesian_coordinates(self, ROI):
"""
Calculates the real space coordinates for each pixel of an ROI of adetector in its defined postion as a numpy array.
Input:
ROI : tuple of size 4, in [xmin, xmax, ymin, ymax]
Output:
pixel_positions_ROI : 3D np.ndarray of shape (3x(2*y_half)x(2*x_half)) holding the real space position of each pixel in ROI.
"""
self.whole_detector_cartesian_coordinates()
x_min, x_max, y_min, y_max = ROI
pixel_positions_ROI = self.pixel_positions_real[:,y_min:y_max, x_min:x_max]
return pixel_positions_ROI
class Sample():
'''
Description:
Stores information about sample geometry and orientation and contains useful computational tools as well as visualization of real/reciprocal space lattice
basis vectors.
Inputs:
Required:
sample_dict : dictionary, containing lattice constants and angles.
Example, {"a":0.336, "b":0.336, "c": 0.5853, "alp":90, "bet":90, "gam":120}
sample_name : string, Name of sample
Example, "1T-TaS2"
sample_normal : 1D np.ndarray of size 3, vector defining the sample normal in real space vector basis.
Not Required:
alpha : float, Angle in degrees, angle according to experiment, roughly correct.
ov : float Angle in degrees, angle according to experiment, relatively correct between runs.
ov0: float, Angle in degrees, how much the sample is offset to the experimental angles in ov
chi0: float, Angle in degrees, how much the sample is offset to the experimental angles in chi
alpha0: float, Angle in degrees, how much the sample is offset to the experimental angles in alpha
'''
### SETUP ###
def __init__(self, sample_dict, sample_name, surface_normal = np.array((0,0,1)), ov0 = 0, chi0 = 0, alpha0 = 0, alpha = 0, ov = 0, chi = 0):
self.sample_dict = sample_dict
self.sample_name = sample_name
self.surface_normal = surface_normal
self.alpha0 = alpha0
self.ov0 = ov0
self.chi0 = chi0
self.define_off_angles(alpha0, ov0, chi0)
self.initialize_lattice_vectors()
self.alpha = alpha
self.ov = ov
self.chi = chi
self.alpha_axis = np.array([0,0,1])
self.ov_axis = rotate(yhat, self.alpha_axis, self.alpha, degrees=True)
self.rotate_sample(alpha, ov)
def define_off_angles(self, alpha0 = 0, ov0 = 0, chi0 = 0):
"""
Initialize sample with off angles
"""
self.alpha0 = alpha0
self.ov0 = ov0
self.chi0 = chi0
self.alpha0_axis = zhat
self.ov0_axis = rotate(yhat, self.alpha0_axis, self.alpha0, degrees=True)
chi0_axis = rotate(xhat, self.alpha0_axis, self.alpha0, degrees=True)
self.chi0_axis = rotate(chi0_axis, self.ov0_axis, self.ov0, degrees=True)
self.initialize_lattice_vectors()
def change_lattice_constants(self, a = None, b = None, c = None):
if a != None:
self.sample_dict["a"] = a
if b != None:
self.sample_dict["b"] = b
if c != None:
self.sample_dict["c"] = c
self.initialize_lattice_vectors()
self.rotate_sample(self.alpha, self.ov)
def initialize_lattice_vectors(self):
"""
Set up the real space vectors and reciprocal space vectors
"""
self.a_unrotated, self.b_unrotated, self.c_unrotated = real_space_vectors(self.sample_dict) #basis vectors unrotated
self.arec0_unrotated, self.brec0_unrotated, self.crec0_unrotated = reciprocal_space_vectors(self.a_unrotated, self.b_unrotated, self.c_unrotated)
self.surface_normal_init = self.a_unrotated * self.surface_normal[0] + self.b_unrotated * self.surface_normal[1] + self.c_unrotated * self.surface_normal[2]
self.surface_normal_init = self.surface_normal_init/np.linalg.norm(self.surface_normal_init)
#get rotated basis vectors
self.a0 = self.vector_setup(self.a_unrotated)
self.b0 = self.vector_setup(self.b_unrotated)
self.c0 = self.vector_setup(self.c_unrotated)
self.surface_normal_init = self.vector_setup(self.surface_normal_init)
#compute reciprocal space vectors
self.arec0, self.brec0, self.crec0 = reciprocal_space_vectors(self.a0, self.b0, self.c0)
def vector_setup(self, vector):
"""
Input
vector: np.ndarray of size 3
Output
vector: np.ndarray of size 3
Helper function for initialize_lattice_vectors, rotates each basis vector to the correct position.
"""
#first deal with surface normal
surface_normal_vec = self.a_unrotated * self.surface_normal[0] + self.b_unrotated * self.surface_normal[1] + self.c_unrotated * self.surface_normal[2]
surface_normal_vec = surface_normal_vec/np.linalg.norm(surface_normal_vec)
angle = np.arccos(surface_normal_vec @ zhat)*180/np.pi
axis = np.cross(surface_normal_vec, zhat)/np.linalg.norm(np.cross(surface_normal_vec, zhat))
vector = rotate(vector, axis, angle, degrees = True)
#rotate to beamline coordinates (horizontal)
vector = rotate(vector, xhat, 90, degrees = True)
vector = rotate(vector, yhat, 90, degrees = True)
#rotate to account for misalignment
vector = rotate(vector, self.alpha0_axis, self.alpha0, degrees = True)
vector = rotate(vector, self.ov0_axis, self.ov0, degrees = True)
vector = rotate(vector, self.chi0_axis, self.chi0, degrees = True)
return vector
### ROTATE SAMPLE ###
def rotate_sample_vector(self, vector):
"""
Input
vector: np.ndarray of size 3
Output
vector: np.ndarray of size 3
Description
Rotates a sample vector by self.alpha and self.ov
"""
vector = rotate(vector, self.alpha_axis, self.alpha)
vector = rotate(vector, self.ov_axis, self.ov)
vector = rotate(vector, self.chi_axis, self.chi)
return vector
def rotate_sample(self, alpha = None, ov = None, chi = None):
'''
Rotates the sample by alpha and then by ov.
'''
if alpha != None:
self.alpha = alpha
if ov != None:
self.ov = ov
if chi!= None:
self.chi = chi
self.ov_axis = rotate(yhat, self.alpha_axis, self.alpha, degrees=True) #update ovrotate axis for rotation
chi_axis = rotate(xhat, self.alpha_axis, self.alpha, degrees=True)
self.chi_axis = rotate(chi_axis, self.ov_axis, self.ov, degrees = True)
self.arec = self.rotate_sample_vector(self.arec0)
self.brec = self.rotate_sample_vector(self.brec0)
self.crec = self.rotate_sample_vector(self.crec0)
self.a = self.rotate_sample_vector(self.a0)
self.b = self.rotate_sample_vector(self.b0)
self.c = self.rotate_sample_vector(self.c0)
self.surface_normal_updated = self.rotate_sample_vector(self.surface_normal_init)
### Computational Tools ###
def peak_q(self, peak_indices):
'''
Calculated the coordinates for a specified peak (h, k, l) in reciprocal space.
Inputs:
peak_indices : tuple/array of size 3, (h, k, l) indices.
Example: (0, 1, 4)
Outputs:
q : 1D np.ndarray of size 3, cartesian momentum space vector of given peak indices
'''
q = peak_indices[0]*self.arec + peak_indices[1]*self.brec + peak_indices[2]*self.crec
return q
def get_basis_change_matrix(self, return_inverse = False, form = "rotated"):
"""
Calculates matrix to go from cartesian basis to hkl basis
Inputs:
return_inverse : boolean, whether to return inverse of default matrix.
Options:
form : string, whether to return rotated basis or unrotated basis. Either "rotated" or "unrotated"
"""
if form == "rotated":
A = np.transpose(np.array((self.arec, self.brec, self.crec)))
if form == "unrotated":
a = self.arec0_unrotated
b = self.brec0_unrotated
c = self.crec0_unrotated
A = np.transpose(np.array((a, b, c)))
if return_inverse:
Ainv = np.linalg.inv(A)
return Ainv
else:
return A
def convert_hkl_to_Q(self, hkl):
return hkl[0]*self.arec0_unrotated+hkl[1]*self.brec0_unrotated+hkl[2]*self.crec0_unrotated
### Visualization Tools ###
def quiver_plot(self, space = "Reciprocal", figax = None):
"""
Plots real and reciprocal space crystal basis vectors in 3D. Good for debugging.
Inputs:
Not Required:
space : string, either "Reciprocal" or "Real". By default, space = "Reciprocal"
Returns:
fig : matplotlib figure
ax : matplotlib axis
"""
if figax == None:
fig = plt.figure(figsize = (8,8))
ax = fig.add_subplot(projection='3d')
else:
fig = figax[0]
ax = figax[1]
if space in ["Real", "All"]:
ax.quiver(0, 0, 0, *self.a,color = "red", length = 0.05, normalize = True, label = "Real Space Basis")
ax.quiver(0, 0, 0, *self.b,color = "red", length = 0.05, normalize = True)
ax.quiver(0, 0, 0, *self.c,color = "red", length = 0.05, normalize = True)
# ax.quiver(0, 0, 0, *self.surface_normal_updated,color = "green", length = 0.05, normalize = True, label ="Surface Normal")
if space in ['Reciprocal', "All"]:
ax.quiver(0, 0, 0, *self.arec,color = "blue", length = np.linalg.norm(self.arec), normalize = True, label = "Reciprocal Space Basis")
ax.quiver(0, 0, 0, *self.brec,color = "blue", length = np.linalg.norm(self.brec), normalize = True)
ax.quiver(0, 0, 0, *self.crec,color = "blue", length = np.linalg.norm(self.crec), normalize = True)
ax.quiver(0, 0, 0, *self.surface_normal_updated,color = "green", length = 35, normalize = True, label ="Surface Normal")
ax.set_xlim((-50, 50))
ax.set_ylim((-50, 50))
ax.set_zlim((-50, 50))
ax.set_aspect('auto')
ax.set_xlabel("x")
ax.set_ylabel("y")
ax.set_zlabel("z")
ax.legend()
return fig, ax
class Experiment():
'''
Description:
Given a sample, a detector, and the energy of an incoming beam, this class contains important tools to calculate scattering conditions to
find a specific peak or functions to calculate the momentum transfer of each pixel for use in the Reconstruction class.
Input:
Sample : Sample object for the experiment
Detector : Detector object for the experiment
energy : float, Energy of the incoming beam
'''
### Setup ###
def __init__(self, Sample, Detector, energy):
self.energy = energy
self.Sample = Sample
self.Detector = Detector
self.set_energy(energy) #incoming beam
self.pixel_hkl = np.array(())
### Changes in Experimental Setup ###
def make_angle_dict(self):
"""
Creates a dictionary containing alpha, ov, delta, and gamma for the experiment in its current state.
Output:
angle_dict : dictionary, holds angle info of experiment
"""
angle_dict = {}
angle_dict["alpha"] = self.Sample.alpha
angle_dict["ov"] = self.Sample.ov
angle_dict["delta"] = self.Detector.delta
angle_dict["gamma"] = self.Detector.gamma
return angle_dict
def move_from_string(self, string, value):
if string in ["energy", "energies"]:
self.set_energy(value)
elif string in ["alp", "alphas", "alpha"]:
self.Sample.rotate_stample(alpha = value)
elif string in ["ov", "phi"]:
self.Sample.rotate_stample(ov = value)
elif string in ["delta", "delt", "deltas"]:
self.Detector.move_angles(delta = value)
elif string in ["gamma", "gam", "gammas"]:
self.Detector.move_angles(gamma = value)
else:
print("String not recognized, choose from ['gamma', 'delta', 'energy', 'alpha', 'phi']")
def move_from_dict(self, angle_dict):
"""
Takes in a dictionary containing angles. Moves the sample and detector accordingly
Input:
angle_dict : dictionary, holds angle info of experiment
"""
alpha = angle_dict["alpha"]
ov = angle_dict["ov"]
self.Sample.rotate_sample(ov = ov, alpha = alpha)
delta = angle_dict["delta"]
gamma = angle_dict["gamma"]
self.Detector.move_angles(delta = delta, gamma = gamma)
def set_energy(self, energy):
self.energy = energy
self.k_in = E_to_k(self.energy)*np.array([-1,0,0]) #incoming beam
### Computational Tools ###
def whole_detector_hkl(self):
"""
Calculate momentum transfer in HKL basis for the entire detector.
Output:
pixel_hkl : 3d np.ndarray of shape (3,y_size,x_size)
"""
#real space coordinates
self.Detector.whole_detector_cartesian_coordinates()
#calculate momentum transfer
pixel_k_out = np.linalg.norm(self.k_in)*self.Detector.pixel_positions_real/np.linalg.norm(self.Detector.pixel_positions_real, axis = 0)
self.pixel_Q = -np.transpose(self.k_in - np.transpose(pixel_k_out,[1,2,0]),[2,0,1])
#solve for hkl of each pixel
Ainv = self.Sample.get_basis_change_matrix(return_inverse = True)
self.pixel_hkl = np.tensordot(Ainv, self.pixel_Q, axes = ([1],[0]))
return self.pixel_hkl
def ROI_hkl(self, ROI):
"""
Calculate momentum transfer in HKL basis for ROI of the detector.
Input:
ROI : tuple of size 4, in [xmin, xmax, ymin, ymax]
Output:
pixel_hkl : 3d np.ndarray of shape (3, 2*y_half, 2*x_half)
"""
real_space_coords = self.Detector.ROI_cartesian_coordinates(ROI)
pixel_k_out = np.linalg.norm(self.k_in)*real_space_coords/np.linalg.norm(real_space_coords, axis = 0)
pixel_Q = -np.transpose(self.k_in - np.transpose(pixel_k_out,[1,2,0]),[2,0,1])
#solve for hkl of each pixel
Ainv = self.Sample.get_basis_change_matrix(return_inverse = True)
pixel_hkl = np.tensordot(Ainv, pixel_Q, axes = ([1],[0]))
return pixel_hkl
def ROI_Q(self, ROI):
"""
Calculate momentum transfer for ROI of the detector
Input:
ROI : tuple of size 4, in [xmin, xmax, ymin, ymax]
Output:
pixel_hkl : 3d np.ndarray of shape (3, 2*y_half, 2*x_half)
"""
real_space_coords = self.Detector.ROI_cartesian_coordinates(ROI)
pixel_k_out = np.linalg.norm(self.k_in)*real_space_coords/np.linalg.norm(real_space_coords, axis = 0)
pixel_Q = -np.transpose(self.k_in - np.transpose(pixel_k_out,[1,2,0]),[2,0,1])
return pixel_Q
def ROI_Q_angle(self, ROI, theta=0):
"""
Calculate momentum transfer for ROI of the detector, assuming there is a coordinate system which rotates with theta
Input:
ROI : tuple of size 4, in [xmin, xmax, ymin, ymax]
theta : sample rotation angle in degree
Output:
pixel_hkl : 3d np.ndarray of shape (3, 2*y_half, 2*x_half)
"""
real_space_coords = self.Detector.ROI_cartesian_coordinates(ROI)
pixel_k_out = np.linalg.norm(self.k_in)*real_space_coords/np.linalg.norm(real_space_coords, axis = 0)
pixel_Q = -np.transpose(self.k_in - np.transpose(pixel_k_out,[1,2,0]),[2,0,1])
theta_rad=theta/180*np.pi
pixel_Q_afterrotate=np.array(pixel_Q)
pixel_Q_afterrotate[0]=pixel_Q[0]*np.cos(theta_rad)+pixel_Q[1]*np.sin(theta_rad)
pixel_Q_afterrotate[1]=-pixel_Q[0]*np.sin(theta_rad)+pixel_Q[1]*np.cos(theta_rad)
return pixel_Q_afterrotate
def point_to_hkl(self, ind_y, ind_x):
"""
Calculate momentum transfer in HKL basis for a single pixel of the detector given ind_y and ind_x.
Inputs:
ind_y : int, defined above
ind_x : int, defined above
"""
# Maps a single pixel to hkl coordinate.
pixel_coords_real = self.Detector.pixel_cartesian_coordinate(ind_y, ind_x)
k_out = np.linalg.norm(self.k_in)*pixel_coords_real/np.linalg.norm(pixel_coords_real)
Q = -(self.k_in - k_out)
Ainv = self.Sample.get_basis_change_matrix(return_inverse = True)
return Ainv @ Q
def visualize_scattering_xyz(self):
"""
Makes a plot showing the scattering vectors in reciprocal space in the xyz basis.
"""
fig, ax = self.Sample.quiver_plot(space = "Reciprocal")
try:
detector_position = self.Detector.cart_vec
except:
detector_position = self.Detector.center_px_position
ax.quiver(np.linalg.norm(self.k_in),0,0,*(self.k_in), length = np.linalg.norm(self.k_in), normalize = True, color = "purple", label = "Incoming beam (K_in)")
k_out_plotting = np.linalg.norm(self.k_in)*detector_position/np.linalg.norm(detector_position)
ax.quiver(0,0,0,*(k_out_plotting), length = np.linalg.norm(k_out_plotting), normalize = True, color = "black", label = "Detector Location (K_out)")
Q_plotting = self.k_in - k_out_plotting
ax.quiver(0,0,0,*(-Q_plotting),length = np.linalg.norm(Q_plotting), normalize = True, color = "orange", label = "Momentum Transfer (Q)")
self.whole_detector_hkl()
Q_pix = self.pixel_Q[:,::50,::50].reshape(3,-1)
ax.scatter(Q_pix[0,:], Q_pix[1,:], Q_pix[2,:], marker = ".", color = "brown")
center = self.pixel_Q[:,self.Detector.center_pixel[1], self.Detector.center_pixel[0]]
ax.scatter(center[0], center[1], center[2], color = "red")
ax.legend()
return fig, ax

0
src/cristallina/skeleton.py Normal file → Executable file
View File

135
src/cristallina/uscan.py Executable file
View File

@@ -0,0 +1,135 @@
import json
import time
from pathlib import Path
from sfdata.utils import print_skip_warning, json_load, adjust_shape
from sfdata.ign import remove_ignored_filetypes_scan
from sfdata import SFDataFiles, SFScanInfo
from partialjson.json_parser import JSONParser
from loguru import logger
MAX_STEP_WAIT = 300 # Maximum wait time in seconds for files to appear
class UnfinishedScanInfo(SFScanInfo):
"""
Represents an unfinished scan from SwissFEL data, mimicking a finished SFScanInfo. It allows to iterate
over already available scan steps and waits for new data if the scan is still ongoing.
If the scan is already finished, it behaves like a regular SFScanInfo.
Args:
fname (str): The filepath of the JSON file to be processed for scan information (e.g. scan.json).
refresh_interval (int): Time in seconds to wait before checking for new data again. Default is 10 seconds.
"""
def __init__(self, fname, refresh_interval=10):
self.fname = fname
self.finished = False
self.refresh_interval = refresh_interval
if is_finished_scan(fname):
# simple case, it is a finished scan and our parent can handle it
super().__init__(fname)
self.finished = True
else:
# try to parse it as a partial JSON file
self._parse_partial_json(fname)
def _parse_partial_json(self, fname):
with open(fname) as f:
content = f.read()
parser = JSONParser()
self.info = info = parser.parse(content)
# we try to extract as much information as possible,
# leaving missing parts out if not available
fnames = info.get("scan_files", [])
self.files = remove_ignored_filetypes_scan(fnames)
self.parameters = info.get("scan_parameters", [])
values = info.get("scan_values", [])
readbacks = info.get("scan_readbacks", [])
# filter for empty values which can occur in the partial json parsing
values = [vals for vals in values if vals]
readbacks = [rb for rb in readbacks if rb]
self.values = adjust_shape(values)
self.readbacks = adjust_shape(readbacks)
def __iter__(self):
if self.finished:
return super().__iter__()
return self._generate_data()
def _generate_data(self):
"""Generator that yields scan data as it becomes available during the scan."""
yielded_count = 0
while True:
retries = 0
self._parse_partial_json(self.fname)
# Check if we have new files to yield
while self.files and len(self.files) > yielded_count:
fns = self.files[yielded_count]
if not files_available_on_disk(fns):
time.sleep(self.refresh_interval)
retries += 1
if (retries * self.refresh_interval) < MAX_STEP_WAIT: # Wait up to 5 minutes for files to appear
continue # Wait and recheck
else:
logger.error(f"Timeout waiting for files to become available for step {yielded_count} {fns}")
# we still yield the remaining files to avoid infinite loop, but log an error and leave it to the caller to handle missing data
yielded_count += 1
retries = 0
try:
with SFDataFiles(*fns) as data:
yield data
except Exception as exc:
# TODO: Think about what could go wrong here and deal with it more specifically
sn = f"step {yielded_count - 1} {fns}"
print_skip_warning(exc, sn)
continue # Try next file
if is_finished_scan(self.fname) and (yielded_count >= len(self.files)):
return # Scan is finished, and we yielded all available files, stop iteration
# Wait before checking again
time.sleep(self.refresh_interval)
def is_finished_scan(fname):
""" If the scan.json file is complete, valid and the files are on the disk the scan is finished.
This does not cover the case where a scan is interrupted and continued later.
It also relies on the behavior of RA that the scan.json is only fully written at the end of the scan.
TODO: A clear criterion for a finished scan would be that another later scan exists."""
try:
content = json_load(fname)
files = [file for step in content['scan_files'] for file in step]
# Check if all files are available on disk
if not files_available_on_disk(set(files)):
return False
except json.JSONDecodeError:
return False
return True
def files_available_on_disk(fnames):
"""Check if all files for this step are available on disk and contain some data."""
# fnames = [fn for fn in fnames if not "PVDATA" in fn] # PVDATA files are not written to disk at the moment!
# logger.debug(f"Skipping PVDATA files for availability check as a workaround!")
if all(Path(fn).exists() for fn in fnames):
return all(Path(fn).stat().st_size > 0 for fn in fnames)
return False

617
src/cristallina/utils.py Normal file → Executable file
View File

@@ -1,43 +1,152 @@
import yaml import yaml
import os import os
import json
import re
from pathlib import Path
import subprocess
from collections import defaultdict
import time
import datetime
from json import JSONDecodeError
import requests
import logging import logging
logger = logging.getLogger() logger = logging.getLogger()
import numpy as np import numpy as np
import pandas as pd
import sfdata import sfdata
import sfdata.errors import sfdata.errors
from sfdata import SFDataFiles, sfdatafile, SFScanInfo, SFProcFile from sfdata import SFDataFiles, sfdatafile, SFScanInfo, SFProcFile
from xraydb import material_mu from xraydb import material_mu
from joblib import Parallel, delayed, cpu_count from joblib import Parallel, delayed, cpu_count
from scipy.stats import pearsonr
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
def scan_info(run_number, base_path=None, small_data=True): from . import channels
"""Returns SFScanInfo object for a given run number. from .uscan import UnfinishedScanInfo
def collect_runs_metadata(pgroup_data_dir: str | os.PathLike):
"""
Generates metadata overview for all runs in a given p-group data directory (e.g. "/sf/cristallina/data/p21261").
Not all available metadata is included, we skip e.g. lists of recorded channels.
Returns: Pandas dataframe with the individual acquisitions and the respective metadata.
"""
measurements = defaultdict(list)
raw_dir = Path(pgroup_data_dir) / "raw"
run_paths = sorted([run for run in Path(raw_dir).glob("run[0-9][0-9][0-9][0-9]")])
for run_path in run_paths:
# overall data in scan.json
with open(run_path / "meta/scan.json", "r") as file:
scan_meta = json.load(file)
# individual metadata in acqXXXX.json
num_acq = len(scan_meta["scan_files"])
acq_jsons = sorted([acq for acq in Path(run_path).glob("meta/acq[0-9][0-9][0-9][0-9].json")])
for i, acq in enumerate(acq_jsons):
with open(acq) as file:
acq_meta = json.load(file)
for parameter in [
"rate_multiplicator",
"user_tag",
"request_time",
"unique_acquisition_run_number",
"start_pulseid",
"stop_pulseid",
"client_name",
]:
try:
measurements[parameter].append(acq_meta[parameter])
except KeyError:
logger.info(f"No parameter {parameter} in metadata found, skipping.")
measurements[parameter] = None
for scan_parameter in ["scan_readbacks", "scan_step_info", "scan_values", "scan_readbacks_raw"]:
# in most scans we are only scanning one parameter at a time, so convert to single object here instead of list
meta_parameter = (
scan_meta[scan_parameter][i][0]
if len(scan_meta[scan_parameter][i]) == 1
else scan_meta[scan_parameter][i]
)
measurements[scan_parameter].append(meta_parameter)
# generate file pattern to cover all *.h5 of acquisition (this assumes we have some PVDATA which should almost always be the case).
measurements["files"].append(str(scan_meta["scan_files"][i][0]).replace("PVDATA", "*"))
df = pd.DataFrame(measurements)
# add run and acquisition numbers from fileset
# kind of added after the fact but now everything is in place
run_no, acq_no = [], []
for fileset in df["files"]:
pattern = r".*run(\d{4})/data/acq(\d{4})"
m = re.match(pattern, fileset)
run_no.append(int(m.groups()[0]))
acq_no.append(int(m.groups()[1]))
df["run"] = run_no
df["acq"] = acq_no
return df
def scan_info(run_number, base_path=None, small_data=True, pgroup=None, use_uscan=False):
"""Returns SFScanInfo object for a given run number.
If there is are small data channels, they will be added (small_data=False to suppress their loading). If there is are small data channels, they will be added (small_data=False to suppress their loading).
use_uscan=True will use UnfinishedScanInfo to load scans that are still running.
""" """
if base_path == None: if base_path == None:
base_path = heuristic_extract_base_path() base_path = heuristic_extract_base_path(pgroup)
if use_uscan:
scan = UnfinishedScanInfo(f"{base_path}run{run_number:04}/meta/scan.json")
return scan
scan = SFScanInfo(f"{base_path}run{run_number:04}/meta/scan.json") scan = SFScanInfo(f"{base_path}run{run_number:04}/meta/scan.json")
try: try:
if small_data: if small_data:
for i in range(len(scan.readbacks)): for i in range(len(scan.readbacks)):
sd_path_for_step=heuristic_extract_smalldata_path()+'run'+str(run_number).zfill(4)+'/acq'+str(i+1).zfill(4)+'.smalldata*.h5' sd_path_for_step = (
scan.info['scan_files'][i].append(sd_path_for_step) heuristic_extract_smalldata_path()
+ "run"
+ str(run_number).zfill(4)
+ "/acq"
+ str(i + 1).zfill(4)
+ ".smalldata*.h5"
)
scan.info["scan_files"][i].append(sd_path_for_step)
except KeyError: except KeyError:
logger.warning("Failed to load smalldata.") logger.warning("Failed to load smalldata.")
return scan return scan
def get_scan_from_run_number_or_scan(run_number_or_scan,small_data=True):
def get_scan_from_run_number_or_scan(run_number_or_scan, small_data=True, pgroup=None):
"""Returns SFScanInfo object from run number or SFScanInfo object (then just passes that to output)""" """Returns SFScanInfo object from run number or SFScanInfo object (then just passes that to output)"""
if type(run_number_or_scan) == SFScanInfo: if type(run_number_or_scan) == SFScanInfo:
scan = run_number_or_scan scan = run_number_or_scan
else: else:
scan = scan_info(run_number_or_scan,small_data=small_data) scan = scan_info(run_number_or_scan, small_data=small_data, pgroup=pgroup)
return scan return scan
def get_run_number_from_run_number_or_scan(run_number_or_scan,small_data=True):
def get_run_number_from_run_number_or_scan(run_number_or_scan, small_data=True):
"""Returns run number from run number or SFScanInfo object""" """Returns run number from run number or SFScanInfo object"""
if type(run_number_or_scan) == int: if type(run_number_or_scan) == int:
rn = run_number_or_scan rn = run_number_or_scan
@@ -47,22 +156,27 @@ def get_run_number_from_run_number_or_scan(run_number_or_scan,small_data=True):
raise ValueError("Input must be an int or SFScanInfo object") raise ValueError("Input must be an int or SFScanInfo object")
return rn return rn
def channel_names(run_number,verbose=False):
def channel_names(run_number, verbose=False):
"""Prints channel names for a given run_number or scan object""" """Prints channel names for a given run_number or scan object"""
if type(run_number) == SFScanInfo: if type(run_number) == SFScanInfo:
scan = run_number scan = run_number
else: else:
scan = scan_info(run_number) scan = scan_info(run_number)
channel_list = list(scan[0].keys()) channel_list = list(scan[0].keys())
if verbose: if verbose:
print(channel_list) print(channel_list)
return channel_list return channel_list
def print_run_info( def print_run_info(
run_number=42, print_channels=True, extra_verbose=False, base_path=None, run_number=42,
print_channels=True,
extra_verbose=False,
base_path=None,
): ):
"""Prints overview of run information. """Prints overview of run information.
@@ -95,87 +209,186 @@ def print_run_info(
print(f"Total file size: {total_size/(1024*1024*1024):.1f} GB\n") print(f"Total file size: {total_size/(1024*1024*1024):.1f} GB\n")
try: if print_channels:
for step in scan: try:
ch = step.channels for step in scan:
print("\n".join([str(c) for c in ch])) ch = step.channels
# print only channels for first step print("\n".join([str(c) for c in ch]))
break # print only channels for first step
except sfdata.errors.NoUsableFileError: break
logger.warning("Cannot access files on /sf...") except sfdata.errors.NoUsableFileError:
logger.warning("Cannot access files on /sf...")
def number_of_steps(scan_number_or_scan): def number_of_steps(scan_number_or_scan):
"""Returns number of steps for a scan object or run number""" """Returns number of steps for a scan object or run number"""
scan = get_scan_from_run_number_or_scan(scan_number_or_scan) scan = get_scan_from_run_number_or_scan(scan_number_or_scan)
return len(scan.info['scan_readbacks']) return len(scan.info["scan_readbacks"])
def process_run(run_number, rois,detector='JF16T03V01', calculate =None, only_shots=slice(None), n_jobs=cpu_count()-2):
def process_run(
run_number, rois, detector="JF16T03V01", calculate=None, only_shots=slice(None), n_jobs=cpu_count() - 2
):
"""Process rois for a given detector. Save the results small data in the res/small_data/run... """Process rois for a given detector. Save the results small data in the res/small_data/run...
By default only sum of rois is calculated, [mean,std,img] can be added to the "calculate" optional parameter. By default only sum of rois is calculated, [mean,std,img] can be added to the "calculate" optional parameter.
""" """
rn = get_run_number_from_run_number_or_scan(run_number) rn = get_run_number_from_run_number_or_scan(run_number)
# Load scan object with SFScanInfo # Load scan object with SFScanInfo
scan = scan_info(rn,small_data=False) scan = scan_info(rn, small_data=False)
# Make the small data folder if it doesn't exist # Make the small data folder if it doesn't exist
if not os.path.exists( heuristic_extract_smalldata_path() ): if not os.path.exists(heuristic_extract_smalldata_path()):
os.mkdir( heuristic_extract_smalldata_path() ) os.mkdir(heuristic_extract_smalldata_path())
# Set the path for later small data saving # Set the path for later small data saving
path_with_run_folder = heuristic_extract_smalldata_path()+'run'+str(run_number).zfill(4) path_with_run_folder = heuristic_extract_smalldata_path() + "run" + str(run_number).zfill(4)
# Make the small data run folder if it doesn't exist # Make the small data run folder if it doesn't exist
if not os.path.exists( path_with_run_folder ): if not os.path.exists(path_with_run_folder):
os.mkdir( path_with_run_folder ) os.mkdir(path_with_run_folder)
# Add scan info into the small_data_object # Add scan info into the small_data_object
#for key in scan.info.keys(): # for key in scan.info.keys():
# d[key]= info.info[key] # d[key]= info.info[key]
# Check if there is only one roi. If yes, make a list of it so it can be iterated over. # Check if there is only one roi. If yes, make a list of it so it can be iterated over.
if isinstance(rois, ROI): if isinstance(rois, ROI):
rois = [rois] rois = [rois]
def process_step(i): def process_step(i):
scan = scan_info(run_number,small_data=False) scan = scan_info(run_number, small_data=False)
step = scan[i] step = scan[i]
with step as data: with step as data:
with SFProcFile(f"{path_with_run_folder}/acq{str(i+1).zfill(4)}.smalldata.h5", mode="w") as sd: with SFProcFile(f"{path_with_run_folder}/acq{str(i+1).zfill(4)}.smalldata.h5", mode="w") as sd:
# Calculate everything related to JF_rois # Calculate everything related to JF_rois
for roi in rois: for roi in rois:
bottom, top, left, right = roi.bottom, roi.top, roi.left, roi.right bottom, top, left, right = roi.bottom, roi.top, roi.left, roi.right
# Pulse ids for saving the new channels # Pulse ids for saving the new channels
det_pids = data[detector].pids det_pids = data[detector].pids
sd[roi.name] = det_pids[only_shots], data[detector][only_shots, bottom:top,left:right].sum(axis=(1, 2)) sd[roi.name] = det_pids[only_shots], data[detector][
only_shots, bottom:top, left:right
].sum(axis=(1, 2))
if calculate: if calculate:
if 'means' in calculate: if "means" in calculate:
sd[roi.name+"_means"] = (det_pids[only_shots], data[detector][only_shots, bottom:top,left:right].mean(axis=(1, 2))) sd[roi.name + "_means"] = (
if 'std' in calculate: det_pids[only_shots],
sd[roi.name+"_std"] = (det_pids[only_shots], data[detector][only_shots, bottom:top,left:right].std(axis=(1, 2))) data[detector][only_shots, bottom:top, left:right].mean(axis=(1, 2)),
if 'img' in calculate: )
sd[f'{roi.name}_img'] = (det_pids[only_shots], data[detector][only_shots, bottom:top,left:right].data) if "std" in calculate:
sd[roi.name + "_std"] = (
det_pids[only_shots],
data[detector][only_shots, bottom:top, left:right].std(axis=(1, 2)),
)
if "max" in calculate:
sd[roi.name + "_max"] = (
det_pids[only_shots],
data[detector][only_shots, bottom:top, left:right].max(axis=(1, 2)),
)
if "img" in calculate:
sd[f"{roi.name}_img"] = (
det_pids[only_shots],
data[detector][only_shots, bottom:top, left:right].data,
)
# Currently meta files can't be read by SFData, this will be modified by Sven and then we can use it. For now saving in roi_info # Currently meta files can't be read by SFData, this will be modified by Sven and then we can use it. For now saving in roi_info
#sd.meta[roi.name+"_info"] = f"roi {roi.name}: {left},{right}; {bottom},{top} (left, right, bottom, top)" # sd.meta[roi.name+"_info"] = f"roi {roi.name}: {left},{right}; {bottom},{top} (left, right, bottom, top)"
# These channels have only one dataset per step of the scan, so we take the first pulseID # These channels have only one dataset per step of the scan, so we take the first pulseID
sd[roi.name + "_info"] =([det_pids[0]], [f"roi {roi.name}: {left},{right}; {bottom},{top} (left, right, bottom, top)"]) sd[roi.name + "_info"] = (
sd[roi.name + "_mean_img"] = ([det_pids[0]], [data[detector][only_shots, bottom:top,left:right].mean(axis=(0))] ) [det_pids[0]],
sd[roi.name + "_step_sum"] = ([det_pids[0]], [data[detector][only_shots, bottom:top,left:right].sum()] ) [f"roi {roi.name}: {left},{right}; {bottom},{top} (left, right, bottom, top)"],
Parallel(n_jobs=n_jobs,verbose=10)(delayed(process_step)(i) for i in range(len(scan))) )
sd[roi.name + "_mean_img"] = (
[det_pids[0]],
[data[detector][only_shots, bottom:top, left:right].mean(axis=(0))],
)
sd[roi.name + "_step_sum"] = (
[det_pids[0]],
[data[detector][only_shots, bottom:top, left:right].sum()],
)
Parallel(n_jobs=n_jobs, verbose=10)(delayed(process_step)(i) for i in range(len(scan)))
def is_processed_JF_file(filepath, detector="JF16T03V01"):
"""Checks if a given .h5 file from a Jungfrau detector has been processed or only
contains raw adc values.
"""
import h5py
f = h5py.File(filepath)
try:
data = f["data"][detector]
except KeyError:
raise ValueError(f"{filepath} does not seem to be an Jungfrau file from the detector {detector}.")
return "meta" in f["data"][detector].keys()
def check_JF_frames(JF_file_path):
""" Simple check if all frames in a given Jungfrau .h5 file are OK or if
there are missing or invalid frames.
Raises an error if invalid frames are found, else returns True.
"""
pattern = r"\.(JF[0-9].*)\.h5"
m = re.search(pattern, str(JF_file_path))
if m is None:
raise LookupError("Cannot match Jungfrau detector name from file path.")
JF_name = m.groups()[0]
with SFDataFiles(JF_file_path) as data:
JF = data[JF_name]
if JF.nvalid != JF.ntotal:
raise ValueError("Jungfrau frames invalid, this should not happen.")
else:
return True
def get_step_time(step: SFDataFiles):
"""Returns the start and end time as a unix timestamp (in seconds)
of a given SFDataFiles or scan step.
Assumes the gasmonitor was running and being recorded.
"""
t_start = step[channels.GASMONITOR].timestamps[0]
t_last = step[channels.GASMONITOR].timestamps[-1]
return t_start.astype(np.timedelta64) / np.timedelta64(1, "s"), t_last.astype(
np.timedelta64
) / np.timedelta64(1, "s")
def wait_for_run(run_number, total_num_steps, snd_file_path="/tmp/CantinaBand3.wav"):
""" Busy wait loop until run has completed all steps. Plays sound
when all files are written to disk.
"""
base_path = heuristic_extract_base_path()
data_path = Path(base_path) / f"run{run_number:0>4}/data"
while True:
pvfiles = data_path.glob("acq*.PVDATA.h5")
length = len(list(pvfiles))
if length == total_num_steps:
break
else:
print(f"Waiting for run {run_number} to complete, only {length} steps so far ...")
time.sleep(5)
subprocess.run(["paplay", snd_file_path])
class ROI: class ROI:
"""Definition of region of interest (ROI) in image coordinates. """Definition of region of interest (ROI) in image coordinates.
Example: ROI(left=10, right=20, bottom=100, top=200). Example: ROI(left=10, right=20, bottom=100, top=200).
Directions assume that lower left corner of image is at (x=0, y=0). Directions assume that lower left corner of image is at (x=0, y=0)
and y is pointing upwards, x pointing to the right.
""" """
def __init__( def __init__(
@@ -189,10 +402,15 @@ class ROI:
width: int = None, width: int = None,
height: int = None, height: int = None,
name: str = None, name: str = None,
from_ROI: "ROI" = None,
): ):
if None not in (left, right, bottom, top): if None not in (left, right, bottom, top):
self.left, self.right, self.bottom, self.top, = ( (
self.left,
self.right,
self.bottom,
self.top,
) = (
left, left,
right, right,
bottom, bottom,
@@ -200,14 +418,20 @@ class ROI:
) )
elif None not in (center_x, center_y, width, height): elif None not in (center_x, center_y, width, height):
self.from_centers_widths(center_x, center_y, width, height) self.from_centers_widths(center_x, center_y, width, height)
elif isinstance(from_ROI, ROI):
self.left = from_ROI.left
self.right = from_ROI.right
self.top = from_ROI.top
self.bottom = from_ROI.bottom
name = from_ROI.name
else: else:
raise ValueError("No valid ROI definition.") raise ValueError("No valid ROI definition.")
# Check that ROI has a name or generate default # Check that ROI has a name or generate default
if name is None: if name is None:
logger.warning(f"No ROI name given, generating: {self.__repr__()}") logger.warning(f"No ROI name given, generating: {self.__repr__()}")
name = self.__repr__() name = self.__repr__()
self.name = name self.name = name
def from_centers_widths(self, center_x, center_y, width, height): def from_centers_widths(self, center_x, center_y, width, height):
@@ -220,7 +444,7 @@ class ROI:
@property @property
def rows(self): def rows(self):
return slice(self.bottom, self.top) return slice(self.bottom, self.top)
@property @property
def LeftRightBottomTop(self): def LeftRightBottomTop(self):
return [self.left, self.right, self.bottom, self.top] return [self.left, self.right, self.bottom, self.top]
@@ -228,85 +452,206 @@ class ROI:
@property @property
def cols(self): def cols(self):
return slice(self.left, self.right) return slice(self.left, self.right)
@property @property
def width(self): def width(self):
return self.right - self.left return self.right - self.left
@property @property
def height(self): def height(self):
return self.top - self.bottom return self.top - self.bottom
def shift(self, x: float = 0, y: float = 0, inplace=False):
""" Returns a new ROI that is shifted
by x to the left and by y pixels up.
If `inplace=True` shifts the itself instead.
"""
if not inplace:
return ROI(left=self.left+x,
right=self.right+x,
bottom = self.bottom+y,
top=self.top+y,
name=self.name)
else:
self.left = self.left + x
self.right = self.right + x
self.bottom = self.bottom + y
self.top = self.top + y
def expand(self, x: float = 0, y: float = 0, inplace=False):
""" Returns a new ROI that is expanded in x
to the left and right, and in y to the
top and bottom.
If `inplace=True` expands the itself instead.
"""
if not inplace:
return ROI(left=self.left - x,
right=self.right + x,
bottom = self.bottom - y,
top=self.top + y,
name=self.name)
else:
self.left = self.left - x
self.right = self.right + x
self.bottom = self.bottom - y
self.top = self.top + y
def __repr__(self): def __repr__(self):
return f"ROI(bottom={self.bottom},top={self.top},left={self.left},right={self.right})" if hasattr(self, "name"):
return f"ROI(bottom={self.bottom},top={self.top},left={self.left},right={self.right},name='{self.name}')"
else:
return f"ROI(bottom={self.bottom},top={self.top},left={self.left},right={self.right})"
def __eq__(self, other): def __eq__(self, other):
# we disregard the name for comparisons # we disregard the name for comparisons
return (self.left, self.right, self.bottom, self.top) == (other.left, other.right, other.bottom, other.top) return (self.left, self.right, self.bottom, self.top) == (
other.left,
def __ne__(self, other): other.right,
return not self == other other.bottom,
other.top,
)
def __ne__(self, other):
return not self == other
def check_pid_offset(step, ch1, ch2, offsets_to_try=[-2, -1, 0, 1, 2], plot=False):
"""Check pid offset between two channels. Offset is the value that should be added to the second channel. Returns the best offset.
ch1 should be something that can be trusted, from experience for example the SARFE10-PBPG050:HAMP-INTENSITY-CAL is reliable.
"""
# Take the first step if scan object is given
if type(step) == sfdata.sfscaninfo.SFScanInfo:
step = step[0]
assert (
type(step) == sfdata.sfdatafiles.SFDataFiles
), "First parameter should be either a step or scan (if scan then first step is tested)"
assert type(ch1) == str, f"Channel {ch1} must be a string"
assert type(ch2) == str, f"Channel {ch2} must be a string"
corrs = []
if plot:
fig, ax = plt.subplots(1, len(offsets_to_try), figsize=(12, 3))
for i, pid_offset in enumerate(offsets_to_try):
ss = step[ch1, ch2] # Make a subset with 2 channels
ss[ch2].offset = pid_offset # Add offset to the second channel
ss.drop_missing()
x = ss[ch1].data
y = ss[ch2].data
if i == 0:
assert len(x.shape) == 1, f"Channel {ch1} has more than one dimension per pid"
assert len(y.shape) == 1, f"Channel {ch2} has more than one dimension per pid"
assert len(x) > 2, f"Channel {ch1} has less than 3 data points"
assert len(y) > 2, f"Channel {ch2} has less than 3 data points"
corr = pearsonr(x, y) # Calculate the correlation value
corrs.append(corr.correlation) # Save in the array
if plot:
if i == 0:
ax[i].set_ylabel(ch2)
ax[2].set_xlabel(ch1)
ax[i].plot(ss[ch1].data, ss[ch2].data, "ok", alpha=0.1)
ax[i].set_title(f"Corr = {np.round(corr.correlation,2)}")
best_offset = offsets_to_try[np.argmax(corrs)] # Best offest is where the correlation is the highest
if plot:
plt.suptitle(f"Best correlation for pid offset {best_offset}")
plt.tight_layout()
return best_offset
######################## Setting up paths ######################## ######################## Setting up paths ########################
def heuristic_extract_pgroup(path=None): def heuristic_extract_pgroup(path=None):
""" The function tries to guess the current p-group from the """The function tries to guess the current p-group from the
current working directory (default) or the contents of current working directory (default) or the contents of
the given path. the given path.
""" """
path = path or os.getcwd() path = path or os.getcwd()
if "/p" in path: while "/p" in path:
# Find the last /p in the path
# Find the last /p in the path p_index = path.rfind("/p")
p_index = path.rfind('/p')
# Cut the string and look at the next five letters after the last /p
p_number = path[p_index+2:p_index+7]
if not p_number.isdigit(): # Cut the string and look at the next five letters after the last /p
raise KeyError("Automatic p-group extraction from the current working directory didn't work.") p_number = path[p_index + 2 : p_index + 7]
if len(p_number) == 5 and p_number.isdigit():
return p_number
path = path[:p_index] # and cut the end off
else: else:
raise KeyError("Automatic p-group extraction from the current working directory didn't work.") raise KeyError("Automatic p-group extraction from the current working directory didn't work.")
return p_number
def heuristic_extract_base_path():
""" The function tries to guess the full path where the raw data is saved.""" def heuristic_extract_base_path(p_number=None):
p_number = heuristic_extract_pgroup() """The function tries to guess the full path where the raw data is saved."""
if p_number is None:
p_number = heuristic_extract_pgroup()
base_path = f"/sf/cristallina/data/p{p_number}/raw/" base_path = f"/sf/cristallina/data/p{p_number}/raw/"
return base_path return base_path
def heuristic_extract_smalldata_path(): def heuristic_extract_smalldata_path():
""" The function tries to guess the full path where the small data is saved.""" """The function tries to guess the full path where the small data is saved."""
p_number = heuristic_extract_pgroup() p_number = heuristic_extract_pgroup()
small_data_path = f"/das/work/p{str(p_number)[0:2]}/p{p_number}/smalldata/" small_data_path = f"/das/work/p{str(p_number)[0:2]}/p{p_number}/smalldata/"
return small_data_path return small_data_path
def stand_table(pgroup=None):
"""Reads the stand table. If no pgroup given it tries to guess it from the current path.
For other pgrous, add pgroup= as optional parameter."""
if pgroup:
# P group might be added as 'p12345' or just 12345, so both cases should be taken care of
if "p" in str(pgroup):
# Cut the string and look at the next five letters after the last /p
p_index = pgroup.rfind("/p")
p_number = pgroup[p_index + 2 : p_index + 7]
base_path = f"/sf/cristallina/data/p{p_number}/raw/"
else:
base_path = f"/sf/cristallina/data/p{pgroup}/raw/"
else:
base_path = heuristic_extract_base_path()
stand_file = base_path.replace("raw", "res") + "output.h5"
stand_table = pd.read_hdf(stand_file)
return stand_table
######################## Little useful functions ######################## ######################## Little useful functions ########################
def find_nearest(array, value): def find_nearest(array, value):
'''Finds an index in an array with a value that is nearest to given number''' """Finds an index in an array with a value that is nearest to given number"""
array = np.asarray(array) array = np.asarray(array)
idx = (np.abs(array - value)).argmin() idx = (np.abs(array - value)).argmin()
return idx return idx
def find_two_nearest(time_array,percentage):
'''Finds indeces of the two values that are the nearest to the given value in an array''' def find_two_nearest(time_array, percentage):
"""Finds indices of the two values that are the nearest to the given value in an array"""
array = np.asarray(time_array) array = np.asarray(time_array)
value = (np.max(array)-np.min(array))*percentage+np.min(array) value = (np.max(array) - np.min(array)) * percentage + np.min(array)
idx = (np.abs(array - value)).argmin() idx = (np.abs(array - value)).argmin()
indices = np.sort([np.argsort(np.abs(array-value))[0], np.argsort(np.abs(array-value))[1]]) indices = np.sort([np.argsort(np.abs(array - value))[0], np.argsort(np.abs(array - value))[1]])
return indices return indices
def gauss(x, H, A, x0, sigma): def gauss(x, H, A, x0, sigma):
"""Returns gauss function value""" """Returns gauss function value"""
return H + A * np.exp(-(x - x0) ** 2 / (2 * sigma ** 2)) return H + A * np.exp(-((x - x0) ** 2) / (2 * sigma**2))
def gauss_fit(x, y, fit_details=None, plot=None): def gauss_fit(x, y, fit_details=None, plot=None):
'''Returns [baseline_offset, Amplitude, center, sigma, FWHM]''' """Returns [baseline_offset, Amplitude, center, sigma, FWHM]"""
# Initial guesses # Initial guesses
mean = sum(x * y) / sum(y) mean = sum(x * y) / sum(y)
@@ -317,73 +662,97 @@ def gauss_fit(x, y, fit_details=None, plot=None):
popt, pcov = curve_fit(gauss, x, y, p0=[min(y), max(y), mean, sigma]) popt, pcov = curve_fit(gauss, x, y, p0=[min(y), max(y), mean, sigma])
# Add FWHM to the ouptuts # Add FWHM to the ouptuts
popt = np.append(popt,2.35482 * popt[3]) popt = np.append(popt, 2.35482 * popt[3])
# Print results # Print results
if fit_details : if fit_details:
print('The baseline offset is', popt[0]) print("The baseline offset is", popt[0])
print('The center is', popt[2]) print("The center is", popt[2])
print('The sigma of the fit is', popt[3]) print("The sigma of the fit is", popt[3])
print('The maximum intensity is', popt[0] + popt[1]) print("The maximum intensity is", popt[0] + popt[1])
print('The Amplitude is', popt[1]) print("The Amplitude is", popt[1])
print('The FWHM is', 2.35482 * popt[3]) print("The FWHM is", 2.35482 * popt[3])
# Show plot # Show plot
if plot: if plot:
plt.figure() plt.figure()
plt.plot(x, y, '.k', label='data') plt.plot(x, y, ".k", label="data")
plt.plot(x, gauss(x, *gauss_fit(x, y)[0:4]), '--r', label='fit') plt.plot(x, gauss(x, *gauss_fit(x, y)[0:4]), "--r", label="fit")
plt.legend() plt.legend()
plt.title('Gaussian fit') plt.title("Gaussian fit")
plt.xlabel('X') plt.xlabel("X")
plt.ylabel('Y') plt.ylabel("Y")
plt.show() plt.show()
return popt return popt
def xray_transmission(energy,thickness,material='Si',density=[]):
'''Calculate x-ray tranmission for given energy, thickness and material. Default material is Si. Add material=element as a string for another material. Density as optional parameter''' def xray_transmission(energy, thickness, material="Si", density=[]):
"""Calculate x-ray tranmission for given energy, thickness and material. Default material is Si. Add material=element as a string for another material. Density as optional parameter"""
mu = material_mu(material, energy)
if density == []: if density == []:
mu_array = material_mu(material, energy) mu_array = material_mu(material, energy)
else: else:
mu_array = material_mu(formula, energy, density=density) mu_array = material_mu(material, energy, density=density)
trans = np.exp(-0.1*(thickness*1000)*mu_array) # 0.1 is beccause mu is in 1/cm, thickness converted to mm from m trans = np.exp(
-0.1 * (thickness * 1000) * mu_array
) # 0.1 is beccause mu is in 1/cm, thickness converted to mm from m
return trans
return trans
######################## Unit conversions ######################## ######################## Unit conversions ########################
def joules_to_eV(joules): def joules_to_eV(joules):
"""Just a unit conversion""" """Just a unit conversion"""
eV = joules * 6.241509e18 eV = joules * 6.241509e18
return eV return eV
def eV_to_joules(eV): def eV_to_joules(eV):
"""Just a unit conversion""" """Just a unit conversion"""
joules = eV * 1.602176565e-19 joules = eV * 1.602176565e-19
return joules return joules
def photon_energy_from_wavelength(wavelength): def photon_energy_from_wavelength(wavelength):
'''Returns photon energy in eV from wavelength in meters. Source https://www.kmlabs.com/en/wavelength-to-photon-energy-calculator''' """Returns photon energy in eV from wavelength in meters. Source https://www.kmlabs.com/en/wavelength-to-photon-energy-calculator"""
Eph = 1239.8 / (wavelength*1e9) Eph = 1239.8 / (wavelength * 1e9)
return Eph return Eph
def wavelength_from_photon_energy(Eph): def wavelength_from_photon_energy(Eph):
'''Returns wavelength in meters from photon energy in eV. Source https://www.kmlabs.com/en/wavelength-to-photon-energy-calculator''' """Returns wavelength in meters from photon energy in eV. Source https://www.kmlabs.com/en/wavelength-to-photon-energy-calculator"""
wavelength = 1239.8 / (Eph*1e9) wavelength = 1239.8 / (Eph * 1e9)
return wavelength return wavelength
def sigma_to_FWHM(sigma): def sigma_to_FWHM(sigma):
"""Gaussian sigma to FWHM""" """Gaussian sigma to FWHM"""
FWHM = sigma * 2.355 FWHM = sigma * 2.355
return FWHM return FWHM
def FWHM_to_sigma(FWHM): def FWHM_to_sigma(FWHM):
"""FWHM to gaussian sigma""" """FWHM to gaussian sigma"""
sigma = FWHM / 2.355 sigma = FWHM / 2.355
return sigma return sigma
def pulseid_to_timestamp(pulse_id):
""" Converts pulse id to datetime using the data-api server.
"""
r = requests.get(f"https://data-api.psi.ch/api/4/map/pulse/sf-databuffer/{pulse_id}")
try:
ns_timestamp = r.json()
timestamp = datetime.datetime.fromtimestamp(ns_timestamp/1E9)
except (JSONDecodeError, TypeError) as e:
raise ValueError(f"Cannot convert pulse id {pulse_id} to timestamp. Cause: {e}")
return timestamp

0
tests/conftest.py Normal file → Executable file
View File

0
tests/data/JF16T03V01.res.h5 Normal file → Executable file
View File

0
tests/data/gains.h5 Normal file → Executable file
View File

0
tests/data/p20841/raw/run0185/data/acq0001.BSDATA.h5 Normal file → Executable file
View File

Some files were not shown because too many files have changed in this diff Show More