14 Commits
v3.2.0 ... main

Author SHA1 Message Date
29d406a290 Add unit tests for filterinc capability and automatic spin state splitting
All checks were successful
Unit Testing / test (3.12) (push) Successful in 46s
Unit Testing / test (3.10) (push) Successful in 48s
Unit Testing / test (3.9) (push) Successful in 45s
Unit Testing / test (3.8) (push) Successful in 48s
2026-02-27 11:44:56 +01:00
5d9c0879b4 Implement automatic splitting of polarization states 2026-02-27 10:59:59 +01:00
7f0e6f1026 Allow option to filter pulses where a switch occured, implement updating of headerin information from filtered log-values for temp., filed and polarization, don't report empty sample environment values
All checks were successful
Unit Testing / test (3.10) (push) Successful in 41s
Unit Testing / test (3.12) (push) Successful in 41s
Unit Testing / test (3.9) (push) Successful in 40s
Unit Testing / test (3.8) (push) Successful in 41s
2026-02-27 10:08:49 +01:00
30574cdf7e try to fix failures on python <3.11 where z-format isotime was not supported
All checks were successful
Unit Testing / test (3.12) (push) Successful in 35s
Unit Testing / test (3.10) (push) Successful in 38s
Unit Testing / test (3.9) (push) Successful in 34s
Unit Testing / test (3.8) (push) Successful in 37s
2026-02-27 08:54:26 +01:00
6298487bf3 fix output name having colon by default, add 2026 dataset and test for logfilter with polarization
Some checks failed
Unit Testing / test (3.10) (push) Failing after 29s
Unit Testing / test (3.8) (push) Failing after 27s
Unit Testing / test (3.9) (push) Failing after 27s
Unit Testing / test (3.12) (push) Successful in 35s
2026-02-27 08:35:56 +01:00
3a7f3cde53 fix sqrt invalid value warning in footprint correction
All checks were successful
Unit Testing / test (3.10) (push) Successful in 28s
Unit Testing / test (3.12) (push) Successful in 27s
Unit Testing / test (3.8) (push) Successful in 27s
Unit Testing / test (3.9) (push) Successful in 26s
2026-02-27 08:06:16 +01:00
dafff07e68 Update version for PyPI release
All checks were successful
Unit Testing / test (3.10) (push) Successful in 27s
Unit Testing / test (3.8) (push) Successful in 25s
Unit Testing / test (3.12) (push) Successful in 30s
Unit Testing / test (3.9) (push) Successful in 29s
2026-02-26 17:31:12 +01:00
9afd15bcb4 Deactivate creating artefact, as broken on gitea right now
All checks were successful
Unit Testing / test (3.12) (push) Successful in 27s
Unit Testing / test (3.10) (push) Successful in 30s
Unit Testing / test (3.9) (push) Successful in 26s
Unit Testing / test (3.8) (push) Successful in 28s
2026-02-26 17:13:42 +01:00
6a4f1c6205 Change artifact action version to work on gitea
All checks were successful
Unit Testing / test (3.10) (push) Successful in 29s
Unit Testing / test (3.12) (push) Successful in 30s
Unit Testing / test (3.8) (push) Successful in 30s
Unit Testing / test (3.9) (push) Successful in 30s
2026-02-26 17:08:15 +01:00
6aacbd5f22 Try to fix PyPI release
All checks were successful
Unit Testing / test (3.12) (push) Successful in 28s
Unit Testing / test (3.10) (push) Successful in 30s
Unit Testing / test (3.9) (push) Successful in 27s
Unit Testing / test (3.8) (push) Successful in 30s
2026-02-26 16:58:57 +01:00
dc7dd2a6f2 Remove unnecessary config class
All checks were successful
Unit Testing / test (3.12) (push) Successful in 28s
Unit Testing / test (3.10) (push) Successful in 30s
Unit Testing / test (3.9) (push) Successful in 27s
Unit Testing / test (3.8) (push) Successful in 28s
2026-02-26 16:44:53 +01:00
a22c23658f Add eosls command to list files with some header info
Some checks failed
Unit Testing / test (3.8) (push) Failing after 10s
Unit Testing / test (3.9) (push) Failing after 10s
Unit Testing / test (3.12) (push) Successful in 27s
Unit Testing / test (3.10) (push) Successful in 29s
2026-02-26 16:41:54 +01:00
246c179481 Changer runner for actions to gitea
All checks were successful
Unit Testing / test (3.10) (push) Successful in 30s
Unit Testing / test (3.12) (push) Successful in 29s
Unit Testing / test (3.8) (push) Successful in 28s
Unit Testing / test (3.9) (push) Successful in 27s
Co-authored-by: bruhn_b <basil.bruhn@psi.ch>
Reviewed-on: #1
Co-authored-by: Artur Glavic <artur.glavic@psi.ch>
Co-committed-by: Artur Glavic <artur.glavic@psi.ch>
2026-02-26 14:34:25 +01:00
30f616d3ab Add option to append Rqz datasets
Some checks failed
Unit Testing / test (3.10) (push) Has been cancelled
Unit Testing / test (3.11) (push) Has been cancelled
Unit Testing / test (3.12) (push) Has been cancelled
Unit Testing / test (3.8) (push) Has been cancelled
Unit Testing / test (3.9) (push) Has been cancelled
2026-02-26 12:54:33 +01:00
19 changed files with 432 additions and 152 deletions

View File

@@ -24,16 +24,15 @@ on:
jobs: jobs:
test: test:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', '3.13'] python-version: ['3.8', '3.9', '3.10', '3.12']
fail-fast: false fail-fast: false
steps: steps:
- uses: actions/checkout@v4 - name: Checkout LFS objects
with: run: git clone https://${{secrets.GITHUB_TOKEN}}@gitea.psi.ch/${{ github.repository }}.git .
lfs: 'true'
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
@@ -60,28 +59,25 @@ jobs:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-python@v5 - uses: actions/setup-python@v5
with: with:
python-version: '3.11' python-version: '3.12'
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip
pip install build pip install build
pip install -r requirements.txt pip install -r requirements.txt
pip install wheel build twine
- name: Build PyPI package - name: Build PyPI package
run: | run: |
python3 -m build python3 -m build
- name: Archive distribution # - name: Archive distribution
uses: actions/upload-artifact@v4 # uses: actions/upload-artifact@v3
with: # with:
name: linux-dist # name: linux-dist
path: | # path: |
dist/*.tar.gz # dist/*.tar.gz
- name: Upload to PyPI - name: Upload to PyPI
#if: github.event_name != 'workflow_dispatch' run: |
uses: pypa/gh-action-pypi-publish@release/v1 twine upload dist/* -u __token__ -p ${{ secrets.PYPI_TOKEN }} --skip-existing
with:
# user: __token__
# password: ${{ secrets.PYPI_TOKEN }}
skip-existing: true
build-windows: build-windows:
needs: [test] needs: [test]
@@ -104,7 +100,7 @@ jobs:
cd dist\eos cd dist\eos
Compress-Archive -Path .\* -Destination ..\..\eos.zip Compress-Archive -Path .\* -Destination ..\..\eos.zip
- name: Archive distribution - name: Archive distribution
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v3
with: with:
name: windows-dist name: windows-dist
path: | path: |
@@ -119,10 +115,10 @@ jobs:
with: with:
fetch-depth: 0 fetch-depth: 0
fetch-tags: true fetch-tags: true
- uses: actions/download-artifact@v4 - uses: actions/download-artifact@v3
with: with:
name: linux-dist name: linux-dist
- uses: actions/download-artifact@v4 - uses: actions/download-artifact@v3
with: with:
name: windows-dist name: windows-dist
- name: get latest version tag - name: get latest version tag

View File

@@ -1,38 +1,38 @@
name: Unit Testing name: Unit Testing
on: on:
push: push:
branches: branches:
- main - main
pull_request: pull_request:
# Allows you to run this workflow manually from the Actions tab # Allows you to run this workflow manually from the Actions tab
workflow_dispatch: workflow_dispatch:
jobs: jobs:
test: test:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12'] python-version: ['3.8', '3.9', '3.10', '3.12']
fail-fast: false fail-fast: false
steps: steps:
- uses: actions/checkout@v4 - name: Checkout LFS objects
with: run: git clone https://${{secrets.GITHUB_TOKEN}}@gitea.psi.ch/${{ github.repository }}.git .
lfs: 'true'
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip
pip install pytest pip install pytest
pip install -r requirements.txt pip install -r requirements.txt
- name: Test with pytest - name: Test with pytest
run: | run: |
python -m pytest tests python -m pytest tests

View File

@@ -2,5 +2,5 @@
Package to handle data redction at AMOR instrument to be used by __main__.py script. Package to handle data redction at AMOR instrument to be used by __main__.py script.
""" """
__version__ = '3.2.0' __version__ = '3.2.2'
__date__ = '2026-02-26' __date__ = '2026-02-27'

View File

@@ -4,7 +4,7 @@ from typing import List, Type
from .options import ArgParsable from .options import ArgParsable
def commandLineArgs(config_items: List[Type[ArgParsable]], program_name=None): def commandLineArgs(config_items: List[Type[ArgParsable]], program_name=None, extra_args=[]):
""" """
Process command line argument. Process command line argument.
The type of the default values is used for conversion and validation. The type of the default values is used for conversion and validation.
@@ -36,4 +36,7 @@ def commandLineArgs(config_items: List[Type[ArgParsable]], program_name=None):
f'--{cpc.argument}', **cpc.add_argument_args f'--{cpc.argument}', **cpc.add_argument_args
) )
for ma in extra_args:
clas.add_argument(**ma)
return clas.parse_args() return clas.parse_args()

View File

@@ -1,7 +1,11 @@
""" """
Constants used in data reduction. Constants used in data reduction.
""" """
hdm = 6.626176e-34/1.674928e-27 # h / m hdm = 6.626176e-34/1.674928e-27 # h / m
lamdaCut = 2.5 # Aa lamdaCut = 2.5 # Aa
lamdaMax = 15.0 # Aa lamdaMax = 15.0 # Aa
polarizationConfigs = ['unpolarized', 'unpolarized', 'po', 'mo', 'op', 'pp', 'mp', 'om', 'pm', 'mm']
polarizationLabels = ['undetermined', 'unpolarized', 'spin-up', 'spin-down', 'op',
'up-up', 'down-up', 'om', 'up-down', 'down-down']

View File

@@ -9,7 +9,7 @@ from typing import Tuple
from . import const from . import const
from .event_data_types import EventDataAction, EventDatasetProtocol, append_fields, EVENT_BITMASKS from .event_data_types import EventDataAction, EventDatasetProtocol, append_fields, EVENT_BITMASKS
from .helpers import filter_project_x, merge_frames, extract_walltime from .helpers import filter_project_x, merge_frames, extract_walltime, add_log_to_pulses
from .instrument import Detector from .instrument import Detector
from .options import IncidentAngle from .options import IncidentAngle
from .header import Header from .header import Header
@@ -126,11 +126,14 @@ class FilterQzRange(EventDataAction):
d.events.mask += EVENT_BITMASKS["qRange"]*((self.qzRange[0]>d.events.qz) | (d.events.qz>self.qzRange[1])) d.events.mask += EVENT_BITMASKS["qRange"]*((self.qzRange[0]>d.events.qz) | (d.events.qz>self.qzRange[1]))
class FilterByLog(EventDataAction): class FilterByLog(EventDataAction):
def __init__(self, filter_string): def __init__(self, filter_string, remove_switchpulse=False):
if filter_string.startswith('!'):
filter_string = filter_string[1:]
remove_switchpulse = True
self.filter_string = filter_string self.filter_string = filter_string
self.remove_switchpulse = remove_switchpulse
def perform_action(self, dataset: EventDatasetProtocol) -> None: def perform_action(self, dataset: EventDatasetProtocol) -> None:
filter_variable = None filter_variable = None
@@ -150,34 +153,11 @@ class FilterByLog(EventDataAction):
EVENT_BITMASKS[filter_variable] = max(EVENT_BITMASKS.values())*2 EVENT_BITMASKS[filter_variable] = max(EVENT_BITMASKS.values())*2
if not filter_variable in dataset.data.pulses.dtype.names: if not filter_variable in dataset.data.pulses.dtype.names:
# interpolate the parameter values for all existing pulses # interpolate the parameter values for all existing pulses
self.add_log_to_pulses(filter_variable, dataset) add_log_to_pulses(filter_variable, dataset)
fltr_pulses = eval(self.filter_string, {filter_variable: dataset.data.pulses[filter_variable]}) fltr_pulses = eval(self.filter_string, {filter_variable: dataset.data.pulses[filter_variable]})
if self.remove_switchpulse:
switched = fltr_pulses[:-1] & ~fltr_pulses[1:]
fltr_pulses[:-1] &= ~switched
goodTimeS = dataset.data.pulses.time[fltr_pulses] goodTimeS = dataset.data.pulses.time[fltr_pulses]
filter_e = np.logical_not(np.isin(dataset.data.events.wallTime, goodTimeS)) filter_e = np.logical_not(np.isin(dataset.data.events.wallTime, goodTimeS))
dataset.data.events.mask += EVENT_BITMASKS[filter_variable]*filter_e dataset.data.events.mask += EVENT_BITMASKS[filter_variable]*filter_e
def add_log_to_pulses(self, key, dataset: EventDatasetProtocol):
"""
Add a log value for each pulse to the pulses array.
"""
# TODO: perform some interpolation for the pulse where a change occured
pulses = dataset.data.pulses
log_data = dataset.data.device_logs[key]
if log_data.time[0]>0:
logTimeS = np.hstack([[0], log_data.time, [pulses.time[-1]+1]])
logValues = np.hstack([[log_data.value[0]], log_data.value])
else:
logTimeS = np.hstack([log_data.time, [pulses.time[-1]+1]])
logValues = log_data.value
pulseLogS = np.zeros(pulses.time.shape[0], dtype=log_data.value.dtype)
j = 0
for i, ti in enumerate(pulses.time):
# find the last current item that was before this pulse
while ti>=logTimeS[j+1]:
j += 1
pulseLogS[i] = logValues[j]
pulses = append_fields(pulses, [(key, pulseLogS.dtype)])
pulses[key] = pulseLogS
dataset.data.pulses = pulses

View File

@@ -48,7 +48,7 @@ class ApplyParameterOverwrites(EventDataAction):
with open(self.config.sampleModel, 'r') as model_yml: with open(self.config.sampleModel, 'r') as model_yml:
model = yaml.safe_load(model_yml) model = yaml.safe_load(model_yml)
else: else:
logging.warning(f' ! the file {self.config.sampleModel}.yml does not exist. Ignored!') logging.warning(f' ! the file {self.config.sampleModel} does not exist. Ignored!')
return return
else: else:
model = dict(stack=self.config.sampleModel) model = dict(stack=self.config.sampleModel)
@@ -168,7 +168,7 @@ class ApplyMask(EventDataAction):
# TODO: why is this action time consuming? # TODO: why is this action time consuming?
d = dataset.data d = dataset.data
pre_filter = d.events.shape[0] pre_filter = d.events.shape[0]
if logging.getLogger().level == logging.DEBUG: if logging.getLogger().level <= logging.DEBUG:
# only run this calculation if debug level is actually active # only run this calculation if debug level is actually active
filtered_by_mask = {} filtered_by_mask = {}
for key, value in EVENT_BITMASKS.items(): for key, value in EVENT_BITMASKS.items():
@@ -183,3 +183,20 @@ class ApplyMask(EventDataAction):
d.events = d.events[fltr] d.events = d.events[fltr]
post_filter = d.events.shape[0] post_filter = d.events.shape[0]
logging.info(f' number of events: total = {pre_filter:7d}, filtered = {post_filter:7d}') logging.info(f' number of events: total = {pre_filter:7d}, filtered = {post_filter:7d}')
if d.device_logs == {} or not hasattr(dataset, 'update_info_from_logs'):
return
# filter pulses and logs to allow update of header information
from .helpers import add_log_to_pulses
times = np.unique(d.events.wallTime)
# make sure all log variables are associated with pulses
for key, log in d.device_logs.items():
if not key in d.pulses.dtype.names:
# interpolate the parameter values for all existing pulses
add_log_to_pulses(key, dataset)
# remove all pulses that have no more events
d.pulses = d.pulses[np.isin(d.pulses.time, times)]
for key, log in d.device_logs.items():
d.device_logs[key] = np.recarray(d.pulses.shape, dtype = log.dtype)
d.device_logs[key].time = d.pulses.time
d.device_logs[key].value = d.pulses[key]
dataset.update_info_from_logs()

View File

@@ -3,6 +3,7 @@ Reading of Amor NeXus data files to extract metadata and event stream.
""" """
from typing import BinaryIO, List, Union from typing import BinaryIO, List, Union
import sys
import h5py import h5py
import numpy as np import numpy as np
import logging import logging
@@ -27,6 +28,7 @@ except ImportError:
# Time zone used to interpret time strings # Time zone used to interpret time strings
AMOR_LOCAL_TIMEZONE = zoneinfo.ZoneInfo(key='Europe/Zurich') AMOR_LOCAL_TIMEZONE = zoneinfo.ZoneInfo(key='Europe/Zurich')
UTC = zoneinfo.ZoneInfo(key='UTC')
class AmorHeader: class AmorHeader:
""" """
@@ -47,8 +49,8 @@ class AmorHeader:
chopper_separation=('entry1/Amor/chopper/pair_separation', float), chopper_separation=('entry1/Amor/chopper/pair_separation', float),
detector_distance=('entry1/Amor/detector/transformation/distance', float), detector_distance=('entry1/Amor/detector/transformation/distance', float),
chopper_distance=('entry1/Amor/chopper/distance', float), chopper_distance=('entry1/Amor/chopper/distance', float),
sample_temperature=('entry1/sample/temperature', float, 'ignore'), sample_temperature=('entry1/sample/temperature', float),
sample_magnetic_field=('entry1/sample/magnetic_field', float, 'ignore'), sample_magnetic_field=('entry1/sample/magnetic_field', float),
mu=('entry1/Amor/instrument_control_parameters/mu', float, 'mu'), mu=('entry1/Amor/instrument_control_parameters/mu', float, 'mu'),
nu=('entry1/Amor/instrument_control_parameters/nu', float, 'nu'), nu=('entry1/Amor/instrument_control_parameters/nu', float, 'nu'),
@@ -94,7 +96,6 @@ class AmorHeader:
try: try:
hdfgrp = self.hdf[hdf_path] hdfgrp = self.hdf[hdf_path]
if hdfgrp.attrs.get('NX_class', None) == 'NXlog': if hdfgrp.attrs.get('NX_class', None) == 'NXlog':
self._log_keys.append(key)
# use the last value that was recoreded before the count started # use the last value that was recoreded before the count started
time_column = hdfgrp['time'][:] time_column = hdfgrp['time'][:]
try: try:
@@ -102,9 +103,12 @@ class AmorHeader:
except IndexError: except IndexError:
start_index = 0 start_index = 0
if hdfgrp['value'].ndim==1: if hdfgrp['value'].ndim==1:
return dtype(hdfgrp['value'][start_index]) output = dtype(hdfgrp['value'][start_index])
else: else:
return dtype(hdfgrp['value'][start_index, 0]) output = dtype(hdfgrp['value'][start_index, 0])
# make sure key is only appended if no exception was raised
self._log_keys.append(key)
return output
elif dtype is str: elif dtype is str:
return self.read_string(hdf_path) return self.read_string(hdf_path)
else: else:
@@ -136,8 +140,14 @@ class AmorHeader:
start_time = self.rv('start_time_fallback') start_time = self.rv('start_time_fallback')
# extract start time as unix time, adding UTC offset of 1h to time string # extract start time as unix time, adding UTC offset of 1h to time string
if start_time.endswith('Z') and sys.version_info.minor<11:
# older python versions did not support Z format
start_time = start_time[:-1]
TZ = UTC
else:
TZ = AMOR_LOCAL_TIMEZONE
start_date = datetime.fromisoformat(start_time) start_date = datetime.fromisoformat(start_time)
self.fileDate = start_date.replace(tzinfo=AMOR_LOCAL_TIMEZONE) self.fileDate = start_date.replace(tzinfo=TZ)
self._start_time_ns = np.uint64(self.fileDate.timestamp()*1e9) self._start_time_ns = np.uint64(self.fileDate.timestamp()*1e9)
# read general information and first data set # read general information and first data set
@@ -185,11 +195,17 @@ class AmorHeader:
) )
# while event times are not evaluated, use average_value reported in file for SEE # while event times are not evaluated, use average_value reported in file for SEE
if self.hdf['entry1/sample'].get('temperature', None) is not None: if self.hdf['entry1/sample'].get('temperature', None) is not None:
sample_temperature = self.rv('sample_temperature') try:
self.sample.sample_parameters['temperature'] = fileio.Value(sample_temperature, unit='K') sample_temperature = self.rv('sample_temperature')
except IndexError: pass
else:
self.sample.sample_parameters['temperature'] = fileio.Value(sample_temperature, unit='K')
if self.hdf['entry1/sample'].get('magnetic_field', None) is not None: if self.hdf['entry1/sample'].get('magnetic_field', None) is not None:
sample_magnetic_field = self.rv('sample_magnetic_field') try:
self.sample.sample_parameters['magnetic_field'] = fileio.Value(sample_magnetic_field, unit='T') sample_magnetic_field = self.rv('sample_magnetic_field')
except IndexError: pass
else:
self.sample.sample_parameters['magnetic_field'] = fileio.Value(sample_magnetic_field, unit='T')
def read_instrument_configuration(self): def read_instrument_configuration(self):
chopperSeparation = self.rv('chopper_separation') chopperSeparation = self.rv('chopper_separation')
@@ -197,8 +213,6 @@ class AmorHeader:
chopperDistance = self.rv('chopper_distance') chopperDistance = self.rv('chopper_distance')
chopperDetectorDistance = detectorDistance - chopperDistance chopperDetectorDistance = detectorDistance - chopperDistance
polarizationConfigs = ['unpolarized', 'unpolarized', 'po', 'mo', 'op', 'pp', 'mp', 'om', 'pm', 'mm']
mu = self.rv('mu') mu = self.rv('mu')
nu = self.rv('nu') nu = self.rv('nu')
kap = self.rv('kap') kap = self.rv('kap')
@@ -227,7 +241,7 @@ class AmorHeader:
self.timing = AmorTiming(ch1TriggerPhase, ch2TriggerPhase, chopperSpeed, chopperPhase, tau) self.timing = AmorTiming(ch1TriggerPhase, ch2TriggerPhase, chopperSpeed, chopperPhase, tau)
polarizationConfigLabel = self.rv('polarization_config_label') polarizationConfigLabel = self.rv('polarization_config_label')
polarizationConfig = fileio.Polarization(polarizationConfigs[polarizationConfigLabel]) polarizationConfig = fileio.Polarization(const.polarizationConfigs[polarizationConfigLabel])
logging.debug(f' polarization configuration: {polarizationConfig} (index {polarizationConfigLabel})') logging.debug(f' polarization configuration: {polarizationConfig} (index {polarizationConfigLabel})')
@@ -387,7 +401,7 @@ class AmorEventData(AmorHeader):
hdf_path, dtype, *_ = self.hdf_paths[key] hdf_path, dtype, *_ = self.hdf_paths[key]
hdfgroup = self.hdf[hdf_path] hdfgroup = self.hdf[hdf_path]
shape = hdfgroup['time'].shape shape = hdfgroup['time'].shape
data = np.recarray(shape, dtype=LOG_TYPE) data = np.recarray(shape, dtype=np.dtype([('value', self.hdf_paths[key][1]), ('time', np.int64)]))
data.time = hdfgroup['time'][:] data.time = hdfgroup['time'][:]
if len(hdfgroup['value'].shape)==1: if len(hdfgroup['value'].shape)==1:
data.value = hdfgroup['value'][:] data.value = hdfgroup['value'][:]
@@ -395,6 +409,29 @@ class AmorEventData(AmorHeader):
data.value = hdfgroup['value'][:, 0] data.value = hdfgroup['value'][:, 0]
self.data.device_logs[key] = data self.data.device_logs[key] = data
def update_info_from_logs(self):
RELEVANT_ITEMS = ['sample_temperature', 'sample_magnetic_field', 'polarization_config_label']
for key, log in self.data.device_logs.items():
if key not in RELEVANT_ITEMS:
continue
if log.value.dtype in [np.int8, np.int16, np.int32, np.int64]:
# for integer items (flags) report the most common one
value = np.bincount(log.value).argmax()
if logging.getLogger().getEffectiveLevel() <= logging.DEBUG \
and np.unique(log.value).shape[0]>1:
logging.debug(f' filtered values for {key} not unique, '
f'has {np.unique(log.value).shape[0]} values')
else:
value = log.value.mean()
if key == 'polarization_config_label':
self.instrument_settings.polarization = fileio.Polarization(const.polarizationConfigs[value])
elif key == 'sample_temperature':
self.sample.sample_parameters['temperature'].magnitue = value
elif key == 'sample_magnetic_field':
self.sample.sample_parameters['magnetic_field'].magnitue = value
def read_chopper_trigger_stream(self, packets): def read_chopper_trigger_stream(self, packets):
chopper1TriggerTime = np.array(self.hdf['entry1/Amor/chopper/ch2_trigger/event_time_zero'][:-2], dtype=np.int64) chopper1TriggerTime = np.array(self.hdf['entry1/Amor/chopper/ch2_trigger/event_time_zero'][:-2], dtype=np.int64)
#self.chopper2TriggerTime = self.chopper1TriggerTime + np.array(self.hdf['entry1/Amor/chopper/ch2_trigger/event_time'][:-2], dtype=np.int64) #self.chopper2TriggerTime = self.chopper1TriggerTime + np.array(self.hdf['entry1/Amor/chopper/ch2_trigger/event_time'][:-2], dtype=np.int64)

View File

@@ -1,10 +1,35 @@
""" """
Helper functions used during calculations. Uses numba enhanced functions if available, otherwise numpy based Helper functions used during calculations. Uses numba enhanced functions if available, otherwise numpy based
fallback is imported. fallback is imported.
""" """
import numpy as np
try: from .event_data_types import EventDatasetProtocol, append_fields
from .helpers_numba import merge_frames, extract_walltime, filter_project_x, calculate_derived_properties_focussing
except ImportError: try:
from .helpers_fallback import merge_frames, extract_walltime, filter_project_x, calculate_derived_properties_focussing from .helpers_numba import merge_frames, extract_walltime, filter_project_x, calculate_derived_properties_focussing
except ImportError:
from .helpers_fallback import merge_frames, extract_walltime, filter_project_x, calculate_derived_properties_focussing
def add_log_to_pulses(key, dataset: EventDatasetProtocol):
"""
Add a log value for each pulse to the pulses array.
"""
pulses = dataset.data.pulses
log_data = dataset.data.device_logs[key]
if log_data.time[0]>0:
logTimeS = np.hstack([[0], log_data.time, [pulses.time[-1]+1]])
logValues = np.hstack([[log_data.value[0]], log_data.value])
else:
logTimeS = np.hstack([log_data.time, [pulses.time[-1]+1]])
logValues = log_data.value
pulseLogS = np.zeros(pulses.time.shape[0], dtype=log_data.value.dtype)
j = 0
for i, ti in enumerate(pulses.time):
# find the last current item that was before this pulse
while ti>=logTimeS[j+1]:
j += 1
pulseLogS[i] = logValues[j]
pulses = append_fields(pulses, [(key, pulseLogS.dtype)])
pulses[key] = pulseLogS
dataset.data.pulses = pulses

54
eos/ls.py Normal file
View File

@@ -0,0 +1,54 @@
"""
eosls executable script to list available datafiles in current folder with some metadata information.
Author: Jochen Stahn (algorithms, python draft),
Artur Glavic (structuring and optimisation of code)
"""
import os
import logging
from eos.command_line import commandLineArgs
def main():
logging.getLogger().setLevel(logging.CRITICAL)
clas = commandLineArgs([], 'eosls', extra_args=[
dict(dest='path', nargs='*', default=['.'], help='paths to list file in')])
from glob import glob
import tabulate
from eos.file_reader import AmorHeader
files = []
for path in clas.path:
files+=glob(os.path.join(path, 'amor*.hdf'))
files.sort()
data = {
'File name': [],
'Start Time': [],
'mu': [],
'nu': [],
'div': [],
'Sample': [],
'T [K]': [],
'H [T]': [],
}
for fi in files:
data['File name'].append(os.path.basename(fi))
ah = AmorHeader(fi)
data['Sample'].append(ah.sample.name)
data['Start Time'].append(ah.fileDate.strftime('%y %m-%d %H:%M:%S'))
data['mu'].append('%.3f' % ah.geometry.mu)
data['nu'].append('%.3f' % ah.geometry.nu)
data['div'].append('%.3f' % ah.geometry.div)
T = ah.sample.sample_parameters.get('temperature', None)
data['T [K]'].append(T.magnitude if T is not None else '-')
H = ah.sample.sample_parameters.get('magnetic_field', None)
data['H [T]'].append(H.magnitude if H is not None else '-')
print(tabulate.tabulate(data, headers="keys"))
if __name__ == '__main__':
main()

View File

@@ -1,5 +1,5 @@
""" """
events2histogram vizualising data from Amor@SINQ, PSI amor-nicos vizualising data from Amor@SINQ, PSI
Author: Jochen Stahn (algorithms, python draft), Author: Jochen Stahn (algorithms, python draft),
Artur Glavic (structuring and optimisation of code) Artur Glavic (structuring and optimisation of code)

View File

@@ -543,6 +543,14 @@ class ReflectivityOutputConfig(ArgParsable):
}, },
) )
append: bool = field(
default=False,
metadata={
'group': 'output',
'help': 'if file already exists, append result as additional ORSO dataset (only Rqz.ort)',
},
)
def _output_format_list(self, outputFormat): def _output_format_list(self, outputFormat):
format_list = [] format_list = []
if OutputFomatOption.ort in outputFormat\ if OutputFomatOption.ort in outputFormat\

View File

@@ -224,6 +224,7 @@ class LZProjection(ProjectionInterface):
# do not perform gravity correction for footprint, would require norm detector distance that is unknown here # do not perform gravity correction for footprint, would require norm detector distance that is unknown here
fp_corr_lz = np.where(np.absolute(delta_lz+norm.angle)>5e-3, fp_corr_lz = np.where(np.absolute(delta_lz+norm.angle)>5e-3,
(delta_lz+self.angle)/(delta_lz+norm.angle), np.nan) (delta_lz+self.angle)/(delta_lz+norm.angle), np.nan)
fp_corr_lz[fp_corr_lz<0] = np.nan
self.data.mask &= np.logical_not(np.isnan(fp_corr_lz)) self.data.mask &= np.logical_not(np.isnan(fp_corr_lz))
self.data.norm = norm_lz*fp_corr_lz self.data.norm = norm_lz*fp_corr_lz
self.norm_monitor = norm.monitor self.norm_monitor = norm.monitor

View File

@@ -5,6 +5,8 @@ import sys
import numpy as np import numpy as np
from orsopy import fileio from orsopy import fileio
from .event_analysis import FilterByLog
from .event_handling import ApplyMask
from .file_reader import AmorEventData from .file_reader import AmorEventData
from .header import Header from .header import Header
from .path_handling import PathResolver from .path_handling import PathResolver
@@ -109,7 +111,7 @@ class ReflectivityReduction:
# output # output
if self.config.output.is_default('outputName'): if self.config.output.is_default('outputName'):
import datetime import datetime
_date = datetime.datetime.now().replace(microsecond=0).isoformat() _date = datetime.datetime.now().replace(microsecond=0).isoformat().replace(':', '-')
if self.header.sample.name: if self.header.sample.name:
_sampleName = self.header.sample.name.replace(' ', '_') _sampleName = self.header.sample.name.replace(' ', '_')
else: else:
@@ -165,7 +167,31 @@ class ReflectivityReduction:
self.header.measurement_data_files.append(fileio.File( file=os.path.basename(fileName), self.header.measurement_data_files.append(fileio.File( file=os.path.basename(fileName),
timestamp=self.dataset.fileDate)) timestamp=self.dataset.fileDate))
if 'polarization_config_label' in self.dataset.data.device_logs:
pols = np.unique(self.dataset.data.device_logs['polarization_config_label'].value)
pols = pols[pols>0]
if len(pols)>1:
logging.warning(f' found {len(pols)} polarization configurations, splitting dataset accordingly')
from copy import deepcopy
from . import const
full_ds = deepcopy(self.dataset)
for pi in pols:
plabel = const.polarizationLabels[pi]
pol_filter = FilterByLog(f'polarization_config_label=={pi}',
remove_switchpulse=True) | ApplyMask()
logging.info(f' filter {plabel} using polarization_config_label=={pi}')
pol_filter(self.dataset)
self.dataset.update_header(self.header)
pol_filter.update_header(self.header)
if self.config.reduction.timeSlize:
if i>0:
logging.warning(
" time slizing should only be used for one set of datafiles, check parameters")
self.analyze_timeslices(i, polstr=f' : polarization = {plabel}')
else:
self.analyze_unsliced(i, polstr=f' : polarization = {plabel}')
self.dataset = deepcopy(full_ds)
return
if self.config.reduction.timeSlize: if self.config.reduction.timeSlize:
if i>0: if i>0:
logging.warning(" time slizing should only be used for one set of datafiles, check parameters") logging.warning(" time slizing should only be used for one set of datafiles, check parameters")
@@ -173,7 +199,7 @@ class ReflectivityReduction:
else: else:
self.analyze_unsliced(i) self.analyze_unsliced(i)
def analyze_unsliced(self, i): def analyze_unsliced(self, i, polstr=''):
self.monitor = self.dataset.data.pulses.monitor.sum() self.monitor = self.dataset.data.pulses.monitor.sum()
logging.info(f' monitor = {self.monitor:8.2f} {MONITOR_UNITS[self.config.experiment.monitorType]}') logging.info(f' monitor = {self.monitor:8.2f} {MONITOR_UNITS[self.config.experiment.monitorType]}')
@@ -186,7 +212,7 @@ class ReflectivityReduction:
if 'Rqz.ort' in self.config.output.outputFormats: if 'Rqz.ort' in self.config.output.outputFormats:
headerRqz = self.header.orso_header() headerRqz = self.header.orso_header()
headerRqz.data_set = f'Nr {i} : mu = {self.dataset.geometry.mu:6.3f} deg' headerRqz.data_set = f'Nr {i} : mu = {self.dataset.geometry.mu:6.3f} deg{polstr}'
# projection on qz-grid # projection on qz-grid
result = proj.project_on_qz() result = proj.project_on_qz()
@@ -261,7 +287,7 @@ class ReflectivityReduction:
proj.plot(colorbar=True, cmap=str(self.config.output.plot_colormap)) proj.plot(colorbar=True, cmap=str(self.config.output.plot_colormap))
plt.title(f'{self.config.reduction.fileIdentifier[i]}') plt.title(f'{self.config.reduction.fileIdentifier[i]}')
def analyze_timeslices(self, i): def analyze_timeslices(self, i, polstr=''):
wallTime_e = np.float64(self.dataset.data.events.wallTime)/1e9 wallTime_e = np.float64(self.dataset.data.events.wallTime)/1e9
pulseTimeS = np.float64(self.dataset.data.pulses.time)/1e9 pulseTimeS = np.float64(self.dataset.data.pulses.time)/1e9
interval = self.config.reduction.timeSlize[0] interval = self.config.reduction.timeSlize[0]
@@ -311,7 +337,7 @@ class ReflectivityReduction:
headerRqz = self.header.orso_header( headerRqz = self.header.orso_header(
extra_columns=[fileio.Column('time', 's', 'time relative to start of measurement series')]) extra_columns=[fileio.Column('time', 's', 'time relative to start of measurement series')])
headerRqz.data_set = f'{i}_{ti}: time = {time:8.1f} s to {time+interval:8.1f} s' headerRqz.data_set = f'{i}_{ti}: time = {time:8.1f} s to {time+interval:8.1f} s{polstr}'
orso_data = fileio.OrsoDataset(headerRqz, result.data_for_time(time)) orso_data = fileio.OrsoDataset(headerRqz, result.data_for_time(time))
self.datasetsRqz.append(orso_data) self.datasetsRqz.append(orso_data)
@@ -329,8 +355,23 @@ class ReflectivityReduction:
def save_Rqz(self): def save_Rqz(self):
fname = os.path.join(self.config.output.outputPath, f'{self.config.output.outputName}.Rqz.ort') fname = os.path.join(self.config.output.outputPath, f'{self.config.output.outputName}.Rqz.ort')
logging.warning(f' {fname}') logging.warning(f' {fname}')
theSecondLine = f' {self.header.experiment.title} | {self.header.experiment.start_date} | sample {self.header.sample.name} | R(q_z)' if os.path.exists(fname) and self.config.output.append:
fileio.save_orso(self.datasetsRqz, fname, data_separator='\n', comment=theSecondLine) logging.info(' file already exists, append as new dataset')
with open(fname, 'r') as f:
f.readline()
theSecondLine = f.readline()[3:]
prev_data = fileio.load_orso(fname)
prev_names = [di.info.data_set for di in prev_data]
for i, di in enumerate(self.datasetsRqz):
while di.info.data_set in prev_names:
if di.info.data_set.startswith('Nr '):
di.info.data_set = f'Nr {i+len(prev_data)} :'+di.info.data_set.split(':', 1)[1]
break
di.info.data_set = di.info.data_set+'_'
fileio.save_orso(prev_data+self.datasetsRqz, fname, data_separator='\n', comment=theSecondLine)
else:
theSecondLine = f' {self.header.experiment.title} | {self.header.experiment.start_date} | sample {self.header.sample.name} | R(q_z)'
fileio.save_orso(self.datasetsRqz, fname, data_separator='\n', comment=theSecondLine)
def save_Rtl(self): def save_Rtl(self):
fname = os.path.join(self.config.output.outputPath, f'{self.config.output.outputName}.Rlt.ort') fname = os.path.join(self.config.output.outputPath, f'{self.config.output.outputName}.Rlt.ort')

View File

@@ -3,5 +3,6 @@ h5py
orsopy orsopy
numba numba
matplotlib matplotlib
tabulate
backports.strenum; python_version<"3.11" backports.strenum; python_version<"3.11"
backports.zoneinfo; python_version<"3.9" backports.zoneinfo; python_version<"3.9"

View File

@@ -34,5 +34,6 @@ Homepage = "https://github.com/jochenstahn/amor"
[options.entry_points] [options.entry_points]
console_scripts = console_scripts =
eos = eos.__main__:main eos = eos.__main__:main
eosls = eos.ls:main
events2histogram = eos.e2h:main events2histogram = eos.e2h:main
amor-nicos = eos.nicos:main amor-nicos = eos.nicos:main

BIN
test_data/amor2026n000826.hdf LFS Normal file

Binary file not shown.

View File

@@ -1,5 +1,6 @@
import os import os
import numpy as np import numpy as np
import logging
from unittest import TestCase from unittest import TestCase
from datetime import datetime from datetime import datetime
@@ -14,7 +15,7 @@ from eos.event_data_types import EVENT_BITMASKS, AmorGeometry, AmorTiming, AmorE
from eos.event_handling import ApplyPhaseOffset, ApplyParameterOverwrites, CorrectChopperPhase, CorrectSeriesTime, \ from eos.event_handling import ApplyPhaseOffset, ApplyParameterOverwrites, CorrectChopperPhase, CorrectSeriesTime, \
AssociatePulseWithMonitor, FilterMonitorThreshold, FilterStrangeTimes, TofTimeCorrection, ApplyMask AssociatePulseWithMonitor, FilterMonitorThreshold, FilterStrangeTimes, TofTimeCorrection, ApplyMask
from eos.event_analysis import ExtractWalltime, MergeFrames, AnalyzePixelIDs, CalculateWavelength, CalculateQ, \ from eos.event_analysis import ExtractWalltime, MergeFrames, AnalyzePixelIDs, CalculateWavelength, CalculateQ, \
FilterQzRange FilterQzRange, FilterByLog
from eos.options import MonitorType, IncidentAngle, ExperimentConfig from eos.options import MonitorType, IncidentAngle, ExperimentConfig
@@ -45,7 +46,7 @@ class MockEventData:
# list of data packates containing previous events # list of data packates containing previous events
packets = np.recarray((1000,), dtype=PACKET_TYPE) packets = np.recarray((1000,), dtype=PACKET_TYPE)
packets.start_index = np.linspace(0, events.shape[0]-1, packets.shape[0], dtype=np.uint32) packets.start_index = np.linspace(0, events.shape[0]-1, packets.shape[0], dtype=np.uint32)
packets.time = np.linspace(1700000000000000000, 1700000000000000000+3_600_000, packets.time = np.linspace(1700000000000000000, 1700000000000000000+3_600_000_000,
packets.shape[0], dtype=np.int64) packets.shape[0], dtype=np.int64)
# chopper pulses within the measurement time # chopper pulses within the measurement time
@@ -57,7 +58,7 @@ class MockEventData:
proton_current = np.recarray((50,), dtype=PC_TYPE) proton_current = np.recarray((50,), dtype=PC_TYPE)
proton_current.current = 1500.0 proton_current.current = 1500.0
proton_current[np.random.randint(0, proton_current.shape[0]-1, 10)] = 0. # random time with no current proton_current[np.random.randint(0, proton_current.shape[0]-1, 10)] = 0. # random time with no current
proton_current.time = np.linspace(1700000000000000300, 1700000000000000000+3_600_000, proton_current.time = np.linspace(1700000000000000300, 1700000000000000000+3_600_000_000,
proton_current.shape[0], dtype=np.int64) proton_current.shape[0], dtype=np.int64)
self.data = AmorEventStream(events, packets, pulses, proton_current) self.data = AmorEventStream(events, packets, pulses, proton_current)
@@ -77,6 +78,28 @@ class MockEventData:
wavelength = ValueRange(3.0, 12.5, 'angstrom'), wavelength = ValueRange(3.0, 12.5, 'angstrom'),
polarization = Polarization.unpolarized) polarization = Polarization.unpolarized)
def update_info_from_logs(self):
RELEVANT_ITEMS = ['sample_temperature', 'sample_magnetic_field', 'polarization_config_label']
for key, log in self.data.device_logs.items():
if key not in RELEVANT_ITEMS:
continue
if log.value.dtype in [np.int8, np.int16, np.int32, np.int64]:
# for integer items (flags) report the most common one
value = np.bincount(log.value).argmax()
if logging.getLogger().getEffectiveLevel() <= logging.DEBUG \
and np.unique(log.value).shape[0]>1:
logging.debug(f' filtered values for {key} not unique, '
f'has {np.unique(log.value).shape[0]} values')
else:
value = log.value.mean()
if key == 'polarization_config_label':
self.instrument_settings.polarization = Polarization(const.polarizationConfigs[value])
elif key == 'sample_temperature':
self.sample.sample_parameters['temperature'].magnitue = value
elif key == 'sample_magnetic_field':
self.sample.sample_parameters['magnetic_field'].magnitue = value
class TestActionClass(TestCase): class TestActionClass(TestCase):
@classmethod @classmethod
def setUpClass(cls): def setUpClass(cls):
@@ -496,3 +519,44 @@ class TestSimpleActions(TestCase):
self.d.data.events.mask, self.d.data.events.mask,
np.array([1, 0, 0, 0, 1], dtype=np.int32) * EVENT_BITMASKS['qRange'] np.array([1, 0, 0, 0, 1], dtype=np.int32) * EVENT_BITMASKS['qRange']
) )
def test_filter_by_log(self):
action = FilterByLog("test_log==0") | ApplyMask()
class LogWarnError(Exception):
...
def warn_raise(*args, **kwargs):
raise LogWarnError()
_orig_warn = logging.warning
try:
logging.warning = warn_raise
with self.assertRaises(LogWarnError):
action.perform_action(self.d)
finally:
logging.warning = _orig_warn
self._extract_walltime()
test_log = np.recarray(shape=(2,), dtype=np.dtype([('value', np.int32),
('time', np.int64)]))
test_log.time = [-5, self.d.data.pulses.time[100]+123]
test_log.value = [0, 1]
self.d.data.device_logs['test_log'] = test_log
action.perform_action(self.d)
self.assertEqual(self.d.data.pulses.shape[0], 101)
def test_filter_by_log_switchpulse(self):
action = FilterByLog("!test_log==0") | ApplyMask()
self._extract_walltime()
test_log = np.recarray(shape=(2,), dtype=np.dtype([('value', np.int32),
('time', np.int64)]))
test_log.time = [-5, self.d.data.pulses.time[100]+123]
test_log.value = [0, 1]
self.d.data.device_logs['test_log'] = test_log
self.d.data.device_logs['check_log'] = test_log.copy()
action.perform_action(self.d)
self.assertEqual(self.d.data.pulses.shape[0], 100)
np.testing.assert_array_equal(
self.d.data.device_logs['test_log'],
self.d.data.device_logs['check_log'],
)

View File

@@ -1,8 +1,10 @@
import os import os
import cProfile import cProfile
import numpy as np
from unittest import TestCase from unittest import TestCase
from dataclasses import fields, MISSING from dataclasses import fields, MISSING
from eos import options, reduction_reflectivity, logconfig from eos import options, reduction_reflectivity, logconfig
from orsopy import fileio
logconfig.setup_logging() logconfig.setup_logging()
logconfig.update_loglevel(1) logconfig.update_loglevel(1)
@@ -38,28 +40,23 @@ class FullAmorTest(TestCase):
def tearDown(self): def tearDown(self):
self.pr.disable() self.pr.disable()
for fi in ['test_results/test.Rqz.ort', 'test_results/5952.norm']: for fi in ['test_results/test.Rqz.ort', 'test_results/5952.norm']:
try: try:
os.unlink(fi) os.unlink(fi)
except FileNotFoundError: except FileNotFoundError:
pass pass
def test_time_slicing(self): def test_time_slicing(self):
experiment_config = options.ExperimentConfig( experiment_config = options.ExperimentConfig(
chopperSpeed=self._field_defaults['ExperimentConfig']['chopperSpeed'],
chopperPhase=-13.5, chopperPhase=-13.5,
chopperPhaseOffset=-5, chopperPhaseOffset=-5,
monitorType=self._field_defaults['ExperimentConfig']['monitorType'],
lowCurrentThreshold=self._field_defaults['ExperimentConfig']['lowCurrentThreshold'],
yRange=(18, 48), yRange=(18, 48),
lambdaRange=(3., 11.5), lambdaRange=(3., 11.5),
incidentAngle=self._field_defaults['ExperimentConfig']['incidentAngle'],
mu=0, mu=0,
nu=0, nu=0,
muOffset=0.0, muOffset=0.0,
sampleModel='air | 10 H2O | D2O' sampleModel='air | 10 H2O | D2O'
) )
reduction_config = options.ReflectivityReductionConfig( reduction_config = options.ReflectivityReductionConfig(
normalisationMethod=self._field_defaults['ReflectivityReductionConfig']['normalisationMethod'],
qResolution=0.01, qResolution=0.01,
qzRange=(0.01, 0.15), qzRange=(0.01, 0.15),
thetaRange=(-0.75, 0.75), thetaRange=(-0.75, 0.75),
@@ -84,22 +81,16 @@ class FullAmorTest(TestCase):
def test_noslicing(self): def test_noslicing(self):
experiment_config = options.ExperimentConfig( experiment_config = options.ExperimentConfig(
chopperSpeed=self._field_defaults['ExperimentConfig']['chopperSpeed'],
chopperPhase=-13.5, chopperPhase=-13.5,
chopperPhaseOffset=-5, chopperPhaseOffset=-5,
monitorType=self._field_defaults['ExperimentConfig']['monitorType'],
lowCurrentThreshold=self._field_defaults['ExperimentConfig']['lowCurrentThreshold'],
yRange=(18, 48), yRange=(18, 48),
lambdaRange=(3., 11.5), lambdaRange=(3., 11.5),
incidentAngle=self._field_defaults['ExperimentConfig']['incidentAngle'],
mu=0, mu=0,
nu=0, nu=0,
muOffset=0.0, muOffset=0.0,
) )
reduction_config = options.ReflectivityReductionConfig( reduction_config = options.ReflectivityReductionConfig(
normalisationMethod=self._field_defaults['ReflectivityReductionConfig']['normalisationMethod'],
qResolution=0.01, qResolution=0.01,
qzRange=self._field_defaults['ReflectivityReductionConfig']['qzRange'],
thetaRange=(-0.75, 0.75), thetaRange=(-0.75, 0.75),
fileIdentifier=["6003", "6004", "6005"], fileIdentifier=["6003", "6004", "6005"],
scale=[1], scale=[1],
@@ -117,3 +108,57 @@ class FullAmorTest(TestCase):
# run second time to reuse norm file # run second time to reuse norm file
reducer = reduction_reflectivity.ReflectivityReduction(config) reducer = reduction_reflectivity.ReflectivityReduction(config)
reducer.reduce() reducer.reduce()
def test_eventfilter(self):
self.reader_config.year = 2026
experiment_config = options.ExperimentConfig()
reduction_config = options.ReflectivityReductionConfig(fileIdentifier=["826"],
logfilter=['polarization_config_label==2'])
output_config = options.ReflectivityOutputConfig(
outputFormats=[options.OutputFomatOption.Rqz_ort],
outputName='test',
outputPath='test_results',
)
config=options.ReflectivityConfig(self.reader_config, experiment_config, reduction_config, output_config)
reducer = reduction_reflectivity.ReflectivityReduction(config)
reducer.reduce()
espin_up = reducer.dataset.data.events.shape[0]
reduction_config.logfilter = ['polarization_config_label==3']
output_config.append = True
reducer = reduction_reflectivity.ReflectivityReduction(config)
reducer.reduce()
espin_down = reducer.dataset.data.events.shape[0]
# measurement should have about 2x as many counts in spin_down
self.assertAlmostEqual(espin_down/espin_up, 2., 2)
# perform the same filter but remove pulses during which the switch occured
reduction_config.logfilter = ['!polarization_config_label==3']
output_config.append = True
reducer = reduction_reflectivity.ReflectivityReduction(config)
reducer.reduce()
espin_down2 = reducer.dataset.data.events.shape[0]
# measurement should have about 2x as many counts in spin_down
self.assertLess(espin_down2, espin_down)
def test_polsplitting(self):
self.reader_config.year = 2026
experiment_config = options.ExperimentConfig()
reduction_config = options.ReflectivityReductionConfig(fileIdentifier=["826"])
output_config = options.ReflectivityOutputConfig(
outputFormats=[options.OutputFomatOption.Rqz_ort],
outputName='test',
outputPath='test_results',
)
config=options.ReflectivityConfig(self.reader_config, experiment_config, reduction_config, output_config)
reducer = reduction_reflectivity.ReflectivityReduction(config)
reducer.reduce()
results = fileio.load_orso(os.path.join(output_config.outputPath, output_config.outputName+'.Rqz.ort'))
self.assertEqual(len(results), 2)
self.assertEqual(results[0].info.data_source.measurement.instrument_settings.polarization, 'po')
self.assertEqual(results[1].info.data_source.measurement.instrument_settings.polarization, 'mo')
espin_up = np.nansum(results[0].data[:,1])
espin_down = np.nansum(results[1].data[:,1])
# the total intensity should be around equal as events are doubled and monitor counts are doubled
self.assertAlmostEqual(espin_down/espin_up, 1., 2)