Dev/add simulator tests in GitHub workflows (#1337)
All checks were successful
Build on RHEL9 / build (push) Successful in 3m50s
Build on RHEL8 / build (push) Successful in 4m46s
Run Simulator Tests on local RHEL9 / build (push) Successful in 14m37s
Build on local RHEL9 / build (push) Successful in 1m26s
Run Simulator Tests on local RHEL8 / build (push) Successful in 16m57s
Build on local RHEL8 / build (push) Successful in 3m33s

* added simulator tests in github workflows

* indentation error

* typo

* debug

* Logging for debugging

* added more debug lines

* more debugging

* debug

* debug

* debug

* dont throw if process does not exist

* debug

* added absolute path to sls_detector commands

* some refactoring in test scripts

* added absolute path to all slsdet command

* typo

* ../tests/scripts/test_frame_synchronizer.py

* raise exception upon failure for github workflows

* removed hidden tags

* some refactoring in test scripts

* some refactoring

* fixed CMakeLists

* fixed unsuccesful merge

* updated python tests using simulators

* debug import error

* debug module import

* python -m runs module pytest as script - everything in path available

* removed integartion tests

* enable file write not to log files

* run tests without log files

* increased sleep time for udp packets

* added logg level variable to cmake

* added testing policies to documenattion

* disabled check for num_frames for jungfrau & xilinx

* set log level as cmake cached variable

* disable tests for jungfrau and xilinx_ctb

* check frames for HDF5

* updated Documentation of Testing

* changed withdetectorsimulators to detectorintegration

* replaced [.cmdcall] with [.detectorintegration]

* check_file_size only disabled for jungfrau - disable for all roi tests

* changed time to wait after receive to 5 ms

* take into account half modules of eiger

* num udp interfaces needs to be consistent across modules

* suppressed warning enclosing if

* config added 2 udp ports per default for moench and jungfrau

* write detector output to console

* allow jungfrau to tests num frames, remove unused variable (numinterfaces), add comment for future to handle traceback to know which calling function threw the files unmatched, added documentation for tests (examples for .detectoritnegration and how to disable marked tests, removed addditional argumetns to disable for test_simulator as one can just use ~, removed the check that checks for jungfrau checking number of frames at master attributes and at rx test, removed unused advanced_test_settings in test_simulator script, the num_mods check for multiple modules is removed and default num  modules set to 1 for test_simulator (to be increased later), back to raising exception for killprocess

* removed integration tests from cmakelists.txt and cmk.sh, modified the tests workflow command to reflect the disable argument and removed xilinx_ctb from test (fix fromdeveloper merge to be done)

* filtering by actual name for disable certain tests on github workflow

* minor refactor

* wip

* wip

* changes to run on local rh9 runner instead of github workfloa

* modified yml to remove some leftover from github workflow

* test

* fix build_dir in scripts (github workflow) and pytest dir in gitea workflow

* making the local machine use python3.13 binary

* pythonpath added

* changes for build_Dir back

* allowing ctb api tests

* allowed ctb api tests and set up slsdetname envt variable for shared memory being reserved just for these tests

* added rh8 workflow for local runner on gitea

* remnants from rh9 local runner

* remnants from rh9 local runner

* conda env for all shell for local runner

* allowing hdf5 to build on local runner

* run all tests for both the runners

* refactored fixtures a bit and merged some tests that use one session for entire server

* test fail

* test fix

* adding github workflow to test without data file checks and without logs

* documentation changes

* unnecessary import in conftest

* allowing the session_simulator to test for multiple modules and interfaces etc

* allow test_simulator script to run for 2 modules for all modules except ctb and xilinx ctb

* run upon push

* removing the disable file check on github workflow

* minor adjustment

* testing without synch

* reverting to previous

* with log file

* without the space

* summary from file and more error extracts from file to terminal

* minor

* trying nlf for more details

* updated with no log file to print everything to screen  also for det and rxr

* trying a no throw

* stoi was more about indent in yaml

* tries

* wip

* debug

* number of frames inconsistent fix=>just take first one, only test xilinx

* jungfrau tests without frames caught check

* extend the disable file check to everywhere that creates files

* specify path for test_simulator

* withoutprinting ==

* wip

* back with printing===, but not parsing file for errors anymore

* lang?

* wip

* safe log?

* wip2

* wip

* dont split error as its streaming live, just raise

* with log files

* lang?

* last resort

* wip

* test no det with general tests

* show tests live

* also include hidden integration tests

* without extra summary?

* revert

* last resort again

* tsquash on int64_t?

* tsquash on int64_t? mroe print

* writing to /tmp?

* all tests

* might be the fix?

* write to file

* fixed a few quiet mode no log file tests

* work on any branch for github tests, work on also release candidates for gitea tests

* added frame synchronizer tests to github workflow

* moved tests to run_tests.yaml from cmake.yaml

* documentation

* disabled general tests

---------

Co-authored-by: Dhanya Thattil <dhanya.thattil@psi.ch>
This commit is contained in:
2026-02-03 11:45:12 +01:00
committed by GitHub
parent a390e580d2
commit 28b2aa9673
42 changed files with 1243 additions and 1758 deletions

View File

@@ -60,7 +60,3 @@ catch_discover_tests(tests)
configure_file(scripts/test_simulators.py ${CMAKE_BINARY_DIR}/bin/test_simulators.py COPYONLY)
configure_file(scripts/test_frame_synchronizer.py ${CMAKE_BINARY_DIR}/bin/test_frame_synchronizer.py COPYONLY)
configure_file(scripts/utils_for_test.py ${CMAKE_BINARY_DIR}/bin/utils_for_test.py COPYONLY)
configure_file(scripts/test_roi.py ${CMAKE_BINARY_DIR}/bin/test_roi.py COPYONLY)
configure_file(scripts/test_free.py ${CMAKE_BINARY_DIR}/bin/test_free.py COPYONLY)
configure_file(scripts/test_commands.py ${CMAKE_BINARY_DIR}/bin/test_commands.py COPYONLY)

View File

@@ -1,100 +0,0 @@
# SPDX-License-Identifier: LGPL-3.0-or-other
# Copyright (C) 2021 Contributors to the SLS Detector Package
'''
This file is used to start up simulators and test for freeing shm and accessing it from python.
Run this using: pytest -s test_free.py
'''
from time import time
import pytest, sys, time
from slsdet import Detector, Ctb, freeSharedMemory
from slsdet.defines import DEFAULT_TCP_RX_PORTNO
from utils_for_test import (
Log,
LogLevel,
cleanup,
startDetectorVirtualServer,
startProcessInBackground,
loadConfig,
loadBasicSettings
)
def startReceiver(num_mods, fp):
if num_mods == 1:
cmd = ['slsReceiver']
else:
cmd = ['slsMultiReceiver', str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
# in 10.0.0
#cmd = ['slsMultiReceiver', '-p', str(DEFAULT_TCP_RX_PORTNO), '-n', str(num_mods)]
startProcessInBackground(cmd, fp)
time.sleep(1)
'''
scope = module =>Once per test file/module
to share expensive setup like startDetectorVirtualServer
'''
@pytest.fixture(scope="module")
def det_config():
return {
"name": "ctb",
"num_mods": 1
}
# autouse is false to pass explictly
@pytest.fixture(scope="module", autouse=False)
def setup_simulator(det_config):
"""Fixture to start the detector server once and clean up at the end."""
fp = sys.stdout
cleanup(fp)
# server
startDetectorVirtualServer(det_config["name"], det_config["num_mods"], fp)
# receiver
startReceiver(det_config["num_mods"], fp)
# config and basic settings
d = loadConfig(name=det_config["name"], rx_hostname="localhost", settingsdir="", fp=fp, num_mods=det_config["num_mods"])
loadBasicSettings(name=det_config["name"], d=d, fp=fp)
yield d # tests run here
cleanup(fp)
def test_parameters_file(setup_simulator):
d = setup_simulator
Log(LogLevel.INFOBLUE, f'\nRunning test_parameters_file')
assert isinstance(d, Detector)
with open("/tmp/params.det", "w") as f:
f.write("frames 2\n")
f.write("fwrite 1\n")
# this should not throw
d.parameters = "/tmp/params.det"
assert d.frames == 2
assert d.fwrite == 1
Log(LogLevel.INFOGREEN, f"✅ Test passed. Command: parameters")
def test_include_file(setup_simulator):
d = setup_simulator
Log(LogLevel.INFOBLUE, f'\test_include_file test_parameters_file')
assert isinstance(d, Detector)
with open("/tmp/params.det", "w") as f:
f.write("frames 3\n")
f.write("fwrite 0\n")
# this should not throw
d.include = "/tmp/params.det"
assert d.frames == 3
assert d.fwrite == 0
Log(LogLevel.INFOGREEN, f"✅ Test passed. Command: include")

View File

@@ -14,44 +14,43 @@ from utils_for_test import (
Log,
LogLevel,
RuntimeException,
checkIfProcessRunning,
killProcess,
cleanup,
cleanSharedmemory,
startProcessInBackground,
startProcessInBackgroundWithLogFile,
checkLogForErrors,
startDetectorVirtualServer,
loadConfig,
loadBasicSettings,
ParseArguments
ParseArguments,
build_dir,
optional_file
)
LOG_PREFIX_FNAME = '/tmp/slsFrameSynchronizer_test'
MAIN_LOG_FNAME = LOG_PREFIX_FNAME + '_log.txt'
PULL_SOCKET_PREFIX_FNAME = LOG_PREFIX_FNAME + '_pull_socket_'
SYNCHRONIZER_SUFFIX_FNAME = LOG_PREFIX_FNAME + '_synchronizer.txt'
def startFrameSynchronizerPullSocket(name, fp):
fname = PULL_SOCKET_PREFIX_FNAME + name + '.txt'
def startFrameSynchronizerPullSocket(name, fp, quiet_mode=False):
cmd = ['python', '-u', 'frameSynchronizerPullSocket.py']
startProcessInBackgroundWithLogFile(cmd, fp, fname)
fname = PULL_SOCKET_PREFIX_FNAME + name + '.txt'
startProcessInBackground(cmd, fp, fname, quiet_mode)
time.sleep(1)
checkLogForErrors(fp, fname)
def startFrameSynchronizer(num_mods, fp):
cmd = ['slsFrameSynchronizer', str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
def startFrameSynchronizer(num_mods, fp, quiet_mode=False):
cmd = [str(build_dir / 'slsFrameSynchronizer'), str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
# in 10.0.0
#cmd = ['slsFrameSynchronizer', '-p', str(DEFAULT_TCP_RX_PORTNO), '-n', str(num_mods)]
startProcessInBackground(cmd, fp)
fname = SYNCHRONIZER_SUFFIX_FNAME
startProcessInBackground(cmd, fp, fname, quiet_mode)
time.sleep(1)
def acquire(fp, det):
Log(LogLevel.INFO, 'Acquiring')
Log(LogLevel.INFO, 'Acquiring', fp)
Log(LogLevel.INFO, 'Acquiring', fp, True)
det.acquire()
@@ -60,14 +59,12 @@ def testFramesCaught(name, det, num_frames):
if fnum != num_frames:
raise RuntimeException(f"{name} caught only {fnum}. Expected {num_frames}")
Log(LogLevel.INFOGREEN, f'Frames caught test passed for {name}')
Log(LogLevel.INFOGREEN, f'Frames caught test passed for {name}', fp)
Log(LogLevel.INFOGREEN, f'Frames caught test passed for {name}', fp, True)
def testZmqHeadetTypeCount(name, det, num_mods, num_frames, fp):
Log(LogLevel.INFO, f"Testing Zmq Header type count for {name}")
Log(LogLevel.INFO, f"Testing Zmq Header type count for {name}", fp)
Log(LogLevel.INFO, f"Testing Zmq Header type count for {name}", fp, True)
htype_counts = {
"header": 0,
"series_end": 0,
@@ -89,7 +86,6 @@ def testZmqHeadetTypeCount(name, det, num_mods, num_frames, fp):
htype_counts[htype] += 1
except json.JSONDecodeError:
continue
# test if file contents matches expected counts
num_ports_per_module = 1 if name == "gotthard2" else det.numinterfaces
total_num_frame_parts = num_ports_per_module * num_mods * num_frames
@@ -100,19 +96,17 @@ def testZmqHeadetTypeCount(name, det, num_mods, num_frames, fp):
except Exception as e:
raise RuntimeException(f'Failed to get zmq header count type. Error:{str(e)}') from e
Log(LogLevel.INFOGREEN, f"Zmq Header type count test passed for {name}")
Log(LogLevel.INFOGREEN, f"Zmq Header type count test passed for {name}", fp)
Log(LogLevel.INFOGREEN, f"Zmq Header type count test passed for {name}", fp, True)
def startTestsForAll(args, fp):
for server in args.servers:
try:
Log(LogLevel.INFOBLUE, f'Synchronizer Tests for {server}')
Log(LogLevel.INFOBLUE, f'Synchronizer Tests for {server}', fp)
Log(LogLevel.INFOBLUE, f'Synchronizer Tests for {server}', fp, True)
cleanup(fp)
startDetectorVirtualServer(server, args.num_mods, fp)
startFrameSynchronizerPullSocket(server, fp)
startFrameSynchronizer(args.num_mods, fp)
startDetectorVirtualServer(server, args.num_mods, fp, args.quiet)
startFrameSynchronizerPullSocket(server, fp, args.quiet)
startFrameSynchronizer(args.num_mods, f, args.quiet_modep)
d = loadConfig(name=server, rx_hostname=args.rx_hostname, settingsdir=args.settingspath, log_file_fp=fp, num_mods=args.num_mods, num_frames=args.num_frames)
loadBasicSettings(name=server, d=d, fp=fp)
acquire(fp, d)
@@ -128,6 +122,9 @@ def startTestsForAll(args, fp):
if __name__ == '__main__':
args = ParseArguments(description='Automated tests to test frame synchronizer', default_num_mods=2)
if args.no_log_file:
raise RuntimeException("Cannot run frame synchronizer test without files")
Log(LogLevel.INFOBLUE, '\nLog File: ' + MAIN_LOG_FNAME + '\n')
with open(MAIN_LOG_FNAME, 'w') as fp:

View File

@@ -1,164 +0,0 @@
# SPDX-License-Identifier: LGPL-3.0-or-other
# Copyright (C) 2021 Contributors to the SLS Detector Package
'''
This file is used to start up simulators and test for freeing shm and accessing it from python.
Run this using: pytest -s test_free.py
'''
import pytest, sys
from slsdet import Detector, Ctb, freeSharedMemory
from utils_for_test import (
Log,
LogLevel,
cleanup,
startDetectorVirtualServer,
connectToVirtualServers,
SERVER_START_PORTNO
)
'''
scope = module =>Once per test file/module
to share expensive setup like startDetectorVirtualServer
'''
@pytest.fixture(scope="module")
def det_config():
return {
"name": "ctb",
"num_mods": 1
}
@pytest.fixture(scope="module", autouse=True)
def setup_simulator(det_config):
"""Fixture to start the detector server once and clean up at the end."""
fp = sys.stdout
cleanup(fp)
startDetectorVirtualServer(det_config["name"], det_config["num_mods"], fp)
Log(LogLevel.INFOBLUE, f'Waiting for server to start up and connect')
connectToVirtualServers(det_config["name"], det_config["num_mods"])
Log(LogLevel.INFOBLUE, f'Freeing shm before tests')
freeSharedMemory()
yield # tests run here
cleanup(fp)
def test_exptime_after_free_should_raise(setup_simulator):
Log(LogLevel.INFOBLUE, f'\nRunning test_exptime_after_free_should_raise')
d = Ctb() # creates multi shm (assuming no shm exists)
d.hostname = f"localhost:{SERVER_START_PORTNO}" # hostname command creates mod shm, d maps to it
d.free() # frees the shm, d should not map to it anymore
# accessing invalid shm should throw
with pytest.raises(Exception) as exc_info:
_ = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exception was: {exc_info.value}")
assert str(exc_info.value) == "Shared memory is invalid or freed. Close resources before access."
def free_and_create_shm():
k = Ctb() # opens existing shm if it exists
k.hostname = f"localhost:{SERVER_START_PORTNO}" # free and recreate shm, maps to local shm struct
def test_exptime_after_not_passing_var_should_raise(setup_simulator):
Log(LogLevel.INFOBLUE, f'\nRunning test_exptime_after_not_passing_var_should_raise')
d = Ctb() # creates multi shm (assuming no shm exists)
d.hostname = f"localhost:{SERVER_START_PORTNO}" # hostname command creates mod shm, d maps to it
free_and_create_shm() # ctb() opens multi shm, hostname command frees and recreates mod shm but shm struct is local. d still maps to old shm struct
# accessing invalid shm should throw
with pytest.raises(Exception) as exc_info:
_ = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exception was: {exc_info.value}")
assert str(exc_info.value) == "Shared memory is invalid or freed. Close resources before access."
def free_and_create_shm_passing_ctb_var(k):
k = Ctb() # opens existing shm if it exists (disregards k as its new Ctb only local to this function)
k.hostname = f"localhost:{SERVER_START_PORTNO}" # free and recreate shm, maps to local shm struct
def test_exptime_after_passing_ctb_var_should_raise(setup_simulator):
Log(LogLevel.INFOBLUE, f'\nRunning test_exptime_after_passing_ctb_var_should_raise')
d = Ctb() # creates multi shm (assuming no shm exists)
d.hostname = f"localhost:{SERVER_START_PORTNO}" # hostname command creates mod shm, d maps to it
free_and_create_shm_passing_ctb_var(d) # ctb() opens multi shm, hostname command frees and recreates mod shm but shm struct is local. d still maps to old shm struct
# accessing invalid shm should throw
with pytest.raises(Exception) as exc_info:
_ = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exception was: {exc_info.value}")
assert str(exc_info.value) == "Shared memory is invalid or freed. Close resources before access."
def free_and_create_shm_returning_ctb():
k = Ctb() # opens existing shm if it exists (disregards k as its new Ctb only local to this function)
k.hostname = f"localhost:{SERVER_START_PORTNO}" # free and recreate shm, maps to local shm struct
return k
def test_exptime_after_returning_ctb_should_raise(setup_simulator):
Log(LogLevel.INFOBLUE, f'\nRunning test_exptime_after_returning_ctb_should_raise')
d = Ctb() # creates multi shm (assuming no shm exists)
d = free_and_create_shm_returning_ctb() # ctb() opens multi shm, hostname command frees and recreates mod shm but shm struct is local but returned. d now maps to the new sturct
# this should not throw
exptime_val = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exptime was: {exptime_val}")
assert isinstance(exptime_val, float)
free_and_create_shm_returning_ctb() # this time d is not updated, it maps to the old shm struct
# accessing invalid shm should throw
with pytest.raises(Exception) as exc_info:
_ = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exception was: {exc_info.value}")
assert str(exc_info.value) == "Shared memory is invalid or freed. Close resources before access."
def test_hostname_twice_acess_old_should_raise(setup_simulator):
Log(LogLevel.INFOBLUE, f'\nRunning test_hostname_twice_acess_old_should_raise')
d = Ctb() # creates multi shm (assuming no shm exists)
d.hostname = f"localhost:{SERVER_START_PORTNO}" # hostname command creates mod shm, d maps to it
d.hostname = f"localhost:{SERVER_START_PORTNO}" # Freeing and recreating shm while mapping d to it (old shm is out of scope)
# this should not throw
exptime_val = d.exptime
Log(LogLevel.INFOGREEN, f"✅ Test passed, exptime was: {exptime_val}")
assert isinstance(exptime_val, float)

View File

@@ -1,82 +0,0 @@
# SPDX-License-Identifier: LGPL-3.0-or-other
# Copyright (C) 2021 Contributors to the SLS Detector Package
'''
This file is used to start up simulators, receivers and test roi for every detector in many configurations.
'''
import sys, time
import traceback
from slsdet import Detector, burstMode
from slsdet.defines import DEFAULT_TCP_RX_PORTNO, DEFAULT_UDP_DST_PORTNO
from datetime import timedelta
from utils_for_test import (
Log,
LogLevel,
RuntimeException,
cleanup,
startProcessInBackground,
startReceiver,
startDetectorVirtualServer,
connectToVirtualServers,
loadBasicSettings,
loadConfig,
runProcessWithLogFile
)
LOG_PREFIX_FNAME = '/tmp/slsDetectorPackage_virtual_roi_test'
MAIN_LOG_FNAME = LOG_PREFIX_FNAME + '_log.txt'
ROI_TEST_FNAME = LOG_PREFIX_FNAME + '_results_'
def startTestsForAll(fp):
servers = [
'eiger',
'jungfrau',
'mythen3',
'gotthard2',
'moench',
]
nmods = 2
for server in servers:
for ninterfaces in range(1, 2):
if ninterfaces == 2 and server != 'jungfrau' and server != 'moench':
continue
try:
msg = f'Starting Roi Tests for {server}'
if server == 'jungfrau' or server == 'moench':
msg += f' with {ninterfaces} interfaces'
Log(LogLevel.INFOBLUE, msg)
Log(LogLevel.INFOBLUE, msg, fp)
cleanup(fp)
startDetectorVirtualServer(server, nmods, fp)
startReceiver(nmods, fp)
d = loadConfig(name=server, log_file_fp = fp, num_mods=nmods, num_frames=5, num_interfaces=ninterfaces)
loadBasicSettings(name=server, d=d, fp=fp)
fname = ROI_TEST_FNAME + server + '.txt'
cmd = ['tests', 'rx_roi', '--abort', '-s']
runProcessWithLogFile('Roi Tests for ' + server, cmd, fp, fname)
Log(LogLevel.INFO, '\n')
except Exception as e:
raise RuntimeException(f'Roi Tests failed') from e
Log(LogLevel.INFOGREEN, 'Passed all Roi tests for all detectors \n' + str(servers))
if __name__ == '__main__':
Log(LogLevel.INFOBLUE, '\nLog File: ' + MAIN_LOG_FNAME + '\n')
with open(MAIN_LOG_FNAME, 'w') as fp:
try:
startTestsForAll(fp)
#TODO: check master file as well for both json and hdf5 as well
cleanup(fp)
except Exception as e:
with open(MAIN_LOG_FNAME, 'a') as fp_error:
traceback.print_exc(file=fp_error)
cleanup(fp)
Log(LogLevel.ERROR, f'Tests Failed.')

View File

@@ -2,91 +2,98 @@
# Copyright (C) 2021 Contributors to the SLS Detector Package
'''
This file is used to start up simulators, receivers and run all the tests on them and finally kill the simulators and receivers.
It can be used to run all catch tests with tag [.detectorintegration].
Pass --tests <testname> to run specific tests only or --tests <testtag> to run all tests with that specific tag.
Pass --servers <server1> <server2> ... to run tests only for specific detector servers.
'''
import argparse
import sys, subprocess, time, traceback
from contextlib import contextmanager
from slsdet import Detector
from slsdet.defines import DEFAULT_TCP_RX_PORTNO
from utils_for_test import (
Log,
LogLevel,
RuntimeException,
checkIfProcessRunning,
killProcess,
cleanup,
cleanSharedmemory,
startProcessInBackground,
runProcessWithLogFile,
runProcess,
startReceiver,
runProcess,
startDetectorVirtualServer,
loadConfig,
loadBasicSettings,
ParseArguments
ParseArguments,
build_dir,
optional_file
)
LOG_PREFIX_FNAME = '/tmp/slsDetectorPackage_virtual_test'
MAIN_LOG_FNAME = LOG_PREFIX_FNAME + '_log.txt'
GENERAL_TESTS_LOG_FNAME = LOG_PREFIX_FNAME + '_results_general.txt'
CMD_TEST_LOG_PREFIX_FNAME = LOG_PREFIX_FNAME + '_results_cmd_'
def startReceiver(num_mods, fp):
if num_mods == 1:
cmd = ['slsReceiver']
else:
cmd = ['slsMultiReceiver', str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
# in 10.0.0
#cmd = ['slsMultiReceiver', '-p', str(DEFAULT_TCP_RX_PORTNO), '-n', str(num_mods)]
startProcessInBackground(cmd, fp)
time.sleep(1)
def startGeneralTests(fp):
fname = GENERAL_TESTS_LOG_FNAME
cmd = ['tests', '--abort', '-s']
cmd = [str(build_dir / 'tests'), '--abort', '-s']
try:
cleanup(fp)
runProcessWithLogFile('General Tests', cmd, fp, fname)
runProcess('General Tests', cmd, fp, fname)
except Exception as e:
raise RuntimeException(f'General tests failed.') from e
def startTestsForAll(args, fp):
fname_template = LOG_PREFIX_FNAME + "_{}_{}.txt"
test_filter = args.tests
cmd = [str(build_dir / 'tests'), '--abort', test_filter, '-s']
num_mods = args.num_mods
def startCmdTestsForAll(args, fp):
for server in args.servers:
try:
num_mods = 2 if server == 'eiger' else 1
fname = CMD_TEST_LOG_PREFIX_FNAME + server + '.txt'
cmd = ['tests', '--abort', args.markers, '-s']
for curMods in range(1, num_mods + 1):
if curMods == 2 and server in ['ctb', 'xilinx_ctb']:
continue
for ninterfaces in [1,2]: # always test both
if ninterfaces == 2 and server != 'jungfrau' and server != 'moench':
continue
Log(LogLevel.INFOBLUE, f'Starting Cmd Tests for {server}')
cleanup(fp)
startDetectorVirtualServer(name=server, num_mods=num_mods, fp=fp)
startReceiver(num_mods, fp)
d = loadConfig(name=server, rx_hostname=args.rx_hostname, settingsdir=args.settingspath, log_file_fp=fp, num_mods=num_mods)
loadBasicSettings(name=server, d=d, fp=fp)
runProcessWithLogFile('Cmd Tests (' + args.markers + ') for ' + server, cmd, fp, fname)
except Exception as e:
raise RuntimeException(f'Cmd Tests failed for {server}.') from e
if server == "eiger":
curMods *= 2 # top and bottom half module
try:
fname = fname_template.format(args.tests, server) if not args.no_log_file else None
Log(LogLevel.INFOBLUE, f'Starting {args.tests} Tests for {server}, {ninterfaces} interfaces, {curMods} modules', fp, True)
cleanup(fp)
startDetectorVirtualServer(name=server, num_mods=curMods, fp=fp, no_log_file=args.no_log_file, quiet_mode=args.quiet)
startReceiver(curMods, fp, args.no_log_file, args.quiet)
d = loadConfig(name=server, rx_hostname=args.rx_hostname, settingsdir=args.settingspath, log_file_fp=fp, num_mods=curMods, num_interfaces=ninterfaces)
loadBasicSettings(name=server, d=d, fp=fp)
runProcess('Tests (' + args.tests + ') for ' + server, cmd, fp, fname, args.quiet)
except Exception as e:
raise RuntimeException(f'Tests (' + args.tests + ') failed for ' + server + '.') from e
Log(LogLevel.INFOGREEN, 'Passed all tests for all detectors \n' + str(args.servers))
if __name__ == '__main__':
args = ParseArguments(description='Automated tests with the virtual detector servers', default_num_mods=1, markers=True, general_tests_option=True)
if args.num_mods > 1:
raise RuntimeException(f'Cannot support multiple modules at the moment (except Eiger).')
args = ParseArguments(description='Automated tests with the virtual detector servers', default_num_mods=2, specific_tests=True, general_tests_option=True)
Log(LogLevel.INFOBLUE, '\nLog File: ' + MAIN_LOG_FNAME + '\n')
with open(MAIN_LOG_FNAME, 'w') as fp:
with optional_file(MAIN_LOG_FNAME if not args.no_log_file else None, 'w', args.quiet) as fp:
try:
if args.general_tests:
startGeneralTests(fp)
startCmdTestsForAll(args, fp)
startTestsForAll(args, fp)
cleanup(fp)
except Exception as e:
with open(MAIN_LOG_FNAME, 'a') as fp_error:
traceback.print_exc(file=fp_error)
traceback.print_exc(file=fp)
cleanup(fp)
Log(LogLevel.ERROR, f'Tests Failed.')
raise e

View File

@@ -3,19 +3,24 @@
'''
This file is used for common utils used for integration tests between simulators and receivers.
'''
import os
from pathlib import Path
import sys, subprocess, time, argparse
from enum import Enum
from colorama import Fore, Style, init
from datetime import timedelta
from contextlib import contextmanager
from slsdet import Detector, Ctb, detectorSettings, burstMode
from slsdet.defines import DEFAULT_TCP_RX_PORTNO, DEFAULT_UDP_DST_PORTNO
SERVER_START_PORTNO=1900
LOG_PREFIX_FNAME = "/tmp/slsDetectorPackage_"
init(autoreset=True)
build_dir = Path(__file__).resolve().parents[2] / "build" / "bin"
class LogLevel(Enum):
INFO = 0
INFORED = 1
@@ -44,10 +49,13 @@ LOG_COLORS = {
}
def Log(level: LogLevel, message: str, stream=sys.stdout):
def Log(level: LogLevel, message: str, stream=sys.stdout, both=False):
color = LOG_COLORS.get(level, Fore.WHITE)
label = LOG_LABELS.get(level, "")
print(f"{color}{label}{message}{Style.RESET_ALL}", file=stream, flush=True)
if both and stream != sys.stdout:
print(f"{color}{label}{message}{Style.RESET_ALL}", file=sys.stdout, flush=True)
class RuntimeException (Exception):
@@ -56,9 +64,27 @@ class RuntimeException (Exception):
super().__init__(message)
@contextmanager
def optional_file(file_path=None, mode='w', quiet_mode=False):
if file_path:
f = open(file_path, mode)
try:
yield f
finally:
f.close()
else:
if quiet_mode:
f = open(os.devnull, mode)
try:
yield f
finally:
f.close()
else:
yield sys.stdout
def checkIfProcessRunning(processName):
cmd = f"pgrep -f {processName}"
res = subprocess.getoutput(cmd)
res = subprocess.getoutput(f"pgrep -f {processName}")
return res.strip().splitlines()
@@ -80,18 +106,13 @@ def killProcess(name, fp):
def cleanSharedmemory(fp):
Log(LogLevel.INFO, 'Cleaning up shared memory', fp)
try:
p = subprocess.run(['sls_detector_get', 'free'], stdout=fp, stderr=fp, check = False)
except FileNotFoundError:
# Binary not available (e.g. on CI) → ignore
Log(LogLevel.INFO, 'sls_detector_get not found, skipping shared memory cleanup', fp)
p = subprocess.run([build_dir / 'sls_detector_get', 'free'], stdout=fp, stderr=fp)
except Exception as e:
# Any other cleanup failure should NEVER fail tests
Log(LogLevel.WARN, f'Ignoring shared memory cleanup error: {e}', fp)
raise RuntimeException(f'Could not free shared memory: {str(e)}')
def cleanup(fp):
Log(LogLevel.INFO, 'Cleaning up')
Log(LogLevel.INFO, 'Cleaning up', fp)
Log(LogLevel.INFO, 'Cleaning up', fp, True)
killProcess('DetectorServer_virtual', fp)
killProcess('slsReceiver', fp)
killProcess('slsMultiReceiver', fp)
@@ -100,20 +121,15 @@ def cleanup(fp):
cleanSharedmemory(fp)
def startProcessInBackground(cmd, fp):
Log(LogLevel.INFO, 'Starting up ' + ' '.join(cmd))
Log(LogLevel.INFO, 'Starting up ' + ' '.join(cmd), fp)
try:
p = subprocess.Popen(cmd, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, restore_signals=False)
except Exception as e:
raise RuntimeException(f'Failed to start {cmd}:{str(e)}') from e
def startProcessInBackground(cmd, fp, log_file_name: str, quiet_mode=False):
info_text = 'Starting up ' + ' '.join(cmd)
if log_file_name:
info_text += '. Log: ' + log_file_name
Log(LogLevel.INFO, f'{info_text}', fp, True)
def startProcessInBackgroundWithLogFile(cmd, fp, log_file_name: str):
Log(LogLevel.INFOBLUE, 'Starting up ' + ' '.join(cmd) + '. Log: ' + log_file_name)
Log(LogLevel.INFOBLUE, 'Starting up ' + ' '.join(cmd) + '. Log: ' + log_file_name, fp)
try:
with open(log_file_name, 'w') as log_fp:
with optional_file(log_file_name, 'w', quiet_mode) as log_fp:
subprocess.Popen(cmd, stdout=log_fp, stderr=log_fp, text=True)
except Exception as e:
raise RuntimeException(f'Failed to start {cmd}:{str(e)}') from e
@@ -124,8 +140,7 @@ def checkLogForErrors(fp, log_file_path: str):
with open(log_file_path, 'r') as log_file:
for line in log_file:
if 'Error' in line:
Log(LogLevel.ERROR, f"Error found in log: {line.strip()}")
Log(LogLevel.ERROR, f"Error found in log: {line.strip()}", fp)
Log(LogLevel.ERROR, f"Error found in log: {line.strip()}", fp, True)
raise RuntimeException("Error found in log file")
except FileNotFoundError:
print(f"Log file not found: {log_file_path}")
@@ -135,33 +150,92 @@ def checkLogForErrors(fp, log_file_path: str):
raise
def runProcessWithLogFile(name, cmd, fp, log_file_name):
Log(LogLevel.INFOBLUE, 'Running ' + name + '. Log: ' + log_file_name)
Log(LogLevel.INFOBLUE, 'Running ' + name + '. Log: ' + log_file_name, fp)
Log(LogLevel.INFOBLUE, 'Cmd: ' + ' '.join(cmd), fp)
def checkLogForErrorsOrSummary(fp, lines, source_name=""):
failed = False # if it found "failed" or "FAILED" in file
failed_msg = ""
printing_error = False # print every line in file after failure
printing_summary = False # print summary if no failure
for line in lines:
line_stripped = line.rstrip()
# Detect failure (case-insensitive)
if not failed and (": FAILED:" in line or " failed\nassertions" in line):
failed = True
failed_msg = line_stripped
printing_error = True
Log(LogLevel.ERROR, line_stripped, fp)
if source_name:
Log(LogLevel.ERROR, f"Error log from file: {source_name}")
Log(LogLevel.ERROR, "="*79)
# After failure, log everything as ERROR
if printing_error:
print(f"{line_stripped}")
continue
# Summary delimiter
if line_stripped.startswith("====="):
printing_summary = True
# No failure - print summary lines
if printing_summary:
print(f"{line_stripped}")
if failed:
Log(LogLevel.ERROR, "="*79)
raise RuntimeException(f'Test failed: {failed_msg}')
else:
print("="*79)
def runProcess(name, cmd, fp, log_file_name = None, quiet_mode=False):
info_text = 'Running ' + name + '.'
if log_file_name:
info_text += ' Log: ' + log_file_name
Log(LogLevel.INFOBLUE, info_text, fp, True)
Log(LogLevel.INFOBLUE, 'Cmd: ' + ' '.join(cmd), fp, True)
error_log = None
try:
with open(log_file_name, 'w') as log_fp:
subprocess.run(cmd, stdout=log_fp, stderr=log_fp, check=True, text=True)
if log_file_name:
with optional_file(log_file_name, 'w', quiet_mode) as log_fp:
subprocess.run(cmd, stdout=log_fp, stderr=log_fp, check=True, text=True)
else:
capture = subprocess.run(cmd, check=True, text=True, capture_output=True)
captured_log = capture.stdout.splitlines()
except subprocess.CalledProcessError as e:
pass
print("error: ", str(e))
captured_log = e.stdout.splitlines()
pass
except Exception as e:
print("something else failed")
Log(LogLevel.ERROR, f'Failed to run {name}:{str(e)}', fp)
raise RuntimeException(f'Failed to run {name}:{str(e)}')
with open (log_file_name, 'r') as f:
for line in f:
if "FAILED" in line:
raise RuntimeException(f'{line}')
if log_file_name:
with optional_file(log_file_name, 'r') as log_fp:
checkLogForErrorsOrSummary(fp, log_fp, log_file_name)
else:
checkLogForErrorsOrSummary(fp, captured_log)
Log(LogLevel.INFOGREEN, name + ' successful!\n')
Log(LogLevel.INFOGREEN, name + ' successful!\n', fp)
Log(LogLevel.INFOGREEN, name + ' successful!\n', fp, True)
def startDetectorVirtualServer(name :str, num_mods, fp):
def startDetectorVirtualServer(name :str, num_mods, fp, no_log_file = False, quiet_mode=False):
for i in range(num_mods):
port_no = SERVER_START_PORTNO + (i * 2)
cmd = [name + 'DetectorServer_virtual', '-p', str(port_no)]
startProcessInBackgroundWithLogFile(cmd, fp, "/tmp/virtual_det_" + name + "_" + str(i) + ".txt")
cmd = [str(build_dir / (name + 'DetectorServer_virtual')), '-p', str(port_no)]
fname = LOG_PREFIX_FNAME + "virtual_det_" + name + "_" + str(SERVER_START_PORTNO) + ".txt"
if no_log_file:
fname = None
startProcessInBackground(cmd, fp, fname, quiet_mode)
match name:
case 'jungfrau':
time.sleep(7)
@@ -196,20 +270,23 @@ def connectToVirtualServers(name, num_mods, ctb_object=False):
return d
def startReceiver(num_mods, fp):
def startReceiver(num_mods, fp, no_log_file = False, quiet_mode=False):
if num_mods == 1:
cmd = ['slsReceiver']
cmd = [str(build_dir / 'slsReceiver')]
fname = LOG_PREFIX_FNAME + "slsReceiver.txt"
else:
cmd = ['slsMultiReceiver', str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
cmd = [str(build_dir / 'slsMultiReceiver'), str(DEFAULT_TCP_RX_PORTNO), str(num_mods)]
fname = LOG_PREFIX_FNAME + "slsMultiReceiver.txt"
# in 10.0.0
#cmd = ['slsMultiReceiver', '-p', str(DEFAULT_TCP_RX_PORTNO), '-n', str(num_mods)]
startProcessInBackground(cmd, fp)
if no_log_file:
fname = None
startProcessInBackground(cmd, fp, fname, quiet_mode)
time.sleep(1)
def loadConfig(name, rx_hostname = 'localhost', settingsdir = None, log_file_fp = None, num_mods = 1, num_frames = 1, num_interfaces = 1):
Log(LogLevel.INFO, 'Loading config')
Log(LogLevel.INFO, 'Loading config', log_file_fp)
Log(LogLevel.INFO, 'Loading config', log_file_fp, True)
try:
d = connectToVirtualServers(name, num_mods)
@@ -217,17 +294,15 @@ def loadConfig(name, rx_hostname = 'localhost', settingsdir = None, log_file_fp
d.numinterfaces = num_interfaces
d.udp_dstport = DEFAULT_UDP_DST_PORTNO
if name == 'eiger' or name == 'jungfrau' or name == 'moench':
if name == 'eiger' or num_interfaces == 2:
d.udp_dstport2 = DEFAULT_UDP_DST_PORTNO + 1
d.rx_hostname = rx_hostname
d.udp_dstip = 'auto'
if name != "eiger":
d.udp_srcip = 'auto'
if name == "jungfrau" or name == "moench":
if num_interfaces == 2:
d.udp_dstip2 = 'auto'
if name == "jungfrau" or name == "moench" or name == "xilinx_ctb":
@@ -241,6 +316,7 @@ def loadConfig(name, rx_hostname = 'localhost', settingsdir = None, log_file_fp
d.trimen = [4500, 5400, 6400] if name == 'eiger' else [4000, 6000, 8000, 12000]
d.setThresholdEnergy(4500, detectorSettings.STANDARD)
d.frames = num_frames
except Exception as e:
@@ -248,10 +324,11 @@ def loadConfig(name, rx_hostname = 'localhost', settingsdir = None, log_file_fp
return d
# for easy acquire
def loadBasicSettings(name, d, fp):
Log(LogLevel.INFO, 'Loading basic settings for ' + name)
Log(LogLevel.INFO, 'Loading basic settings for ' + name, fp)
Log(LogLevel.INFO, 'Loading basic settings for ' + name, fp, True)
try:
# basic settings for easy acquire
if name == "jungfrau":
@@ -278,7 +355,7 @@ def loadBasicSettings(name, d, fp):
except Exception as e:
raise RuntimeException(f'Could not load config for {name}. Error: {str(e)}') from e
def ParseArguments(description, default_num_mods=1, markers=False, general_tests_option=False):
def ParseArguments(description, default_num_mods=2, specific_tests=False, general_tests_option=False):
parser = argparse.ArgumentParser(description)
default_settings_path = Path(__file__).resolve().parents[2] / "settingsdir"
@@ -293,16 +370,21 @@ def ParseArguments(description, default_num_mods=1, markers=False, general_tests
help='Number of frames to test with')
parser.add_argument('-s', '--servers', nargs='*',
help='Detector servers to run')
if markers:
parser.add_argument('-m', '--markers', nargs='?', default ='[.cmdcall]',
help = 'Markers to use for cmd tests, default: [.cmdcall]')
parser.add_argument('-nlf', '--no-log-file', action='store_true',
help='Dont write output to log file')
parser.add_argument('-q', '--quiet', action='store_true',
help='Dont write to stdout when possible.')
if specific_tests:
parser.add_argument('-t', '--tests', nargs='?', default ='[.detectorintegration]',
help = 'Test markers or specific test name to use for tests, default: [.detectorintegration]')
if general_tests_option:
parser.add_argument('-g', '--general_tests', action='store_true',
parser.add_argument('-g', '--general-tests', action='store_true',
help = 'Enable general tests (no value needed)')
args = parser.parse_args()
# Set default server list if not provided
if args.servers is None:
args.servers = [
@@ -323,8 +405,19 @@ def ParseArguments(description, default_num_mods=1, markers=False, general_tests
f"num_mods: '{args.num_mods}'\n"
f"num_frames: '{args.num_frames}'"
)
if markers:
msg += f"\nmarkers: '{args.markers}'"
if args.no_log_file:
msg += f"\nLog File: Disabled"
else:
msg += f"\nLog File: Enabled"
if args.quiet:
msg += f"\nQuiet mode: Enabled"
else:
msg += f"\nQuiet mode: Disabled"
if specific_tests:
msg += f"\ntests: '{args.tests}'"
if general_tests_option:
msg += f"\ngeneral_tests: '{args.general_tests}'"