Add job processing system with streaming endpoint

Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
This commit is contained in:
GotthardG 2025-04-10 20:33:00 +02:00
parent 401f1e889a
commit e4740ec0b5

View File

@ -60,12 +60,16 @@ release:
TWINE_USERNAME: gitlab-ci-token TWINE_USERNAME: gitlab-ci-token
TWINE_PASSWORD: $CI_JOB_TOKEN TWINE_PASSWORD: $CI_JOB_TOKEN
ENVIRONMENT: test ENVIRONMENT: test
before_script: # common setup for every job before_script:
- python3.12 -m venv $VIRTUAL_ENV - python3.12 -m venv $VIRTUAL_ENV
- source $VIRTUAL_ENV/bin/activate - source $VIRTUAL_ENV/bin/activate
- pip install --upgrade pip - pip install --upgrade pip
# Explicit clean-up commands
- find "$CI_PROJECT_DIR" -name '__pycache__' -type d -exec rm -rf {} + || true - find "$CI_PROJECT_DIR" -name '__pycache__' -type d -exec rm -rf {} + || true
- find "$CI_PROJECT_DIR" -name '*.pyc' -type f -delete || true - find "$CI_PROJECT_DIR" -name '*.pyc' -type f -delete || true
# Fix permissions (if necessary)
- chmod -R u+w "$CI_PROJECT_DIR"
script: script:
# build and run commands within docker container context # build and run commands within docker container context
- docker compose build backend - docker compose build backend