Compare commits

...

65 Commits

Author SHA1 Message Date
866139baea Added a test function for SSE 2025-04-11 13:59:46 +02:00
2e6d06018c Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 13:20:34 +02:00
17fee74862 Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 12:40:02 +02:00
86d03285e4 Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 12:37:18 +02:00
afa473e8a8 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:27:54 +02:00
1da5634013 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:22:51 +02:00
c24d8d9afa Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:17:58 +02:00
358ff7a6f8 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:12:38 +02:00
32edeac476 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:05:58 +02:00
d9c480cd57 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:03:43 +02:00
00f694951a Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:00:23 +02:00
a05b2efd26 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:52:44 +02:00
529e1d7157 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:36:33 +02:00
a948fbf7d7 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:24:54 +02:00
3eaadf0b27 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:22:11 +02:00
288217051f Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:55:56 +02:00
e62e18d774 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:50:05 +02:00
4b7a84aaa6 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:46:42 +02:00
e4740ec0b5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:33:00 +02:00
401f1e889a Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 19:00:25 +02:00
479cdda780 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:53:33 +02:00
834b460eb5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:44:27 +02:00
0ae7f11a25 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:41:05 +02:00
6ea0a09938 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:56:08 +02:00
00b8df1111 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:16:44 +02:00
e5844a5fb5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:12:51 +02:00
9feda3a8a6 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:05:01 +02:00
3ae1de12b2 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:02:19 +02:00
14d23cdc96 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:52:23 +02:00
9e63ad33cb Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:42:38 +02:00
9ef94e73dd Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:38:22 +02:00
8c783eae06 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:17:16 +02:00
fda9142155 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 11:53:36 +02:00
f54ffd138a Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:52:58 +02:00
7dc7e32c33 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:35:04 +02:00
f3951f5be1 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:26:15 +02:00
5cd8157904 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:19:53 +02:00
b19aa37023 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:19:10 +02:00
ee9ed865ea Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:09:17 +02:00
af51428fc2 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:39:27 +02:00
c0a43351e1 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:35:32 +02:00
fe80ba7be2 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:32:46 +02:00
049cd591ca Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:28:30 +02:00
1ba606132e Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:09:27 +02:00
1052794a39 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:55:18 +02:00
64bd7b0077 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:47:14 +02:00
39bdb735f5 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:39:53 +02:00
a4ed8259da Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 21:55:20 +02:00
bb6cca4f23 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 15:09:22 +02:00
248085b3c4 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 12:04:50 +01:00
8663d4aaa9 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:55:12 +01:00
b95a560288 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:53:52 +01:00
9d5ec8fae5 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:53:22 +01:00
36f9978c79 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:51:49 +01:00
1c44bc160b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:33:24 +01:00
3ccf4ecc14 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:27:25 +01:00
56d2a1c3e9 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:43:59 +01:00
e22fc86db6 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:42:18 +01:00
faebccf68d Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:00:33 +01:00
615e4c5433 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:50:06 +01:00
35d4cceea3 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:43:44 +01:00
88d0745c3b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:37:22 +01:00
bd852bea8f Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:20:44 +01:00
70c457b0aa Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:17:37 +01:00
536cfcd34b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:14:12 +01:00
30 changed files with 1062 additions and 239 deletions

1
.gitignore vendored
View File

@ -4,3 +4,4 @@
/backend/python-client/ /backend/python-client/
/backend/openapi.json /backend/openapi.json
/backend/openapi-generator-cli.jar /backend/openapi-generator-cli.jar
/backend/images/*/

View File

@ -23,9 +23,9 @@ test:
script: script:
- source $VIRTUAL_ENV/bin/activate - source $VIRTUAL_ENV/bin/activate
- pip install -r requirements.txt - pip install -r requirements.txt
- export PYTHONPATH=$PYTHONPATH:/home/gitlab-runner/builds/t3_38ooWt/0/mx/heidi-v2/backend - export PYTHONPATH=$PYTHONPATH:/home/gitlab-runner/builds/t3_38ooWt/0/mx/aaredb/backend
- cd /home/gitlab-runner/builds/t3_38ooWt/0/mx/heidi-v2 # Change to the project root - cd /home/gitlab-runner/builds/t3_38ooWt/0/mx/aaredb # Change to the project root
- pytest --cov=app --cov-report=xml # Run tests and generate coverage report #pytest --cov=app --cov-report=xml # Run tests and generate coverage report
lint: lint:
stage: test stage: test
@ -57,16 +57,39 @@ release:
stage: release stage: release
when: manual when: manual
variables: variables:
TWINE_USERNAME: gitlab-ci-token # Keep username the same TWINE_USERNAME: gitlab-ci-token
TWINE_PASSWORD: $CI_JOB_TOKEN # Use PAT stored in GitLab CI/CD Variables TWINE_PASSWORD: $CI_JOB_TOKEN
script: ENVIRONMENT: test
- echo "Setting up Python dependencies..." before_script:
- cp ~/.env.test $CI_PROJECT_DIR/frontend/.
- python3.12 -m venv $VIRTUAL_ENV
- source $VIRTUAL_ENV/bin/activate - source $VIRTUAL_ENV/bin/activate
- pip install -r requirements.txt - pip install --upgrade pip
- rm -f openapi.json || true # Explicit clean-up commands
- rm -rf python-client || true - find "" -name '__pycache__' -type d -exec rm -rf {} + || true
- bash make_openapi_client.sh # Generate OpenAPI client and package - find "$CI_PROJECT_DIR" -name '*.pyc' -type f -delete || true
- ls backend/python-client/dist/ # Debug artifacts to ensure files exist # Fix permissions (if necessary)
- chmod -R u+w "$CI_PROJECT_DIR"
script:
- ls -la "$CI_PROJECT_DIR/config_${ENVIRONMENT}.json" # <-- Verify host file
- file "$CI_PROJECT_DIR/config_${ENVIRONMENT}.json" # <-- Confirm host file type
# build and run commands within docker container context
- docker compose --env-file frontend/.env.${ENVIRONMENT} build backend
# Run commands inside your 'backend' service container
- docker compose --env-file frontend/.env.${ENVIRONMENT} run --rm backend mkdir -p /app/backend/ssl
- docker compose --env-file frontend/.env.${ENVIRONMENT} run --rm backend bash make_openapi_client.sh
# After script finishes execution within the container,
# revert to the runner environment context if needed
# Assuming 'python-client' is generated and mounted correctly,
# subsequent twine commands must be run out of the container
- ls backend/python-client/dist/ # Ensure files exist
# install further publishing requirements outside of docker
- pip install --upgrade build setuptools wheel twine - pip install --upgrade build setuptools wheel twine
- twine check backend/python-client/dist/* - twine check backend/python-client/dist/*
- twine upload --repository-url ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi backend/python-client/dist/* - twine upload --repository-url ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi backend/python-client/dist/*

View File

@ -3,9 +3,12 @@ FROM python:3.12-slim-bullseye
# Set the working directory in the container # Set the working directory in the container
WORKDIR /app WORKDIR /app
RUN mkdir -p /app/backend/ssl
# Install system dependencies # Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
default-jre \
build-essential \ build-essential \
unixodbc-dev \ unixodbc-dev \
libmariadb-dev \ libmariadb-dev \
@ -15,6 +18,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gpg && \ gpg && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
# May need to install postgreSQL and run the server within the docker
# Download and install the msodbcsql18 driver for arm64-compatible base image # Download and install the msodbcsql18 driver for arm64-compatible base image
RUN apt-get update && apt-get install -y --no-install-recommends unixodbc-dev curl apt-transport-https gnupg && \ RUN apt-get update && apt-get install -y --no-install-recommends unixodbc-dev curl apt-transport-https gnupg && \
curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \ curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \

View File

@ -26,7 +26,9 @@ else: # Default is dev
db_host = os.getenv("DB_HOST", "localhost") db_host = os.getenv("DB_HOST", "localhost")
db_name = os.getenv("DB_NAME", "aare_dev_db") db_name = os.getenv("DB_NAME", "aare_dev_db")
SQLALCHEMY_DATABASE_URL = f"mysql://{db_username}:{db_password}@{db_host}/{db_name}" SQLALCHEMY_DATABASE_URL = (
f"postgresql://{db_username}:{db_password}@{db_host}/{db_name}"
)
# Create engine and session # Create engine and session
engine = create_engine(SQLALCHEMY_DATABASE_URL) engine = create_engine(SQLALCHEMY_DATABASE_URL)

View File

@ -7,10 +7,13 @@ from sqlalchemy import (
JSON, JSON,
DateTime, DateTime,
Boolean, Boolean,
Enum,
func,
) )
from sqlalchemy.orm import relationship from sqlalchemy.orm import relationship
from .database import Base from .database import Base
from datetime import datetime from datetime import datetime
import enum
class Shipment(Base): class Shipment(Base):
@ -303,3 +306,20 @@ class Results(Base):
# total_refl: int # total_refl: int
# unique_refl: int # unique_refl: int
# #comments: Optional[constr(max_length=200)] = None # #comments: Optional[constr(max_length=200)] = None
class JobStatus(str, enum.Enum):
TODO = "todo"
SUBMITTED = "submitted"
DONE = "done"
class Jobs(Base):
__tablename__ = "jobs"
id = Column(Integer, primary_key=True, index=True)
experiment_parameters_id = Column(Integer, nullable=False)
status = Column(Enum(JobStatus), default=JobStatus.TODO, nullable=False)
parameters = Column(JSON, nullable=False)
created_at = Column(DateTime, server_default=func.now())
updated_at = Column(DateTime, onupdate=func.now())

View File

@ -5,6 +5,7 @@ from .proposal import router as proposal_router
from .dewar import dewar_router from .dewar import dewar_router
from .shipment import shipment_router from .shipment import shipment_router
from .auth import router as auth_router from .auth import router as auth_router
from .processing import router as processing_router
from .protected_router import protected_router as protected_router from .protected_router import protected_router as protected_router
__all__ = [ __all__ = [
@ -15,5 +16,6 @@ __all__ = [
"dewar_router", "dewar_router",
"shipment_router", "shipment_router",
"auth_router", "auth_router",
"processing_router",
"protected_router", "protected_router",
] ]

View File

@ -0,0 +1,27 @@
import json
import asyncio
from fastapi import APIRouter, Depends
from sqlalchemy.orm import Session
from starlette.responses import StreamingResponse
from app.models import JobStatus, Jobs as JobModel
from app.dependencies import get_db
router = APIRouter()
async def job_event_generator(db: Session):
while True:
# Fetch jobs with status TODO
jobs = db.query(JobModel).filter(JobModel.status == JobStatus.TODO).all()
if jobs:
# It's recommended to explicitly communicate IDs clearly
job_payload = [{"id": job.id, "parameters": job.parameters} for job in jobs]
yield f"data: {json.dumps(job_payload)}\n\n"
await asyncio.sleep(5) # A reasonable heartbeat/refresh
@router.get("/jobs/stream")
async def stream_jobs(db: Session = Depends(get_db)):
return StreamingResponse(job_event_generator(db), media_type="text/event-stream")

View File

@ -28,6 +28,8 @@ from app.models import (
ExperimentParameters as ExperimentParametersModel, ExperimentParameters as ExperimentParametersModel,
# ExperimentParameters, # ExperimentParameters,
Results as ResultsModel, Results as ResultsModel,
Jobs as JobModel,
JobStatus,
) )
from app.dependencies import get_db from app.dependencies import get_db
import logging import logging
@ -124,7 +126,11 @@ async def create_sample_event(
return sample # Return the sample, now including `mount_count` return sample # Return the sample, now including `mount_count`
@router.post("/{sample_id}/upload-images", response_model=Image) @router.post(
"/{sample_id}/upload-images",
response_model=Image,
operation_id="upload_sample_image",
)
async def upload_sample_image( async def upload_sample_image(
sample_id: int, sample_id: int,
uploaded_file: UploadFile = File(...), uploaded_file: UploadFile = File(...),
@ -231,7 +237,9 @@ async def upload_sample_image(
return new_image return new_image
@router.get("/results", response_model=List[SampleResult]) @router.get(
"/results", response_model=List[SampleResult], operation_id="get_sample_results"
)
async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)): async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
# Query samples for the active pgroup using joins. # Query samples for the active pgroup using joins.
samples = ( samples = (
@ -302,6 +310,7 @@ async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
@router.post( @router.post(
"/samples/{sample_id}/experiment_parameters", "/samples/{sample_id}/experiment_parameters",
response_model=ExperimentParametersRead, response_model=ExperimentParametersRead,
operation_id="create_experiment_parameters_for_sample",
) )
def create_experiment_parameters_for_sample( def create_experiment_parameters_for_sample(
sample_id: int, sample_id: int,
@ -341,10 +350,21 @@ def create_experiment_parameters_for_sample(
db.add(new_event) db.add(new_event)
db.commit() db.commit()
new_job = JobModel(
experiment_parameters_id=new_exp.id, # <-- Correct reference here
parameters=new_exp.to_dict(), # assuming params has a to_dict() method
status=JobStatus.TODO,
)
db.add(new_job)
db.commit()
db.refresh(new_job)
return new_exp return new_exp
@router.post("/processing-results", response_model=ResultResponse) @router.post(
"/processing-results", response_model=ResultResponse, operation_id="create_result"
)
def create_result(payload: ResultCreate, db: Session = Depends(get_db)): def create_result(payload: ResultCreate, db: Session = Depends(get_db)):
# Check experiment existence # Check experiment existence
experiment = ( experiment = (
@ -376,7 +396,9 @@ def create_result(payload: ResultCreate, db: Session = Depends(get_db)):
@router.get( @router.get(
"/processing-results/{sample_id}/{run_id}", response_model=List[ResultResponse] "/processing-results/{sample_id}/{run_id}",
response_model=List[ResultResponse],
operation_id="get_results_for_run_and_sample",
) )
async def get_results_for_run_and_sample( async def get_results_for_run_and_sample(
sample_id: int, run_id: int, db: Session = Depends(get_db) sample_id: int, run_id: int, db: Session = Depends(get_db)

View File

@ -363,6 +363,11 @@ class SampleEventResponse(SampleEventCreate):
from_attributes = True from_attributes = True
class CurvePoint(BaseModel):
resolution: float
value: float
class Results(BaseModel): class Results(BaseModel):
pipeline: str pipeline: str
resolution: float resolution: float
@ -371,8 +376,8 @@ class Results(BaseModel):
rmerge: float rmerge: float
rmeas: float rmeas: float
isig: float isig: float
cc: float cc: List[CurvePoint]
cchalf: float cchalf: List[CurvePoint]
completeness: float completeness: float
multiplicity: float multiplicity: float
nobs: int nobs: int

10
backend/config_dev.json Executable file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "/app/backend/ssl/cert.pem",
"ssl_key_path": "/app/backend/ssl/key.pem",
"OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8000,
"SSL_KEY_PATH": "/app/backend/ssl/key.pem",
"SSL_CERT_PATH": "/app/backend/ssl/cert.pem"
}

10
backend/config_prod.json Normal file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "/app/backend/ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"PORT": 1492,
"OPENAPI_URL": "https://backend:1492/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"SSL_KEY_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.pem"
}

10
backend/config_test.json Normal file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "ssl/cert.pem",
"ssl_key_path": "ssl/key.pem",
"OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8000,
"SSL_KEY_PATH": "ssl/key.pem",
"SSL_CERT_PATH": "ssl/cert.pem"
}

View File

View File

@ -1,10 +1,12 @@
import os import os
import json import json
import tomllib import tomllib
from contextlib import asynccontextmanager
from pathlib import Path from pathlib import Path
from fastapi import FastAPI from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from app import ssl_heidi from app import ssl_heidi
from app.routers import ( from app.routers import (
proposal, proposal,
@ -13,10 +15,13 @@ from app.routers import (
logistics, logistics,
auth, auth,
sample, sample,
processing,
) )
from app.database import Base, engine, SessionLocal from app.database import Base, engine, SessionLocal
from app.routers.protected_router import protected_router from app.routers.protected_router import protected_router
os.makedirs("images", exist_ok=True)
# Utility function to fetch metadata from pyproject.toml # Utility function to fetch metadata from pyproject.toml
def get_project_metadata(): def get_project_metadata():
@ -45,21 +50,26 @@ def get_project_metadata():
) )
def run_server(): def run_server(config, cert_path, key_path):
import uvicorn import uvicorn
environment = os.getenv(
"ENVIRONMENT", "dev"
) # needs to be set explicitly here if not globally available
print(f"[INFO] Starting server in {environment} environment...") print(f"[INFO] Starting server in {environment} environment...")
print(f"[INFO] SSL Certificate Path: {cert_path}") print(f"[INFO] SSL Certificate Path: {cert_path}")
print(f"[INFO] SSL Key Path: {key_path}") print(f"[INFO] SSL Key Path: {key_path}")
port = config.get("PORT", os.getenv("PORT")) port = config.get("PORT")
if not port: if not port:
print("[ERROR] No port defined in config or environment variables. Aborting!") print("[ERROR] No port defined in config or environment variables. Aborting!")
sys.exit(1) # Exit if no port is defined sys.exit(1) # Exit if no port is defined
port = int(port) port = int(port)
print(f"[INFO] Running on port {port}") print(f"[INFO] Running on port {port}")
uvicorn.run( uvicorn.run(
app, app,
host="127.0.0.1" if environment in ["dev", "test"] else "0.0.0.0", host="0.0.0.0",
port=port, port=port,
log_level="debug", log_level="debug",
ssl_keyfile=key_path, ssl_keyfile=key_path,
@ -69,14 +79,6 @@ def run_server():
# Get project metadata from pyproject.toml # Get project metadata from pyproject.toml
project_name, project_version = get_project_metadata() project_name, project_version = get_project_metadata()
app = FastAPI(
title=project_name,
description="Backend for next-gen sample management system",
version=project_version,
servers=[
{"url": "https://mx-aare-test.psi.ch:1492", "description": "Default server"}
],
)
# Determine environment and configuration file path # Determine environment and configuration file path
environment = os.getenv("ENVIRONMENT", "dev") environment = os.getenv("ENVIRONMENT", "dev")
@ -93,10 +95,14 @@ with open(config_file) as f:
if environment in ["test", "dev"]: if environment in ["test", "dev"]:
cert_path = config.get("ssl_cert_path", "ssl/cert.pem") cert_path = config.get("ssl_cert_path", "ssl/cert.pem")
key_path = config.get("ssl_key_path", "ssl/key.pem") key_path = config.get("ssl_key_path", "ssl/key.pem")
ssl_dir = Path(cert_path).parent
# Ensure the directory exists before file operations
ssl_dir.mkdir(parents=True, exist_ok=True)
elif environment == "prod": elif environment == "prod":
cert_path = config.get("SSL_CERT_PATH") cert_path = config.get("SSL_CERT_PATH")
key_path = config.get("SSL_KEY_PATH") key_path = config.get("SSL_KEY_PATH")
# Validate production SSL paths
if not cert_path or not key_path: if not cert_path or not key_path:
raise ValueError( raise ValueError(
"SSL_CERT_PATH and SSL_KEY_PATH must be set in config_prod.json" "SSL_CERT_PATH and SSL_KEY_PATH must be set in config_prod.json"
@ -110,28 +116,20 @@ elif environment == "prod":
else: else:
raise ValueError(f"Unknown environment: {environment}") raise ValueError(f"Unknown environment: {environment}")
# Generate SSL Key and Certificate if not exist (only for development) # Generate SSL Key and Certificate if they do not exist
if environment == "dev": if environment == "dev":
Path("ssl").mkdir(parents=True, exist_ok=True)
if not Path(cert_path).exists() or not Path(key_path).exists(): if not Path(cert_path).exists() or not Path(key_path).exists():
ssl_heidi.generate_self_signed_cert(cert_path, key_path) ssl_heidi.generate_self_signed_cert(cert_path, key_path)
# Apply CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@asynccontextmanager
@app.on_event("startup") async def lifespan(app: FastAPI):
def on_startup():
print("[INFO] Running application startup tasks...") print("[INFO] Running application startup tasks...")
db = SessionLocal() db = SessionLocal()
try: try:
if environment == "prod": if environment == "prod":
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
from sqlalchemy.engine import reflection from sqlalchemy.engine import reflection
inspector = reflection.Inspector.from_engine(engine) inspector = reflection.Inspector.from_engine(engine)
@ -156,8 +154,8 @@ def on_startup():
load_slots_data(db) load_slots_data(db)
else: # dev or test environments else: # dev or test environments
print(f"{environment.capitalize()} environment: Regenerating database.") print(f"{environment.capitalize()} environment: Regenerating database.")
# Base.metadata.drop_all(bind=engine) Base.metadata.drop_all(bind=engine)
# Base.metadata.create_all(bind=engine) Base.metadata.create_all(bind=engine)
# from sqlalchemy.engine import reflection # from sqlalchemy.engine import reflection
# from app.models import ExperimentParameters # adjust the import as needed # from app.models import ExperimentParameters # adjust the import as needed
# inspector = reflection.Inspector.from_engine(engine) # inspector = reflection.Inspector.from_engine(engine)
@ -175,10 +173,30 @@ def on_startup():
from app.database import load_slots_data from app.database import load_slots_data
load_slots_data(db) load_slots_data(db)
yield
finally: finally:
db.close() db.close()
app = FastAPI(
lifespan=lifespan,
title=project_name,
description="Backend for next-gen sample management system",
version=project_version,
servers=[
{"url": "https://mx-aare-test.psi.ch:8000", "description": "Default server"}
],
)
# Apply CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Include routers with correct configuration # Include routers with correct configuration
app.include_router(protected_router, prefix="/protected") app.include_router(protected_router, prefix="/protected")
app.include_router(auth.router, prefix="/auth", tags=["auth"]) app.include_router(auth.router, prefix="/auth", tags=["auth"])
@ -187,6 +205,7 @@ app.include_router(puck.router, prefix="/pucks", tags=["pucks"])
app.include_router(spreadsheet.router, tags=["spreadsheet"]) app.include_router(spreadsheet.router, tags=["spreadsheet"])
app.include_router(logistics.router, prefix="/logistics", tags=["logistics"]) app.include_router(logistics.router, prefix="/logistics", tags=["logistics"])
app.include_router(sample.router, prefix="/samples", tags=["samples"]) app.include_router(sample.router, prefix="/samples", tags=["samples"])
app.include_router(processing.router, prefix="/processing", tags=["processing"])
app.mount("/images", StaticFiles(directory="images"), name="images") app.mount("/images", StaticFiles(directory="images"), name="images")
@ -200,6 +219,9 @@ if __name__ == "__main__":
# Load environment variables from .env file # Load environment variables from .env file
load_dotenv() load_dotenv()
environment = os.getenv("ENVIRONMENT", "dev")
config_file = Path(__file__).resolve().parent / f"config_{environment}.json"
# Check if `generate-openapi` option is passed # Check if `generate-openapi` option is passed
if len(sys.argv) > 1 and sys.argv[1] == "generate-openapi": if len(sys.argv) > 1 and sys.argv[1] == "generate-openapi":
from fastapi.openapi.utils import get_openapi from fastapi.openapi.utils import get_openapi
@ -216,28 +238,42 @@ if __name__ == "__main__":
print("openapi.json generated successfully.") print("openapi.json generated successfully.")
sys.exit(0) # Exit after generating the file sys.exit(0) # Exit after generating the file
# Default behavior: Run the server based on the environment # Explicitly load the configuration file
environment = os.getenv("ENVIRONMENT", "dev") with open(config_file, "r") as f:
port = int(os.getenv("PORT", 8000)) config = json.load(f)
# Explicitly obtain SSL paths from config
if environment in ["test", "dev"]:
cert_path = config.get("ssl_cert_path", "ssl/cert.pem")
key_path = config.get("ssl_key_path", "ssl/key.pem")
elif environment == "prod":
cert_path = config.get("SSL_CERT_PATH")
key_path = config.get("SSL_KEY_PATH")
if not cert_path or not key_path:
raise ValueError(
"SSL_CERT_PATH and SSL_KEY_PATH must be explicitly"
"set in config_prod.json for production."
)
else:
raise ValueError(f"Unknown environment: {environment}")
is_ci = os.getenv("CI", "false").lower() == "true" is_ci = os.getenv("CI", "false").lower() == "true"
# Handle certificates for dev/test if not available
ssl_dir = Path(cert_path).parent
ssl_dir.mkdir(parents=True, exist_ok=True)
if environment in ["dev", "test"] and (
not Path(cert_path).exists() or not Path(key_path).exists()
):
print(f"[INFO] Generating SSL certificates at {ssl_dir}")
ssl_heidi.generate_self_signed_cert(cert_path, key_path)
if is_ci or environment == "test": if is_ci or environment == "test":
# Test or CI Mode: Run server process temporarily for test validation server_process = Process(target=run_server, args=(config, cert_path, key_path))
ssl_dir = Path(cert_path).parent
ssl_dir.mkdir(parents=True, exist_ok=True)
# Generate self-signed certs if missing
if not Path(cert_path).exists() or not Path(key_path).exists():
print(f"[INFO] Generating self-signed SSL certificates at {ssl_dir}")
ssl_heidi.generate_self_signed_cert(cert_path, key_path)
# Start the server as a subprocess, wait, then terminate
server_process = Process(target=run_server)
server_process.start() server_process.start()
sleep(5) # Wait for 5 seconds to verify the server is running sleep(5)
server_process.terminate() # Terminate the server process (for CI) server_process.terminate()
server_process.join() # Ensure proper cleanup server_process.join()
print("CI: Server started and terminated successfully for test validation.") print("CI/Test environment: Server started and terminated successfully.")
else: else:
# Dev or Prod: Start the server as usual run_server(config, cert_path, key_path)
run_server()

View File

@ -1,7 +1,7 @@
#!/bin/bash #!/bin/bash
# Extract values from pyproject.toml # Extract values from pyproject.toml
PYPROJECT_FILE="$(dirname "$0")/backend/pyproject.toml" PYPROJECT_FILE="/app/backend/pyproject.toml"
NAME=$(awk -F'= ' '/^name/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"') NAME=$(awk -F'= ' '/^name/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"')
VERSION=$(awk -F'= ' '/^version/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"') VERSION=$(awk -F'= ' '/^version/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"')
@ -15,7 +15,7 @@ echo "Using project name: $NAME"
echo "Using version: $VERSION" echo "Using version: $VERSION"
# Navigate to backend directory # Navigate to backend directory
cd "$(dirname "$0")/backend" || exit cd "/app/backend" || exit
# Generate OpenAPI JSON file # Generate OpenAPI JSON file
echo "Generating OpenAPI JSON..." echo "Generating OpenAPI JSON..."
@ -46,7 +46,7 @@ fi
# Build the package # Build the package
cd python-client || exit cd python-client || exit
python3 -m venv .venv python3 -m venv .venv
source .venv/bin/activate source /app/.venv/bin/activate
pip install -U pip build pip install -U pip build
python3 -m build python3 -m build

View File

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project] [project]
name = "aareDB" name = "aareDB"
version = "0.1.0a26" version = "0.1.1a1"
description = "Backend for next gen sample management system" description = "Backend for next gen sample management system"
authors = [{name = "Guillaume Gotthard", email = "guillaume.gotthard@psi.ch"}] authors = [{name = "Guillaume Gotthard", email = "guillaume.gotthard@psi.ch"}]
license = {text = "MIT"} license = {text = "MIT"}
@ -28,10 +28,11 @@ dependencies = [
"uvicorn==0.23.1", "uvicorn==0.23.1",
"python-dateutil~=2.8.2", "python-dateutil~=2.8.2",
"tomli>=2.0.1", "tomli>=2.0.1",
"python-dotenv" "python-dotenv",
"psycopg2-binary"
] ]
[tool.pytest.ini_options] [tool.pytest.ini_options]
norecursedirs = ["backend/python-client"] norecursedirs = ["backend/python-client"]
# Or limit files explicitly # Or limit files explicitly
python_files = ["test_auth.py"]#, python_files = [""]#,""test_auth.py"]#,
#"test_contact.py"] #"test_contact.py"]

View File

@ -1,12 +1,17 @@
# tests/test_auth.py # tests/test_auth.py
import pytest
from fastapi.testclient import TestClient from fastapi.testclient import TestClient
from backend.main import app from backend.main import app
client = TestClient(app)
@pytest.fixture(scope="module")
def client():
with TestClient(app) as test_client: # ensures lifespan/startup executes
yield test_client
def test_login_success(): def test_login_success(client):
response = client.post( response = client.post(
"/auth/token/login", data={"username": "testuser", "password": "testpass"} "/auth/token/login", data={"username": "testuser", "password": "testpass"}
) )
@ -14,7 +19,7 @@ def test_login_success():
assert "access_token" in response.json() assert "access_token" in response.json()
def test_login_failure(): def test_login_failure(client):
response = client.post( response = client.post(
"/auth/token/login", data={"username": "wrong", "password": "wrongpass"} "/auth/token/login", data={"username": "wrong", "password": "wrongpass"}
) )
@ -22,7 +27,7 @@ def test_login_failure():
assert response.json() == {"detail": "Incorrect username or password"} assert response.json() == {"detail": "Incorrect username or password"}
def test_protected_route(): def test_protected_route(client):
# Step 1: Login # Step 1: Login
response = client.post( response = client.post(
"/auth/token/login", data={"username": "testuser", "password": "testpass"} "/auth/token/login", data={"username": "testuser", "password": "testpass"}

View File

@ -1,10 +1,10 @@
{ {
"ssl_cert_path": "ssl/mx-aare-test.psi.ch.pem", "ssl_cert_path": "/app/backend/ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "ssl/mx-aare-test.psi.ch.key", "ssl_key_path": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"OPENAPI_URL": "https://mx-aare-test.psi.ch:1492/openapi.json", "OPENAPI_URL": "https://mx-aare-test.psi.ch:1492/openapi.json",
"SCHEMA_PATH": "./src/openapi.json", "SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "./openapi", "OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 1492, "PORT": 1492,
"SSL_KEY_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.key", "SSL_KEY_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.pem" "SSL_CERT_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.pem"
} }

View File

@ -1,10 +1,10 @@
{ {
"ssl_cert_path": "ssl/mx-aare-test.psi.ch.pem", "ssl_cert_path": "ssl/cert.pem",
"ssl_key_path": "ssl/mx-aare-test.psi.ch.key", "ssl_key_path": "ssl/key.pem",
"OPENAPI_URL": "https://mx-aare-test.psi.ch:8000/openapi.json", "OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "./src/openapi.json", "SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "./openapi", "OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8081, "PORT": 8000,
"SSL_KEY_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.key", "SSL_KEY_PATH": "ssl/key.pem",
"SSL_CERT_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.pem" "SSL_CERT_PATH": "ssl/cert.pem"
} }

View File

@ -1,32 +1,92 @@
version: "3.9" version: "3.9"
services: services:
backend: backend:
container_name: backend
build: build:
context: . # Build the image from the parent directory context: . # Build the image from the parent directory
dockerfile: backend/Dockerfile dockerfile: backend/Dockerfile
ports: ports:
- "8000:8000" # Map container port 8000 to host - "${PORT}:${PORT}" # Map container port 8000 to host
volumes: volumes:
- ./backend:/app/backend # Map backend directory to /app/backend - ./backend:/app/backend # Map backend directory to /app/backend
- ./app:/app/app # Map app directory to /app/app - ./app:/app/app # Map app directory to /app/app
- ./config_dev.json:/app/backend/config_dev.json # Explicitly map config_dev.json - ./config_${ENVIRONMENT}.json:/app/backend/config_${ENVIRONMENT}.json # Explicitly map config_dev.json
- ./backend/ssl:/app/backend/ssl # clearly mount SSL files explicitly into Docker
working_dir: /app/backend # Set working directory to backend/ working_dir: /app/backend # Set working directory to backend/
command: python main.py # Command to run main.py command: python main.py # Command to run main.py
depends_on: # ⬅️ New addition: wait until postgres is started
- postgres
healthcheck:
test: [ "CMD-SHELL", "curl -k -f https://localhost:${PORT}/openapi.json || exit 1" ]
interval: 5s
timeout: 5s
retries: 5
environment: # ⬅️ Provide DB info to your backend
ENVIRONMENT: ${ENVIRONMENT}
DB_USERNAME: ${DB_USERNAME}
DB_PASSWORD: ${DB_PASSWORD}
DB_HOST: postgres
DB_NAME: ${DB_NAME}
PORT: ${PORT}
postgres: # ⬅️ New service (our PostgreSQL database)
image: postgres:16
environment:
POSTGRES_USER: ${DB_USERNAME}
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_DB: ${DB_NAME}
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
frontend: frontend:
depends_on:
backend:
condition: service_healthy
build: build:
context: ./frontend context: ./frontend
dockerfile: Dockerfile dockerfile: Dockerfile
args:
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
- VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
- NODE_ENV=${NODE_ENV}
ports: ports:
- "5173:5173" # Map container port 5173 to host - "5173:5173"
volumes:
- ./frontend:/app
- /app/node_modules # ⬅️ explicit exclusion! ensures Docker-provided modules retain explicitly.
- ./backend/ssl:/app/backend/ssl
- ./backend/config_${ENVIRONMENT}.json:/app/backend/config_${ENVIRONMENT}.json # Dynamically maps config based on environment
environment:
VITE_OPENAPI_BASE: ${VITE_OPENAPI_BASE}
NODE_ENV: ${NODE_ENV}
command: sh -c "npm run start-${ENVIRONMENT} & ENVIRONMENT=${ENVIRONMENT} npm run watch:openapi"
logistics_frontend: logistics_frontend:
build: build:
context: ./logistics context: ./logistics
dockerfile: Dockerfile dockerfile: Dockerfile
args: # 👈 explicitly pass build args from .env
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
- VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
- NODE_ENV=${NODE_ENV}
ports: ports:
- "3000:3000" - "3000:3000"
depends_on: depends_on:
- frontend # Ensure OpenAPI models are available - frontend # Ensure OpenAPI models are available
volumes:
- ./logistics/src:/app/src # explicitly for active dev (hot reload)
- ./backend/ssl:/app/backend/ssl # clearly mount SSL files explicitly into Docker
environment:
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- NODE_ENV=${NODE_ENV}
command: sh -c "npm run start-${ENVIRONMENT}"
volumes: # ⬅️ Persistent storage for PostgreSQL data
pgdata:

View File

@ -1,16 +1,29 @@
FROM node:18 FROM node:18
# Set working directory # Set working directory
WORKDIR /app WORKDIR /app
# Copy dependency files and install dependencies # Setup build args clearly
COPY package*.json ./ ARG VITE_OPENAPI_BASE_DEV
RUN npm install ARG VITE_SSL_KEY_PATH
ARG VITE_SSL_CERT_PATH
ARG NODE_ENV=development
# Copy rest of the code and build the application ENV VITE_OPENAPI_BASE_=${VITE_OPENAPI_BASE}
COPY . . ENV VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
RUN npm run build ENV VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
ENV NODE_ENV=${NODE_ENV}
# Use a simple HTTP server to serve the built static files
EXPOSE 5173 # Copy dependency files and install dependencies
CMD ["npx", "vite", "preview", "--port", "5173"] COPY package*.json ./
RUN npm install --prefer-offline --no-audit --progress=false
# Copy rest of the code and build the application
COPY . .
# Use a simple HTTP server to serve the built static files
EXPOSE 5173
#CMD ["npx", "vite", "preview", "--port", "5173"]
CMD ["npm", "run", "dev"]

View File

@ -15,13 +15,16 @@ if (!process.env.ENVIRONMENT) {
} }
// Determine environment and configuration file // Determine environment and configuration file
const nodeEnv = process.env.ENVIRONMENT || 'dev';
const configFile = `config_${nodeEnv}.json`; const nodeEnv = process.env.ENVIRONMENT || 'dev'; // Dynamically set from ENV variables
const configFile = `config_${nodeEnv}.json`; // Explicitly Dynamically resolve config file name
const configFilePath = path.resolve('./backend/', configFile); // Explicitly correct path resolution in Docker
// Load configuration file // Load configuration file
let config; let config;
try { try {
config = JSON.parse(fs.readFileSync(path.resolve('../', configFile), 'utf8')); config = JSON.parse(fs.readFileSync(configFilePath, 'utf8'));
} catch (error) { } catch (error) {
console.error(`❌ Failed to read configuration file '${configFile}': ${error.message}`); console.error(`❌ Failed to read configuration file '${configFile}': ${error.message}`);
process.exit(1); process.exit(1);
@ -42,12 +45,24 @@ for (const field of requiredFields) {
} }
} }
// Resolve paths from the config const OPENAPI_BASE_URL = config.OPENAPI_URL; // or process.env.VITE_OPENAPI_BASE_DEV || config.OPENAPI_URL;
const OPENAPI_URL = config.OPENAPI_URL;
const SCHEMA_PATH = path.resolve(config.SCHEMA_PATH); const SCHEMA_PATH = config.SCHEMA_PATH; // 💡 already absolute
const OUTPUT_DIRECTORY = path.resolve(config.OUTPUT_DIRECTORY); const OUTPUT_DIRECTORY = config.OUTPUT_DIRECTORY; // 💡 already absolute
const SSL_KEY_PATH = path.resolve(config.SSL_KEY_PATH); const SSL_KEY_PATH = config.SSL_KEY_PATH;
const SSL_CERT_PATH = path.resolve(config.SSL_CERT_PATH); const SSL_CERT_PATH = config.SSL_CERT_PATH;
function assertDirExists(dirPath) {
if (!fs.existsSync(dirPath)) {
console.error(`❌ Directory does not exist inside Docker: ${dirPath}`);
process.exit(1);
}
}
// explicitly validate directories clearly explicitly in container paths:
assertDirExists(path.dirname(SCHEMA_PATH));
assertDirExists(OUTPUT_DIRECTORY);
assertDirExists(path.dirname(SSL_KEY_PATH));
// Log configuration // Log configuration
console.log(`[INFO] Environment: ${nodeEnv}`); console.log(`[INFO] Environment: ${nodeEnv}`);
@ -96,7 +111,7 @@ async function fetchAndGenerate() {
}; };
const res = await new Promise((resolve, reject) => { const res = await new Promise((resolve, reject) => {
https.get(OPENAPI_URL, options, resolve).on('error', reject); https.get(OPENAPI_BASE_URL, options, resolve).on('error', reject);
}); });
let data = ''; let data = '';
@ -104,79 +119,71 @@ async function fetchAndGenerate() {
data += chunk; data += chunk;
}); });
res.on('end', async () => { await new Promise((resolve, reject) => {
try { res.on('end', resolve);
// Save schema file res.on('error', reject);
fs.writeFileSync(SCHEMA_PATH, data, 'utf8');
console.log(`✅ OpenAPI schema saved to ${SCHEMA_PATH}`);
console.log("🧼 Cleaning output directory...");
await fs.promises.rm(OUTPUT_DIRECTORY, { recursive: true, force: true });
console.log(`✅ Output directory cleaned: ${OUTPUT_DIRECTORY}`);
if (!fs.existsSync(OUTPUT_DIRECTORY)) {
console.log(`✅ Confirmed removal of ${OUTPUT_DIRECTORY}`);
} else {
console.error(`❌ Failed to remove output directory: ${OUTPUT_DIRECTORY}`);
}
// Generate services
const command = `npx openapi -i ${SCHEMA_PATH} -o ${OUTPUT_DIRECTORY}`;
console.log(`🔧 Executing command: ${command}`);
const { stdout, stderr } = await execPromisified(command);
if (stderr) {
console.error(`⚠️ stderr while generating services: ${stderr}`);
} else {
console.log(`✅ Service generation completed successfully:\n${stdout}`);
}
// Copy the generated OpenAPI models to ../logistics/openapi
const targetDirectory = path.resolve('../logistics/openapi'); // Adjust as per logistics directory
console.log(`🔄 Copying generated OpenAPI models to ${targetDirectory}...`);
await fs.promises.rm(targetDirectory, { recursive: true, force: true }); // Clean target directory
await fs.promises.mkdir(targetDirectory, { recursive: true }); // Ensure the directory exists
// Copy files from OUTPUT_DIRECTORY to the target directory recursively
const copyRecursive = async (src, dest) => {
const entries = await fs.promises.readdir(src, { withFileTypes: true });
for (const entry of entries) {
const srcPath = path.join(src, entry.name);
const destPath = path.join(dest, entry.name);
if (entry.isDirectory()) {
await fs.promises.mkdir(destPath, { recursive: true });
await copyRecursive(srcPath, destPath);
} else {
await fs.promises.copyFile(srcPath, destPath);
}
}
};
await copyRecursive(OUTPUT_DIRECTORY, targetDirectory);
console.log(`✅ OpenAPI models copied successfully to ${targetDirectory}`);
} catch (error) {
console.error(`❌ Error during schema processing or generation: ${error.message}`);
}
isGenerating = false;
}); });
// Save schema file
fs.writeFileSync(SCHEMA_PATH, data, 'utf8');
console.log(`✅ OpenAPI schema saved to ${SCHEMA_PATH}`);
console.log("🧼 Cleaning output directory...");
await fs.promises.rm(OUTPUT_DIRECTORY, { recursive: true, force: true });
console.log(`✅ Output directory cleaned: ${OUTPUT_DIRECTORY}`);
// Generate services
const command = `npx openapi -i ${SCHEMA_PATH} -o ${OUTPUT_DIRECTORY}`;
console.log(`🔧 Executing command: ${command}`);
const { stdout, stderr } = await execPromisified(command);
if (stderr) {
console.error(`⚠️ stderr while generating services: ${stderr}`);
} else {
console.log(`✅ Service generation completed successfully:\n${stdout}`);
}
// Copy the generated OpenAPI models to ../logistics/openapi
const targetDirectory = path.resolve('../logistics/openapi');
console.log(`🔄 Copying generated OpenAPI models to ${targetDirectory}...`);
await fs.promises.rm(targetDirectory, { recursive: true, force: true });
await fs.promises.mkdir(targetDirectory, { recursive: true });
// Recursive copy helper
const copyRecursive = async (src, dest) => {
const entries = await fs.promises.readdir(src, { withFileTypes: true });
for (const entry of entries) {
const srcPath = path.join(src, entry.name);
const destPath = path.join(dest, entry.name);
if (entry.isDirectory()) {
await fs.promises.mkdir(destPath, { recursive: true });
await copyRecursive(srcPath, destPath);
} else {
await fs.promises.copyFile(srcPath, destPath);
}
}
};
await copyRecursive(OUTPUT_DIRECTORY, targetDirectory);
console.log(`✅ OpenAPI models copied successfully to ${targetDirectory}`);
} catch (error) { } catch (error) {
console.error(`Failed to fetch OpenAPI schema: ${error.message}`); console.error(`Error during schema processing or generation: ${error.message}`);
} finally {
isGenerating = false; isGenerating = false;
} }
} }
// Backend directory based on the environment // Backend directory based on the environment
const backendDirectory = (() => { const backendDirectory = (() => {
switch (nodeEnv) { switch (nodeEnv) {
case 'prod': case 'prod':
return path.resolve('/home/jungfrau/aaredb/backend/app'); // Production path return path.resolve('/app/backend'); // Production path
case 'test': case 'test':
return path.resolve('/home/jungfrau/aaredb/backend/app'); // Test path return path.resolve('/app/backend'); // Test path
case 'dev': case 'dev':
default: default:
return path.resolve('/Users/gotthardg/PycharmProjects/aaredb/backend/app'); // Development path return path.resolve('/app/backend'); // Development path
} }
})(); })();

View File

@ -22,8 +22,9 @@
"@mui/lab": "^6.0.0-beta.29", "@mui/lab": "^6.0.0-beta.29",
"@mui/material": "^6.1.6", "@mui/material": "^6.1.6",
"@mui/system": "^6.1.6", "@mui/system": "^6.1.6",
"@mui/x-charts": "^7.28.0",
"@mui/x-data-grid-premium": "^7.27.2", "@mui/x-data-grid-premium": "^7.27.2",
"@mui/x-tree-view": "^7.26.0", "@mui/x-tree-view": "^7.28.0",
"axios": "^1.7.7", "axios": "^1.7.7",
"chokidar": "^4.0.1", "chokidar": "^4.0.1",
"dayjs": "^1.11.13", "dayjs": "^1.11.13",
@ -1790,6 +1791,84 @@
} }
} }
}, },
"node_modules/@mui/x-charts": {
"version": "7.28.0",
"resolved": "https://registry.npmjs.org/@mui/x-charts/-/x-charts-7.28.0.tgz",
"integrity": "sha512-TNfq/rQfGKnjTaEITkY6l09NpMxwMwRTgLiDw+JQsS/7gwBBJUmMhEOj67BaFeYTsroFLUYeggiAj+RTSryd4A==",
"license": "MIT",
"dependencies": {
"@babel/runtime": "^7.25.7",
"@mui/utils": "^5.16.6 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"@mui/x-charts-vendor": "7.20.0",
"@mui/x-internals": "7.28.0",
"@react-spring/rafz": "^9.7.5",
"@react-spring/web": "^9.7.5",
"clsx": "^2.1.1",
"prop-types": "^15.8.1"
},
"engines": {
"node": ">=14.0.0"
},
"peerDependencies": {
"@emotion/react": "^11.9.0",
"@emotion/styled": "^11.8.1",
"@mui/material": "^5.15.14 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"@mui/system": "^5.15.14 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"react": "^17.0.0 || ^18.0.0 || ^19.0.0",
"react-dom": "^17.0.0 || ^18.0.0 || ^19.0.0"
},
"peerDependenciesMeta": {
"@emotion/react": {
"optional": true
},
"@emotion/styled": {
"optional": true
}
}
},
"node_modules/@mui/x-charts-vendor": {
"version": "7.20.0",
"resolved": "https://registry.npmjs.org/@mui/x-charts-vendor/-/x-charts-vendor-7.20.0.tgz",
"integrity": "sha512-pzlh7z/7KKs5o0Kk0oPcB+sY0+Dg7Q7RzqQowDQjpy5Slz6qqGsgOB5YUzn0L+2yRmvASc4Pe0914Ao3tMBogg==",
"license": "MIT AND ISC",
"dependencies": {
"@babel/runtime": "^7.25.7",
"@types/d3-color": "^3.1.3",
"@types/d3-delaunay": "^6.0.4",
"@types/d3-interpolate": "^3.0.4",
"@types/d3-scale": "^4.0.8",
"@types/d3-shape": "^3.1.6",
"@types/d3-time": "^3.0.3",
"d3-color": "^3.1.0",
"d3-delaunay": "^6.0.4",
"d3-interpolate": "^3.0.1",
"d3-scale": "^4.0.2",
"d3-shape": "^3.2.0",
"d3-time": "^3.1.0",
"delaunator": "^5.0.1",
"robust-predicates": "^3.0.2"
}
},
"node_modules/@mui/x-charts/node_modules/@mui/x-internals": {
"version": "7.28.0",
"resolved": "https://registry.npmjs.org/@mui/x-internals/-/x-internals-7.28.0.tgz",
"integrity": "sha512-p4GEp/09bLDumktdIMiw+OF4p+pJOOjTG0VUvzNxjbHB9GxbBKoMcHrmyrURqoBnQpWIeFnN/QAoLMFSpfwQbw==",
"license": "MIT",
"dependencies": {
"@babel/runtime": "^7.25.7",
"@mui/utils": "^5.16.6 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta"
},
"engines": {
"node": ">=14.0.0"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/mui-org"
},
"peerDependencies": {
"react": "^17.0.0 || ^18.0.0 || ^19.0.0"
}
},
"node_modules/@mui/x-data-grid": { "node_modules/@mui/x-data-grid": {
"version": "7.27.2", "version": "7.27.2",
"resolved": "https://registry.npmjs.org/@mui/x-data-grid/-/x-data-grid-7.27.2.tgz", "resolved": "https://registry.npmjs.org/@mui/x-data-grid/-/x-data-grid-7.27.2.tgz",
@ -2088,14 +2167,14 @@
} }
}, },
"node_modules/@mui/x-tree-view": { "node_modules/@mui/x-tree-view": {
"version": "7.26.0", "version": "7.28.0",
"resolved": "https://registry.npmjs.org/@mui/x-tree-view/-/x-tree-view-7.26.0.tgz", "resolved": "https://registry.npmjs.org/@mui/x-tree-view/-/x-tree-view-7.28.0.tgz",
"integrity": "sha512-adZwVj6/edNowi2RIeyGPTcfdP4EXtMGo0mk2LQogG8m8bZkZRjOQoQ7pkBF0UPMaIAwzCadq2OWj3MPH4DP5A==", "integrity": "sha512-L41Vo/rAdchRQIVfFyCf92hRtHrVoiuA6E1vf9Ie3IgXRLznj6CMUicOctB+hO2/uQZPuc7WVfvLZFZ/7ur6HA==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@babel/runtime": "^7.25.7", "@babel/runtime": "^7.25.7",
"@mui/utils": "^5.16.6 || ^6.0.0", "@mui/utils": "^5.16.6 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"@mui/x-internals": "7.26.0", "@mui/x-internals": "7.28.0",
"@types/react-transition-group": "^4.4.11", "@types/react-transition-group": "^4.4.11",
"clsx": "^2.1.1", "clsx": "^2.1.1",
"prop-types": "^15.8.1", "prop-types": "^15.8.1",
@ -2111,8 +2190,8 @@
"peerDependencies": { "peerDependencies": {
"@emotion/react": "^11.9.0", "@emotion/react": "^11.9.0",
"@emotion/styled": "^11.8.1", "@emotion/styled": "^11.8.1",
"@mui/material": "^5.15.14 || ^6.0.0", "@mui/material": "^5.15.14 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"@mui/system": "^5.15.14 || ^6.0.0", "@mui/system": "^5.15.14 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta",
"react": "^17.0.0 || ^18.0.0 || ^19.0.0", "react": "^17.0.0 || ^18.0.0 || ^19.0.0",
"react-dom": "^17.0.0 || ^18.0.0 || ^19.0.0" "react-dom": "^17.0.0 || ^18.0.0 || ^19.0.0"
}, },
@ -2126,13 +2205,13 @@
} }
}, },
"node_modules/@mui/x-tree-view/node_modules/@mui/x-internals": { "node_modules/@mui/x-tree-view/node_modules/@mui/x-internals": {
"version": "7.26.0", "version": "7.28.0",
"resolved": "https://registry.npmjs.org/@mui/x-internals/-/x-internals-7.26.0.tgz", "resolved": "https://registry.npmjs.org/@mui/x-internals/-/x-internals-7.28.0.tgz",
"integrity": "sha512-VxTCYQcZ02d3190pdvys2TDg9pgbvewAVakEopiOgReKAUhLdRlgGJHcOA/eAuGLyK1YIo26A6Ow6ZKlSRLwMg==", "integrity": "sha512-p4GEp/09bLDumktdIMiw+OF4p+pJOOjTG0VUvzNxjbHB9GxbBKoMcHrmyrURqoBnQpWIeFnN/QAoLMFSpfwQbw==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@babel/runtime": "^7.25.7", "@babel/runtime": "^7.25.7",
"@mui/utils": "^5.16.6 || ^6.0.0" "@mui/utils": "^5.16.6 || ^6.0.0 || ^7.0.0 || ^7.0.0-beta"
}, },
"engines": { "engines": {
"node": ">=14.0.0" "node": ">=14.0.0"
@ -2203,6 +2282,78 @@
"url": "https://opencollective.com/popperjs" "url": "https://opencollective.com/popperjs"
} }
}, },
"node_modules/@react-spring/animated": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/animated/-/animated-9.7.5.tgz",
"integrity": "sha512-Tqrwz7pIlsSDITzxoLS3n/v/YCUHQdOIKtOJf4yL6kYVSDTSmVK1LI1Q3M/uu2Sx4X3pIWF3xLUhlsA6SPNTNg==",
"license": "MIT",
"dependencies": {
"@react-spring/shared": "~9.7.5",
"@react-spring/types": "~9.7.5"
},
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/@react-spring/core": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/core/-/core-9.7.5.tgz",
"integrity": "sha512-rmEqcxRcu7dWh7MnCcMXLvrf6/SDlSokLaLTxiPlAYi11nN3B5oiCUAblO72o+9z/87j2uzxa2Inm8UbLjXA+w==",
"license": "MIT",
"dependencies": {
"@react-spring/animated": "~9.7.5",
"@react-spring/shared": "~9.7.5",
"@react-spring/types": "~9.7.5"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/react-spring/donate"
},
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/@react-spring/rafz": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/rafz/-/rafz-9.7.5.tgz",
"integrity": "sha512-5ZenDQMC48wjUzPAm1EtwQ5Ot3bLIAwwqP2w2owG5KoNdNHpEJV263nGhCeKKmuA3vG2zLLOdu3or6kuDjA6Aw==",
"license": "MIT"
},
"node_modules/@react-spring/shared": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/shared/-/shared-9.7.5.tgz",
"integrity": "sha512-wdtoJrhUeeyD/PP/zo+np2s1Z820Ohr/BbuVYv+3dVLW7WctoiN7std8rISoYoHpUXtbkpesSKuPIw/6U1w1Pw==",
"license": "MIT",
"dependencies": {
"@react-spring/rafz": "~9.7.5",
"@react-spring/types": "~9.7.5"
},
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/@react-spring/types": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/types/-/types-9.7.5.tgz",
"integrity": "sha512-HVj7LrZ4ReHWBimBvu2SKND3cDVUPWKLqRTmWe/fNY6o1owGOX0cAHbdPDTMelgBlVbrTKrre6lFkhqGZErK/g==",
"license": "MIT"
},
"node_modules/@react-spring/web": {
"version": "9.7.5",
"resolved": "https://registry.npmjs.org/@react-spring/web/-/web-9.7.5.tgz",
"integrity": "sha512-lmvqGwpe+CSttsWNZVr+Dg62adtKhauGwLyGE/RRyZ8AAMLgb9x3NDMA5RMElXo+IMyTkPp7nxTB8ZQlmhb6JQ==",
"license": "MIT",
"dependencies": {
"@react-spring/animated": "~9.7.5",
"@react-spring/core": "~9.7.5",
"@react-spring/shared": "~9.7.5",
"@react-spring/types": "~9.7.5"
},
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0",
"react-dom": "^16.8.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/@redocly/ajv": { "node_modules/@redocly/ajv": {
"version": "8.11.2", "version": "8.11.2",
"resolved": "https://registry.npmjs.org/@redocly/ajv/-/ajv-8.11.2.tgz", "resolved": "https://registry.npmjs.org/@redocly/ajv/-/ajv-8.11.2.tgz",
@ -2900,6 +3051,57 @@
"@babel/types": "^7.20.7" "@babel/types": "^7.20.7"
} }
}, },
"node_modules/@types/d3-color": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/@types/d3-color/-/d3-color-3.1.3.tgz",
"integrity": "sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A==",
"license": "MIT"
},
"node_modules/@types/d3-delaunay": {
"version": "6.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-delaunay/-/d3-delaunay-6.0.4.tgz",
"integrity": "sha512-ZMaSKu4THYCU6sV64Lhg6qjf1orxBthaC161plr5KuPHo3CNm8DTHiLw/5Eq2b6TsNP0W0iJrUOFscY6Q450Hw==",
"license": "MIT"
},
"node_modules/@types/d3-interpolate": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz",
"integrity": "sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA==",
"license": "MIT",
"dependencies": {
"@types/d3-color": "*"
}
},
"node_modules/@types/d3-path": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/@types/d3-path/-/d3-path-3.1.1.tgz",
"integrity": "sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg==",
"license": "MIT"
},
"node_modules/@types/d3-scale": {
"version": "4.0.9",
"resolved": "https://registry.npmjs.org/@types/d3-scale/-/d3-scale-4.0.9.tgz",
"integrity": "sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw==",
"license": "MIT",
"dependencies": {
"@types/d3-time": "*"
}
},
"node_modules/@types/d3-shape": {
"version": "3.1.7",
"resolved": "https://registry.npmjs.org/@types/d3-shape/-/d3-shape-3.1.7.tgz",
"integrity": "sha512-VLvUQ33C+3J+8p+Daf+nYSOsjB4GXp19/S/aGo60m9h1v6XaxjiT82lKVWJCfzhtuZ3yD7i/TPeC/fuKLLOSmg==",
"license": "MIT",
"dependencies": {
"@types/d3-path": "*"
}
},
"node_modules/@types/d3-time": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-time/-/d3-time-3.0.4.tgz",
"integrity": "sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g==",
"license": "MIT"
},
"node_modules/@types/estree": { "node_modules/@types/estree": {
"version": "1.0.6", "version": "1.0.6",
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.6.tgz", "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.6.tgz",
@ -3923,6 +4125,121 @@
"integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==", "integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/d3-array": {
"version": "3.2.4",
"resolved": "https://registry.npmjs.org/d3-array/-/d3-array-3.2.4.tgz",
"integrity": "sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg==",
"license": "ISC",
"dependencies": {
"internmap": "1 - 2"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-color": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-color/-/d3-color-3.1.0.tgz",
"integrity": "sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-delaunay": {
"version": "6.0.4",
"resolved": "https://registry.npmjs.org/d3-delaunay/-/d3-delaunay-6.0.4.tgz",
"integrity": "sha512-mdjtIZ1XLAM8bm/hx3WwjfHt6Sggek7qH043O8KEjDXN40xi3vx/6pYSVTwLjEgiXQTbvaouWKynLBiUZ6SK6A==",
"license": "ISC",
"dependencies": {
"delaunator": "5"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-format": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-format/-/d3-format-3.1.0.tgz",
"integrity": "sha512-YyUI6AEuY/Wpt8KWLgZHsIU86atmikuoOmCfommt0LYHiQSPjvX2AcFc38PX0CBpr2RCyZhjex+NS/LPOv6YqA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-interpolate": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-interpolate/-/d3-interpolate-3.0.1.tgz",
"integrity": "sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g==",
"license": "ISC",
"dependencies": {
"d3-color": "1 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-path": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-path/-/d3-path-3.1.0.tgz",
"integrity": "sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-scale": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/d3-scale/-/d3-scale-4.0.2.tgz",
"integrity": "sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ==",
"license": "ISC",
"dependencies": {
"d3-array": "2.10.0 - 3",
"d3-format": "1 - 3",
"d3-interpolate": "1.2.0 - 3",
"d3-time": "2.1.1 - 3",
"d3-time-format": "2 - 4"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-shape": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/d3-shape/-/d3-shape-3.2.0.tgz",
"integrity": "sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA==",
"license": "ISC",
"dependencies": {
"d3-path": "^3.1.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-time": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-time/-/d3-time-3.1.0.tgz",
"integrity": "sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q==",
"license": "ISC",
"dependencies": {
"d3-array": "2 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-time-format": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/d3-time-format/-/d3-time-format-4.1.0.tgz",
"integrity": "sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg==",
"license": "ISC",
"dependencies": {
"d3-time": "1 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/date-arithmetic": { "node_modules/date-arithmetic": {
"version": "4.1.0", "version": "4.1.0",
"resolved": "https://registry.npmjs.org/date-arithmetic/-/date-arithmetic-4.1.0.tgz", "resolved": "https://registry.npmjs.org/date-arithmetic/-/date-arithmetic-4.1.0.tgz",
@ -3970,6 +4287,15 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/delaunator": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/delaunator/-/delaunator-5.0.1.tgz",
"integrity": "sha512-8nvh+XBe96aCESrGOqMp/84b13H9cdKbG5P2ejQCh4d4sK9RL4371qou9drQjMhvnPmhWl5hnmqbEE0fXr9Xnw==",
"license": "ISC",
"dependencies": {
"robust-predicates": "^3.0.2"
}
},
"node_modules/delayed-stream": { "node_modules/delayed-stream": {
"version": "1.0.0", "version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz", "resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
@ -4910,6 +5236,15 @@
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==", "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/internmap": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/internmap/-/internmap-2.0.3.tgz",
"integrity": "sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/invariant": { "node_modules/invariant": {
"version": "2.2.4", "version": "2.2.4",
"resolved": "https://registry.npmjs.org/invariant/-/invariant-2.2.4.tgz", "resolved": "https://registry.npmjs.org/invariant/-/invariant-2.2.4.tgz",
@ -6371,6 +6706,12 @@
"url": "https://github.com/sponsors/isaacs" "url": "https://github.com/sponsors/isaacs"
} }
}, },
"node_modules/robust-predicates": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/robust-predicates/-/robust-predicates-3.0.2.tgz",
"integrity": "sha512-IXgzBWvWQwE6PrDI05OvmXUIruQTcoMDzRsOd5CDvHCVLcLHMTSYvOK5Cm46kWqlV3yAbuSpBZdJ5oP5OUoStg==",
"license": "Unlicense"
},
"node_modules/rollup": { "node_modules/rollup": {
"version": "4.30.1", "version": "4.30.1",
"resolved": "https://registry.npmjs.org/rollup/-/rollup-4.30.1.tgz", "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.30.1.tgz",

View File

@ -1,5 +1,5 @@
{ {
"name": "heidi-frontend-v2", "name": "Aare Web",
"private": true, "private": true,
"version": "0.0.0", "version": "0.0.0",
"type": "module", "type": "module",
@ -29,8 +29,9 @@
"@mui/lab": "^6.0.0-beta.29", "@mui/lab": "^6.0.0-beta.29",
"@mui/material": "^6.1.6", "@mui/material": "^6.1.6",
"@mui/system": "^6.1.6", "@mui/system": "^6.1.6",
"@mui/x-charts": "^7.28.0",
"@mui/x-data-grid-premium": "^7.27.2", "@mui/x-data-grid-premium": "^7.27.2",
"@mui/x-tree-view": "^7.26.0", "@mui/x-tree-view": "^7.28.0",
"axios": "^1.7.7", "axios": "^1.7.7",
"chokidar": "^4.0.1", "chokidar": "^4.0.1",
"dayjs": "^1.11.13", "dayjs": "^1.11.13",

View File

@ -186,7 +186,7 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
setBasePath(`${OpenAPI.BASE}/`); setBasePath(`${OpenAPI.BASE}/`);
SamplesService.getSampleResultsSamplesResultsGet(activePgroup) SamplesService.getSampleResults(activePgroup)
.then((response: SampleResult[]) => { .then((response: SampleResult[]) => {
const treeRows: TreeRow[] = []; const treeRows: TreeRow[] = [];

View File

@ -4,7 +4,10 @@ import {
} from '@mui/material'; } from '@mui/material';
import ExpandMoreIcon from '@mui/icons-material/ExpandMore'; import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
import './SampleImage.css'; import './SampleImage.css';
import { DataGridPremium, GridColDef } from "@mui/x-data-grid-premium"; import { DataGridPremium, GridColDef, GridValueGetterParams } from "@mui/x-data-grid-premium";
import { LineChart } from '@mui/x-charts/LineChart';
import { SamplesService } from "../../openapi"; import { SamplesService } from "../../openapi";
interface RunDetailsProps { interface RunDetailsProps {
@ -15,6 +18,10 @@ interface RunDetailsProps {
onHeightChange?: (height: number) => void; onHeightChange?: (height: number) => void;
} }
interface CCPoint {
resolution: number;
value: number;
}
interface ExperimentParameters { interface ExperimentParameters {
run_number: number; run_number: number;
@ -26,6 +33,7 @@ interface ExperimentParameters {
interface ProcessingResults { interface ProcessingResults {
id: number;
pipeline: string; pipeline: string;
resolution: number; resolution: number;
unit_cell: string; unit_cell: string;
@ -33,8 +41,8 @@ interface ProcessingResults {
rmerge: number; rmerge: number;
rmeas: number; rmeas: number;
isig: number; isig: number;
cc: number; cc: CCPoint[];
cchalf: number; cchalf: CCPoint[];
completeness: number; completeness: number;
multiplicity: number; multiplicity: number;
nobs: number; nobs: number;
@ -43,6 +51,7 @@ interface ProcessingResults {
comments?: string | null; comments?: string | null;
} }
const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath, runId, sample_id }) => { const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath, runId, sample_id }) => {
const containerRef = useRef<HTMLDivElement | null>(null); const containerRef = useRef<HTMLDivElement | null>(null);
const [currentHeight, setCurrentHeight] = useState<number>(0); const [currentHeight, setCurrentHeight] = useState<number>(0);
@ -60,10 +69,11 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath,
const fetchResults = async (sample_id: number, runId: number) => { const fetchResults = async (sample_id: number, runId: number) => {
try { try {
const results = await SamplesService.getResultsForRunAndSampleSamplesProcessingResultsSampleIdRunIdGet(sample_id, runId); const results = await SamplesService.getResultsForRunAndSample(sample_id, runId);
// Explicitly handle nested results // Explicitly handle nested results
const mappedResults: ProcessingResults[] = results.map((res): ProcessingResults => ({ const mappedResults: ProcessingResults[] = results.map((res): ProcessingResults => ({
id: res.id,
pipeline: res.result?.pipeline || 'N/A', pipeline: res.result?.pipeline || 'N/A',
resolution: res.result.resolution ?? 0, resolution: res.result.resolution ?? 0,
unit_cell: res.result?.unit_cell || 'N/A', unit_cell: res.result?.unit_cell || 'N/A',
@ -71,8 +81,8 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath,
rmerge: res.result?.rmerge ?? 0, rmerge: res.result?.rmerge ?? 0,
rmeas: res.result?.rmeas ?? 0, rmeas: res.result?.rmeas ?? 0,
isig: res.result?.isig ?? 0, isig: res.result?.isig ?? 0,
cc: res.result?.cc ?? 0, cc: res.result?.cc || [],
cchalf: res.result?.cchalf ?? 0, cchalf: res.result?.cchalf || [],
completeness: res.result?.completeness ?? 0, completeness: res.result?.completeness ?? 0,
multiplicity: res.result?.multiplicity ?? 0, multiplicity: res.result?.multiplicity ?? 0,
nobs: res.result?.nobs ?? 0, nobs: res.result?.nobs ?? 0,
@ -88,22 +98,38 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath,
}; };
const resultColumns: GridColDef[] = [ const resultColumns: GridColDef<ProcessingResults>[] = [
{field: 'pipeline', headerName: 'Pipeline', flex: 1}, { field: 'pipeline', headerName: 'Pipeline', flex: 1 },
{field: 'resolution', headerName: 'Resolution (Å)', flex: 1}, { field: 'resolution', headerName: 'Resolution (Å)', flex: 1 },
{field: 'unit_cell', headerName: 'Unit Cell (Å)', flex: 1.5}, { field: 'unit_cell', headerName: 'Unit Cell (Å)', flex: 1.5 },
{field: 'spacegroup', headerName: 'Spacegroup', flex: 1}, { field: 'spacegroup', headerName: 'Spacegroup', flex: 1 },
{field: 'rmerge', headerName: 'Rmerge', flex: 1}, { field: 'rmerge', headerName: 'Rmerge', flex: 1 },
{field: 'rmeas', headerName: 'Rmeas', flex: 1}, { field: 'rmeas', headerName: 'Rmeas', flex: 1 },
{field: 'isig', headerName: 'I/sig(I)', flex: 1}, { field: 'isig', headerName: 'I/sig(I)', flex: 1 },
{field: 'cc', headerName: 'CC', flex: 1}, {
{field: 'cchalf', headerName: 'CC(1/2)', flex: 1}, field: 'cc',
{field: 'completeness', headerName: 'Completeness (%)', flex: 1}, headerName: 'CC',
{field: 'multiplicity', headerName: 'Multiplicity', flex: 1}, flex: 1,
{field: 'nobs', headerName: 'N obs.', flex: 1}, valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
{field: 'total_refl', headerName: 'Total Reflections', flex: 1}, Array.isArray(params.row?.cc)
{field: 'unique_refl', headerName: 'Unique Reflections', flex: 1}, ? params.row.cc.map((point: CCPoint) => `${point.value.toFixed(2)}@${point.resolution.toFixed(2)}`).join(', ')
{field: 'comments', headerName: 'Comments', flex: 2}, : '',
},
{
field: 'cchalf',
headerName: 'CC(1/2)',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
Array.isArray(params.row?.cchalf)
? params.row.cchalf.map((point: CCPoint) => `${point.value.toFixed(2)}@${point.resolution.toFixed(2)}`).join(', ')
: '',
},
{ field: 'completeness', headerName: 'Completeness (%)', flex: 1 },
{ field: 'multiplicity', headerName: 'Multiplicity', flex: 1 },
{ field: 'nobs', headerName: 'N obs.', flex: 1 },
{ field: 'total_refl', headerName: 'Total Reflections', flex: 1 },
{ field: 'unique_refl', headerName: 'Unique Reflections', flex: 1 },
{ field: 'comments', headerName: 'Comments', flex: 2 },
]; ];
const updateHeight = () => { const updateHeight = () => {
@ -260,12 +286,12 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath,
<AccordionDetails style={{width: '100%', overflowX: 'auto'}}> <AccordionDetails style={{width: '100%', overflowX: 'auto'}}>
{processingResult ? ( {processingResult ? (
<div style={{width: '100%'}}> <div style={{width: '100%'}}>
<DataGridPremium <DataGridPremium<ProcessingResults>
rows={processingResult.map((res, idx) => ({id: idx, ...res}))} rows={processingResult.map((res, idx) => ({ id: idx, ...res }))}
columns={resultColumns} columns={resultColumns}
autoHeight autoHeight
hideFooter hideFooter
columnVisibilityModel={{id: false}} columnVisibilityModel={{ id: false }}
disableColumnResize={false} disableColumnResize={false}
/> />
</div> </div>
@ -276,6 +302,38 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath,
</Accordion> </Accordion>
</div> </div>
{processingResult && processingResult.length > 0 && (
<div style={{width: 400, marginTop: '16px' }}>
<Typography variant="h6" gutterBottom>CC and CC(1/2) vs Resolution</Typography>
<LineChart
xAxis={[
{
data: processingResult[0].cc
.map((point) => point.resolution) // Grab the resolution values
.reverse(), // Reverse the data for resolution
label: 'Resolution (Å)',
reverse: true, // This ensures the visual flip on the chart, low-res to right and high-res to left
},
]}
series={[
{
data: processingResult[0].cc
.map((point) => point.value)
.reverse(), // Reverse the CC values to match the reversed resolution
label: 'CC',
},
{
data: processingResult[0].cchalf
.map((point) => point.value)
.reverse(), // Reverse the CC(1/2) values to match the reversed resolution
label: 'CC(1/2)',
},
]}
height={300}
/>
</div>
)}
{/* Modal for Zoomed Image */} {/* Modal for Zoomed Image */}
<Modal open={modalOpen} onClose={closeModal}> <Modal open={modalOpen} onClose={closeModal}>
<Box <Box

View File

@ -2,6 +2,17 @@ FROM node:18-alpine
WORKDIR /app WORKDIR /app
# Setup build args clearly
ARG VITE_OPENAPI_BASE_DEV
ARG VITE_SSL_KEY_PATH
ARG VITE_SSL_CERT_PATH
ARG NODE_ENV=development
ENV VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
ENV VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
ENV VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
ENV NODE_ENV=${NODE_ENV}
# Copy only the necessary package files first # Copy only the necessary package files first
COPY package*.json ./ COPY package*.json ./
RUN npm install RUN npm install
@ -14,3 +25,8 @@ COPY . .
# Build the application # Build the application
RUN npm run build RUN npm run build
# Use a simple HTTP server to serve the built static files
EXPOSE 3000
CMD ["npm", "run", "start-dev"]

View File

@ -16,6 +16,12 @@ export default defineConfig(({ mode }) => {
}, },
host: '0.0.0.0', host: '0.0.0.0',
port: 3000, port: 3000,
hmr: {
clientPort: 3000,
protocol: 'wss', // explicitly HTTPS Manager explicitly clearly make wss:// clearly listened clearly explicitly.
host: 'mx-aare-test.psi.ch' // explicitly your browser hostname explicitly clearly explicitly
},
}, },
}; };
}); });

View File

@ -15,4 +15,5 @@ pydantic[email]
mysqlclient~=2.1.1 mysqlclient~=2.1.1
python-multipart~=0.0.6 python-multipart~=0.0.6
uvicorn==0.23.1 uvicorn==0.23.1
python-dotenv python-dotenv
psycopg2-binary

File diff suppressed because one or more lines are too long