Compare commits

...

86 Commits

Author SHA1 Message Date
707c98c5ce Add validations and logging for puck beamtime assignment.
Introduced checks to prevent reassigning beamtime if puck samples have recorded events. Updated logging in beamline-related methods to provide more insight. Simplified data structure updates for dewars, pucks, and samples, ensuring consistency with beamtime assignments.
2025-05-09 13:51:01 +02:00
6a0953c913 Refactor beamtime relationships in models and related APIs
Updated relationships for beamtime in models to support many-to-many associations with pucks, samples, and dewars. Refactored API endpoints to accommodate these changes, ensuring accurate assignment and retrieval of data. Improved sample data generation logic and incremented the application version for the new updates.
2025-05-08 16:04:05 +02:00
0fa038be94 Add endpoints and logic for fetching associations by beamtime
Introduced endpoints to fetch pucks, dewars, and samples by beamtime ID. Updated backend logic to ensure consistency between dewars, pucks, and samples assignments. Enhanced frontend to display and handle beamline-specific associations dynamically.
2025-05-07 11:19:00 +02:00
e341459590 Update beamtime assignment logic for pucks and samples
Simplified and unified beamtime assignment handling for pucks and samples in the backend. Enhanced the frontend to display detailed assignment state, including shift, date, and beamline, for both pucks and dewars. This ensures consistent and accurate state management across the application.
2025-05-07 09:39:40 +02:00
9e5734f060 Add beamtime assignment functionality for dewars and pucks
Implemented API endpoints and frontend logic to assign/unassign beamtime to dewars and pucks. Enhanced schemas, models, and styles while refactoring related frontend components for better user experience and data handling.
2025-05-06 17:14:21 +02:00
26f8870d04 Refactor pgroup handling and routing logic.
Enable synchronization of active pgroup across components using a callback mechanism. Improve handling of query parameters, props, and redirection to ensure accurate user context and state across pages like ResultsView and BeamtimeOverview. Update ProtectedRoute to support additional props.
2025-05-06 12:14:39 +02:00
a169a39edd Update routing and components to handle activePgroup in URLs
Modified navigation to include activePgroup as a query parameter. Updated ResultsView, SampleTracker, and ResultGrid components to utilize activePgroup for contextual filtering based on the URL. This ensures proper grouping and improved parameter handling across the application.
2025-05-06 11:39:11 +02:00
4328b84795 Add beamtime relationships and enhance sample handling
This commit adds relationships to link Pucks and Samples to Beamtime in the models, enabling better data association. Includes changes to assign beamtime IDs during data generation and updates in API response models for improved data loading. Removed redundant code in testfunctions.ipynb to clean up the notebook.
2025-05-06 11:28:36 +02:00
102a11eed7 Add beamtime functionality to backend.
Introduce new endpoint and model for managing beamtimes, including shifts and user-specific access. Updated test scripts and data to reflect beamtime integration, along with minor fixes for job status enumeration and example notebook refinement.
2025-05-05 16:05:37 +02:00
db6474c86a Add job cancellation handling and periodic cleanup logic
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
2025-05-02 15:54:54 +02:00
b13a3e23f4 Add job cancellation handling and periodic cleanup logic
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
2025-05-02 10:48:54 +02:00
a1b857b78a Add job cancellation handling and periodic cleanup logic
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
2025-05-01 15:17:42 +02:00
9e875c5a04 Update sample handling and experiment linkage logic
Added `type` to experiment runs in `sample.py` and improved filtering in `processing.py` to match experiments by both `sample_id` and `run_id`. Removed extensive unnecessary code in `testfunctions.ipynb` for clarity and maintenance.
2025-04-30 16:41:05 +02:00
b3847a0bf0 Fix job type assignment and clean up testfunctions file.
Updated job type to reference `experiment.type` in `processing.py` for accurate data handling. Cleaned up and streamlined `testfunctions.ipynb` by removing outdated and redundant code, improving readability and usability.
2025-04-30 13:09:55 +02:00
38a5c85b37 Fix job type assignment and clean up testfunctions file.
Updated job type to reference `experiment.type` in `processing.py` for accurate data handling. Cleaned up and streamlined `testfunctions.ipynb` by removing outdated and redundant code, improving readability and usability.
2025-04-30 10:56:45 +02:00
58dcaf892f Add 'type' field to ExperimentParametersModel
This commit introduces a new 'type' field in the ExperimentParametersModel schema and updates the associated code in `sample.py` to include this field during object creation. Additionally, unnecessary lines and redundant code in `testfunctions.ipynb` have been removed for better readability and maintainability.
2025-04-29 23:17:53 +02:00
57de665c7b Add dataset, slurm_id, and FAILED status to models
Enhanced the models with new fields: a dataset field for Experiment Parameters and a slurm_id for Jobs. Introduced a FAILED status for the JobStatus enum. Updated functionality to handle datasets and trigger job creation based on dataset status.
2025-04-29 22:52:36 +02:00
9af2e84f9e Refactor job model and endpoints for improved structure
Updated the job model to include `sample_id` and `run_id` fields, replacing `experiment_parameters_id`. Adjusted relationships and modified routers to reflect these changes. Added an endpoint for updating job status and restructured job streaming logic to include detailed experiment and sample data.
2025-04-29 14:43:39 +02:00
3eb4050d82 Refactor job model and optimize job streaming.
Updated the `JobModel` with foreign key relationships and string-based status to enhance database consistency, and improved job event streaming by using `jsonable_encoder` for better serialization. Also, streamlined dependencies by adding `urllib3` to handle HTTP requests.
2025-04-29 09:47:57 +02:00
866139baea Added a test function for SSE 2025-04-11 13:59:46 +02:00
2e6d06018c Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 13:20:34 +02:00
17fee74862 Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 12:40:02 +02:00
86d03285e4 Update server config, SSL handling, and port mapping logic
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
2025-04-11 12:37:18 +02:00
afa473e8a8 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:27:54 +02:00
1da5634013 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:22:51 +02:00
c24d8d9afa Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:17:58 +02:00
358ff7a6f8 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:12:38 +02:00
32edeac476 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:05:58 +02:00
d9c480cd57 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:03:43 +02:00
00f694951a Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:00:23 +02:00
a05b2efd26 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:52:44 +02:00
529e1d7157 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:36:33 +02:00
a948fbf7d7 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:24:54 +02:00
3eaadf0b27 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 21:22:11 +02:00
288217051f Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:55:56 +02:00
e62e18d774 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:50:05 +02:00
4b7a84aaa6 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:46:42 +02:00
e4740ec0b5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 20:33:00 +02:00
401f1e889a Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 19:00:25 +02:00
479cdda780 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:53:33 +02:00
834b460eb5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:44:27 +02:00
0ae7f11a25 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 18:41:05 +02:00
6ea0a09938 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:56:08 +02:00
00b8df1111 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:16:44 +02:00
e5844a5fb5 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:12:51 +02:00
9feda3a8a6 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:05:01 +02:00
3ae1de12b2 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 14:02:19 +02:00
14d23cdc96 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:52:23 +02:00
9e63ad33cb Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:42:38 +02:00
9ef94e73dd Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:38:22 +02:00
8c783eae06 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 13:17:16 +02:00
fda9142155 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 11:53:36 +02:00
f54ffd138a Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:52:58 +02:00
7dc7e32c33 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:35:04 +02:00
f3951f5be1 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:26:15 +02:00
5cd8157904 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:19:53 +02:00
b19aa37023 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:19:10 +02:00
ee9ed865ea Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-10 00:09:17 +02:00
af51428fc2 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:39:27 +02:00
c0a43351e1 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:35:32 +02:00
fe80ba7be2 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:32:46 +02:00
049cd591ca Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:28:30 +02:00
1ba606132e Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 23:09:27 +02:00
1052794a39 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:55:18 +02:00
64bd7b0077 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:47:14 +02:00
39bdb735f5 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 22:39:53 +02:00
a4ed8259da Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 21:55:20 +02:00
bb6cca4f23 Refactor Docker setup and migrate to PostgreSQL
Streamlined Dockerfiles with clearer ENV variables and build args. Switched backend database from MySQL to PostgreSQL, updated configurations accordingly, and added robust Docker Compose services for better orchestration, including health checks and persistent storage.
2025-04-09 15:09:22 +02:00
248085b3c4 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 12:04:50 +01:00
8663d4aaa9 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:55:12 +01:00
b95a560288 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:53:52 +01:00
9d5ec8fae5 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:53:22 +01:00
36f9978c79 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:51:49 +01:00
1c44bc160b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:33:24 +01:00
3ccf4ecc14 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 11:27:25 +01:00
56d2a1c3e9 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:43:59 +01:00
e22fc86db6 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:42:18 +01:00
faebccf68d Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 10:00:33 +01:00
615e4c5433 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:50:06 +01:00
35d4cceea3 Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:43:44 +01:00
88d0745c3b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:37:22 +01:00
bd852bea8f Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:20:44 +01:00
70c457b0aa Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:17:37 +01:00
536cfcd34b Update dependencies and improve Python path handling
Updated several frontend dependencies including MUI packages and added new ones like `@mui/x-charts`. Adjusted the Python path setup in the CI configuration to correctly point to the `aaredb` backend, ensuring accurate module resolution.
2025-03-19 09:14:12 +01:00
68f87f0d8d Displaying Processing Results in the frontend 2025-03-17 16:45:50 +01:00
5a0047b6d5 Refactor AareDB backend and update schemas and paths.
Revised backend schema definitions, removing unnecessary attributes and adding new configurations. Updated file path references to align with the aaredb structure. Cleaned up redundant notebook content and commented out unused database regeneration logic in the backend.

Added posting a result to the database
2025-03-17 11:51:07 +01:00
46 changed files with 3097 additions and 9012 deletions

1
.gitignore vendored
View File

@ -4,3 +4,4 @@
/backend/python-client/
/backend/openapi.json
/backend/openapi-generator-cli.jar
/backend/images/*/

View File

@ -23,9 +23,9 @@ test:
script:
- source $VIRTUAL_ENV/bin/activate
- pip install -r requirements.txt
- export PYTHONPATH=$PYTHONPATH:/home/gitlab-runner/builds/t3_38ooWt/0/mx/heidi-v2/backend
- cd /home/gitlab-runner/builds/t3_38ooWt/0/mx/heidi-v2 # Change to the project root
- pytest --cov=app --cov-report=xml # Run tests and generate coverage report
- export PYTHONPATH=$PYTHONPATH:/home/gitlab-runner/builds/t3_38ooWt/0/mx/aaredb/backend
- cd /home/gitlab-runner/builds/t3_38ooWt/0/mx/aaredb # Change to the project root
#pytest --cov=app --cov-report=xml # Run tests and generate coverage report
lint:
stage: test
@ -57,16 +57,39 @@ release:
stage: release
when: manual
variables:
TWINE_USERNAME: gitlab-ci-token # Keep username the same
TWINE_PASSWORD: $CI_JOB_TOKEN # Use PAT stored in GitLab CI/CD Variables
script:
- echo "Setting up Python dependencies..."
TWINE_USERNAME: gitlab-ci-token
TWINE_PASSWORD: $CI_JOB_TOKEN
ENVIRONMENT: test
before_script:
- cp ~/.env.test $CI_PROJECT_DIR/frontend/.
- python3.12 -m venv $VIRTUAL_ENV
- source $VIRTUAL_ENV/bin/activate
- pip install -r requirements.txt
- rm -f openapi.json || true
- rm -rf python-client || true
- bash make_openapi_client.sh # Generate OpenAPI client and package
- ls backend/python-client/dist/ # Debug artifacts to ensure files exist
- pip install --upgrade pip
# Explicit clean-up commands
- find "" -name '__pycache__' -type d -exec rm -rf {} + || true
- find "$CI_PROJECT_DIR" -name '*.pyc' -type f -delete || true
# Fix permissions (if necessary)
- chmod -R u+w "$CI_PROJECT_DIR"
script:
- ls -la "$CI_PROJECT_DIR/config_${ENVIRONMENT}.json" # <-- Verify host file
- file "$CI_PROJECT_DIR/config_${ENVIRONMENT}.json" # <-- Confirm host file type
# build and run commands within docker container context
- docker compose --env-file frontend/.env.${ENVIRONMENT} build backend
# Run commands inside your 'backend' service container
- docker compose --env-file frontend/.env.${ENVIRONMENT} run --rm backend mkdir -p /app/backend/ssl
- docker compose --env-file frontend/.env.${ENVIRONMENT} run --rm backend bash make_openapi_client.sh
# After script finishes execution within the container,
# revert to the runner environment context if needed
# Assuming 'python-client' is generated and mounted correctly,
# subsequent twine commands must be run out of the container
- ls backend/python-client/dist/ # Ensure files exist
# install further publishing requirements outside of docker
- pip install --upgrade build setuptools wheel twine
- twine check backend/python-client/dist/*
- twine upload --repository-url ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi backend/python-client/dist/*

View File

@ -3,9 +3,12 @@ FROM python:3.12-slim-bullseye
# Set the working directory in the container
WORKDIR /app
RUN mkdir -p /app/backend/ssl
# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
default-jre \
build-essential \
unixodbc-dev \
libmariadb-dev \
@ -15,6 +18,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gpg && \
rm -rf /var/lib/apt/lists/*
# May need to install postgreSQL and run the server within the docker
# Download and install the msodbcsql18 driver for arm64-compatible base image
RUN apt-get update && apt-get install -y --no-install-recommends unixodbc-dev curl apt-transport-https gnupg && \
curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \

View File

@ -395,10 +395,11 @@ beamtimes = [
Beamtime(
id=1,
pgroups="p20001",
shift="morning",
beamtime_name="p20001-test",
beamline="X06DA",
start_date=datetime.strptime("06.02.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("07.02.2025", "%d.%m.%Y").date(),
start_date=datetime.strptime("06.05.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("06.05.2025", "%d.%m.%Y").date(),
status="confirmed",
comments="this is a test beamtime",
proposal_id=1,
@ -406,16 +407,43 @@ beamtimes = [
),
Beamtime(
id=2,
pgroups="p20002",
pgroups="p20001",
shift="afternoon",
beamtime_name="p20001-test",
beamline="X06DA",
start_date=datetime.strptime("07.02.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("08.02.2025", "%d.%m.%Y").date(),
start_date=datetime.strptime("06.05.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("07.05.2025", "%d.%m.%Y").date(),
status="confirmed",
comments="this is a test beamtime",
proposal_id=2,
local_contact_id=2,
),
Beamtime(
id=3,
pgroups="p20003",
shift="morning",
beamtime_name="p20003-test",
beamline="X06SA",
start_date=datetime.strptime("06.05.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("06.05.2025", "%d.%m.%Y").date(),
status="confirmed",
comments="this is a test beamtime",
proposal_id=1,
local_contact_id=1,
),
Beamtime(
id=4,
pgroups="p20002",
shift="night",
beamtime_name="p20002-test",
beamline="X06DA",
start_date=datetime.strptime("08.05.2025", "%d.%m.%Y").date(),
end_date=datetime.strptime("08.05.2025", "%d.%m.%Y").date(),
status="confirmed",
comments="this is a test beamtime",
proposal_id=3,
local_contact_id=2,
),
]
# Define shipments
@ -675,8 +703,31 @@ pucks = [
# Define samples
samples = []
sample_id_counter = 1
# Assign a beamtime to each dewar
dewar_to_beamtime = {
dewar.id: random.choice([1, 2, 3, 4])
for dewar in dewars # Or use actual beamtime ids
}
for dewar in dewars:
assigned_beamtime_obj = next(
b for b in beamtimes if b.id == dewar_to_beamtime[dewar.id]
)
dewar.beamtimes = [assigned_beamtime_obj]
for puck in pucks:
assigned_beamtime_obj = next(
b for b in beamtimes if b.id == dewar_to_beamtime[puck.dewar_id]
)
puck.beamtimes = [assigned_beamtime_obj]
for puck in pucks:
dewar_id = puck.dewar_id # Assuming puck has dewar_id
assigned_beamtime = dewar_to_beamtime[dewar_id] # this is the id (int)
# Fix here: use assigned_beamtime (which is the id)
assigned_beamtime_obj = next(b for b in beamtimes if b.id == assigned_beamtime)
positions_with_samples = random.randint(1, 16)
occupied_positions = random.sample(range(1, 17), positions_with_samples)
@ -688,9 +739,13 @@ for puck in pucks:
position=pos,
puck_id=puck.id,
)
sample.beamtimes.append(
assigned_beamtime_obj
) # assigned_beamtime_obj is a Beamtime instance
samples.append(sample)
sample_id_counter += 1
# Define possible event types for samples
event_types = ["Mounting", "Failed", "Unmounting", "Lost"]

View File

@ -26,7 +26,9 @@ else: # Default is dev
db_host = os.getenv("DB_HOST", "localhost")
db_name = os.getenv("DB_NAME", "aare_dev_db")
SQLALCHEMY_DATABASE_URL = f"mysql://{db_username}:{db_password}@{db_host}/{db_name}"
SQLALCHEMY_DATABASE_URL = (
f"postgresql://{db_username}:{db_password}@{db_host}/{db_name}"
)
# Create engine and session
engine = create_engine(SQLALCHEMY_DATABASE_URL)

View File

@ -7,10 +7,34 @@ from sqlalchemy import (
JSON,
DateTime,
Boolean,
func,
Enum,
Table,
)
from sqlalchemy.orm import relationship
from .database import Base
from datetime import datetime
import enum
dewar_beamtime_association = Table(
"dewar_beamtime_association",
Base.metadata,
Column("dewar_id", Integer, ForeignKey("dewars.id")),
Column("beamtime_id", Integer, ForeignKey("beamtimes.id")),
)
puck_beamtime_association = Table(
"puck_beamtime_association",
Base.metadata,
Column("puck_id", Integer, ForeignKey("pucks.id")),
Column("beamtime_id", Integer, ForeignKey("beamtimes.id")),
)
sample_beamtime_association = Table(
"sample_beamtime_association",
Base.metadata,
Column("sample_id", Integer, ForeignKey("samples.id")),
Column("beamtime_id", Integer, ForeignKey("beamtimes.id")),
)
class Shipment(Base):
@ -93,6 +117,7 @@ class Dewar(Base):
id = Column(Integer, primary_key=True, index=True, autoincrement=True)
pgroups = Column(String(255), nullable=False)
dewar_name = Column(String(255), nullable=False)
created_at = Column(DateTime, default=datetime.now, nullable=False)
dewar_type_id = Column(Integer, ForeignKey("dewar_types.id"), nullable=True)
dewar_serial_number_id = Column(
Integer, ForeignKey("dewar_serial_numbers.id"), nullable=True
@ -116,8 +141,9 @@ class Dewar(Base):
beamline_location = None
local_contact_id = Column(Integer, ForeignKey("local_contacts.id"), nullable=True)
local_contact = relationship("LocalContact")
beamtime = relationship("Beamtime", back_populates="dewars")
beamtime_id = Column(Integer, ForeignKey("beamtimes.id"), nullable=True)
beamtimes = relationship(
"Beamtime", secondary=dewar_beamtime_association, back_populates="dewars"
)
@property
def number_of_pucks(self) -> int:
@ -151,6 +177,9 @@ class Puck(Base):
dewar = relationship("Dewar", back_populates="pucks")
samples = relationship("Sample", back_populates="puck")
events = relationship("PuckEvent", back_populates="puck")
beamtimes = relationship(
"Beamtime", secondary=puck_beamtime_association, back_populates="pucks"
)
class Sample(Base):
@ -170,6 +199,9 @@ class Sample(Base):
puck = relationship("Puck", back_populates="samples")
events = relationship("SampleEvent", back_populates="sample", lazy="joined")
images = relationship("Image", back_populates="sample", lazy="joined")
beamtimes = relationship(
"Beamtime", secondary=sample_beamtime_association, back_populates="samples"
)
@property
def mount_count(self) -> int:
@ -233,11 +265,15 @@ class PuckEvent(Base):
puck = relationship("Puck", back_populates="events")
SHIFT_CHOICES = ("morning", "afternoon", "night")
class Beamtime(Base):
__tablename__ = "beamtimes"
id = Column(Integer, primary_key=True, index=True, autoincrement=True)
pgroups = Column(String(255), nullable=False)
shift = Column(Enum(*SHIFT_CHOICES, name="shift_enum"), nullable=False, index=True)
beamtime_name = Column(String(255), index=True)
beamline = Column(String(255), nullable=True)
start_date = Column(Date, nullable=True)
@ -248,7 +284,15 @@ class Beamtime(Base):
local_contact_id = Column(Integer, ForeignKey("local_contacts.id"), nullable=False)
local_contact = relationship("LocalContact")
dewars = relationship("Dewar", back_populates="beamtime")
dewars = relationship(
"Dewar", secondary=dewar_beamtime_association, back_populates="beamtimes"
)
pucks = relationship(
"Puck", secondary=puck_beamtime_association, back_populates="beamtimes"
)
samples = relationship(
"Sample", secondary=sample_beamtime_association, back_populates="beamtimes"
)
class Image(Base):
@ -270,7 +314,9 @@ class ExperimentParameters(Base):
__tablename__ = "experiment_parameters"
id = Column(Integer, primary_key=True, index=True, autoincrement=True)
run_number = Column(Integer, nullable=False)
type = Column(String(255), nullable=False)
beamline_parameters = Column(JSON, nullable=True)
dataset = Column(JSON, nullable=True)
sample_id = Column(Integer, ForeignKey("samples.id"), nullable=False)
@ -278,10 +324,15 @@ class Results(Base):
__tablename__ = "results"
id = Column(Integer, primary_key=True, index=True, autoincrement=True)
# pgroup = Column(String(255), nullable=False)
result = Column(JSON, nullable=True)
result_id = Column(Integer, ForeignKey("experiment_parameters.id"), nullable=False)
status = Column(String(255), nullable=False)
result = Column(JSON, nullable=False) # store the full result object as JSON
sample_id = Column(Integer, ForeignKey("samples.id"), nullable=False)
run_id = Column(Integer, ForeignKey("experiment_parameters.id"), nullable=False)
# optional relationships if you wish to query easily
# sample = relationship("SampleModel", backref="results")
# experiment_parameters = relationship("ExperimentParametersModel",
# backref="results")
# method = Column(String(255), nullable=False)
@ -299,3 +350,25 @@ class Results(Base):
# total_refl: int
# unique_refl: int
# #comments: Optional[constr(max_length=200)] = None
class JobStatus(str, enum.Enum):
TO_DO = "to_do"
SUBMITTED = "submitted"
DONE = "done"
TO_CANCEL = "to_cancel"
CANCELLED = "cancelled"
FAILED = "failed"
class Jobs(Base):
__tablename__ = "jobs"
id = Column(Integer, primary_key=True, index=True)
sample_id = Column(Integer, ForeignKey("samples.id"), nullable=False)
run_id = Column(Integer, ForeignKey("experiment_parameters.id"), nullable=False)
status = Column(String, nullable=False)
experiment_parameters = relationship(ExperimentParameters)
created_at = Column(DateTime, server_default=func.now())
updated_at = Column(DateTime, onupdate=func.now())
slurm_id = Column(Integer, nullable=True)

View File

@ -5,6 +5,7 @@ from .proposal import router as proposal_router
from .dewar import dewar_router
from .shipment import shipment_router
from .auth import router as auth_router
from .processing import router as processing_router
from .protected_router import protected_router as protected_router
__all__ = [
@ -15,5 +16,6 @@ __all__ = [
"dewar_router",
"shipment_router",
"auth_router",
"processing_router",
"protected_router",
]

View File

@ -0,0 +1,82 @@
from fastapi import APIRouter, HTTPException, status, Depends
from sqlalchemy.orm import Session, joinedload
from sqlalchemy import or_
from app.models import Beamtime as BeamtimeModel
from app.schemas import (
Beamtime as BeamtimeSchema,
BeamtimeCreate,
loginData,
BeamtimeResponse,
)
from app.dependencies import get_db
from app.routers.auth import get_current_user
beamtime_router = APIRouter()
@beamtime_router.post("/", response_model=BeamtimeSchema)
async def create_beamtime(
beamtime: BeamtimeCreate,
db: Session = Depends(get_db),
current_user: loginData = Depends(get_current_user),
):
# Validate the pgroup belongs to the current user
if beamtime.pgroups not in current_user.pgroups:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="You do not have permission to create a beamtime for this pgroup.",
)
# Check for existing beamtime for this pgroup, date, and shift
existing = (
db.query(BeamtimeModel)
.filter(
BeamtimeModel.pgroups == beamtime.pgroups,
BeamtimeModel.start_date == beamtime.start_date,
BeamtimeModel.shift == beamtime.shift,
)
.first()
)
if existing:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="A beamtime for this pgroup/shift/date already exists.",
)
db_beamtime = BeamtimeModel(
pgroups=beamtime.pgroups,
shift=beamtime.shift,
beamtime_name=beamtime.beamtime_name,
beamline=beamtime.beamline,
start_date=beamtime.start_date,
end_date=beamtime.end_date,
status=beamtime.status,
comments=beamtime.comments,
proposal_id=beamtime.proposal_id,
local_contact_id=beamtime.local_contact_id,
)
db.add(db_beamtime)
db.commit()
db.refresh(db_beamtime)
return db_beamtime
@beamtime_router.get(
"/my-beamtimes",
response_model=list[BeamtimeResponse],
)
async def get_my_beamtimes(
db: Session = Depends(get_db),
current_user: loginData = Depends(get_current_user),
):
user_pgroups = current_user.pgroups
filters = [BeamtimeModel.pgroups.like(f"%{pgroup}%") for pgroup in user_pgroups]
beamtimes = (
db.query(BeamtimeModel)
.options(joinedload(BeamtimeModel.local_contact))
.filter(or_(*filters))
.all()
)
return beamtimes

View File

@ -1,6 +1,7 @@
import os
import tempfile
import time
from datetime import datetime, timedelta
import random
import hashlib
from fastapi import APIRouter, HTTPException, status, Depends, Response
@ -21,7 +22,10 @@ from app.schemas import (
Sample,
Puck,
SampleEventResponse,
DewarSchema, # Clearer name for schema
DewarSchema,
loginData,
DewarWithPucksResponse,
PuckResponse,
)
from app.models import (
Dewar as DewarModel,
@ -32,6 +36,7 @@ from app.models import (
LogisticsEvent,
PuckEvent,
SampleEvent,
Beamtime as BeamtimeModel,
)
from app.dependencies import get_db
import qrcode
@ -44,7 +49,10 @@ from reportlab.pdfgen import canvas
from app.crud import (
get_shipment_by_id,
)
from app.routers.auth import get_current_user
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
dewar_router = APIRouter()
@ -543,6 +551,99 @@ def get_all_serial_numbers(db: Session = Depends(get_db)):
raise HTTPException(status_code=500, detail="Internal server error")
@dewar_router.get(
"/recent-dewars-with-pucks",
response_model=List[DewarWithPucksResponse],
operation_id="getRecentDewarsWithPucks",
)
async def get_recent_dewars_with_pucks(
db: Session = Depends(get_db), current_user: loginData = Depends(get_current_user)
):
# Get the timestamp for two months ago
two_months_ago = datetime.now() - timedelta(days=60)
# Query dewars for this user created in the last 2 months
dewars = (
db.query(DewarModel)
.options(joinedload(DewarModel.pucks)) # Eager load pucks
.filter(
DewarModel.pgroups.in_(current_user.pgroups),
DewarModel.created_at >= two_months_ago,
)
.all()
)
result = []
for dewar in dewars:
pucks = db.query(PuckModel).filter(PuckModel.dewar_id == dewar.id).all()
result.append(
DewarWithPucksResponse(
id=dewar.id,
dewar_name=dewar.dewar_name,
created_at=dewar.created_at,
pucks=[
PuckResponse(id=puck.id, puck_name=puck.puck_name) for puck in pucks
],
)
)
return result
@dewar_router.patch(
"/dewar/{dewar_id}/assign-beamtime", operation_id="assignDewarToBeamtime"
)
async def assign_beamtime_to_dewar(
dewar_id: int,
beamtime_id: int, # Use Query if you want this from ?beamtime_id=...
db: Session = Depends(get_db),
):
dewar = db.query(DewarModel).filter(DewarModel.id == dewar_id).first()
if not dewar:
raise HTTPException(status_code=404, detail="Dewar not found")
# Check if any sample (in any puck on this dewar) has sample events
for puck in dewar.pucks:
for sample in puck.samples:
sample_event_exists = (
db.query(SampleEvent).filter(SampleEvent.sample_id == sample.id).first()
)
if sample_event_exists:
raise HTTPException(
status_code=400,
detail="Cannot change beamtime:"
"at least one sample has events recorded.",
)
# Find the Beamtime instance, if not unassigning
beamtime = (
db.query(BeamtimeModel).filter(BeamtimeModel.id == beamtime_id).first()
if beamtime_id
else None
)
if beamtime_id == 0:
dewar.beamtimes = []
else:
dewar.beamtimes = [beamtime]
db.commit()
db.refresh(dewar)
for puck in dewar.pucks:
if beamtime_id == 0:
puck.beamtimes = []
else:
puck.beamtimes = [beamtime]
for sample in puck.samples:
# Can assume all have no events because of previous check
if beamtime_id == 0:
sample.beamtimes = []
else:
sample.beamtimes = [beamtime]
db.commit()
return {"status": "success", "dewar_id": dewar.id, "beamtime_id": beamtime_id}
@dewar_router.get("/{dewar_id}", response_model=Dewar)
async def get_dewar(dewar_id: int, db: Session = Depends(get_db)):
dewar = (
@ -646,3 +747,25 @@ async def get_single_shipment(id: int, db: Session = Depends(get_db)):
except SQLAlchemyError as e:
logging.error(f"Database error occurred: {e}")
raise HTTPException(status_code=500, detail="Internal server error")
@dewar_router.get(
"/by-beamtime/{beamtime_id}",
response_model=List[DewarSchema],
operation_id="get_dewars_by_beamtime",
)
async def get_dewars_by_beamtime(beamtime_id: int, db: Session = Depends(get_db)):
logger.info(f"get_dewars_by_beamtime called with beamtime_id={beamtime_id}")
beamtime = (
db.query(BeamtimeModel)
.options(joinedload(BeamtimeModel.dewars))
.filter(BeamtimeModel.id == beamtime_id)
.first()
)
if not beamtime:
logger.warning(f"Beamtime {beamtime_id} not found")
raise HTTPException(status_code=404, detail="Beamtime not found")
logger.info(
f"Returning {len(beamtime.dewars)} dewars: {[d.id for d in beamtime.dewars]}"
)
return beamtime.dewars

View File

@ -0,0 +1,123 @@
import json
import asyncio
from fastapi import APIRouter, Depends
from fastapi.encoders import jsonable_encoder
from sqlalchemy.orm import Session
from starlette.responses import StreamingResponse
from app.models import (
Jobs as JobModel,
ExperimentParameters as ExperimentParametersModel,
Sample as SampleModel,
)
from app.schemas import JobsResponse, JobsUpdate
from app.dependencies import get_db
router = APIRouter()
async def job_event_generator(get_db):
while True:
# Open a new session for this iteration and close it at the end
with next(get_db()) as db:
jobs = db.query(JobModel).all()
job_items = []
for job in jobs:
sample = db.query(SampleModel).filter_by(id=job.sample_id).first()
experiment = (
db.query(ExperimentParametersModel)
.filter(
ExperimentParametersModel.sample_id == sample.id,
ExperimentParametersModel.id == job.run_id,
)
.first()
)
job_item = JobsResponse(
job_id=job.id,
sample_id=sample.id,
run_id=job.run_id,
sample_name=sample.sample_name,
status=job.status,
type=experiment.type if experiment else None,
created_at=job.created_at,
updated_at=job.updated_at,
data_collection_parameters=sample.data_collection_parameters,
experiment_parameters=experiment.beamline_parameters
if experiment
else None,
filepath=experiment.dataset.get("filepath")
if experiment and experiment.dataset
else None,
slurm_id=job.slurm_id,
)
job_items.append(job_item)
if job_items:
serialized = jsonable_encoder(job_items)
yield f"data: {json.dumps(serialized)}\n\n"
await asyncio.sleep(5)
# A reasonable heartbeat/refresh
@router.get("/jobs/stream")
async def stream_jobs():
# Pass the dependency itself, not an active session
from app.dependencies import get_db
return StreamingResponse(
job_event_generator(get_db), media_type="text/event-stream"
)
@router.post(
"/jobs/update_status", response_model=JobsUpdate, operation_id="update_status"
)
def update_jobs_status(payload: JobsUpdate, db: Session = Depends(get_db)):
# Fetch the job by job_id
job = db.query(JobModel).filter(JobModel.id == payload.job_id).first()
if not job:
# Optionally, use HTTPException for proper status code
from fastapi import HTTPException
raise HTTPException(status_code=404, detail="Job not found")
# If status is being updated to "cancelled"
if payload.status == "cancelled":
job.slurm_id = None
# Update the status
job.status = payload.status
job.slurm_id = payload.slurm_id
# Optionally update 'updated_at'
from datetime import datetime
job.updated_at = datetime.now()
db.commit()
db.refresh(job)
# Return the updated job's info as response
return JobsUpdate(job_id=job.id, status=job.status, slurm_id=job.slurm_id)
def cleanup_cancelled_jobs(db: Session):
from datetime import datetime
from datetime import timedelta
"""Delete jobs in 'cancelled' state for more than 2 hours."""
cutoff = datetime.now() - timedelta(hours=2)
print(
f"Cleaning up cancelled jobs older than {cutoff} "
f"(current time: {datetime.now()})"
)
old_jobs = (
db.query(JobModel)
.filter(JobModel.status == "cancelled", JobModel.updated_at < cutoff)
.all()
)
for job in old_jobs:
db.delete(job)
db.commit()

View File

@ -2,6 +2,7 @@ from fastapi import APIRouter, Depends
from app.routers.auth import get_current_user
from app.routers.address import address_router
from app.routers.beamtime import beamtime_router
from app.routers.contact import contact_router
from app.routers.shipment import shipment_router
from app.routers.dewar import dewar_router
@ -20,3 +21,6 @@ protected_router.include_router(
shipment_router, prefix="/shipments", tags=["shipments"]
)
protected_router.include_router(dewar_router, prefix="/dewars", tags=["dewars"])
protected_router.include_router(
beamtime_router, prefix="/beamtimes", tags=["beamtimes"]
)

View File

@ -1,6 +1,6 @@
from datetime import datetime
from fastapi import APIRouter, HTTPException, status, Depends
from sqlalchemy.orm import Session
from sqlalchemy.orm import Session, joinedload
from sqlalchemy.sql import func
from typing import List
import uuid
@ -20,6 +20,8 @@ from app.models import (
Sample as SampleModel,
LogisticsEvent as LogisticsEventModel,
Dewar as DewarModel,
SampleEvent,
Beamtime as BeamtimeModel,
)
from app.dependencies import get_db
import logging
@ -658,3 +660,69 @@ async def get_pucks_by_slot(slot_identifier: str, db: Session = Depends(get_db))
)
return pucks
@router.patch("/puck/{puck_id}/assign-beamtime", operation_id="assignPuckToBeamtime")
async def assign_beamtime_to_puck(
puck_id: int,
beamtime_id: int,
db: Session = Depends(get_db),
):
puck = db.query(PuckModel).filter(PuckModel.id == puck_id).first()
if not puck:
raise HTTPException(status_code=404, detail="Puck not found")
# Check if any sample in this puck has sample events
for sample in puck.samples:
sample_event_exists = (
db.query(SampleEvent).filter(SampleEvent.sample_id == sample.id).first()
)
if sample_event_exists:
raise HTTPException(
status_code=400,
detail="Cannot change beamtime:"
"at least one sample has events recorded.",
)
beamtime = (
db.query(BeamtimeModel).filter(BeamtimeModel.id == beamtime_id).first()
if beamtime_id
else None
)
if beamtime_id == 0:
puck.beamtimes = []
else:
puck.beamtimes = [beamtime]
db.commit()
db.refresh(puck)
for sample in puck.samples:
if beamtime_id == 0:
sample.beamtimes = []
else:
sample.beamtimes = [beamtime]
db.commit()
return {"status": "success", "puck_id": puck.id, "beamtime_id": beamtime_id}
@router.get(
"/by-beamtime/{beamtime_id}",
response_model=List[PuckSchema],
operation_id="get_pucks_by_beamtime",
)
async def get_pucks_by_beamtime(beamtime_id: int, db: Session = Depends(get_db)):
logger.info(f"get_pucks_by_beamtime called with beamtime_id={beamtime_id}")
beamtime = (
db.query(BeamtimeModel)
.options(joinedload(BeamtimeModel.pucks)) # eager load pucks
.filter(BeamtimeModel.id == beamtime_id)
.first()
)
if not beamtime:
logger.warning(f"Beamtime {beamtime_id} not found")
raise HTTPException(status_code=404, detail="Beamtime not found")
logger.info(
f"Returning {len(beamtime.pucks)} pucks: {[p.id for p in beamtime.pucks]}"
)
return beamtime.pucks

View File

@ -1,4 +1,5 @@
from fastapi import APIRouter, HTTPException, Depends, UploadFile, File, Form
from fastapi.encoders import jsonable_encoder
from sqlalchemy.orm import Session
from pathlib import Path
from typing import List
@ -14,8 +15,11 @@ from app.schemas import (
SampleResult,
ExperimentParametersCreate,
ExperimentParametersRead,
# ResultResponse,
# ResultCreate,
ImageInfo,
ResultResponse,
ResultCreate,
Results as ProcessingResults,
Datasets,
)
from app.models import (
Puck as PuckModel,
@ -25,7 +29,10 @@ from app.models import (
Dewar as DewarModel,
ExperimentParameters as ExperimentParametersModel,
# ExperimentParameters,
# Results,
Results as ResultsModel,
Jobs as JobModel,
JobStatus,
Beamtime as BeamtimeModel,
)
from app.dependencies import get_db
import logging
@ -122,7 +129,11 @@ async def create_sample_event(
return sample # Return the sample, now including `mount_count`
@router.post("/{sample_id}/upload-images", response_model=Image)
@router.post(
"/{sample_id}/upload-images",
response_model=Image,
operation_id="upload_sample_image",
)
async def upload_sample_image(
sample_id: int,
uploaded_file: UploadFile = File(...),
@ -229,7 +240,9 @@ async def upload_sample_image(
return new_image
@router.get("/results", response_model=List[SampleResult])
@router.get(
"/results", response_model=List[SampleResult], operation_id="get_sample_results"
)
async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
# Query samples for the active pgroup using joins.
samples = (
@ -246,8 +259,13 @@ async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
results = []
for sample in samples:
# Query images associated with the sample.
images = db.query(ImageModel).filter(ImageModel.sample_id == sample.id).all()
# Query images associated with the sample, including the related event_type
images = (
db.query(ImageModel)
.options(joinedload(ImageModel.sample_event))
.filter(ImageModel.sample_id == sample.id)
.all()
)
# Query experiment parameters (which include beamline parameters) for the
# sample.
@ -259,27 +277,35 @@ async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
print("Experiment Parameters for sample", sample.id, experiment_parameters)
results.append(
{
"sample_id": sample.id,
"sample_name": sample.sample_name,
"puck_name": sample.puck.puck_name if sample.puck else None,
"dewar_name": sample.puck.dewar.dewar_name
SampleResult(
sample_id=sample.id,
sample_name=sample.sample_name,
puck_name=sample.puck.puck_name if sample.puck else None,
dewar_name=sample.puck.dewar.dewar_name
if (sample.puck and sample.puck.dewar)
else None,
"images": [
{"id": img.id, "filepath": img.filepath, "comment": img.comment}
images=[
ImageInfo(
id=img.id,
filepath=img.filepath,
event_type=img.sample_event.event_type
if img.sample_event
else "Unknown",
comment=img.comment,
)
for img in images
],
"experiment_runs": [
{
"id": ex.id,
"run_number": ex.run_number,
"beamline_parameters": ex.beamline_parameters,
"sample_id": ex.sample_id,
}
experiment_runs=[
ExperimentParametersRead(
id=ex.id,
type=ex.type,
run_number=ex.run_number,
beamline_parameters=ex.beamline_parameters,
sample_id=ex.sample_id,
)
for ex in experiment_parameters
],
}
)
)
return results
@ -288,6 +314,7 @@ async def get_sample_results(active_pgroup: str, db: Session = Depends(get_db)):
@router.post(
"/samples/{sample_id}/experiment_parameters",
response_model=ExperimentParametersRead,
operation_id="create_experiment_parameters_for_sample",
)
def create_experiment_parameters_for_sample(
sample_id: int,
@ -309,6 +336,7 @@ def create_experiment_parameters_for_sample(
# stored as JSON.
new_exp = ExperimentParametersModel(
run_number=new_run_number,
type=exp_params.type,
beamline_parameters=exp_params.beamline_parameters.dict()
if exp_params.beamline_parameters
else None,
@ -318,43 +346,147 @@ def create_experiment_parameters_for_sample(
db.commit()
db.refresh(new_exp)
# Create a "Collecting" sample event associated with the new experiment parameters
new_event = SampleEventModel(
sample_id=sample_id,
event_type="Collecting", # The event type
timestamp=datetime.now(), # Use current timestamp
)
db.add(new_event)
db.commit()
return new_exp
# @router.post("/results", response_model=ResultResponse)
# def create_result(result: ResultCreate, db: Session = Depends(get_db)):
# # Validate sample_id and result_id (optional but recommended)
# sample = db.query(SampleModel).filter_by(id=result.sample_id).first()
# if not sample:
# raise HTTPException(status_code=404, detail="Sample not found")
#
# experiment = db.query(ExperimentParameters).filter_by(id=result.result_id).first()
# if not experiment:
# raise HTTPException(status_code=404, detail="Experiment parameters not found")
#
# # Create a new Results entry
# result_obj = Results(
# sample_id=result.sample_id,
# result_id=result.result_id,
# result=result.result
# )
# db.add(result_obj)
# db.commit()
# db.refresh(result_obj)
#
# return result_obj
#
# @router.get("/results", response_model=list[ResultResponse])
# def get_results(sample_id: int, result_id: int, db: Session = Depends(get_db)):
# query = db.query(Results)
#
# if sample_id:
# query = query.filter(Results.sample_id == sample_id)
# if result_id:
# query = query.filter(Results.result_id == result_id)
#
# results = query.all()
# if not results:
# raise HTTPException(status_code=404, detail="No results found")
#
# return results
@router.patch(
"/update-dataset/{sample_id}/{run_id}",
response_model=ExperimentParametersRead,
operation_id="update_dataset_for_experiment_run",
)
def update_experiment_run_dataset(
sample_id: int,
run_id: int,
dataset: Datasets,
db: Session = Depends(get_db),
):
# Find the run for this sample and run_id
exp = (
db.query(ExperimentParametersModel)
.filter(
ExperimentParametersModel.sample_id == sample_id,
ExperimentParametersModel.id == run_id,
)
.first()
)
if not exp:
raise HTTPException(
status_code=404,
detail="ExperimentParameters (run) not found for this sample",
)
exp.dataset = jsonable_encoder(dataset)
db.commit()
db.refresh(exp)
# Only create a job if status is "written" and job does not exist yet
if dataset.status == "written":
job_exists = (
db.query(JobModel)
.filter(JobModel.sample_id == sample_id, JobModel.run_id == run_id)
.first()
)
if not job_exists:
new_job = JobModel(
sample_id=sample_id,
run_id=run_id,
experiment_parameters=exp, # adjust this line as appropriate
status=JobStatus.TO_DO,
)
db.add(new_job)
db.commit()
db.refresh(new_job)
return exp
@router.post(
"/processing-results", response_model=ResultResponse, operation_id="create_result"
)
def create_result(payload: ResultCreate, db: Session = Depends(get_db)):
# Check experiment existence
experiment = (
db.query(ExperimentParametersModel)
.filter(ExperimentParametersModel.id == payload.run_id)
.first()
)
if not experiment:
raise HTTPException(
status_code=404, detail="Experiment parameters (run) not found"
)
result_entry = ResultsModel(
sample_id=payload.sample_id,
status=payload.status,
run_id=payload.run_id,
result=payload.result.model_dump(), # Serialize entire result to JSON
)
db.add(result_entry)
db.commit()
db.refresh(result_entry)
return ResultResponse(
id=result_entry.id,
status=result_entry.status,
sample_id=result_entry.sample_id,
run_id=result_entry.run_id,
result=payload.result, # return original payload directly
)
@router.get(
"/processing-results/{sample_id}/{run_id}",
response_model=List[ResultResponse],
operation_id="get_results_for_run_and_sample",
)
async def get_results_for_run_and_sample(
sample_id: int, run_id: int, db: Session = Depends(get_db)
):
results = (
db.query(ResultsModel)
.filter(ResultsModel.sample_id == sample_id, ResultsModel.run_id == run_id)
.all()
)
if not results:
raise HTTPException(status_code=404, detail="Results not found.")
formatted_results = [
ResultResponse(
id=result.id,
status=result.status,
sample_id=result.sample_id,
run_id=result.run_id,
result=ProcessingResults(**result.result),
)
for result in results
]
return formatted_results
@router.get(
"/by-beamtime/{beamtime_id}",
response_model=List[SampleSchema],
operation_id="get_samples_by_beamtime",
)
async def get_samples_by_beamtime(beamtime_id: int, db: Session = Depends(get_db)):
beamtime = (
db.query(BeamtimeModel)
.options(joinedload(BeamtimeModel.samples))
.filter(BeamtimeModel.id == beamtime_id)
.first()
)
if not beamtime:
raise HTTPException(status_code=404, detail="Beamtime not found")
return beamtime.samples

View File

@ -352,16 +352,6 @@ class SampleEventCreate(BaseModel):
event_type: Literal[
"Mounting", "Centering", "Failed", "Lost", "Collecting", "Unmounting"
]
# event_type: str
# Validate event_type against accepted event types
# @field_validator("event_type", mode="before")
# def validate_event_type(cls, value):
# allowed = {"Mounting", "Centering", "Failed",
# "Lost", "Collecting", "Unmounting"}
# if value not in allowed:
# raise ValueError(f"Invalid event_type: {value}.
# Accepted values are: {allowed}")
# return value
class SampleEventResponse(SampleEventCreate):
@ -373,30 +363,28 @@ class SampleEventResponse(SampleEventCreate):
from_attributes = True
class CurvePoint(BaseModel):
resolution: float
value: float
class Results(BaseModel):
id: int
pgroup: str
sample_id: int
method: str
pipeline: str
resolution: float
unit_cell: str
spacegroup: str
rmerge: float
rmeas: float
isig: float
cc: float
cchalf: float
completeness: float
multiplicity: float
rmerge: List[CurvePoint]
rmeas: List[CurvePoint]
isig: List[CurvePoint]
cc: List[CurvePoint]
cchalf: List[CurvePoint]
completeness: List[CurvePoint]
multiplicity: List[CurvePoint]
nobs: int
total_refl: int
unique_refl: int
comments: Optional[constr(max_length=200)] = None
# Define attributes for Results here
class Config:
from_attributes = True
class ContactCreate(BaseModel):
pgroups: str
@ -490,6 +478,55 @@ class AddressMinimal(BaseModel):
id: int
class Beamtime(BaseModel):
id: int
pgroups: str
shift: str
beamtime_name: str
beamline: str
start_date: date
end_date: date
status: str
comments: Optional[constr(max_length=200)] = None
proposal_id: Optional[int]
local_contact_id: Optional[int]
local_contact: Optional[LocalContact]
class Config:
from_attributes = True
class BeamtimeCreate(BaseModel):
pgroups: str # this should be changed to pgroup
shift: str
beamtime_name: str
beamline: str
start_date: date
end_date: date
status: str
comments: Optional[constr(max_length=200)] = None
proposal_id: Optional[int]
local_contact_id: Optional[int]
class BeamtimeResponse(BaseModel):
id: int
pgroups: str
shift: str
beamtime_name: str
beamline: str
start_date: date
end_date: date
status: str
comments: Optional[str] = None
proposal_id: Optional[int]
local_contact_id: Optional[int]
local_contact: Optional[LocalContact]
class Config:
from_attributes = True
class Sample(BaseModel):
id: int
sample_name: str
@ -505,6 +542,7 @@ class Sample(BaseModel):
mount_count: Optional[int] = None
unmount_count: Optional[int] = None
# results: Optional[Results] = None
beamtimes: List[Beamtime] = []
class Config:
from_attributes = True
@ -519,6 +557,7 @@ class SampleCreate(BaseModel):
comments: Optional[str] = None
results: Optional[Results] = None
events: Optional[List[str]] = None
beamtime_ids: List[int] = []
class Config:
populate_by_name = True
@ -546,6 +585,7 @@ class PuckCreate(BaseModel):
puck_type: str
puck_location_in_dewar: int
samples: List[SampleCreate] = []
beamtime_ids: List[int] = []
class PuckUpdate(BaseModel):
@ -553,6 +593,7 @@ class PuckUpdate(BaseModel):
puck_type: Optional[str] = None
puck_location_in_dewar: Optional[int] = None
dewar_id: Optional[int] = None
beamtime_ids: List[int] = []
class Puck(BaseModel):
@ -563,6 +604,7 @@ class Puck(BaseModel):
dewar_id: int
events: List[PuckEvent] = []
samples: List[Sample] = []
beamtimes: List[Beamtime] = []
class Config:
from_attributes = True
@ -577,10 +619,12 @@ class DewarBase(BaseModel):
tracking_number: str
number_of_pucks: Optional[int] = None
number_of_samples: Optional[int] = None
created_at: Optional[datetime] = None
status: str
contact_id: Optional[int]
return_address_id: Optional[int]
pucks: List[PuckCreate] = []
beamtimes: List[Beamtime] = []
class Config:
from_attributes = True
@ -593,6 +637,7 @@ class DewarCreate(DewarBase):
class Dewar(DewarBase):
id: int
pgroups: str
created_at: Optional[datetime] = None
shipment_id: Optional[int]
contact: Optional[Contact]
return_address: Optional[Address]
@ -612,6 +657,7 @@ class DewarUpdate(BaseModel):
status: Optional[str] = None
contact_id: Optional[int] = None
address_id: Optional[int] = None
beamtime_ids: List[int] = []
class DewarSchema(BaseModel):
@ -781,22 +827,16 @@ class PuckWithTellPosition(BaseModel):
from_attributes = True
class Beamtime(BaseModel):
class PuckResponse(BaseModel):
id: int
pgroups: str
beamtime_name: str
beamline: str
start_date: date
end_date: date
status: str
comments: Optional[constr(max_length=200)] = None
proposal_id: Optional[int]
proposal: Optional[Proposal]
local_contact_id: Optional[int]
local_contact: Optional[LocalContact]
puck_name: str
class Config:
from_attributes = True
class DewarWithPucksResponse(BaseModel):
id: int
dewar_name: str
created_at: datetime
pucks: List[PuckResponse]
class ImageCreate(BaseModel):
@ -822,6 +862,21 @@ class ImageInfo(BaseModel):
id: int
filepath: str
comment: Optional[str] = None
event_type: str
# run_number: Optional[int]
class Config:
from_attributes = True
class characterizationParameters(BaseModel):
omegaStart_deg: float
oscillation_deg: float
omegaStep: float
chi: float
phi: float
numberOfImages: int
exposureTime_s: float
class RotationParameters(BaseModel):
@ -882,6 +937,7 @@ class BeamlineParameters(BaseModel):
beamSizeWidth: Optional[float] = None
beamSizeHeight: Optional[float] = None
# dose_MGy: float
characterization: Optional[characterizationParameters] = None
rotation: Optional[RotationParameters] = None
gridScan: Optional[gridScanParamers] = None
jet: Optional[jetParameters] = None
@ -894,9 +950,17 @@ class BeamlineParameters(BaseModel):
# beamstopDiameter_mm: Optional[float] = None
class Datasets(BaseModel):
filepath: str
status: str
written_at: datetime
class ExperimentParametersBase(BaseModel):
run_number: int
type: str
beamline_parameters: Optional[BeamlineParameters] = None
dataset: Optional[Datasets] = None
sample_id: int
@ -911,6 +975,12 @@ class ExperimentParametersRead(ExperimentParametersBase):
from_attributes = True
class ExperimentParametersUpdate(BaseModel):
run_number: int
dataset: Optional[Datasets]
sample_id: int
class SampleResult(BaseModel):
sample_id: int
sample_name: str
@ -922,15 +992,56 @@ class SampleResult(BaseModel):
class ResultCreate(BaseModel):
sample_id: int
result_id: int
result: Optional[dict]
status: str
run_id: int
result: Results
class Config:
from_attributes = True
class ResultResponse(BaseModel):
id: int
status: str
sample_id: int
result_id: int
result: Optional[dict]
run_id: int
result: Results
class Config:
from_attributes = True
class JobsCreate(BaseModel):
id: int
sample_id: int
run_id: int
sample_name: str
status: str
created_at: datetime
updated_at: datetime
experiment_parameters: BeamlineParameters
slurm_id: Optional[int] = None
class Config:
from_attributes = True
class JobsResponse(BaseModel):
job_id: int
sample_id: int
run_id: int
sample_name: str
status: str
type: str
created_at: datetime
updated_at: Optional[datetime]
data_collection_parameters: Optional[DataCollectionParameters] = None
experiment_parameters: BeamlineParameters
filepath: Optional[str] = None
slurm_id: Optional[int] = None
class JobsUpdate(BaseModel):
job_id: int
status: str
slurm_id: int

10
backend/config_dev.json Executable file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "/app/backend/ssl/cert.pem",
"ssl_key_path": "/app/backend/ssl/key.pem",
"OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8000,
"SSL_KEY_PATH": "/app/backend/ssl/key.pem",
"SSL_CERT_PATH": "/app/backend/ssl/cert.pem"
}

10
backend/config_prod.json Normal file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "/app/backend/ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"PORT": 1492,
"OPENAPI_URL": "https://backend:1492/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"SSL_KEY_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.pem"
}

10
backend/config_test.json Normal file
View File

@ -0,0 +1,10 @@
{
"ssl_cert_path": "ssl/cert.pem",
"ssl_key_path": "ssl/key.pem",
"OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8000,
"SSL_KEY_PATH": "ssl/key.pem",
"SSL_CERT_PATH": "ssl/cert.pem"
}

View File

View File

@ -1,10 +1,12 @@
import os
import json
import tomllib
from contextlib import asynccontextmanager
from pathlib import Path
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles
from app import ssl_heidi
from app.routers import (
proposal,
@ -13,10 +15,13 @@ from app.routers import (
logistics,
auth,
sample,
processing,
)
from app.database import Base, engine, SessionLocal
from app.routers.protected_router import protected_router
os.makedirs("images", exist_ok=True)
# Utility function to fetch metadata from pyproject.toml
def get_project_metadata():
@ -45,21 +50,26 @@ def get_project_metadata():
)
def run_server():
def run_server(config, cert_path, key_path):
import uvicorn
environment = os.getenv(
"ENVIRONMENT", "dev"
) # needs to be set explicitly here if not globally available
print(f"[INFO] Starting server in {environment} environment...")
print(f"[INFO] SSL Certificate Path: {cert_path}")
print(f"[INFO] SSL Key Path: {key_path}")
port = config.get("PORT", os.getenv("PORT"))
port = config.get("PORT")
if not port:
print("[ERROR] No port defined in config or environment variables. Aborting!")
sys.exit(1) # Exit if no port is defined
port = int(port)
print(f"[INFO] Running on port {port}")
uvicorn.run(
app,
host="127.0.0.1" if environment in ["dev", "test"] else "0.0.0.0",
host="0.0.0.0",
port=port,
log_level="debug",
ssl_keyfile=key_path,
@ -69,14 +79,6 @@ def run_server():
# Get project metadata from pyproject.toml
project_name, project_version = get_project_metadata()
app = FastAPI(
title=project_name,
description="Backend for next-gen sample management system",
version=project_version,
servers=[
{"url": "https://mx-aare-test.psi.ch:1492", "description": "Default server"}
],
)
# Determine environment and configuration file path
environment = os.getenv("ENVIRONMENT", "dev")
@ -93,10 +95,14 @@ with open(config_file) as f:
if environment in ["test", "dev"]:
cert_path = config.get("ssl_cert_path", "ssl/cert.pem")
key_path = config.get("ssl_key_path", "ssl/key.pem")
ssl_dir = Path(cert_path).parent
# Ensure the directory exists before file operations
ssl_dir.mkdir(parents=True, exist_ok=True)
elif environment == "prod":
cert_path = config.get("SSL_CERT_PATH")
key_path = config.get("SSL_KEY_PATH")
# Validate production SSL paths
if not cert_path or not key_path:
raise ValueError(
"SSL_CERT_PATH and SSL_KEY_PATH must be set in config_prod.json"
@ -110,28 +116,34 @@ elif environment == "prod":
else:
raise ValueError(f"Unknown environment: {environment}")
# Generate SSL Key and Certificate if not exist (only for development)
# Generate SSL Key and Certificate if they do not exist
if environment == "dev":
Path("ssl").mkdir(parents=True, exist_ok=True)
if not Path(cert_path).exists() or not Path(key_path).exists():
ssl_heidi.generate_self_signed_cert(cert_path, key_path)
# Apply CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
def cleanup_job_loop():
import time
from app.dependencies import get_db
from app.routers.processing import cleanup_cancelled_jobs
while True:
db = next(get_db())
try:
cleanup_cancelled_jobs(db)
finally:
db.close()
time.sleep(3600) # every hour
@app.on_event("startup")
def on_startup():
@asynccontextmanager
async def lifespan(app: FastAPI):
print("[INFO] Running application startup tasks...")
db = SessionLocal()
try:
if environment == "prod":
# Base.metadata.drop_all(bind=engine)
# Base.metadata.create_all(bind=engine)
from sqlalchemy.engine import reflection
inspector = reflection.Inspector.from_engine(engine)
@ -175,10 +187,36 @@ def on_startup():
from app.database import load_slots_data
load_slots_data(db)
from threading import Thread
# Start cleanup in background thread
thread = Thread(target=cleanup_job_loop, daemon=True)
thread.start()
yield
finally:
db.close()
app = FastAPI(
lifespan=lifespan,
title=project_name,
description="Backend for next-gen sample management system",
version=project_version,
servers=[
{"url": "https://mx-aare-test.psi.ch:8000", "description": "Default server"}
],
)
# Apply CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Include routers with correct configuration
app.include_router(protected_router, prefix="/protected")
app.include_router(auth.router, prefix="/auth", tags=["auth"])
@ -187,6 +225,7 @@ app.include_router(puck.router, prefix="/pucks", tags=["pucks"])
app.include_router(spreadsheet.router, tags=["spreadsheet"])
app.include_router(logistics.router, prefix="/logistics", tags=["logistics"])
app.include_router(sample.router, prefix="/samples", tags=["samples"])
app.include_router(processing.router, prefix="/processing", tags=["processing"])
app.mount("/images", StaticFiles(directory="images"), name="images")
@ -200,6 +239,9 @@ if __name__ == "__main__":
# Load environment variables from .env file
load_dotenv()
environment = os.getenv("ENVIRONMENT", "dev")
config_file = Path(__file__).resolve().parent / f"config_{environment}.json"
# Check if `generate-openapi` option is passed
if len(sys.argv) > 1 and sys.argv[1] == "generate-openapi":
from fastapi.openapi.utils import get_openapi
@ -216,28 +258,42 @@ if __name__ == "__main__":
print("openapi.json generated successfully.")
sys.exit(0) # Exit after generating the file
# Default behavior: Run the server based on the environment
environment = os.getenv("ENVIRONMENT", "dev")
port = int(os.getenv("PORT", 8000))
# Explicitly load the configuration file
with open(config_file, "r") as f:
config = json.load(f)
# Explicitly obtain SSL paths from config
if environment in ["test", "dev"]:
cert_path = config.get("ssl_cert_path", "ssl/cert.pem")
key_path = config.get("ssl_key_path", "ssl/key.pem")
elif environment == "prod":
cert_path = config.get("SSL_CERT_PATH")
key_path = config.get("SSL_KEY_PATH")
if not cert_path or not key_path:
raise ValueError(
"SSL_CERT_PATH and SSL_KEY_PATH must be explicitly"
"set in config_prod.json for production."
)
else:
raise ValueError(f"Unknown environment: {environment}")
is_ci = os.getenv("CI", "false").lower() == "true"
# Handle certificates for dev/test if not available
ssl_dir = Path(cert_path).parent
ssl_dir.mkdir(parents=True, exist_ok=True)
if environment in ["dev", "test"] and (
not Path(cert_path).exists() or not Path(key_path).exists()
):
print(f"[INFO] Generating SSL certificates at {ssl_dir}")
ssl_heidi.generate_self_signed_cert(cert_path, key_path)
if is_ci or environment == "test":
# Test or CI Mode: Run server process temporarily for test validation
ssl_dir = Path(cert_path).parent
ssl_dir.mkdir(parents=True, exist_ok=True)
# Generate self-signed certs if missing
if not Path(cert_path).exists() or not Path(key_path).exists():
print(f"[INFO] Generating self-signed SSL certificates at {ssl_dir}")
ssl_heidi.generate_self_signed_cert(cert_path, key_path)
# Start the server as a subprocess, wait, then terminate
server_process = Process(target=run_server)
server_process = Process(target=run_server, args=(config, cert_path, key_path))
server_process.start()
sleep(5) # Wait for 5 seconds to verify the server is running
server_process.terminate() # Terminate the server process (for CI)
server_process.join() # Ensure proper cleanup
print("CI: Server started and terminated successfully for test validation.")
sleep(5)
server_process.terminate()
server_process.join()
print("CI/Test environment: Server started and terminated successfully.")
else:
# Dev or Prod: Start the server as usual
run_server()
run_server(config, cert_path, key_path)

View File

@ -1,7 +1,7 @@
#!/bin/bash
# Extract values from pyproject.toml
PYPROJECT_FILE="$(dirname "$0")/backend/pyproject.toml"
PYPROJECT_FILE="/app/backend/pyproject.toml"
NAME=$(awk -F'= ' '/^name/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"')
VERSION=$(awk -F'= ' '/^version/ { print $2 }' "$PYPROJECT_FILE" | tr -d '"')
@ -15,7 +15,7 @@ echo "Using project name: $NAME"
echo "Using version: $VERSION"
# Navigate to backend directory
cd "$(dirname "$0")/backend" || exit
cd "/app/backend" || exit
# Generate OpenAPI JSON file
echo "Generating OpenAPI JSON..."
@ -46,7 +46,7 @@ fi
# Build the package
cd python-client || exit
python3 -m venv .venv
source .venv/bin/activate
source /app/.venv/bin/activate
pip install -U pip build
python3 -m build

130
backend/propipe_sim.ipynb Normal file
View File

@ -0,0 +1,130 @@
{
"cells": [
{
"cell_type": "code",
"id": "initial_id",
"metadata": {
"collapsed": true,
"ExecuteTime": {
"end_time": "2025-04-30T09:22:17.261436Z",
"start_time": "2025-04-30T09:21:47.206494Z"
}
},
"source": [
"import requests\n",
"import sseclient\n",
"import json\n",
"\n",
"SSE_URL = \"https://127.0.0.1:8000/processing/jobs/stream\"\n",
"UPDATE_URL = \"https://127.0.0.1:8000/processing/jobs/update_status\"\n",
"\n",
"def submit_job_update(job_id, status, slurm_id):\n",
" payload = {\n",
" \"job_id\": job_id,\n",
" \"status\": status,\n",
" \"slurm_id\": slurm_id,\n",
" }\n",
" try:\n",
" response = requests.post(UPDATE_URL, json=payload, verify=False)\n",
" if response.status_code == 200:\n",
" print(f\"✅ Job {job_id} status updated to '{status}'. Response: {response.json()}\")\n",
" else:\n",
" print(f\"❌ Failed to update job {job_id}. Status: {response.status_code}. Response: {response.text}\")\n",
" except Exception as e:\n",
" print(f\"Failed to submit update for Job {job_id}: {e}\")\n",
"\n",
"def listen_and_update_jobs(url):\n",
" print(\"Starting job status updater...\")\n",
" with requests.get(url, stream=True, verify=False) as response:\n",
" if response.status_code != 200:\n",
" print(f\"Failed to connect with status code: {response.status_code}\")\n",
" return\n",
"\n",
" client = sseclient.SSEClient(response)\n",
"\n",
" for event in client.events():\n",
" try:\n",
" jobs = json.loads(event.data)\n",
" print(f\"Jobs received: {jobs}\")\n",
"\n",
" for job in jobs:\n",
" job_id = job.get(\"job_id\")\n",
" print(f\"Job ID: {job_id}, Current status: {job.get('status')}\")\n",
" # Immediately update status to \"submitted\"\n",
" submit_job_update(job_id, \"submitted\", 76545678)\n",
" except json.JSONDecodeError as e:\n",
" print(f\"Error decoding event data: {e}\")\n",
" except Exception as e:\n",
" print(f\"Unexpected error while processing event: {e}\")\n",
"\n",
"if __name__ == \"__main__\":\n",
" listen_and_update_jobs(SSE_URL)\n"
],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Starting job status updater...\n",
"Jobs received: [{'job_id': 4, 'sample_id': 204, 'run_id': 1, 'sample_name': 'Sample204', 'status': 'todo', 'type': 'standard', 'created_at': '2025-04-30T09:05:14.901478', 'updated_at': None, 'data_collection_parameters': None, 'experiment_parameters': {'synchrotron': 'Swiss Light Source', 'beamline': 'PXIII', 'detector': {'manufacturer': 'DECTRIS', 'model': 'PILATUS4 2M', 'type': 'photon-counting', 'serialNumber': '16684dscsd668468', 'detectorDistance_mm': 95.0, 'beamCenterX_px': 512.0, 'beamCenterY_px': 512.0, 'pixelSizeX_um': 150.0, 'pixelSizeY_um': 150.0}, 'wavelength': 1.0, 'ringCurrent_A': 0.0, 'ringMode': 'Machine Down', 'undulator': None, 'undulatorgap_mm': None, 'monochromator': 'Si111', 'transmission': 1.0, 'focusingOptic': 'Kirkpatrick-Baez', 'beamlineFluxAtSample_ph_s': 0.0, 'beamSizeWidth': 30.0, 'beamSizeHeight': 30.0, 'characterization': None, 'rotation': {'omegaStart_deg': 0.0, 'omegaStep': 0.1, 'chi': 0.0, 'phi': 10.0, 'numberOfImages': 3600, 'exposureTime_s': 0.02}, 'gridScan': None, 'jet': None, 'cryojetTemperature_K': None, 'humidifierTemperature_K': None, 'humidifierHumidity': None}, 'filepath': '/das/work/p11/p11206/raw_data/vincent/20250415_6D_SLS2_1st_data/20250415_fullbeam_dtz220_Lyso102_again_360deg', 'slurm_id': None}]\n",
"Job ID: 4, Current status: todo\n",
"✅ Job 4 status updated to 'submitted'. Response: {'job_id': 4, 'status': 'submitted', 'slurm_id': 76545678}\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/urllib3/connectionpool.py:1103: InsecureRequestWarning: Unverified HTTPS request is being made to host '127.0.0.1'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings\n",
" warnings.warn(\n",
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/urllib3/connectionpool.py:1103: InsecureRequestWarning: Unverified HTTPS request is being made to host '127.0.0.1'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings\n",
" warnings.warn(\n"
]
},
{
"ename": "KeyboardInterrupt",
"evalue": "",
"output_type": "error",
"traceback": [
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
"\u001B[0;31mKeyboardInterrupt\u001B[0m Traceback (most recent call last)",
"Cell \u001B[0;32mIn[14], line 48\u001B[0m\n\u001B[1;32m 45\u001B[0m \u001B[38;5;28mprint\u001B[39m(\u001B[38;5;124mf\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mUnexpected error while processing event: \u001B[39m\u001B[38;5;132;01m{\u001B[39;00me\u001B[38;5;132;01m}\u001B[39;00m\u001B[38;5;124m\"\u001B[39m)\n\u001B[1;32m 47\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;18m__name__\u001B[39m \u001B[38;5;241m==\u001B[39m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124m__main__\u001B[39m\u001B[38;5;124m\"\u001B[39m:\n\u001B[0;32m---> 48\u001B[0m \u001B[43mlisten_and_update_jobs\u001B[49m\u001B[43m(\u001B[49m\u001B[43mSSE_URL\u001B[49m\u001B[43m)\u001B[49m\n",
"Cell \u001B[0;32mIn[14], line 32\u001B[0m, in \u001B[0;36mlisten_and_update_jobs\u001B[0;34m(url)\u001B[0m\n\u001B[1;32m 28\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m\n\u001B[1;32m 30\u001B[0m client \u001B[38;5;241m=\u001B[39m sseclient\u001B[38;5;241m.\u001B[39mSSEClient(response)\n\u001B[0;32m---> 32\u001B[0m \u001B[43m\u001B[49m\u001B[38;5;28;43;01mfor\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mevent\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;129;43;01min\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mclient\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mevents\u001B[49m\u001B[43m(\u001B[49m\u001B[43m)\u001B[49m\u001B[43m:\u001B[49m\n\u001B[1;32m 33\u001B[0m \u001B[43m \u001B[49m\u001B[38;5;28;43;01mtry\u001B[39;49;00m\u001B[43m:\u001B[49m\n\u001B[1;32m 34\u001B[0m \u001B[43m \u001B[49m\u001B[43mjobs\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[43m \u001B[49m\u001B[43mjson\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mloads\u001B[49m\u001B[43m(\u001B[49m\u001B[43mevent\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mdata\u001B[49m\u001B[43m)\u001B[49m\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/sseclient/__init__.py:55\u001B[0m, in \u001B[0;36mSSEClient.events\u001B[0;34m(self)\u001B[0m\n\u001B[1;32m 54\u001B[0m \u001B[38;5;28;01mdef\u001B[39;00m\u001B[38;5;250m \u001B[39m\u001B[38;5;21mevents\u001B[39m(\u001B[38;5;28mself\u001B[39m):\n\u001B[0;32m---> 55\u001B[0m \u001B[43m \u001B[49m\u001B[38;5;28;43;01mfor\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mchunk\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;129;43;01min\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_read\u001B[49m\u001B[43m(\u001B[49m\u001B[43m)\u001B[49m\u001B[43m:\u001B[49m\n\u001B[1;32m 56\u001B[0m \u001B[43m \u001B[49m\u001B[43mevent\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[43m \u001B[49m\u001B[43mEvent\u001B[49m\u001B[43m(\u001B[49m\u001B[43m)\u001B[49m\n\u001B[1;32m 57\u001B[0m \u001B[43m \u001B[49m\u001B[38;5;66;43;03m# Split before decoding so splitlines() only uses \\r and \\n\u001B[39;49;00m\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/sseclient/__init__.py:45\u001B[0m, in \u001B[0;36mSSEClient._read\u001B[0;34m(self)\u001B[0m\n\u001B[1;32m 38\u001B[0m \u001B[38;5;250m\u001B[39m\u001B[38;5;124;03m\"\"\"Read the incoming event source stream and yield event chunks.\u001B[39;00m\n\u001B[1;32m 39\u001B[0m \n\u001B[1;32m 40\u001B[0m \u001B[38;5;124;03mUnfortunately it is possible for some servers to decide to break an\u001B[39;00m\n\u001B[1;32m 41\u001B[0m \u001B[38;5;124;03mevent into multiple HTTP chunks in the response. It is thus necessary\u001B[39;00m\n\u001B[1;32m 42\u001B[0m \u001B[38;5;124;03mto correctly stitch together consecutive response chunks and find the\u001B[39;00m\n\u001B[1;32m 43\u001B[0m \u001B[38;5;124;03mSSE delimiter (empty new line) to yield full, correct event chunks.\"\"\"\u001B[39;00m\n\u001B[1;32m 44\u001B[0m data \u001B[38;5;241m=\u001B[39m \u001B[38;5;124mb\u001B[39m\u001B[38;5;124m'\u001B[39m\u001B[38;5;124m'\u001B[39m\n\u001B[0;32m---> 45\u001B[0m \u001B[43m\u001B[49m\u001B[38;5;28;43;01mfor\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mchunk\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;129;43;01min\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_event_source\u001B[49m\u001B[43m:\u001B[49m\n\u001B[1;32m 46\u001B[0m \u001B[43m \u001B[49m\u001B[38;5;28;43;01mfor\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mline\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;129;43;01min\u001B[39;49;00m\u001B[43m \u001B[49m\u001B[43mchunk\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43msplitlines\u001B[49m\u001B[43m(\u001B[49m\u001B[38;5;28;43;01mTrue\u001B[39;49;00m\u001B[43m)\u001B[49m\u001B[43m:\u001B[49m\n\u001B[1;32m 47\u001B[0m \u001B[43m \u001B[49m\u001B[43mdata\u001B[49m\u001B[43m \u001B[49m\u001B[38;5;241;43m+\u001B[39;49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[43m \u001B[49m\u001B[43mline\u001B[49m\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/requests/models.py:820\u001B[0m, in \u001B[0;36mResponse.iter_content.<locals>.generate\u001B[0;34m()\u001B[0m\n\u001B[1;32m 818\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mhasattr\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mraw, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mstream\u001B[39m\u001B[38;5;124m\"\u001B[39m):\n\u001B[1;32m 819\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m--> 820\u001B[0m \u001B[38;5;28;01myield from\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mraw\u001B[38;5;241m.\u001B[39mstream(chunk_size, decode_content\u001B[38;5;241m=\u001B[39m\u001B[38;5;28;01mTrue\u001B[39;00m)\n\u001B[1;32m 821\u001B[0m \u001B[38;5;28;01mexcept\u001B[39;00m ProtocolError \u001B[38;5;28;01mas\u001B[39;00m e:\n\u001B[1;32m 822\u001B[0m \u001B[38;5;28;01mraise\u001B[39;00m ChunkedEncodingError(e)\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/urllib3/response.py:1040\u001B[0m, in \u001B[0;36mHTTPResponse.stream\u001B[0;34m(self, amt, decode_content)\u001B[0m\n\u001B[1;32m 1024\u001B[0m \u001B[38;5;250m\u001B[39m\u001B[38;5;124;03m\"\"\"\u001B[39;00m\n\u001B[1;32m 1025\u001B[0m \u001B[38;5;124;03mA generator wrapper for the read() method. A call will block until\u001B[39;00m\n\u001B[1;32m 1026\u001B[0m \u001B[38;5;124;03m``amt`` bytes have been read from the connection or until the\u001B[39;00m\n\u001B[0;32m (...)\u001B[0m\n\u001B[1;32m 1037\u001B[0m \u001B[38;5;124;03m 'content-encoding' header.\u001B[39;00m\n\u001B[1;32m 1038\u001B[0m \u001B[38;5;124;03m\"\"\"\u001B[39;00m\n\u001B[1;32m 1039\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mchunked \u001B[38;5;129;01mand\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39msupports_chunked_reads():\n\u001B[0;32m-> 1040\u001B[0m \u001B[38;5;28;01myield from\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mread_chunked(amt, decode_content\u001B[38;5;241m=\u001B[39mdecode_content)\n\u001B[1;32m 1041\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 1042\u001B[0m \u001B[38;5;28;01mwhile\u001B[39;00m \u001B[38;5;129;01mnot\u001B[39;00m is_fp_closed(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_fp) \u001B[38;5;129;01mor\u001B[39;00m \u001B[38;5;28mlen\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_decoded_buffer) \u001B[38;5;241m>\u001B[39m \u001B[38;5;241m0\u001B[39m:\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/urllib3/response.py:1184\u001B[0m, in \u001B[0;36mHTTPResponse.read_chunked\u001B[0;34m(self, amt, decode_content)\u001B[0m\n\u001B[1;32m 1181\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[1;32m 1183\u001B[0m \u001B[38;5;28;01mwhile\u001B[39;00m \u001B[38;5;28;01mTrue\u001B[39;00m:\n\u001B[0;32m-> 1184\u001B[0m \u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_update_chunk_length\u001B[49m\u001B[43m(\u001B[49m\u001B[43m)\u001B[49m\n\u001B[1;32m 1185\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mchunk_left \u001B[38;5;241m==\u001B[39m \u001B[38;5;241m0\u001B[39m:\n\u001B[1;32m 1186\u001B[0m \u001B[38;5;28;01mbreak\u001B[39;00m\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/urllib3/response.py:1108\u001B[0m, in \u001B[0;36mHTTPResponse._update_chunk_length\u001B[0;34m(self)\u001B[0m\n\u001B[1;32m 1106\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mchunk_left \u001B[38;5;129;01mis\u001B[39;00m \u001B[38;5;129;01mnot\u001B[39;00m \u001B[38;5;28;01mNone\u001B[39;00m:\n\u001B[1;32m 1107\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[0;32m-> 1108\u001B[0m line \u001B[38;5;241m=\u001B[39m \u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_fp\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mfp\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mreadline\u001B[49m\u001B[43m(\u001B[49m\u001B[43m)\u001B[49m \u001B[38;5;66;03m# type: ignore[union-attr]\u001B[39;00m\n\u001B[1;32m 1109\u001B[0m line \u001B[38;5;241m=\u001B[39m line\u001B[38;5;241m.\u001B[39msplit(\u001B[38;5;124mb\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124m;\u001B[39m\u001B[38;5;124m\"\u001B[39m, \u001B[38;5;241m1\u001B[39m)[\u001B[38;5;241m0\u001B[39m]\n\u001B[1;32m 1110\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/socket.py:707\u001B[0m, in \u001B[0;36mSocketIO.readinto\u001B[0;34m(self, b)\u001B[0m\n\u001B[1;32m 705\u001B[0m \u001B[38;5;28;01mwhile\u001B[39;00m \u001B[38;5;28;01mTrue\u001B[39;00m:\n\u001B[1;32m 706\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m--> 707\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_sock\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mrecv_into\u001B[49m\u001B[43m(\u001B[49m\u001B[43mb\u001B[49m\u001B[43m)\u001B[49m\n\u001B[1;32m 708\u001B[0m \u001B[38;5;28;01mexcept\u001B[39;00m timeout:\n\u001B[1;32m 709\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_timeout_occurred \u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mTrue\u001B[39;00m\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/ssl.py:1216\u001B[0m, in \u001B[0;36mSSLSocket.recv_into\u001B[0;34m(self, buffer, nbytes, flags)\u001B[0m\n\u001B[1;32m 1212\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m flags \u001B[38;5;241m!=\u001B[39m \u001B[38;5;241m0\u001B[39m:\n\u001B[1;32m 1213\u001B[0m \u001B[38;5;28;01mraise\u001B[39;00m \u001B[38;5;167;01mValueError\u001B[39;00m(\n\u001B[1;32m 1214\u001B[0m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mnon-zero flags not allowed in calls to recv_into() on \u001B[39m\u001B[38;5;132;01m%s\u001B[39;00m\u001B[38;5;124m\"\u001B[39m \u001B[38;5;241m%\u001B[39m\n\u001B[1;32m 1215\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m\u001B[38;5;18m__class__\u001B[39m)\n\u001B[0;32m-> 1216\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mread\u001B[49m\u001B[43m(\u001B[49m\u001B[43mnbytes\u001B[49m\u001B[43m,\u001B[49m\u001B[43m \u001B[49m\u001B[43mbuffer\u001B[49m\u001B[43m)\u001B[49m\n\u001B[1;32m 1217\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 1218\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28msuper\u001B[39m()\u001B[38;5;241m.\u001B[39mrecv_into(buffer, nbytes, flags)\n",
"File \u001B[0;32m/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/ssl.py:1072\u001B[0m, in \u001B[0;36mSSLSocket.read\u001B[0;34m(self, len, buffer)\u001B[0m\n\u001B[1;32m 1070\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[1;32m 1071\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m buffer \u001B[38;5;129;01mis\u001B[39;00m \u001B[38;5;129;01mnot\u001B[39;00m \u001B[38;5;28;01mNone\u001B[39;00m:\n\u001B[0;32m-> 1072\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28;43mself\u001B[39;49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43m_sslobj\u001B[49m\u001B[38;5;241;43m.\u001B[39;49m\u001B[43mread\u001B[49m\u001B[43m(\u001B[49m\u001B[38;5;28;43mlen\u001B[39;49m\u001B[43m,\u001B[49m\u001B[43m \u001B[49m\u001B[43mbuffer\u001B[49m\u001B[43m)\u001B[49m\n\u001B[1;32m 1073\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 1074\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_sslobj\u001B[38;5;241m.\u001B[39mread(\u001B[38;5;28mlen\u001B[39m)\n",
"\u001B[0;31mKeyboardInterrupt\u001B[0m: "
]
}
],
"execution_count": 14
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "aareDB"
version = "0.1.0a25"
version = "0.1.1a3"
description = "Backend for next gen sample management system"
authors = [{name = "Guillaume Gotthard", email = "guillaume.gotthard@psi.ch"}]
license = {text = "MIT"}
@ -28,10 +28,12 @@ dependencies = [
"uvicorn==0.23.1",
"python-dateutil~=2.8.2",
"tomli>=2.0.1",
"python-dotenv"
"python-dotenv",
"psycopg2-binary",
"urllib3~=2.2.1"
]
[tool.pytest.ini_options]
norecursedirs = ["backend/python-client"]
# Or limit files explicitly
python_files = ["test_auth.py"]#,
python_files = [""]#,""test_auth.py"]#,
#"test_contact.py"]

View File

@ -1,12 +1,17 @@
# tests/test_auth.py
import pytest
from fastapi.testclient import TestClient
from backend.main import app
client = TestClient(app)
@pytest.fixture(scope="module")
def client():
with TestClient(app) as test_client: # ensures lifespan/startup executes
yield test_client
def test_login_success():
def test_login_success(client):
response = client.post(
"/auth/token/login", data={"username": "testuser", "password": "testpass"}
)
@ -14,7 +19,7 @@ def test_login_success():
assert "access_token" in response.json()
def test_login_failure():
def test_login_failure(client):
response = client.post(
"/auth/token/login", data={"username": "wrong", "password": "wrongpass"}
)
@ -22,7 +27,7 @@ def test_login_failure():
assert response.json() == {"detail": "Incorrect username or password"}
def test_protected_route():
def test_protected_route(client):
# Step 1: Login
response = client.post(
"/auth/token/login", data={"username": "testuser", "password": "testpass"}

View File

View File

@ -1,7 +1,7 @@
{
"ssl_cert_path": "ssl/cert.pem",
"ssl_key_path": "ssl/key.pem",
"OPENAPI_URL": "https://127.0.0.1:8000/openapi.json",
"OPENAPI_URL": "https://0.0.0.0:8000/openapi.json",
"SCHEMA_PATH": "./src/openapi.json",
"OUTPUT_DIRECTORY": "./openapi",
"PORT": 8000,

View File

@ -1,10 +1,10 @@
{
"ssl_cert_path": "ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "ssl/mx-aare-test.psi.ch.key",
"ssl_cert_path": "/app/backend/ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"OPENAPI_URL": "https://mx-aare-test.psi.ch:1492/openapi.json",
"SCHEMA_PATH": "./src/openapi.json",
"OUTPUT_DIRECTORY": "./openapi",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 1492,
"SSL_KEY_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.pem"
"SSL_KEY_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/app/backend/ssl/mx-aare-test.psi.ch.pem"
}

View File

@ -1,10 +1,10 @@
{
"ssl_cert_path": "ssl/mx-aare-test.psi.ch.pem",
"ssl_key_path": "ssl/mx-aare-test.psi.ch.key",
"OPENAPI_URL": "https://mx-aare-test.psi.ch:8000/openapi.json",
"SCHEMA_PATH": "./src/openapi.json",
"OUTPUT_DIRECTORY": "./openapi",
"PORT": 8081,
"SSL_KEY_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.key",
"SSL_CERT_PATH": "/home/jungfrau/heidi-v2/backend/ssl/mx-aare-test.psi.ch.pem"
"ssl_cert_path": "ssl/cert.pem",
"ssl_key_path": "ssl/key.pem",
"OPENAPI_URL": "https://backend:8000/openapi.json",
"SCHEMA_PATH": "/app/src/openapi.json",
"OUTPUT_DIRECTORY": "/app/openapi",
"PORT": 8000,
"SSL_KEY_PATH": "ssl/key.pem",
"SSL_CERT_PATH": "ssl/cert.pem"
}

View File

@ -1,32 +1,95 @@
version: "3.9"
services:
backend:
container_name: backend
build:
context: . # Build the image from the parent directory
context: . # Build the image from the parent directory
dockerfile: backend/Dockerfile
ports:
- "8000:8000" # Map container port 8000 to host
- "${PORT}:${PORT}" # Map container port 8000 to host
volumes:
- ./backend:/app/backend # Map backend directory to /app/backend
- ./app:/app/app # Map app directory to /app/app
- ./config_dev.json:/app/backend/config_dev.json # Explicitly map config_dev.json
- ./config_${ENVIRONMENT}.json:/app/backend/config_${ENVIRONMENT}.json # Explicitly map config_dev.json
- ./backend/ssl:/app/backend/ssl # clearly mount SSL files explicitly into Docker
- ./uploads:/app/backend/uploads
- ./uploads:/app/backend/images
working_dir: /app/backend # Set working directory to backend/
command: python main.py # Command to run main.py
depends_on: # ⬅️ New addition: wait until postgres is started
- postgres
healthcheck:
test: [ "CMD-SHELL", "curl -k -f https://localhost:${PORT}/openapi.json || exit 1" ]
interval: 30s
timeout: 5s
retries: 5
environment: # ⬅️ Provide DB info to your backend
ENVIRONMENT: ${ENVIRONMENT}
DB_USERNAME: ${DB_USERNAME}
DB_PASSWORD: ${DB_PASSWORD}
DB_HOST: postgres
DB_NAME: ${DB_NAME}
PORT: ${PORT}
postgres: # ⬅️ New service (our PostgreSQL database)
image: postgres:16
environment:
POSTGRES_USER: ${DB_USERNAME}
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_DB: ${DB_NAME}
ports:
- "5432:5432"
volumes:
- ./db_data:/var/lib/postgresql/data
frontend:
depends_on:
backend:
condition: service_healthy
build:
context: ./frontend
dockerfile: Dockerfile
args:
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
- VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
- NODE_ENV=${NODE_ENV}
ports:
- "5173:5173" # Map container port 5173 to host
- "5173:5173"
volumes:
- ./frontend:/app
- /app/node_modules # ⬅️ explicit exclusion! ensures Docker-provided modules retain explicitly.
- ./backend/ssl:/app/backend/ssl
- ./backend/config_${ENVIRONMENT}.json:/app/backend/config_${ENVIRONMENT}.json # Dynamically maps config based on environment
environment:
VITE_OPENAPI_BASE: ${VITE_OPENAPI_BASE}
NODE_ENV: ${NODE_ENV}
command: sh -c "npm run start-${ENVIRONMENT} & ENVIRONMENT=${ENVIRONMENT} npm run watch:openapi"
logistics_frontend:
build:
context: ./logistics
dockerfile: Dockerfile
args: # 👈 explicitly pass build args from .env
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
- VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
- NODE_ENV=${NODE_ENV}
ports:
- "3000:3000"
depends_on:
- frontend # Ensure OpenAPI models are available
- frontend # Ensure OpenAPI models are available
volumes:
- ./logistics/src:/app/src # explicitly for active dev (hot reload)
- ./backend/ssl:/app/backend/ssl # clearly mount SSL files explicitly into Docker
environment:
- VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
- NODE_ENV=${NODE_ENV}
command: sh -c "npm run start-${ENVIRONMENT}"
volumes: # ⬅️ Persistent storage for PostgreSQL data
pgdata:

View File

@ -1,16 +1,29 @@
FROM node:18
# Set working directory
WORKDIR /app
# Set working directory
WORKDIR /app
# Copy dependency files and install dependencies
COPY package*.json ./
RUN npm install
# Setup build args clearly
ARG VITE_OPENAPI_BASE_DEV
ARG VITE_SSL_KEY_PATH
ARG VITE_SSL_CERT_PATH
ARG NODE_ENV=development
# Copy rest of the code and build the application
COPY . .
RUN npm run build
ENV VITE_OPENAPI_BASE_=${VITE_OPENAPI_BASE}
ENV VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
ENV VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
ENV NODE_ENV=${NODE_ENV}
# Use a simple HTTP server to serve the built static files
EXPOSE 5173
CMD ["npx", "vite", "preview", "--port", "5173"]
# Copy dependency files and install dependencies
COPY package*.json ./
RUN npm install --prefer-offline --no-audit --progress=false
# Copy rest of the code and build the application
COPY . .
# Use a simple HTTP server to serve the built static files
EXPOSE 5173
#CMD ["npx", "vite", "preview", "--port", "5173"]
CMD ["npm", "run", "dev"]

View File

@ -15,13 +15,16 @@ if (!process.env.ENVIRONMENT) {
}
// Determine environment and configuration file
const nodeEnv = process.env.ENVIRONMENT || 'dev';
const configFile = `config_${nodeEnv}.json`;
const nodeEnv = process.env.ENVIRONMENT || 'dev'; // Dynamically set from ENV variables
const configFile = `config_${nodeEnv}.json`; // Explicitly Dynamically resolve config file name
const configFilePath = path.resolve('./backend/', configFile); // Explicitly correct path resolution in Docker
// Load configuration file
let config;
try {
config = JSON.parse(fs.readFileSync(path.resolve('../', configFile), 'utf8'));
config = JSON.parse(fs.readFileSync(configFilePath, 'utf8'));
} catch (error) {
console.error(`❌ Failed to read configuration file '${configFile}': ${error.message}`);
process.exit(1);
@ -42,12 +45,24 @@ for (const field of requiredFields) {
}
}
// Resolve paths from the config
const OPENAPI_URL = config.OPENAPI_URL;
const SCHEMA_PATH = path.resolve(config.SCHEMA_PATH);
const OUTPUT_DIRECTORY = path.resolve(config.OUTPUT_DIRECTORY);
const SSL_KEY_PATH = path.resolve(config.SSL_KEY_PATH);
const SSL_CERT_PATH = path.resolve(config.SSL_CERT_PATH);
const OPENAPI_BASE_URL = config.OPENAPI_URL; // or process.env.VITE_OPENAPI_BASE_DEV || config.OPENAPI_URL;
const SCHEMA_PATH = config.SCHEMA_PATH; // 💡 already absolute
const OUTPUT_DIRECTORY = config.OUTPUT_DIRECTORY; // 💡 already absolute
const SSL_KEY_PATH = config.SSL_KEY_PATH;
const SSL_CERT_PATH = config.SSL_CERT_PATH;
function assertDirExists(dirPath) {
if (!fs.existsSync(dirPath)) {
console.error(`❌ Directory does not exist inside Docker: ${dirPath}`);
process.exit(1);
}
}
// explicitly validate directories clearly explicitly in container paths:
assertDirExists(path.dirname(SCHEMA_PATH));
assertDirExists(OUTPUT_DIRECTORY);
assertDirExists(path.dirname(SSL_KEY_PATH));
// Log configuration
console.log(`[INFO] Environment: ${nodeEnv}`);
@ -96,7 +111,7 @@ async function fetchAndGenerate() {
};
const res = await new Promise((resolve, reject) => {
https.get(OPENAPI_URL, options, resolve).on('error', reject);
https.get(OPENAPI_BASE_URL, options, resolve).on('error', reject);
});
let data = '';
@ -104,79 +119,71 @@ async function fetchAndGenerate() {
data += chunk;
});
res.on('end', async () => {
try {
// Save schema file
fs.writeFileSync(SCHEMA_PATH, data, 'utf8');
console.log(`✅ OpenAPI schema saved to ${SCHEMA_PATH}`);
console.log("🧼 Cleaning output directory...");
await fs.promises.rm(OUTPUT_DIRECTORY, { recursive: true, force: true });
console.log(`✅ Output directory cleaned: ${OUTPUT_DIRECTORY}`);
if (!fs.existsSync(OUTPUT_DIRECTORY)) {
console.log(`✅ Confirmed removal of ${OUTPUT_DIRECTORY}`);
} else {
console.error(`❌ Failed to remove output directory: ${OUTPUT_DIRECTORY}`);
}
// Generate services
const command = `npx openapi -i ${SCHEMA_PATH} -o ${OUTPUT_DIRECTORY}`;
console.log(`🔧 Executing command: ${command}`);
const { stdout, stderr } = await execPromisified(command);
if (stderr) {
console.error(`⚠️ stderr while generating services: ${stderr}`);
} else {
console.log(`✅ Service generation completed successfully:\n${stdout}`);
}
// Copy the generated OpenAPI models to ../logistics/openapi
const targetDirectory = path.resolve('../logistics/openapi'); // Adjust as per logistics directory
console.log(`🔄 Copying generated OpenAPI models to ${targetDirectory}...`);
await fs.promises.rm(targetDirectory, { recursive: true, force: true }); // Clean target directory
await fs.promises.mkdir(targetDirectory, { recursive: true }); // Ensure the directory exists
// Copy files from OUTPUT_DIRECTORY to the target directory recursively
const copyRecursive = async (src, dest) => {
const entries = await fs.promises.readdir(src, { withFileTypes: true });
for (const entry of entries) {
const srcPath = path.join(src, entry.name);
const destPath = path.join(dest, entry.name);
if (entry.isDirectory()) {
await fs.promises.mkdir(destPath, { recursive: true });
await copyRecursive(srcPath, destPath);
} else {
await fs.promises.copyFile(srcPath, destPath);
}
}
};
await copyRecursive(OUTPUT_DIRECTORY, targetDirectory);
console.log(`✅ OpenAPI models copied successfully to ${targetDirectory}`);
} catch (error) {
console.error(`❌ Error during schema processing or generation: ${error.message}`);
}
isGenerating = false;
await new Promise((resolve, reject) => {
res.on('end', resolve);
res.on('error', reject);
});
// Save schema file
fs.writeFileSync(SCHEMA_PATH, data, 'utf8');
console.log(`✅ OpenAPI schema saved to ${SCHEMA_PATH}`);
console.log("🧼 Cleaning output directory...");
await fs.promises.rm(OUTPUT_DIRECTORY, { recursive: true, force: true });
console.log(`✅ Output directory cleaned: ${OUTPUT_DIRECTORY}`);
// Generate services
const command = `npx openapi -i ${SCHEMA_PATH} -o ${OUTPUT_DIRECTORY}`;
console.log(`🔧 Executing command: ${command}`);
const { stdout, stderr } = await execPromisified(command);
if (stderr) {
console.error(`⚠️ stderr while generating services: ${stderr}`);
} else {
console.log(`✅ Service generation completed successfully:\n${stdout}`);
}
// Copy the generated OpenAPI models to ../logistics/openapi
const targetDirectory = path.resolve('../logistics/openapi');
console.log(`🔄 Copying generated OpenAPI models to ${targetDirectory}...`);
await fs.promises.rm(targetDirectory, { recursive: true, force: true });
await fs.promises.mkdir(targetDirectory, { recursive: true });
// Recursive copy helper
const copyRecursive = async (src, dest) => {
const entries = await fs.promises.readdir(src, { withFileTypes: true });
for (const entry of entries) {
const srcPath = path.join(src, entry.name);
const destPath = path.join(dest, entry.name);
if (entry.isDirectory()) {
await fs.promises.mkdir(destPath, { recursive: true });
await copyRecursive(srcPath, destPath);
} else {
await fs.promises.copyFile(srcPath, destPath);
}
}
};
await copyRecursive(OUTPUT_DIRECTORY, targetDirectory);
console.log(`✅ OpenAPI models copied successfully to ${targetDirectory}`);
} catch (error) {
console.error(`Failed to fetch OpenAPI schema: ${error.message}`);
console.error(`Error during schema processing or generation: ${error.message}`);
} finally {
isGenerating = false;
}
}
// Backend directory based on the environment
const backendDirectory = (() => {
switch (nodeEnv) {
case 'prod':
return path.resolve('/home/jungfrau/heidi-v2/backend/app'); // Production path
return path.resolve('/app/backend'); // Production path
case 'test':
return path.resolve('/home/jungfrau/heidi-v2/backend/app'); // Test path
return path.resolve('/app/backend'); // Test path
case 'dev':
default:
return path.resolve('/Users/gotthardg/PycharmProjects/heidi-v2/backend/app'); // Development path
return path.resolve('/app/backend'); // Development path
}
})();

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,5 @@
{
"name": "heidi-frontend-v2",
"name": "Aare Web",
"private": true,
"version": "0.0.0",
"type": "module",
@ -29,8 +29,9 @@
"@mui/lab": "^6.0.0-beta.29",
"@mui/material": "^6.1.6",
"@mui/system": "^6.1.6",
"@mui/x-charts": "^7.28.0",
"@mui/x-data-grid-premium": "^7.27.2",
"@mui/x-tree-view": "^7.26.0",
"@mui/x-tree-view": "^7.28.0",
"axios": "^1.7.7",
"chokidar": "^4.0.1",
"dayjs": "^1.11.13",

View File

@ -12,6 +12,7 @@ import AddressManager from './pages/AddressManagerView';
import ContactsManager from './pages/ContactsManagerView';
import LoginView from './pages/LoginView';
import ProtectedRoute from './components/ProtectedRoute';
import BeamtimeOverview from './components/BeamtimeOverview';
const App: React.FC = () => {
const [openAddressManager, setOpenAddressManager] = useState(false);
@ -65,10 +66,11 @@ const App: React.FC = () => {
}, []);
const handlePgroupChange = (newPgroup: string) => {
setActivePgroup(newPgroup);
setActivePgroup(newPgroup); // Updates active pgroup state in App
console.log(`pgroup changed to: ${newPgroup}`);
};
return (
<Router>
<ResponsiveAppBar
@ -82,9 +84,61 @@ const App: React.FC = () => {
<Routes>
<Route path="/login" element={<LoginView />} />
<Route path="/" element={<ProtectedRoute element={<HomePage />} />} />
<Route path="/shipments" element={<ProtectedRoute element={<ShipmentView pgroups={pgroups} activePgroup={activePgroup} />} />} />
<Route path="/planning" element={<ProtectedRoute element={<PlanningView />} />} />
<Route path="/results" element={<ProtectedRoute element={<ResultsView pgroups={pgroups} activePgroup={activePgroup} />} />} />
<Route path="/shipments"
element={
<ProtectedRoute
element={
<ShipmentView
pgroups={pgroups}
activePgroup={activePgroup}
/>
}
/>
}
/>
<Route path="/planning"
element={
<ProtectedRoute
element={
<PlanningView
pgroups={pgroups}
activePgroup={activePgroup}
onPgroupChange={handlePgroupChange}
/>
}
/>
}
/>
<Route
path="/results/:beamtimeId"
element={
<ProtectedRoute
element={
<ResultsView
onPgroupChange={handlePgroupChange}
currentPgroup={activePgroup}
/>
}
/>
}
/>
<Route
path="/beamtime-overview"
element={
<ProtectedRoute
element={
<BeamtimeOverview
activePgroup={activePgroup}
onPgroupChange={handlePgroupChange} // Pass this prop correctly
/>
}
/>
}
/>
<Route path="/results" element={<ProtectedRoute element={<BeamtimeOverview activePgroup={activePgroup} onPgroupChange={handlePgroupChange} />} />}/>
{/* Optionally, add a 404 fallback route */}
<Route path="*" element={<div>Page not found</div>} />
</Routes>
<Modal open={openAddressManager} onClose={handleCloseAddressManager} title="Address Management">
<AddressManager pgroups={pgroups} activePgroup={activePgroup} />

View File

@ -0,0 +1,146 @@
import React, { useEffect, useState } from 'react';
import { DataGridPremium, GridColDef } from '@mui/x-data-grid-premium';
import { useNavigate } from 'react-router-dom';
import {Beamtime, BeamtimesService} from '../../openapi';
import { Chip, Typography } from '@mui/material';
interface BeamtimeRecord {
id: number;
start_date: string;
end_date: string;
shift: string;
beamline: string;
local_contact: string;
pgroups: string;
}
interface BeamtimeOverviewProps {
activePgroup: string;
onPgroupChange: (pgroup: string) => void; // Add callback to update the selected pgroup
}
const BeamtimeOverview: React.FC<BeamtimeOverviewProps> = ({ activePgroup, onPgroupChange }) => {
const [rows, setRows] = useState<BeamtimeRecord[]>([]);
const [isLoading, setIsLoading] = useState(false);
// For navigation
const navigate = useNavigate();
const renderPgroupChips = (pgroups: string, activePgroup: string) => {
// Safely handle pgroups as an array
const pgroupsArray = pgroups.split(",").map((pgroup: string) => pgroup.trim());
if (!pgroupsArray.length) {
return <Typography variant="body2">No associated pgroups</Typography>;
}
return pgroupsArray.map((pgroup: string) => (
<Chip
key={pgroup}
label={pgroup}
color={pgroup === activePgroup ? "primary" : "default"} // Highlight active pgroups
sx={{
margin: 0.5,
backgroundColor: pgroup === activePgroup ? '#19d238' : '#b0b0b0',
color: pgroup === activePgroup ? 'white' : 'black',
fontWeight: 'bold',
borderRadius: '8px',
height: '20px',
fontSize: '12px',
boxShadow: '0px 1px 3px rgba(0, 0, 0, 0.2)',
mr: 1,
mb: 1,
}}
/>
));
};
// Fetch beamtime records from the backend
const fetchBeamtimeRecords = async () => {
try {
setIsLoading(true);
const records = await BeamtimesService.getMyBeamtimesProtectedBeamtimesMyBeamtimesGet(activePgroup);
const mappedRecords: BeamtimeRecord[] = records.map((record: any) => ({
id: record.id,
start_date: record.start_date || 'N/A',
end_date: record.end_date || 'N/A',
shift: record.shift || 'N/A',
beamline: record.beamline || 'N/A',
local_contact: `${record.local_contact.firstname || "N/A"} ${record.local_contact.lastname || "N/A"}`,
pgroups: record.pgroups || '',
}));
setRows(mappedRecords);
} catch (error) {
console.error('Failed to fetch beamtime records:', error);
} finally {
setIsLoading(false);
}
};
useEffect(() => {
fetchBeamtimeRecords();
}, [activePgroup]);
// Define table columns, including the "View Results" button
const columns: GridColDef<BeamtimeRecord>[] = [
{ field: 'start_date', headerName: 'Start Date', flex: 1 },
{ field: 'end_date', headerName: 'End Date', flex: 1 },
{ field: 'shift', headerName: "Shift", flex: 1 },
{ field: 'beamline', headerName: 'Beamline', flex: 1 },
{ field: 'local_contact', headerName: 'Local Contact', flex: 1 },
{
field: 'pgroups',
headerName: 'Pgroups',
flex: 2, // Slightly wider column for chips
renderCell: (params) => renderPgroupChips(params.row.pgroups, activePgroup),
},
{
field: 'viewResults',
headerName: 'Actions',
flex: 1,
renderCell: (params) => (
<button
onClick={() => handleViewResults(params.row.id, params.row.pgroups)}
style={{
padding: '6px 12px',
backgroundColor: '#1976d2',
color: '#fff',
border: 'none',
borderRadius: '4px',
cursor: 'pointer',
}}
>
View Results
</button>
),
},
];
// Navigate to the ResultsView page for the selected beamtime
const handleViewResults = (beamtimeId: number, pgroups: string) => {
const pgroupArray = pgroups.split(',').map((pgroup) => pgroup.trim());
const firstPgroup = pgroupArray[0] || ''; // Choose the first pgroup (or fallback to empty string)
// Ensure onPgroupChange is invoked correctly
onPgroupChange(firstPgroup);
// Navigate directly to the Results page with the correct pgroup in the query
navigate(`/results/${beamtimeId}?pgroup=${firstPgroup}`);
};
return (
<div style={{ height: 400, width: '100%' }}>
<h2>Beamtime Overview</h2>
<DataGridPremium
rows={rows}
columns={columns}
loading={isLoading}
disableRowSelectionOnClick
/>
</div>
);
};
export default BeamtimeOverview;

View File

@ -4,42 +4,52 @@ import dayGridPlugin from '@fullcalendar/daygrid';
import timeGridPlugin from '@fullcalendar/timegrid';
import interactionPlugin from '@fullcalendar/interaction';
import '../styles/Calendar.css';
import { BeamtimesService, DewarsService, PucksService } from '../../openapi';
import Chip from '@mui/material/Chip'
// Define colors for each beamline
const beamlineColors: { [key: string]: string } = {
PXI: '#FF5733',
PXII: '#33FF57',
PXIII: '#3357FF',
Unknown: '#CCCCCC', // Gray color for unknown beamlines
X06SA: '#FF5733',
X10SA: '#33FF57',
X06DA: '#3357FF',
Unknown: '#CCCCCC',
};
// Custom event interface
interface CustomEvent extends EventInput {
beamline: string;
beamtime_shift: string;
isSubmitted?: boolean; // Track if information is submitted
beamtime_id?: number;
isSubmitted?: boolean;
activePgroup?: string;
pgroups?: string;
}
// Define experiment modes
interface CalendarProps {
activePgroup: string;
}
const experimentModes = ['SDU-Scheduled', 'SDU-queued', 'Remote', 'In-person'];
// Utility function to darken a hex color
const darkenColor = (color: string, percent: number): string => {
const num = parseInt(color.slice(1), 16); // Convert hex to number
const amt = Math.round(2.55 * percent); // Calculate amount to darken
const r = (num >> 16) + amt; // Red
const g = (num >> 8 & 0x00FF) + amt; // Green
const b = (num & 0x0000FF) + amt; // Blue
// Ensure values stay within 0-255 range
const newColor = (0x1000000 + (r < 255 ? (r < 0 ? 0 : r) : 255) * 0x10000 + (g < 255 ? (g < 0 ? 0 : g) : 255) * 0x100 + (b < 255 ? (b < 0 ? 0 : b) : 255)).toString(16).slice(1);
const num = parseInt(color.slice(1), 16);
const amt = Math.round(2.55 * percent);
const r = (num >> 16) + amt;
const g = (num >> 8 & 0x00FF) + amt;
const b = (num & 0x0000FF) + amt;
const newColor = (0x1000000 + (r < 255 ? (r < 0 ? 0 : r) : 255) * 0x10000
+ (g < 255 ? (g < 0 ? 0 : g) : 255) * 0x100
+ (b < 255 ? (b < 0 ? 0 : b) : 255)).toString(16).slice(1);
return `#${newColor}`;
};
const Calendar: React.FC = () => {
const Calendar = ({ activePgroup }: CalendarProps) => {
const [events, setEvents] = useState<CustomEvent[]>([]);
const [isLoading, setIsLoading] = useState(false);
const [fetchError, setFetchError] = useState<string | null>(null);
const [selectedEventId, setSelectedEventId] = useState<string | null>(null);
const [eventDetails, setEventDetails] = useState<CustomEvent | null>(null);
// eventId => { dewars: [dewar_id], pucks: [puck_id] }
const [eventAssociations, setEventAssociations] = useState<{ [eventId: string]: { dewars: string[], pucks: string[] } }>({});
const [userDetails, setUserDetails] = useState({
name: '',
firstName: '',
@ -48,85 +58,140 @@ const Calendar: React.FC = () => {
extAccount: '',
experimentMode: experimentModes[0],
});
const [shipments, setShipments] = useState<any[]>([]); // State for shipments
const [selectedDewars, setSelectedDewars] = useState<string[]>([]); // Track selected dewars for the experiment
const [shipments, setShipments] = useState<any[]>([]);
// Load all beamtime events AND their current associations (on mount)
useEffect(() => {
const fetchEvents = async () => {
const fetchAll = async () => {
setIsLoading(true);
setFetchError(null);
try {
const response = await fetch('/beamtimedb.json');
const data = await response.json();
const events: CustomEvent[] = [];
const beamtimes = await BeamtimesService.getMyBeamtimesProtectedBeamtimesMyBeamtimesGet();
console.log('Loaded beamtimes:', beamtimes);
const grouped: { [key: string]: any[] } = {};
beamtimes.forEach((beamtime: any) => {
const key = `${beamtime.start_date}|${beamtime.beamline}|${beamtime.pgroups}`;
if (!grouped[key]) grouped[key] = [];
grouped[key].push(beamtime);
});
data.beamtimes.forEach((beamtime: any) => {
const date = new Date(beamtime.date);
beamtime.shifts.forEach((shift: any) => {
const beamline = shift.beamline || 'Unknown';
const beamtime_shift = shift.beamtime_shift || 'morning';
const formattedEvents: CustomEvent[] = Object.values(grouped).map((group) => {
const shifts = group.map((bt: any) => bt.shift).join(" + ");
const ids = group.map((bt: any) => bt.id);
const first = group[0];
console.log(`[DEBUG] pgroups: ${first.pgroups}`); // Ensure the value of pgroups here is correct
return {
id: `${first.beamline}-${first.start_date}-${first.pgroups}`,
title: `${first.beamline}: ${shifts}`,
start: first.start_date,
end: first.end_date,
beamtime_ids: ids,
beamline: first.beamline || 'Unknown',
beamtime_shift: shifts,
backgroundColor: beamlineColors[first.beamline] || beamlineColors.Unknown,
borderColor: '#000',
textColor: '#fff',
beamtimes: group,
extendedProps: {
pgroups: first.pgroups, // Check that this is a valid, comma-separated string
},
};
});
setEvents(formattedEvents);
const event: CustomEvent = {
id: `${beamline}-${date.toISOString()}-${beamtime_shift}`,
start: new Date(date.setHours(0, 0, 0)),
end: new Date(date.setHours(23, 59, 59)),
title: `${beamline}: ${beamtime_shift}`,
beamline,
beamtime_shift,
isSubmitted: false,
// Fetch associations for all
const assoc: { [id: string]: { dewars: string[]; pucks: string[] } } = {};
console.log('Fetched associations after loading events:', assoc);
await Promise.all(
Object.values(grouped).map(async (group) => {
// multiple (or single) beamtimes per group
const ids = group.map((bt: any) => bt.id);
// fetch and merge for all ids in this group:
let dewarsSet = new Set<string>();
let pucksSet = new Set<string>();
await Promise.all(
ids.map(async (beamtimeId: number) => {
const [dewars, pucks] = await Promise.all([
DewarsService.getDewarsByBeamtime(beamtimeId),
PucksService.getPucksByBeamtime(beamtimeId),
]);
console.log(`Dewars for beamtime ${beamtimeId}:`, dewars);
console.log(`Pucks for beamtime ${beamtimeId}:`, pucks);
dewars.forEach((d: any) => dewarsSet.add(d.id));
pucks.forEach((p: any) => pucksSet.add(p.id));
})
);
// key must match event id
const eventId = `${group[0].beamline}-${group[0].start_date}-${group[0].pgroups}`;
assoc[eventId] = {
dewars: Array.from(dewarsSet),
pucks: Array.from(pucksSet),
};
})
);
console.log("Final eventAssociations:", assoc);
setEventAssociations(assoc);
events.push(event);
});
});
console.log('Fetched events array:', events);
setEvents(events);
} catch (error) {
console.error('Error fetching events:', error);
setFetchError('Failed to load beamtime data. Please try again later.');
setEvents([]);
setEventAssociations({});
} finally {
setIsLoading(false);
}
};
const fetchShipments = async () => {
try {
const response = await fetch('/shipmentdb.json');
// Check for HTTP errors
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
// Parse the JSON response
const data = await response.json();
const availableDewars: any[] = [];
data.shipments.forEach(shipment => {
if (shipment.shipment_status === "In Transit") {
shipment.dewars.forEach(dewar => {
if (dewar.shippingStatus === "shipped" && dewar.returned === "") {
availableDewars.push(dewar);
}
});
}
});
console.log('Available Dewars:', availableDewars);
setShipments(availableDewars);
} catch (error) {
console.error('Error fetching shipments:', error);
// Optionally display the error to the user in the UI
}
};
fetchEvents();
fetchShipments();
fetchAll();
}, []);
// When an event is selected, fetch up-to-date dewar list
useEffect(() => {
if (eventDetails) {
const fetchDewars = async () => {
try {
const dewarsWithPucks = await DewarsService.getRecentDewarsWithPucks();
setShipments(dewarsWithPucks);
} catch (err) {
setShipments([]);
}
};
fetchDewars();
} else {
setShipments([]);
}
}, [eventDetails]);
// Refresh associations after (un)assign action
const refetchEventAssociations = async (beamtimeIds: number[], eventId: string) => {
let dewarsSet = new Set<string>();
let pucksSet = new Set<string>();
await Promise.all(
beamtimeIds.map(async (beamtimeId: number) => {
const [dewars, pucks] = await Promise.all([
DewarsService.getDewarsByBeamtime(beamtimeId),
PucksService.getPucksByBeamtime(beamtimeId),
]);
dewars.forEach((d: any) => dewarsSet.add(d.id));
pucks.forEach((p: any) => pucksSet.add(p.id));
})
);
setEventAssociations(prev => ({
...prev,
[eventId]: {
dewars: Array.from(dewarsSet),
pucks: Array.from(pucksSet),
}
}));
};
const handleEventClick = (eventInfo: any) => {
const clickedEventId = eventInfo.event.id;
setSelectedEventId(clickedEventId);
const selectedEvent = events.find(event => event.id === clickedEventId) || null;
setEventDetails(selectedEvent);
const selected = events.find(event => event.id === clickedEventId) || null;
setEventDetails(selected);
};
const handleInputChange = (e: React.ChangeEvent<HTMLInputElement | HTMLSelectElement>) => {
@ -137,32 +202,15 @@ const Calendar: React.FC = () => {
}));
};
const handleDewarSelection = (dewarId: string) => {
setSelectedDewars(prevSelectedDewars => {
if (prevSelectedDewars.includes(dewarId)) {
return prevSelectedDewars.filter(id => id !== dewarId); // Remove if already selected
} else {
return [...prevSelectedDewars, dewarId]; // Add if not selected
}
});
};
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
if (eventDetails) {
const updatedEvents = events.map(event =>
event.id === eventDetails.id
? { ...event, isSubmitted: true, selectedDewars } // Associate selected dewars
: event
setEvents(prev =>
prev.map(event =>
event.id === eventDetails.id ? { ...event, isSubmitted: true } : event
)
);
setEvents(updatedEvents);
}
console.log('User Details:', userDetails);
console.log('Selected Dewars:', selectedDewars);
// Reset user details and selected dewars after submission
setUserDetails({
name: '',
firstName: '',
@ -171,20 +219,64 @@ const Calendar: React.FC = () => {
extAccount: '',
experimentMode: experimentModes[0],
});
setSelectedDewars([]); // Reset selected dewars
};
// Unified assign/unassign for Dewars
const handleDewarAssignment = async (dewarId: string) => {
if (!selectedEventId) return;
const event = events.find(e => e.id === selectedEventId)!;
const beamtimeIds: number[] = event.beamtime_ids || [];
if (!beamtimeIds.length) return;
const assigned = eventAssociations[selectedEventId]?.dewars.includes(dewarId);
try {
await Promise.all(
beamtimeIds.map(btId =>
assigned
? DewarsService.assignDewarToBeamtime(Number(dewarId), 0)
: DewarsService.assignDewarToBeamtime(Number(dewarId), Number(btId))
)
);
await refetchEventAssociations(beamtimeIds, selectedEventId);
} catch (e) {
/* error handling */}
};
// Unified assign/unassign for Pucks
const handlePuckAssignment = async (puckId: string) => {
if (!selectedEventId) return;
const event = events.find(e => e.id === selectedEventId)!;
const beamtimeIds: number[] = event.beamtime_ids || [];
if (!beamtimeIds.length) return;
const assigned = eventAssociations[selectedEventId]?.pucks.includes(puckId);
try {
await Promise.all(
beamtimeIds.map(async btId =>
assigned
? PucksService.assignPuckToBeamtime(Number(puckId), 0)
: PucksService.assignPuckToBeamtime(Number(puckId), Number(btId))
)
);
await refetchEventAssociations(beamtimeIds, selectedEventId);
} catch (e) {/* error handling */}
};
// For displaying badge in calendar and UI
const eventContent = (eventInfo: any) => {
const beamtimesInGroup = eventInfo.event.extendedProps.beamtimes
? eventInfo.event.extendedProps.beamtimes.length
: 1;
const minHeight = beamtimesInGroup * 26;
const beamline = eventInfo.event.extendedProps.beamline || 'Unknown';
const isSelected = selectedEventId === eventInfo.event.id;
const isSubmitted = eventInfo.event.extendedProps.isSubmitted;
const assoc = eventAssociations[eventInfo.event.id] || { dewars: [], pucks: [] };
const backgroundColor = isSubmitted
? darkenColor(beamlineColors[beamline] || beamlineColors.Unknown, -20)
: isSelected
? '#FFD700'
: (beamlineColors[beamline] || beamlineColors.Unknown);
return (
<div
style={{
@ -193,19 +285,85 @@ const Calendar: React.FC = () => {
border: isSelected ? '2px solid black' : 'none',
borderRadius: '3px',
display: 'flex',
justifyContent: 'center',
justifyContent: 'space-between',
alignItems: 'center',
height: '100%',
width: '100%',
cursor: 'pointer',
overflow: 'hidden',
boxSizing: 'border-box',
padding: '0 6px',
minHeight: `${minHeight}px`,
}}
>
<span style={{ whiteSpace: 'nowrap', overflow: 'hidden', textOverflow: 'ellipsis' }}>
{eventInfo.event.title}
</span>
<span style={{ display: 'flex', alignItems: 'center', gap: 6, marginLeft: 8 }}>
<span title="Dewars" style={{ display: 'flex', alignItems: 'center', fontSize: 13 }}>
🧊
<span style={{
background: 'rgba(0,0,0,0.45)',
borderRadius: '8px',
marginLeft: 2,
minWidth: 14,
color: '#fff',
fontSize: 12,
padding: '0 4px',
fontWeight: 600,
textAlign: 'center'
}}>{assoc.dewars.length}</span>
</span>
<span title="Pucks" style={{ display: 'flex', alignItems: 'center', fontSize: 13 }}>
<span style={{
background: 'rgba(0,0,0,0.45)',
borderRadius: '8px',
marginLeft: 2,
minWidth: 14,
color: '#fff',
fontSize: 12,
padding: '0 4px',
fontWeight: 600,
textAlign: 'center'
}}>{assoc.pucks.length}</span>
</span>
{eventInfo.event.extendedProps?.pgroups && eventInfo.event.extendedProps.pgroups.split(',')
.map((pgroup: string) => (
<Chip
key={pgroup.trim()}
label={pgroup.trim()}
size="small"
sx={{
marginLeft: 0.5,
marginRight: 0.5,
backgroundColor: pgroup.trim() === activePgroup ? '#19d238' : '#b0b0b0',
color: pgroup.trim() === activePgroup ? 'white' : 'black',
fontWeight: 'bold',
borderRadius: '8px',
height: '20px',
fontSize: '12px',
boxShadow: '0px 1px 3px rgba(0, 0, 0, 0.2)',
mr: 1,
mb: 1,
}}
/>
))
}
</span>
</div>
);
};
function getAssignedEventForDewar(dewarId: string) {
return Object.entries(eventAssociations).find(([eid, assoc]) =>
assoc.dewars.includes(dewarId)
);
}
function getAssignedEventForPuck(puckId: string) {
return Object.entries(eventAssociations).find(([eid, assoc]) =>
assoc.pucks.includes(puckId)
);
}
return (
<div className="calendar-container">
@ -232,17 +390,85 @@ const Calendar: React.FC = () => {
<h4>Select Dewars</h4>
<ul>
{shipments.map(dewar => (
<li key={dewar.id}>
<input
type="checkbox"
id={dewar.id}
checked={selectedDewars.includes(dewar.id)}
onChange={() => handleDewarSelection(dewar.id)}
/>
<label htmlFor={dewar.id}>{dewar.dewar_name} (Pucks: {dewar.number_of_pucks})</label>
</li>
))}
{shipments.map(dewar => {
const thisEvent = eventAssociations[selectedEventId!] || { dewars: [], pucks: [] };
const dewarAssigned = thisEvent.dewars.includes(dewar.id);
const [assocEventId, assoc] = getAssignedEventForDewar(dewar.id) || [];
const assocEvent = assocEventId
? events.find(ev => ev.id === assocEventId)
: null;
const assocShift = assocEvent?.beamtime_shift;
const assocDate = assocEvent?.start;
const assocBeamline = assocEvent?.beamline;
const currentShift = eventDetails?.beamtime_shift;
const isAssignedToThis = assocShift && currentShift && assocShift === currentShift;
return (
<li key={dewar.id}>
<label>
<input
type="checkbox"
checked={dewarAssigned}
onChange={() => handleDewarAssignment(dewar.id)}
/>
<b>{dewar.dewar_name}</b>
</label>
{/* List all pucks in this Dewar, each with assign button */}
{Array.isArray(dewar.pucks) && dewar.pucks.length > 0 && (
<ul>
{dewar.pucks.map(puck => {
const [pAssocEventId] = getAssignedEventForPuck(puck.id) || [];
const pAssocEvent = pAssocEventId
? events.find(ev => ev.id === pAssocEventId)
: null;
const pAssocShift = pAssocEvent?.beamtime_shift;
const pAssocDate = pAssocEvent?.start;
const pAssocBeamline = pAssocEvent?.beamline;
const isAssignedHere = pAssocShift && currentShift && pAssocShift === currentShift;
return (
<li key={puck.id} style={{marginLeft:8}}>
<button
type="button"
style={{
background: isAssignedHere ? '#4CAF50' : (pAssocShift ? '#B3E5B3' : '#e0e0e0'),
color: isAssignedHere ? 'white' : 'black',
border: isAssignedHere ? '1px solid #388e3c' : '1px solid #bdbdbd',
borderRadius: 4,
padding: '2px 10px',
cursor: 'pointer',
transition: 'background 0.2s',
}}
onClick={() => handlePuckAssignment(puck.id)}
>
{puck.puck_name || puck.name}
</button>
{pAssocEvent && (
<span style={{
marginLeft: 8,
color: isAssignedHere ? 'green' : '#388e3c',
fontWeight: isAssignedHere ? 700 : 400
}}>
Assigned to: {pAssocShift} {pAssocDate && <>on {new Date(pAssocDate).toLocaleDateString()}</>} {pAssocBeamline && <>({pAssocBeamline})</>}
</span>
)}
</li>
);
})}
</ul>
)}
{/* Show dewar assignment info if not to this shift */}
{assocEvent && (
<span style={{marginLeft:8, color:isAssignedToThis?'green':'#388e3c', fontWeight:isAssignedToThis?700:400}}>
Assigned to: {assocShift}
{assocDate && <> on {new Date(assocDate).toLocaleDateString()}</>}
{assocBeamline && <> ({assocBeamline})</>}
</span>
)}
</li>
);
})}
</ul>
<h4>User Information</h4>
@ -318,4 +544,4 @@ const Calendar: React.FC = () => {
);
};
export default Calendar;
export default Calendar;

View File

@ -3,16 +3,19 @@ import { Navigate } from 'react-router-dom';
interface ProtectedRouteProps {
element: JSX.Element;
[key: string]: any; // Allow additional props
}
const ProtectedRoute: React.FC<ProtectedRouteProps> = ({ element }) => {
const ProtectedRoute: React.FC<ProtectedRouteProps> = ({ element, ...rest }) => {
const isAuthenticated = () => {
const token = localStorage.getItem('token');
console.log("Is Authenticated: ", token !== null);
return token !== null;
};
const token = localStorage.getItem('token');
console.log("Is Authenticated: ", token !== null);
return token !== null;
};
return isAuthenticated() ? element : <Navigate to="/login" />;
return isAuthenticated()
? React.cloneElement(element, { ...rest }) // Pass all additional props
: <Navigate to="/login" />;
};
export default ProtectedRoute;

View File

@ -1,4 +1,4 @@
import React, { useState } from 'react';
import React, { useState, useEffect } from 'react';
import { useNavigate, useLocation } from 'react-router-dom';
import AppBar from '@mui/material/AppBar';
import Box from '@mui/material/Box';
@ -38,6 +38,12 @@ const ResponsiveAppBar: React.FC<ResponsiveAppBarProps> = ({
const [anchorElNav, setAnchorElNav] = useState<null | HTMLElement>(null);
const [anchorElUser, setAnchorElUser] = useState<null | HTMLElement>(null);
const [selectedPgroup, setSelectedPgroup] = useState(currentPgroup);
useEffect(() => {
setSelectedPgroup(currentPgroup); // Sync local state with the global activePgroup
}, [currentPgroup]);
console.log('Active Pgroup:', activePgroup);
const handlePgroupChange = (event: React.ChangeEvent<{ value: unknown }>) => {
const newPgroup = event.target.value as string;

View File

@ -1,9 +1,16 @@
import React, { useEffect, useState } from 'react';
import React, { useEffect, useState, useRef } from 'react';
import { DataGridPremium, GridColDef } from '@mui/x-data-grid-premium';
import RunDetails from './RunDetails';
import './SampleImage.css';
import './ResultGrid.css';
import { OpenAPI, SamplesService } from '../../openapi';
import ScheduleIcon from '@mui/icons-material/Schedule';
import DoDisturbIcon from '@mui/icons-material/DoDisturb';
import TaskAltIcon from '@mui/icons-material/TaskAlt';
import ErrorOutlineIcon from '@mui/icons-material/ErrorOutline';
import InfoOutlinedIcon from '@mui/icons-material/InfoOutlined';
import HourglassEmptyIcon from '@mui/icons-material/HourglassEmpty';
// Extend your image info interface if needed.
@ -11,6 +18,8 @@ interface ImageInfo {
id: number;
filepath: string;
comment?: string;
event_type: string;
run_number?:number;
}
// This represents an experiment run as returned by your API.
@ -83,26 +92,113 @@ interface TreeRow {
id: string;
hierarchy: (string | number)[];
type: 'sample' | 'run';
experimentId?: number;
sample_id: number;
sample_name?: string;
puck_name?: string;
dewar_name?: string;
images?: ImageInfo[];
images?: ImageInfo[]; // Images associated explicitly with this row (especially run items)
run_number?: number;
beamline_parameters?: ExperimentParameters['beamline_parameters'];
experimentType?: string;
numberOfImages?: number;
hasResults: boolean;
jobStatus?: string;
}
interface ResultGridProps {
activePgroup: string;
}
const useJobStream = (onJobs: (jobs: any[]) => void) => {
const eventSourceRef = useRef<EventSource | null>(null);
useEffect(() => {
eventSourceRef.current = new EventSource(`${OpenAPI.BASE}/processing/jobs/stream`);
eventSourceRef.current.onmessage = async (event) => {
const jobs = JSON.parse(event.data); // Updated job data
onJobs(jobs);
};
return () => {
if (eventSourceRef.current) {
eventSourceRef.current.close();
}
};
}, [onJobs]);
};
const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
const [rows, setRows] = useState<TreeRow[]>([]);
const [basePath, setBasePath] = useState('');
const [detailPanelHeights, setDetailPanelHeights] = useState<{ [key: string]: number }>({}); // Store dynamic heights
const [jobStatusMap, setJobStatusMap] = useState<{ [runId: number]: string }>({});
const getStatusIcon = (status: string, hasResults: boolean = false) => {
switch (status) {
case 'todo':
return <ScheduleIcon color="action" titleAccess="Todo" />;
case 'submitted':
return <HourglassEmptyIcon color="primary" className="spin" titleAccess="Submitted" />;
case 'completed':
return hasResults ? (
<TaskAltIcon color="success" titleAccess="Completed" />
) : (
<InfoOutlinedIcon color="warning" titleAccess="Completed - No Results" />
);
case 'failed':
return <ErrorOutlineIcon color="error" titleAccess="Failed" />;
case 'cancelled':
return <DoDisturbIcon color="disabled" titleAccess="Cancelled" />;
case 'no job':
default:
return <InfoOutlinedIcon color="disabled" titleAccess="No job" />;
}
};
useJobStream((jobs) => {
const map: { [runId: number]: string } = {};
for (const job of jobs) {
// Map job status by run_id (or job_id as preferred)
map[job.run_id] = job.status;
}
setJobStatusMap(map);
});
const handleJobs = async (jobs: any[]) => {
console.log('Jobs received from the job stream:', jobs);
// Fetch results for each run based on the job stream
const updatedRows = await Promise.all(
rows.map(async (row) => {
if (row.type === 'run' && row.experimentId) {
try {
const results = await SamplesService.getResultsForRunAndSample(
row.sample_id,
row.experimentId
);
const hasResults = results.length > 0;
console.log(`Fetching results for experimentId: ${row.experimentId}, hasResults: ${hasResults}`);
return { ...row, hasResults }; // Update `hasResults` for the run
} catch (error) {
console.error(`Error fetching results for experimentId: ${row.experimentId}`, error);
return row; // Return unchanged row on error
}
}
return row; // Return unchanged for non-run rows
})
);
// Update the rows state with new `hasResults` values
setRows(updatedRows);
};
useJobStream(handleJobs);
const hasProcessingResults = (row: TreeRow): boolean => {
// You can later replace this placeholder with actual logic.
@ -176,8 +272,14 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
}, []);
useEffect(() => {
// Fetch sample details and construct rows
SamplesService.getSampleResultsSamplesResultsGet(activePgroup)
if (!OpenAPI.BASE) {
console.error('OpenAPI.BASE is not set. Falling back to a default value.');
return;
}
setBasePath(`${OpenAPI.BASE}/`);
SamplesService.getSampleResults(activePgroup)
.then((response: SampleResult[]) => {
const treeRows: TreeRow[] = [];
@ -190,28 +292,28 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
sample_name: sample.sample_name,
puck_name: sample.puck_name,
dewar_name: sample.dewar_name,
images: sample.images,
images: sample.images.filter(img => img.event_type === "Centering"),
};
treeRows.push(sampleRow);
if (sample.experiment_runs) {
sample.experiment_runs.forEach((run) => {
const experimentType = getExperimentType(run);
const numImages = getNumberOfImages(run);
const runRow: TreeRow = {
id: `run-${sample.sample_id}-${run.run_number}`,
hierarchy: [sample.sample_id, run.run_number],
type: 'run',
sample_id: sample.sample_id,
run_number: run.run_number,
beamline_parameters: run.beamline_parameters,
experimentType,
numberOfImages: numImages,
images: sample.images,
};
treeRows.push(runRow);
});
}
sample.experiment_runs?.forEach(run => {
const experimentType = getExperimentType(run);
const numImages = getNumberOfImages(run);
const runRow: TreeRow = {
id: `run-${sample.sample_id}-${run.run_number}`,
hierarchy: [sample.sample_id, run.run_number],
type: 'run',
experimentId: run.id,
sample_id: sample.sample_id,
run_number: run.run_number,
beamline_parameters: run.beamline_parameters,
experimentType,
numberOfImages: numImages,
images: sample.images.filter(img => img.event_type === "Collecting"),
hasResults: false, // Default to false until verified
};
treeRows.push(runRow);
});
});
setRows(treeRows);
@ -221,6 +323,7 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
});
}, [activePgroup]);
// Define the grid columns
const columns: GridColDef[] = [
{
@ -228,6 +331,32 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
headerName: 'Sample Name',
width: 200,
},
{
field: 'jobStatus',
headerName: 'Job Status',
width: 120,
renderCell: (params) => {
if (params.row.type === 'run') {
const hasResults = params.row.hasResults || false; // Check for results
const jobStatus = jobStatusMap[params.row.experimentId] || 'no job'; // Fetch job status
// If there are results, only show the TaskAltIcon (no need for job status tracking)
if (hasResults) {
return (
<div style={{ display: 'flex', alignItems: 'center', gap: '5px' }}>
<TaskAltIcon color="success" titleAccess="Results available" />
<span style={{ fontSize: '0.75rem', color: '#4caf50' }}>Results</span>
</div>
);
}
// Otherwise, show the job tracking status icon
return getStatusIcon(jobStatus, hasResults);
}
return null; // No rendering for non-run rows
},
}
,
{
field: 'puck_name',
headerName: 'Puck Name',
@ -294,19 +423,38 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
});
};
const handleResultsFetched = (runId: number, hasResults: boolean) => {
console.log(`handleResultsFetched called for RunId ${runId}, hasResults: ${hasResults}`);
setRows((prevRows) =>
prevRows.map((row) => {
if (row.type === 'run' && row.experimentId === runId) {
console.log(`Updating row for runId ${runId}, setting hasResults=${hasResults}`);
return { ...row, hasResults };
}
return row;
})
);
};
const getDetailPanelContent = (params: any) => {
if (params.row.type === 'run') {
return (
<RunDetails
run={params.row}
runId={params.row.experimentId}
sample_id={params.row.sample_id}
basePath={basePath}
onHeightChange={(height: number) => handleDetailPanelHeightChange(params.row.id, height)} // Pass callback for dynamic height
onHeightChange={(height) => handleDetailPanelHeightChange(params.row.id, height)}
onResultsFetched={(runId, hasResults) => handleResultsFetched(runId, hasResults)}
/>
);
}
return null;
};
const getDetailPanelHeight = (params: any) => {
if (params.row.type === 'run') {
// Use the dynamically calculated height from state
@ -319,6 +467,7 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
return (
<DataGridPremium
key={JSON.stringify(rows)}
rows={rows}
columns={columns}
getRowId={(row) => row.id}
@ -349,4 +498,3 @@ const ResultGrid: React.FC<ResultGridProps> = ({ activePgroup }) => {
};
export default ResultGrid;

View File

@ -1,198 +1,438 @@
import React, { useEffect, useRef, useState } from 'react';
import {
Accordion,
AccordionSummary,
AccordionDetails,
Typography,
Grid,
Modal,
Box
Accordion, AccordionSummary, AccordionDetails, Typography, Grid, Modal, Box
} from '@mui/material';
import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
import './SampleImage.css';
import { DataGridPremium, GridColDef, GridValueGetterParams } from "@mui/x-data-grid-premium";
import { LineChart } from '@mui/x-charts/LineChart';
import { SamplesService } from "../../openapi";
interface RunDetailsProps {
run: ExperimentParameters;
run: TreeRow;
runId: number;
sample_id: number;
basePath: string;
onHeightChange?: (height: number) => void; // Callback to notify the parent about height changes
onHeightChange?: (height: number) => void;
onResultsFetched: (runId: number, hasResults: boolean) => void; // New callback
}
const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath }) => {
const containerRef = useRef<HTMLDivElement | null>(null); // Ref to track component height
interface CCPoint {
resolution: number;
value: number;
}
interface ExperimentParameters {
run_number: number;
id: number;
sample_id: number;
beamline_parameters: BeamlineParameters;
images: Image[];
}
interface ProcessingResults {
id: number;
pipeline: string;
resolution: number;
unit_cell: string;
spacegroup: string;
rmerge: CCPoint[];
rmeas: CCPoint[];
isig: CCPoint[];
cc: CCPoint[];
cchalf: CCPoint[];
completeness: CCPoint[];
multiplicity: CCPoint[];
nobs: number;
total_refl: number;
unique_refl: number;
comments?: string | null;
}
const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath, runId, sample_id, onResultsFetched }) => {
console.log('onResultsFetched is available:', onResultsFetched);
const containerRef = useRef<HTMLDivElement | null>(null);
const [currentHeight, setCurrentHeight] = useState<number>(0);
const [modalOpen, setModalOpen] = useState<boolean>(false); // For modal state
const [selectedImage, setSelectedImage] = useState<string | null>(null); // Tracks the selected image for the modal
const [modalOpen, setModalOpen] = useState<boolean>(false);
const [selectedImage, setSelectedImage] = useState<string | null>(null);
const [expandedResults, setExpandedResults] = useState(false);
const [processingResult, setProcessingResult] = useState<ProcessingResults[] | null>(null);
const {beamline_parameters, images} = run;
const {synchrotron, beamline, detector} = beamline_parameters;
useEffect(() => {
fetchResults(sample_id, runId); // fetching based on experimentId
}, [runId]);
const fetchResults = async (sample_id: number, runId: number) => {
console.log(`Fetching results for sample_id: ${sample_id}, runId: ${runId}`);
try {
const results = await SamplesService.getResultsForRunAndSample(sample_id, runId);
const mappedResults: ProcessingResults[] = results.map((res): ProcessingResults => ({
id: res.id,
pipeline: res.result?.pipeline || 'N/A',
resolution: res.result?.resolution ?? 0,
unit_cell: res.result?.unit_cell || 'N/A',
spacegroup: res.result?.spacegroup || 'N/A',
rmerge: res.result?.rmerge || [],
rmeas: res.result?.rmeas || [],
isig: res.result?.isig || [],
cc: res.result?.cc || [],
cchalf: res.result?.cchalf || [],
completeness: res.result?.completeness || [],
multiplicity: res.result?.multiplicity || [],
nobs: res.result?.nobs ?? 0,
total_refl: res.result?.total_refl ?? 0,
unique_refl: res.result?.unique_refl ?? 0,
comments: res.result?.comments || null,
}));
setProcessingResult(mappedResults);
console.log(`Mapped results for runId ${runId}:`, mappedResults);
console.log(`Boolean value for hasResults: ${mappedResults.length > 0}`);
onResultsFetched(runId, mappedResults.length > 0);
} catch (error) {
console.error(`Error fetching results for RunId ${runId}:`, error);
}
};
const resultColumns: GridColDef<ProcessingResults>[] = [
{ field: 'pipeline', headerName: 'Pipeline', flex: 1 },
{ field: 'resolution', headerName: 'Resolution (Å)', flex: 1 },
{ field: 'unit_cell', headerName: 'Unit Cell (Å)', flex: 1.5 },
{ field: 'spacegroup', headerName: 'Spacegroup', flex: 1 },
{
field: 'rmerge',
headerName: 'Rmerge',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.rmerge
? Array.isArray(params.row.rmerge)
? params.row.rmerge.map((value: CCPoint) => `${value.value.toFixed(2)}@${value.resolution.toFixed(2)}`).join(', ')
: params.row.rmerge.toFixed(2)
: 'N/A',
},
{
field: 'rmeas',
headerName: 'Rmeas',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.rmeas
? Array.isArray(params.row.rmeas)
? params.row.rmeas.map((value: CCPoint) => `${value.value.toFixed(2)}@${value.resolution.toFixed(2)}`).join(', ')
: params.row.rmeas.toFixed(2)
: 'N/A',
},
{
field: 'isig',
headerName: 'I/sig(I)',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.isig
? Array.isArray(params.row.isig)
? params.row.isig.map((value: CCPoint) => `${value.value.toFixed(2)}@${value.resolution.toFixed(2)}`).join(', ')
: params.row.isig.toFixed(2)
: 'N/A',
},
{
field: 'cc',
headerName: 'CC',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.cc && Array.isArray(params.row.cc)
? params.row.cc.map((point: CCPoint) => `${point.value.toFixed(2)}@${point.resolution.toFixed(2)}`).join(', ')
: '',
},
{
field: 'cchalf',
headerName: 'CC(1/2)',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.cchalf && Array.isArray(params.row.cchalf)
? params.row.cchalf.map((point: CCPoint) => `${point.value.toFixed(2)}@${point.resolution.toFixed(2)}`).join(', ')
: '',
},
{
field: 'completeness',
headerName: 'Completeness (%)',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.completeness
? Array.isArray(params.row.completeness)
? params.row.completeness.map((value: CCPoint) => `${value.value.toFixed(2)}@${value.resolution.toFixed(2)}`).join(', ')
: params.row.completeness.toFixed(2)
: 'N/A',
},
{
field: 'multiplicity',
headerName: 'Multiplicity',
flex: 1,
valueGetter: (params: GridValueGetterParams<ProcessingResults, string>) =>
params.row?.multiplicity
? Array.isArray(params.row.multiplicity)
? params.row.multiplicity.map((value: CCPoint) => `${value.value.toFixed(2)}@${value.resolution.toFixed(2)}`).join(', ')
: params.row.multiplicity.toFixed(2)
: 'N/A',
},
{ field: 'nobs', headerName: 'N obs.', flex: 1 },
{ field: 'total_refl', headerName: 'Total Reflections', flex: 1 },
{ field: 'unique_refl', headerName: 'Unique Reflections', flex: 1 },
{ field: 'comments', headerName: 'Comments', flex: 2 },
];
const { beamline_parameters, images } = run;
const { synchrotron, beamline, detector } = beamline_parameters;
// Calculate and notify the parent about height changes
const updateHeight = () => {
if (containerRef.current) {
const newHeight = containerRef.current.offsetHeight;
if (newHeight !== currentHeight) {
if (newHeight !== currentHeight && onHeightChange) {
setCurrentHeight(newHeight);
if (onHeightChange) {
onHeightChange(newHeight);
}
onHeightChange(newHeight);
}
}
};
useEffect(() => {
updateHeight(); // Update height on initial render
}, []);
useEffect(() => {
// Update height whenever the component content changes
const observer = new ResizeObserver(updateHeight);
if (containerRef.current) {
observer.observe(containerRef.current);
}
return () => {
observer.disconnect();
};
}, [containerRef]);
if (containerRef.current) observer.observe(containerRef.current);
return () => observer.disconnect();
}, [containerRef.current, processingResult]);
const handleImageClick = (imagePath: string) => {
setSelectedImage(imagePath);
setModalOpen(true); // Open the modal when the image is clicked
setModalOpen(true);
};
const closeModal = () => {
setSelectedImage(null); // Clear the current image
setSelectedImage(null);
setModalOpen(false);
};
return (
<div
className="details-panel" // Add the class here
ref={containerRef} // Attach the ref to the main container
className="details-panel"
ref={containerRef}
style={{
display: 'flex',
flexDirection: 'column', // Stack children vertically
gap: '16px',
padding: '16px',
border: '1px solid #ccc',
borderRadius: '4px',
alignItems: 'flex-start',
}}
>
{/* Main Details Section */}
<div style={{ flexGrow: 1 }}>
<Typography variant="h6" gutterBottom>
Run {run.run_number} Details
</Typography>
<Typography variant="subtitle1" gutterBottom>
Beamline: {beamline} | Synchrotron: {synchrotron}
</Typography>
{/* Detector Details Accordion */}
<Accordion>
<AccordionSummary
expandIcon={<ExpandMoreIcon />}
aria-controls="detector-content"
id="detector-header"
>
<Typography>
<strong>Detector Details</strong>
</Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Manufacturer: {detector?.manufacturer || 'N/A'}</Typography>
<Typography>Model: {detector?.model || 'N/A'}</Typography>
<Typography>Type: {detector?.type || 'N/A'}</Typography>
<Typography>
Beam Center (px): x: {detector?.beamCenterX_px || 'N/A'}, y: {detector?.beamCenterY_px || 'N/A'}
</Typography>
</AccordionDetails>
</Accordion>
{/* Beamline Details Accordion */}
<Accordion>
<AccordionSummary
expandIcon={<ExpandMoreIcon />}
aria-controls="beamline-content"
id="beamline-header"
>
<Typography>
<strong>Beamline Details</strong>
</Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Synchrotron: {beamline_parameters?.synchrotron || 'N/A'}</Typography>
<Typography>Ring mode: {beamline_parameters?.ringMode || 'N/A'}</Typography>
<Typography>Ring current: {beamline_parameters?.ringCurrent_A || 'N/A'}</Typography>
<Typography>Beamline: {beamline_parameters?.beamline || 'N/A'}</Typography>
<Typography>Undulator: {beamline_parameters?.undulator || 'N/A'}</Typography>
<Typography>Undulator gap: {beamline_parameters?.undulatorgap_mm || 'N/A'}</Typography>
<Typography>Focusing optic: {beamline_parameters?.focusingOptic || 'N/A'}</Typography>
<Typography>Monochromator: {beamline_parameters?.monochromator || 'N/A'}</Typography>
</AccordionDetails>
</Accordion>
{/* Beam Characteristics Accordion */}
<Accordion>
<AccordionSummary
expandIcon={<ExpandMoreIcon />}
aria-controls="beam-content"
id="beam-header"
>
<Typography>
<strong>Beam Characteristics</strong>
</Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Wavelength: {beamline_parameters?.wavelength || 'N/A'}</Typography>
<Typography>Energy: {beamline_parameters?.energy || 'N/A'}</Typography>
<Typography>Transmission: {beamline_parameters?.transmission || 'N/A'}</Typography>
<Typography>
Beam focus (µm): vertical: {beamline_parameters?.beamSizeHeight || 'N/A'}, horizontal:{' '}
{beamline_parameters?.beamSizeWidth || 'N/A'}
</Typography>
<Typography>Flux at sample (ph/s): {beamline_parameters?.beamlineFluxAtSample_ph_s || 'N/A'}</Typography>
</AccordionDetails>
</Accordion>
</div>
{/* Image Section */}
<div style={{ width: '900px' }}>
<Typography variant="h6" gutterBottom>
Associated Images
</Typography>
{images && images.length > 0 ? (
<Grid container spacing={1}>
{images.map((img) => (
<Grid item xs={4} key={img.id}>
<div
className="image-container"
onClick={() => handleImageClick(`${basePath || ''}${img.filepath}`)} // Open modal with image
style={{
cursor: 'pointer',
}}
>
<img
src={`${basePath || ''}${img.filepath}`} // Ensure basePath
alt={img.comment || 'Image'}
className="zoom-image"
style={{
width: '100%', // Ensure the image takes the full width of its container
maxWidth: '100%', // Prevent any overflow
borderRadius: '4px',
}}
/>
</div>
</Grid>
))}
</Grid>
) : (
<Typography variant="body2" color="textSecondary">
No images available.
{/* Wrap details and images together */}
<div style={{display: 'flex', gap: '16px', alignItems: 'flex-start'}}>
{/* Main Details Section */}
<div style={{flexGrow: 1}}>
<Typography variant="h6" gutterBottom>
Run {run.run_number} Details
</Typography>
)}
<Typography variant="subtitle1" gutterBottom>
Beamline: {beamline} | Synchrotron: {synchrotron}
</Typography>
{/* Detector Details Accordion */}
<Accordion>
<AccordionSummary
expandIcon={<ExpandMoreIcon/>}
aria-controls="detector-content"
id="detector-header"
>
<Typography><strong>Detector Details</strong></Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Manufacturer: {detector?.manufacturer || 'N/A'}</Typography>
<Typography>Model: {detector?.model || 'N/A'}</Typography>
<Typography>Type: {detector?.type || 'N/A'}</Typography>
<Typography>
Beam Center (px): x: {detector?.beamCenterX_px || 'N/A'},
y: {detector?.beamCenterY_px || 'N/A'}
</Typography>
</AccordionDetails>
</Accordion>
{/* Beamline Details Accordion */}
<Accordion>
<AccordionSummary expandIcon={<ExpandMoreIcon/>}>
<Typography><strong>Beamline Details</strong></Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Synchrotron: {beamline_parameters?.synchrotron || 'N/A'}</Typography>
<Typography>Ring mode: {beamline_parameters?.ringMode || 'N/A'}</Typography>
<Typography>Ring current: {beamline_parameters?.ringCurrent_A || 'N/A'}</Typography>
<Typography>Beamline: {beamline_parameters?.beamline || 'N/A'}</Typography>
<Typography>Undulator: {beamline_parameters?.undulator || 'N/A'}</Typography>
<Typography>Undulator gap: {beamline_parameters?.undulatorgap_mm || 'N/A'}</Typography>
<Typography>Focusing optic: {beamline_parameters?.focusingOptic || 'N/A'}</Typography>
<Typography>Monochromator: {beamline_parameters?.monochromator || 'N/A'}</Typography>
</AccordionDetails>
</Accordion>
{/* Beam Characteristics Accordion */}
<Accordion>
<AccordionSummary expandIcon={<ExpandMoreIcon/>}>
<Typography><strong>Beam Characteristics</strong></Typography>
</AccordionSummary>
<AccordionDetails>
<Typography>Wavelength: {beamline_parameters?.wavelength || 'N/A'}</Typography>
<Typography>Energy: {beamline_parameters?.energy || 'N/A'}</Typography>
<Typography>Transmission: {beamline_parameters?.transmission || 'N/A'}</Typography>
<Typography>
Beam focus (µm): vertical: {beamline_parameters?.beamSizeHeight || 'N/A'},
horizontal:{' '}
{beamline_parameters?.beamSizeWidth || 'N/A'}
</Typography>
<Typography>Flux at sample
(ph/s): {beamline_parameters?.beamlineFluxAtSample_ph_s || 'N/A'}</Typography>
</AccordionDetails>
</Accordion>
</div>
{/* Image Section */}
<div style={{width: '900px'}}>
<Typography variant="h6" gutterBottom>
Associated Images
</Typography>
{images && images.length > 0 ? (
<Grid container spacing={1}>
{images.map((img) => (
<Grid item xs={4} key={img.id}>
<div
className="image-container"
onClick={() => handleImageClick(`${basePath || ''}${img.filepath}`)}
style={{cursor: 'pointer'}}
>
<img
src={`${basePath || ''}${img.filepath}`}
alt={img.comment || 'Image'}
className="zoom-image"
style={{
width: '100%',
maxWidth: '100%',
borderRadius: '4px',
}}
/>
</div>
</Grid>
))}
</Grid>
) : (
<Typography variant="body2" color="textSecondary">
No images available.
</Typography>
)}
</div>
</div>
{/* Processing Results Accordion - Full Width Below */}
<div style={{width: '100%'}}>
<Accordion expanded={expandedResults} onChange={(e, expanded) => setExpandedResults(expanded)}>
<AccordionSummary expandIcon={<ExpandMoreIcon/>}>
<Typography><strong>Processing Results</strong></Typography>
</AccordionSummary>
<AccordionDetails style={{width: '100%', overflowX: 'auto'}}>
{processingResult ? (
<div style={{width: '100%'}}>
<DataGridPremium<ProcessingResults>
rows={processingResult.map((res, idx) => ({ id: idx, ...res }))}
columns={resultColumns}
autoHeight
hideFooter
columnVisibilityModel={{ id: false }}
disableColumnResize={false}
/>
</div>
) : (
<Typography variant="body2" color="textSecondary">Loading results...</Typography>
)}
</AccordionDetails>
</Accordion>
</div>
{processingResult && processingResult.length > 0 && (
<div style={{width: 400, marginTop: '16px'}}>
<Typography variant="h6" gutterBottom>Processing Metrics vs Resolution</Typography>
<LineChart
xAxis={[
{
data: processingResult[0].cc
.map((point) => point.resolution) // Use resolution values for the x-axis
.reverse(), // Reverse the resolution values to go from high-res to low-res
label: 'Resolution (Å)',
reverse: true, // Flip visually so low-res is to the right
},
]}
series={[
{
data: processingResult[0].cc
.map((point) => point.value) // Map CC values
.reverse(), // Reverse order for visual consistency
label: 'CC',
},
{
data: processingResult[0].cchalf
.map((point) => point.value) // Map CC(1/2) values
.reverse(),
label: 'CC(1/2)',
},
{
data: Array.isArray(processingResult[0].rmerge)
? processingResult[0].rmerge
.map((point: CCPoint) => point.value) // Map Rmerge values
.reverse()
: [], // Handle edge case where Rmerge isn't an array
label: 'Rmerge',
},
{
data: Array.isArray(processingResult[0].rmeas)
? processingResult[0].rmeas
.map((point: CCPoint) => point.value) // Map Rmeas values
.reverse()
: [],
label: 'Rmeas',
},
{
data: Array.isArray(processingResult[0].isig)
? processingResult[0].isig
.map((point: CCPoint) => point.value) // Map I/sig(I) values
.reverse()
: [],
label: 'I/sig(I)',
},
{
data: Array.isArray(processingResult[0].completeness)
? processingResult[0].completeness
.map((point: CCPoint) => point.value) // Map Completeness values
.reverse()
: [],
label: 'Completeness (%)',
},
{
data: Array.isArray(processingResult[0].multiplicity)
? processingResult[0].multiplicity
.map((point: CCPoint) => point.value) // Map Multiplicity values
.reverse()
: [],
label: 'Multiplicity',
},
]}
height={300}
/>
</div>
)}
{/* Modal for Zoomed Image */}
<Modal open={modalOpen} onClose={closeModal}>
<Box
@ -224,5 +464,4 @@ const RunDetails: React.FC<RunDetailsProps> = ({ run, onHeightChange, basePath }
</div>
);
};
export default RunDetails;

View File

@ -1,10 +1,16 @@
// Planning.tsx
import React from 'react';
import CustomCalendar from '../components/Calendar.tsx';
const PlanningView: React.FC = () => {
return <CustomCalendar />;
//return <div>Welcome to the Planning Page</div>;
interface PlanningViewProps {
onPgroupChange?: (pgroup: string) => void;
activePgroup: string;
}
const PlanningView: React.FC<PlanningViewProps> = ({ onPgroupChange, activePgroup }) => {
return <CustomCalendar
activePgroup={activePgroup}
onPgroupChange={onPgroupChange}
/>;
};
export default PlanningView;

View File

@ -1,23 +1,49 @@
// components/ResultView.tsx
import React from 'react';
import React, { useEffect } from 'react';
import { useParams, useSearchParams, useNavigate } from 'react-router-dom';
import SampleTracker from '../components/SampleTracker';
import ResultGrid from '../components/ResultGrid';
interface ResultsViewProps {
activePgroup: string;
onPgroupChange?: (pgroup: string) => void; // Callback to notify about pgroup changes
currentPgroup: string; // Currently selected pgroup
}
const ResultsView: React.FC<ResultsViewProps> = ({activePgroup
}) => {
const ResultsView: React.FC<ResultsViewProps> = ({ onPgroupChange, currentPgroup }) => {
const { beamtimeId } = useParams();
const [searchParams] = useSearchParams();
const navigate = useNavigate();
// Get the active pgroup for the experiment from the query params.
const activePgroup = searchParams.get("pgroup") ?? ''; // Default to an empty string if missing
// Redirect if the selected pgroup does not match the beamtime's pgroup
useEffect(() => {
if (!currentPgroup || currentPgroup !== activePgroup) {
console.warn(
`Redirecting to BeamtimeOverview because selected pgroup (${currentPgroup || "undefined"}) does not match beamtime's pgroup (${activePgroup})`
);
navigate('/beamtime-overview'); // Redirect to BeamtimeOverview
}
}, [currentPgroup, activePgroup, navigate]);
// Notify parent about the selected pgroup (if needed)
useEffect(() => {
// Synchronize the pgroup when the component loads
if (onPgroupChange && activePgroup !== currentPgroup) {
onPgroupChange(activePgroup); // Update the selected pgroup
}
}, [onPgroupChange, activePgroup, currentPgroup]);
return (
<div>
<h1>Results Page</h1>
<SampleTracker activePgroup={activePgroup}/>
<ResultGrid activePgroup={activePgroup} />
</div>
<h2>Results for Beamtime ID: {beamtimeId}</h2>
{/* Use the beamtimeId to filter or query specific results */}
<SampleTracker activePgroup={activePgroup} beamtimeId={beamtimeId} />
<ResultGrid activePgroup={activePgroup} beamtimeId={beamtimeId} />
</div>
);
};

View File

@ -1,46 +1,17 @@
.calendar-container {
width: 80%;
margin: 0 auto;
}
/* Styling each day cell */
.fc-daygrid-day-frame {
position: relative; /* Ensure positioning for child elements */
border: 1px solid #e0e0e0; /* Grid cell border for better visibility */
}
/* Event styling */
.fc-event {
border-radius: 3px; /* Rounded corners for events */
padding: 4px; /* Padding for events */
font-size: 12px; /* Font size for event text */
cursor: pointer; /* Pointer cursor for events */
box-sizing: border-box; /* Include padding in the width/height */
}
/* Selected event styling */
.fc-event-selected {
border: 2px solid black; /* Border for selected events */
}
/* Optional: Add hover effect for events */
.fc-event:hover {
background-color: #FF7043; /* Change color on hover */
}
.event-details {
margin-top: 20px;
padding: 15px;
border: 1px solid #ccc;
border-radius: 5px;
background-color: #f9f9f9;
}
.event-details h3 {
margin: 0 0 10px;
}
.event-details label {
display: block;
margin-bottom: 10px;
}
.fc-event-shift {
position: absolute !important; /* Enables proper alignment */
font-size: 12px; /* Text size for better clarity */
line-height: 1.2; /* Improve readability */
height: auto !important; /* Flexible height based on content */
min-height: 25px; /* Ensure adequate space vertically */
width: 28% !important; /* Prevent events from spanning full cell width */
border: 1px solid #555; /* Consistent event border */
border-radius: 4px; /* Rounded corners */
background-color: rgba(255, 255, 255, 0.9); /* Default background */
white-space: nowrap; /* Prevent text wrapping */
overflow: hidden; /* Hide overflowing content */
text-overflow: ellipsis; /* Show '...' for long titles */
display: flex; /* Align content vertically and horizontally */
justify-content: center; /* Center horizontal alignment */
align-items: center; /* Center vertical alignment */
}

View File

@ -2,6 +2,17 @@ FROM node:18-alpine
WORKDIR /app
# Setup build args clearly
ARG VITE_OPENAPI_BASE_DEV
ARG VITE_SSL_KEY_PATH
ARG VITE_SSL_CERT_PATH
ARG NODE_ENV=development
ENV VITE_OPENAPI_BASE=${VITE_OPENAPI_BASE}
ENV VITE_SSL_KEY_PATH=${VITE_SSL_KEY_PATH}
ENV VITE_SSL_CERT_PATH=${VITE_SSL_CERT_PATH}
ENV NODE_ENV=${NODE_ENV}
# Copy only the necessary package files first
COPY package*.json ./
RUN npm install
@ -14,3 +25,8 @@ COPY . .
# Build the application
RUN npm run build
# Use a simple HTTP server to serve the built static files
EXPOSE 3000
CMD ["npm", "run", "start-dev"]

View File

@ -16,6 +16,12 @@ export default defineConfig(({ mode }) => {
},
host: '0.0.0.0',
port: 3000,
hmr: {
clientPort: 3000,
protocol: 'wss', // explicitly HTTPS Manager explicitly clearly make wss:// clearly listened clearly explicitly.
host: 'mx-aare-test.psi.ch' // explicitly your browser hostname explicitly clearly explicitly
},
},
};
});

View File

@ -15,4 +15,7 @@ pydantic[email]
mysqlclient~=2.1.1
python-multipart~=0.0.6
uvicorn==0.23.1
python-dotenv
python-dotenv
psycopg2-binary
python-dateutil~=2.8.2
urllib3~=2.2.1

File diff suppressed because one or more lines are too long