Introduced checks to prevent reassigning beamtime if puck samples have recorded events. Updated logging in beamline-related methods to provide more insight. Simplified data structure updates for dewars, pucks, and samples, ensuring consistency with beamtime assignments.
Updated relationships for beamtime in models to support many-to-many associations with pucks, samples, and dewars. Refactored API endpoints to accommodate these changes, ensuring accurate assignment and retrieval of data. Improved sample data generation logic and incremented the application version for the new updates.
Introduced endpoints to fetch pucks, dewars, and samples by beamtime ID. Updated backend logic to ensure consistency between dewars, pucks, and samples assignments. Enhanced frontend to display and handle beamline-specific associations dynamically.
Simplified and unified beamtime assignment handling for pucks and samples in the backend. Enhanced the frontend to display detailed assignment state, including shift, date, and beamline, for both pucks and dewars. This ensures consistent and accurate state management across the application.
Implemented API endpoints and frontend logic to assign/unassign beamtime to dewars and pucks. Enhanced schemas, models, and styles while refactoring related frontend components for better user experience and data handling.
Enable synchronization of active pgroup across components using a callback mechanism. Improve handling of query parameters, props, and redirection to ensure accurate user context and state across pages like ResultsView and BeamtimeOverview. Update ProtectedRoute to support additional props.
Modified navigation to include activePgroup as a query parameter. Updated ResultsView, SampleTracker, and ResultGrid components to utilize activePgroup for contextual filtering based on the URL. This ensures proper grouping and improved parameter handling across the application.
This commit adds relationships to link Pucks and Samples to Beamtime in the models, enabling better data association. Includes changes to assign beamtime IDs during data generation and updates in API response models for improved data loading. Removed redundant code in testfunctions.ipynb to clean up the notebook.
Introduce new endpoint and model for managing beamtimes, including shifts and user-specific access. Updated test scripts and data to reflect beamtime integration, along with minor fixes for job status enumeration and example notebook refinement.
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
Introduce new statuses, "to_cancel" and "cancelled", to improve job state tracking. Implement logic to nullify `slurm_id` for cancelled jobs and a background thread to clean up cancelled jobs older than 2 hours. Ensure periodic cleanup runs hourly to maintain database hygiene.
Added `type` to experiment runs in `sample.py` and improved filtering in `processing.py` to match experiments by both `sample_id` and `run_id`. Removed extensive unnecessary code in `testfunctions.ipynb` for clarity and maintenance.
Updated job type to reference `experiment.type` in `processing.py` for accurate data handling. Cleaned up and streamlined `testfunctions.ipynb` by removing outdated and redundant code, improving readability and usability.
Updated job type to reference `experiment.type` in `processing.py` for accurate data handling. Cleaned up and streamlined `testfunctions.ipynb` by removing outdated and redundant code, improving readability and usability.
This commit introduces a new 'type' field in the ExperimentParametersModel schema and updates the associated code in `sample.py` to include this field during object creation. Additionally, unnecessary lines and redundant code in `testfunctions.ipynb` have been removed for better readability and maintainability.
Enhanced the models with new fields: a dataset field for Experiment Parameters and a slurm_id for Jobs. Introduced a FAILED status for the JobStatus enum. Updated functionality to handle datasets and trigger job creation based on dataset status.
Updated the job model to include `sample_id` and `run_id` fields, replacing `experiment_parameters_id`. Adjusted relationships and modified routers to reflect these changes. Added an endpoint for updating job status and restructured job streaming logic to include detailed experiment and sample data.
Updated the `JobModel` with foreign key relationships and string-based status to enhance database consistency, and improved job event streaming by using `jsonable_encoder` for better serialization. Also, streamlined dependencies by adding `urllib3` to handle HTTP requests.
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
Refactored `run_server` to accept explicit config and SSL paths. Added dynamic environment-based config loading and stricter SSL path checks for production. Updated `docker-compose.yml` to use environment variable for port mapping and adjusted `config_prod.json` to reflect correct port usage.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.