178 Commits

Author SHA1 Message Date
Mose Müller
a76035f443 Merge pull request #68 from tiqi-group/fix/only_load_state_properties_can_be_updated
fix: only load state properties can be updated
2023-11-09 17:35:28 +01:00
Mose Müller
2ab4d1c00a updates to v0.3.1 2023-11-09 17:33:03 +01:00
Mose Müller
a9d577820f updates tests 2023-11-09 17:32:35 +01:00
Mose Müller
f5e6dca16a moves check for load_state decorator to load_state method in StateManager 2023-11-09 17:32:30 +01:00
Mose Müller
4a45d0d438 npm run build 2023-11-09 17:10:56 +01:00
Mose Müller
3cc6399f60 frontend: update style (fix button appearance) 2023-11-09 17:10:21 +01:00
Mose Müller
dc1c7e80f4 docs: updates Readme TOC 2023-11-09 16:05:31 +01:00
Mose Müller
95b5907a8d update to version v0.3.0 2023-11-09 15:56:00 +01:00
Mose Müller
675fe86e7e Merge pull request #67 from tiqi-group/46-setter-functions-being-called-at-startup-when-loading-json-file
46 setter functions being called at startup when loading json file
2023-11-09 15:54:05 +01:00
Mose Müller
60c2cca8f5 updates Readme.md 2023-11-09 15:52:25 +01:00
Mose Müller
e4fb1c66a1 updates StateManager tests 2023-11-09 15:52:25 +01:00
Mose Müller
1af4f98a48 updates has_load_state_decorator logic 2023-11-09 15:52:22 +01:00
Mose Müller
eddf3dd2fc adds docstring 2023-11-09 15:52:06 +01:00
Mose Müller
c2a22d4456 adds tests for load_state decorator 2023-11-09 15:29:47 +01:00
Mose Müller
aa9f1ba35a adds load_state decorator 2023-11-09 15:29:39 +01:00
Mose Müller
2208e5f66e npm run build 2023-11-09 15:16:53 +01:00
Mose Müller
96f1ee16b7 docs: updates Adding_Components.md 2023-11-09 14:36:45 +01:00
Mose Müller
4f7c6ccde4 docs: updates Adding_Components.md 2023-11-09 14:15:30 +01:00
Mose Müller
856f5d0c79 update flake8 and pyright configs 2023-11-09 14:10:58 +01:00
Mose Müller
b60995d218 removes unnecessary log msg 2023-11-09 14:10:27 +01:00
Mose Müller
380f98edb5 adds type hint 2023-11-09 14:10:14 +01:00
Mose Müller
30e4ebb670 Merge pull request #66 from tiqi-group/fix/executing_methods_through_frontend
Fix/executing methods through frontend
2023-11-09 14:04:48 +01:00
Mose Müller
bdf5512bcc adds run_method socketio event to web server 2023-11-09 14:03:20 +01:00
Mose Müller
a323ce169e renames frontend_update socketio event to set_attribute 2023-11-09 13:53:13 +01:00
Mose Müller
d18be54284 updates frontend components to use new methods from socket.ts 2023-11-09 13:52:23 +01:00
Mose Müller
a750644c20 updates socket.ts (renames and add method) 2023-11-09 13:52:00 +01:00
Mose Müller
45ede860d9 removes JSDoc types (already in typescript) 2023-11-09 13:51:26 +01:00
Mose Müller
a060836304 updates DataServiceList tests 2023-11-09 11:59:48 +01:00
Mose Müller
963e449adb moves DataServiceList test file 2023-11-09 11:50:08 +01:00
Mose Müller
1776fc8623 converts values (ints and quantities) when setting list entries 2023-11-09 11:49:42 +01:00
Mose Müller
aed0dd9493 updates StateManager tests 2023-11-09 11:37:07 +01:00
Mose Müller
784d49d90c refactores __update_attribute_by_path of StateManager 2023-11-09 11:36:45 +01:00
Mose Müller
8dd05ac5e3 renames helper function 2023-11-09 11:35:04 +01:00
Mose Müller
27bb73a2da adds docstring 2023-11-09 08:20:51 +01:00
Mose Müller
6b643210d7 adds tests for StateManager 2023-11-08 17:09:05 +01:00
Mose Müller
24f1574168 web server now uses StateManager method to update DataService attributes 2023-11-08 17:08:31 +01:00
Mose Müller
b594a91a18 refactors load_state method 2023-11-08 17:08:00 +01:00
Mose Müller
e708d6f1c3 adds logic of updating DataService attributes to StateManager 2023-11-08 17:07:37 +01:00
Mose Müller
6c2c5d4ad1 deprecates update_DataService_attribute function 2023-11-08 17:05:55 +01:00
Mose Müller
d0377be455 Merge pull request #63 from tiqi-group/52-add-a-cache-storing-the-state-of-the-service
52 add a cache storing the state of the service
2023-11-07 18:34:46 +01:00
Mose Müller
5e136c2784 renames test file 2023-11-07 18:26:13 +01:00
Mose Müller
0a94b32011 updates serializer tests 2023-11-07 18:25:57 +01:00
Mose Müller
14b5219915 refactoring serializer module methods 2023-11-07 18:23:24 +01:00
Mose Müller
7c573cdc10 updating docstring 2023-11-07 17:16:02 +01:00
Mose Müller
393b025648 renaming function, updating docstring 2023-11-07 17:15:54 +01:00
Mose Müller
03fee3f88c moves generate_paths_from_DataService_dict to serializer module 2023-11-07 17:06:35 +01:00
Mose Müller
59c7d7bb6f formatting 2023-11-07 17:03:36 +01:00
Mose Müller
dc70f3cfcf renames functions, adds docstrings 2023-11-07 16:59:59 +01:00
Mose Müller
cdd657f895 adds tests for update_serialization_dict method 2023-11-07 16:43:09 +01:00
Mose Müller
c9b5547831 refactoring serializer.py 2023-11-07 16:41:22 +01:00
Mose Müller
615bf294e1 moves get_attribute_doc to helpers 2023-11-07 16:14:41 +01:00
Mose Müller
b6953251b9 updating helper function 2023-11-06 18:27:41 +01:00
Mose Müller
3440a632ad moving set_nested_value_in_dict to Serializer, renaming module 2023-11-06 18:27:00 +01:00
Mose Müller
4ef4bab36e fixing mypy issues 2023-11-06 18:25:40 +01:00
Mose Müller
567617f4e6 docs: Updating Readme 2023-11-06 17:35:47 +01:00
Mose Müller
76545b88de removes _filename attribute from DataService (unless specified) 2023-11-06 17:30:22 +01:00
Mose Müller
f38df58842 updates logging message formatting 2023-11-06 17:22:42 +01:00
Mose Müller
d057710b60 adds StateManager tests 2023-11-06 17:22:26 +01:00
Mose Müller
f071bda35f adds data service cache tests 2023-11-06 15:08:36 +01:00
Mose Müller
2b304cba03 docs: updating docstrings 2023-11-06 15:08:15 +01:00
Mose Müller
f88493d97c fix: removes monkey path of emit_notification, adapts affected tests 2023-11-06 13:46:08 +01:00
Mose Müller
53ce51991f initializes cache in the constructor of the DataServiceCache 2023-11-06 11:13:44 +01:00
Mose Müller
0385e5732e adds docstring 2023-11-06 10:49:41 +01:00
Mose Müller
20a64099a4 updates protocol for additional servers 2023-11-06 10:49:35 +01:00
Mose Müller
16b284da45 adds state manager to additional servers 2023-11-06 10:49:16 +01:00
Mose Müller
2833284239 chore: updating types, removes unused imports 2023-11-06 10:06:08 +01:00
Mose Müller
8d9160d660 adds docstring 2023-11-06 09:58:48 +01:00
Mose Müller
c196c82c52 refactor StateManager: adds cache property for direct access 2023-11-06 09:58:06 +01:00
Mose Müller
d66a3ad015 updates comments and docstrings 2023-11-06 09:54:50 +01:00
Mose Müller
08512e945b adds deprecation warnings to DataService 2023-11-06 09:54:33 +01:00
Mose Müller
e4796102be removes filename argument from DataService constructor 2023-11-06 09:54:23 +01:00
Mose Müller
2fd4d94dbb moves cache from StateManager to DataServiceCache 2023-11-06 09:53:09 +01:00
Mose Müller
78c055acf0 adds DataServiceCache class 2023-11-06 09:50:38 +01:00
Mose Müller
75a69204b5 moves state manager from DataService to Server 2023-11-06 09:32:25 +01:00
Mose Müller
f852dea9e5 Removes state manager from all service instances that have no filename set or are not exposed 2023-11-03 09:55:40 +01:00
Mose Müller
49070a7f38 removing unused imports
removing unused import
2023-11-03 09:55:40 +01:00
Mose Müller
fc7092f14c using StateManger in DataService 2023-11-03 09:55:40 +01:00
Mose Müller
b0254daa17 adds StateManager 2023-11-03 09:55:40 +01:00
Mose Müller
b08a976d2a removes tests from pyright includes, updates poetry.lock 2023-11-03 09:54:46 +01:00
Mose Müller
fccd5a7c36 Merge pull request #62 from tiqi-group/fix/connection_toast_timeout
Fix/connection toast timeout
2023-11-03 09:18:13 +01:00
Mose Müller
d643923fd3 fix: only update connection toast to reconnecting when still disconnected 2023-11-03 09:14:36 +01:00
Mose Müller
3132680c50 removing unnecessary console log commands 2023-11-03 09:14:36 +01:00
Mose Müller
f47a5524b3 Merge pull request #61 from tiqi-group/revert-60-fix/connection_toast_timeout
Revert "Fix/connection toast timeout"
2023-11-03 09:11:59 +01:00
Mose Müller
b32bdabfca Revert "Fix/connection toast timeout" 2023-11-03 09:11:40 +01:00
Mose Müller
c5beee5d50 Merge pull request #60 from tiqi-group/fix/connection_toast_timeout
Fix/connection toast timeout
2023-11-03 08:52:17 +01:00
Mose Müller
55ce32e105 fix: only update connection toast to reconnecting when still disconnected 2023-11-03 08:50:33 +01:00
Mose Müller
621bed94af removing unnecessary console log commands 2023-11-03 08:50:03 +01:00
Mose Müller
a837e1bce8 removing unused imports 2023-11-02 18:25:55 +01:00
Mose Müller
6ab11394fa using StateManger in DataService 2023-11-02 18:22:32 +01:00
Mose Müller
51c4e2f971 adds StateManager 2023-11-02 18:21:43 +01:00
Mose Müller
c5a2b38914 removing duplicate test 2023-11-02 17:50:43 +01:00
Mose Müller
d45b835ea2 docs: correcting docstring 2023-11-02 16:00:54 +01:00
Mose Müller
d2c0b6968e Merge pull request #58 from tiqi-group/37-update-task-status-in-frontend-when-restarting-the-service
Service data will be fetched as soon as the client connects to the websocket server
2023-11-02 15:47:30 +01:00
Mose Müller
728fe958cb npm run build 2023-11-02 15:43:18 +01:00
Mose Müller
69c5e0397b fetch data as soon as the client connects to the websocket server 2023-11-02 15:43:11 +01:00
Mose Müller
7f402b45e7 docs: adding docstring to ConnectionToast 2023-11-02 15:34:07 +01:00
Mose Müller
c4056d3ca8 chore: formatting, renaming 2023-11-02 15:31:46 +01:00
Mose Müller
c13166dddb Merge pull request #57 from tiqi-group/feat/adding_connection_toast
adds connection toast component to app
2023-11-02 15:26:47 +01:00
Mose Müller
47d64243c3 adds connection toast component to app 2023-11-02 15:23:31 +01:00
Mose Müller
f01ef057bf Merge pull request #56 from tiqi-group/cleanup/refactoring_serialization
Refactors DataService serialization
2023-11-02 14:36:12 +01:00
Mose Müller
6804cdf3b1 refactoring Serializer class 2023-11-02 14:33:16 +01:00
Mose Müller
2b57df5aac adds tests for serialization (and moves tests from test_data_service) 2023-11-02 14:11:08 +01:00
Mose Müller
2eb0eb84cf moves serialization into separate class in the utils module 2023-11-02 14:10:33 +01:00
Mose Müller
f8495dc949 Merge pull request #55 from tiqi-group/50-problem-with-negative-number
feat: pressing "-" at the start of a number component toggles the sign
2023-10-30 14:39:59 +01:00
Mose Müller
9ac6e2c56a npm run build 2023-10-30 14:37:25 +01:00
Mose Müller
8ae0b7818b feat (frontend): pressing "-" at the beginning of a number component will add a minus sign 2023-10-30 14:36:52 +01:00
Mose Müller
61c6585ac6 Merge pull request #54 from tiqi-group/fix/frontend-div-ids
Fix: frontend div ids adhere to html guidelines now
2023-10-30 14:21:55 +01:00
Mose Müller
b6c956fab8 docs: updating Adding_Components description 2023-10-30 14:17:30 +01:00
Mose Müller
743531c434 updating frontend packages and config 2023-10-30 14:15:53 +01:00
Mose Müller
3ecb6384ad npm run build 2023-10-30 14:15:38 +01:00
Mose Müller
1d2325171b fixing eslint errors 2023-10-30 14:14:32 +01:00
Mose Müller
b149c1b411 fix: component ids adhere to html guidelines now 2023-10-30 14:05:39 +01:00
Mose Müller
7e5861ec22 feat: adding utils module (string manipulation function) 2023-10-30 14:04:25 +01:00
Mose Müller
5b4c74f1c2 npm run build 2023-10-30 13:27:02 +01:00
Mose Müller
7dcec88c9a frontend: updating addNotification type hints 2023-10-30 13:26:25 +01:00
Mose Müller
3d42366ada Merge pull request #53 from tiqi-group/27-task-autostart-not-working-in-nested-classes
27 task autostart not working in nested classes
2023-10-25 16:50:55 +02:00
Mose Müller
eb46a088ee chore: refactoring method 2023-10-25 16:48:33 +02:00
Mose Müller
69cd86b601 feat: adds autostart_tasks test 2023-10-25 16:39:11 +02:00
Mose Müller
81f2281002 fix: autostart_tasks capbability in sub-classes 2023-10-25 16:33:29 +02:00
Mose Müller
f7f64bbe92 adding callback_manager tests 2023-10-25 16:23:06 +02:00
Mose Müller
0504a50a08 fix: creates property functions to avoid closure and late binding issue
When having multiple tasks, they all pointed to the one defined last.
2023-10-25 16:23:06 +02:00
Mose Müller
8564df5adc fix: adds start_stop_task callbacks to lists 2023-10-25 16:23:06 +02:00
Mose Müller
a24eb928a8 Updating launch.json (nvim compatible) 2023-10-25 16:22:57 +02:00
Mose Müller
2713dad423 Merge pull request #51 from tiqi-group/49-wrong-types-for-non-standard-saved-variables
fix: loading of ColouredEnum and Quantity from settings file
2023-10-25 16:16:43 +02:00
Mose Müller
6ea4cf3eb7 adds test for loading units from json 2023-10-25 16:15:31 +02:00
Mose Müller
9054f05f30 fix: convert quantity dict to quantity when loading from json 2023-10-25 16:15:19 +02:00
Mose Müller
b790b6a6ca fix: adds ColouredEnum to STANDARD_TYPES 2023-10-25 10:47:15 +02:00
Mose Müller
22f832054e Updating logging message 2023-10-19 17:48:05 +02:00
Mose Müller
2e9ced4e5e Merge pull request #43 from tiqi-group/42-enhanced-signal-handling-for-asyncio-loop
Enhances signal handling, adds force exit capability
2023-10-19 11:14:06 +02:00
Mose Müller
b654c7d176 adds pytest-mock to python dependencies 2023-10-19 11:12:32 +02:00
Mose Müller
b5b2fb8c35 adding signal-handling test 2023-10-19 11:11:56 +02:00
Mose Müller
1bc2bb3605 Enhances signal handling, adds force exit capability 2023-10-19 10:59:00 +02:00
Mose Müller
0a77cc1f36 pytest coverage: do not omit logging.py anymore (after switch from loguru to logging) 2023-10-19 08:02:23 +02:00
Mose Müller
d334ec5284 Merge pull request #41 from tiqi-group/39-feat-add-customcss-option-to-pydaseserver
adds custom css option to pydase.Server
2023-10-17 17:04:30 +02:00
Mose Müller
d3a74a734a updating Readme 2023-10-17 13:13:13 +02:00
Mose Müller
43e0c72018 updating comments 2023-10-17 12:52:08 +02:00
Mose Müller
27b430333a ignoring flake8 error 2023-10-17 12:52:00 +02:00
Mose Müller
e25acb7e59 removing unused web server functions 2023-10-17 12:49:18 +02:00
Mose Müller
89f281bd3b Merge pull request #40 from tiqi-group/turn-of-frontend-notifications-by-default
turns of frontend notifications by default
2023-10-17 12:45:22 +02:00
Mose Müller
829e73e2e7 npm run build 2023-10-17 11:50:35 +02:00
Mose Müller
04b9976a3b turns of frontend notifications by default 2023-10-17 11:49:42 +02:00
Mose Müller
785ed92b45 adds link element to frontend header if services exposes /custom.css endpoint 2023-10-17 11:48:07 +02:00
Mose Müller
6e14837e15 adding custom.css endpoint to web server 2023-10-17 11:47:34 +02:00
Mose Müller
5ad15c1cae frontend: fix div ids 2023-10-17 11:45:50 +02:00
Mose Müller
c1f0b7b74d using logger instead of print statement 2023-10-16 17:29:00 +02:00
Mose Müller
5badd86d5a chore: formatting 2023-10-16 17:26:26 +02:00
Mose Müller
b5953f13f7 fix: updating uvicorn logger config 2023-10-16 17:24:51 +02:00
Mose Müller
a3c2672458 docs: updating readme 2023-10-16 17:17:39 +02:00
Mose Müller
7a78713388 Merge pull request #35 from tiqi-group/34-remove-loguru-dependency-and-use-std-logging
34 remove loguru dependency and use std logging
2023-10-16 17:16:33 +02:00
Mose Müller
8a8375735a extend simple logging example 2023-10-16 17:11:46 +02:00
Mose Müller
e61b2a4969 fix: pyright issue 2023-10-16 17:06:11 +02:00
Mose Müller
453076da86 removing loguru python dependency 2023-10-16 16:58:16 +02:00
Mose Müller
886b086180 docs: update Readme 2023-10-16 15:52:45 +02:00
Mose Müller
7b04298ead add logging tests 2023-10-16 15:52:09 +02:00
Mose Müller
c6a96ba6c0 update tests 2023-10-16 15:52:04 +02:00
Mose Müller
5d7a7c6bdb update logging module 2023-10-16 15:51:52 +02:00
Mose Müller
1241d7a128 using logging instead of loguru 2023-10-16 15:51:37 +02:00
Mose Müller
cdd60190a7 Merge pull request #33 from tiqi-group/32-configure-pint-to-autoconvert-offset-unit-to-base-unit
configures pint to autoconvert offset units to base units
2023-10-16 12:07:53 +02:00
Mose Müller
d144b6c42b configures pint to autoconvert offset units to base units 2023-10-16 12:05:07 +02:00
Mose Müller
4abea8785c Merge pull request #30 from tiqi-group/29-protected-lists-crash-pydase
fix: removes notification for updating protected lists
2023-10-12 14:25:23 +02:00
Mose Müller
dbc975bd85 updating version 2023-10-12 14:24:46 +02:00
Mose Müller
b04ad0c6a3 fix: removes notification for updating protected lists 2023-10-12 14:12:48 +02:00
Mose Müller
48e8b7dbaf Update docs 2023-10-11 14:30:49 +02:00
Mose Müller
aa85f6453f update version to v0.2.0 2023-10-11 14:20:35 +02:00
Mose Müller
343354e0ee Update README.md 2023-10-11 14:17:40 +02:00
Mose Müller
b38bb05c69 Merge pull request #28 from tiqi-group/5-adding-status-component
5 adding coloured enum component
2023-10-11 14:15:34 +02:00
Mose Müller
a0dab630f9 Update README.md 2023-10-11 14:03:07 +02:00
Mose Müller
a9db7848f7 fix: pytest failed after moving from StrEnum to Enum 2023-10-11 14:03:07 +02:00
Mose Müller
a8b14180ad fix: using Enum instead of StrEnum (>=3.11 only) 2023-10-11 14:03:07 +02:00
Mose Müller
26a366842a frontend: npm run build 2023-10-11 14:03:07 +02:00
Mose Müller
b0e7de2d2c docs: updating Readme 2023-10-11 14:03:07 +02:00
Mose Müller
bbcba8b39f test: adding test for ColouredEnum component 2023-10-11 14:03:07 +02:00
Mose Müller
34e46e05ee feat: adding ColouredEnum component 2023-10-11 14:03:07 +02:00
Mose Müller
93c2f5ab70 docs: updating mkdocs documentation
- adding user guide section
- removing "baselevel: 4"
2023-10-11 13:58:55 +02:00
Mose Müller
106ffbfc40 removing .python-version 2023-10-11 13:52:13 +02:00
Mose Müller
5702adbdbd update version 2023-10-10 14:54:17 +02:00
Mose Müller
e3a7932ac4 Merge pull request #26 from tiqi-group/24-issue-pydase-checks-if-protected-variables-inherit-from-dataservice
fix: only check inheritance of public attributes
2023-10-10 12:54:45 +02:00
Mose Müller
21cd039610 fix: only check inheritance of public attributes 2023-10-10 12:51:50 +02:00
79 changed files with 6283 additions and 6255 deletions

View File

@@ -2,5 +2,3 @@
exclude_lines = exclude_lines =
pragma: no cover pragma: no cover
if TYPE_CHECKING: if TYPE_CHECKING:
omit =
src/pydase/utils/logging.py

View File

@@ -4,5 +4,5 @@ include = src
max-line-length = 88 max-line-length = 88
max-doc-length = 88 max-doc-length = 88
max-complexity = 7 max-complexity = 7
max-expression-complexity = 5.5 max-expression-complexity = 7
use_class_attributes_order_strict_mode=True use_class_attributes_order_strict_mode=True

7
.vscode/launch.json vendored
View File

@@ -1,7 +1,4 @@
{ {
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0", "version": "0.2.0",
"configurations": [ "configurations": [
{ {
@@ -19,7 +16,7 @@
"type": "python", "type": "python",
"request": "launch", "request": "launch",
"module": "bar", "module": "bar",
"justMyCode": true, "justMyCode": false,
"env": { "env": {
"ENVIRONMENT": "development" "ENVIRONMENT": "development"
} }
@@ -29,7 +26,7 @@
"request": "launch", "request": "launch",
"name": "react: firefox", "name": "react: firefox",
"url": "http://localhost:3000", "url": "http://localhost:3000",
"webRoot": "${workspaceFolder}/frontend", "webRoot": "${workspaceFolder}/frontend"
} }
] ]
} }

168
README.md
View File

@@ -17,8 +17,13 @@
- [Method Components](#method-components) - [Method Components](#method-components)
- [DataService Instances (Nested Classes)](#dataservice-instances-nested-classes) - [DataService Instances (Nested Classes)](#dataservice-instances-nested-classes)
- [Custom Components (`pydase.components`)](#custom-components-pydasecomponents) - [Custom Components (`pydase.components`)](#custom-components-pydasecomponents)
- [`Image`](#image)
- [`NumberSlider`](#numberslider)
- [`ColouredEnum`](#colouredenum)
- [Extending with New Components](#extending-with-new-components) - [Extending with New Components](#extending-with-new-components)
- [Customizing Web Interface Style](#customizing-web-interface-style)
- [Understanding Service Persistence](#understanding-service-persistence) - [Understanding Service Persistence](#understanding-service-persistence)
- [Controlling Property State Loading with `@load_state`](#controlling-property-state-loading-with-load_state)
- [Understanding Tasks in pydase](#understanding-tasks-in-pydase) - [Understanding Tasks in pydase](#understanding-tasks-in-pydase)
- [Understanding Units in pydase](#understanding-units-in-pydase) - [Understanding Units in pydase](#understanding-units-in-pydase)
- [Changing the Log Level](#changing-the-log-level) - [Changing the Log Level](#changing-the-log-level)
@@ -29,18 +34,21 @@
## Features ## Features
<!-- no toc --> <!-- no toc -->
* [Simple data service definition through class-based interface](#defining-a-dataService) - [Simple data service definition through class-based interface](#defining-a-dataService)
* [Integrated web interface for interactive access and control of your data service](#accessing-the-web-interface) - [Integrated web interface for interactive access and control of your data service](#accessing-the-web-interface)
* [Support for `rpyc` connections, allowing for programmatic control and interaction with your service](#connecting-to-the-service-using-rpyc) - [Support for `rpyc` connections, allowing for programmatic control and interaction with your service](#connecting-to-the-service-using-rpyc)
* [Component system bridging Python backend with frontend visual representation](#understanding-the-component-system) - [Component system bridging Python backend with frontend visual representation](#understanding-the-component-system)
* [Saving and restoring the service state for service persistence](#understanding-service-persistence) - [Customizable styling for the web interface through user-defined CSS](#customizing-web-interface-style)
* [Automated task management with built-in start/stop controls and optional autostart](#understanding-tasks-in-pydase) - [Saving and restoring the service state for service persistence](#understanding-service-persistence)
* [Support for units](#understanding-units-in-pydase) - [Automated task management with built-in start/stop controls and optional autostart](#understanding-tasks-in-pydase)
- [Support for units](#understanding-units-in-pydase)
<!-- * Event-based callback functionality for real-time updates <!-- * Event-based callback functionality for real-time updates
* Support for additional servers for specific use-cases --> - Support for additional servers for specific use-cases -->
## Installation ## Installation
<!--installation-start--> <!--installation-start-->
Install pydase using [`poetry`](https://python-poetry.org/): Install pydase using [`poetry`](https://python-poetry.org/):
```bash ```bash
@@ -52,10 +60,13 @@ or `pip`:
```bash ```bash
pip install pydase pip install pydase
``` ```
<!--installation-end--> <!--installation-end-->
## Usage ## Usage
<!--usage-start--> <!--usage-start-->
Using `pydase` involves three main steps: defining a `DataService` subclass, running the server, and then connecting to the service either programmatically using `rpyc` or through the web interface. Using `pydase` involves three main steps: defining a `DataService` subclass, running the server, and then connecting to the service either programmatically using `rpyc` or through the web interface.
### Defining a DataService ### Defining a DataService
@@ -129,7 +140,7 @@ if __name__ == "__main__":
Server(service).run() Server(service).run()
``` ```
This will start the server, making your Device service accessible via RPC and a web server at http://localhost:8001. This will start the server, making your Device service accessible via RPC and a web server at [http://localhost:8001](http://localhost:8001).
### Accessing the Web Interface ### Accessing the Web Interface
@@ -156,14 +167,19 @@ print(client.voltage) # prints 5.0
``` ```
In this example, replace `<ip_addr>` with the IP address of the machine where the service is running. After establishing a connection, you can interact with the service attributes as if they were local attributes. In this example, replace `<ip_addr>` with the IP address of the machine where the service is running. After establishing a connection, you can interact with the service attributes as if they were local attributes.
<!--usage-end--> <!--usage-end-->
## Understanding the Component System ## Understanding the Component System
<!-- Component User Guide Start -->
In `pydase`, components are fundamental building blocks that bridge the Python backend logic with frontend visual representation and interactions. This system can be understood based on the following categories: In `pydase`, components are fundamental building blocks that bridge the Python backend logic with frontend visual representation and interactions. This system can be understood based on the following categories:
### Built-in Type and Enum Components ### Built-in Type and Enum Components
`pydase` automatically maps standard Python data types to their corresponding frontend components: `pydase` automatically maps standard Python data types to their corresponding frontend components:
- `str`: Translated into a `StringComponent` on the frontend. - `str`: Translated into a `StringComponent` on the frontend.
- `int` and `float`: Manifested as the `NumberComponent`. - `int` and `float`: Manifested as the `NumberComponent`.
- `bool`: Rendered as a `ButtonComponent`. - `bool`: Rendered as a `ButtonComponent`.
@@ -173,6 +189,7 @@ In `pydase`, components are fundamental building blocks that bridge the Python b
### Method Components ### Method Components
Methods within the `DataService` class have frontend representations: Methods within the `DataService` class have frontend representations:
- Regular Methods: These are rendered as a `MethodComponent` in the frontend, allowing users to execute the method via an "execute" button. - Regular Methods: These are rendered as a `MethodComponent` in the frontend, allowing users to execute the method via an "execute" button.
- Asynchronous Methods: These are manifested as the `AsyncMethodComponent` with "start"/"stop" buttons to manage the execution of [tasks](#understanding-tasks-in-pydase). - Asynchronous Methods: These are manifested as the `AsyncMethodComponent` with "start"/"stop" buttons to manage the execution of [tasks](#understanding-tasks-in-pydase).
@@ -221,6 +238,7 @@ if __name__ == "__main__":
**Note** that defining classes within `DataService` classes is not supported (see [this issue](https://github.com/tiqi-group/pydase/issues/16)). **Note** that defining classes within `DataService` classes is not supported (see [this issue](https://github.com/tiqi-group/pydase/issues/16)).
### Custom Components (`pydase.components`) ### Custom Components (`pydase.components`)
The custom components in `pydase` have two main parts: The custom components in `pydase` have two main parts:
- A **Python Component Class** in the backend, implementing the logic needed to set, update, and manage the component's state and data. - A **Python Component Class** in the backend, implementing the logic needed to set, update, and manage the component's state and data.
@@ -228,7 +246,11 @@ The custom components in `pydase` have two main parts:
Below are the components available in the `pydase.components` module, accompanied by their Python usage: Below are the components available in the `pydase.components` module, accompanied by their Python usage:
- `Image`: This component allows users to display and update images within the application. #### `Image`
This component provides a versatile interface for displaying images within the application. Users can update and manage images from various sources, including local paths, URLs, and even matplotlib figures.
The component offers methods to load images seamlessly, ensuring that visual content is easily integrated and displayed within the data service.
```python ```python
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
@@ -262,7 +284,11 @@ Below are the components available in the `pydase.components` module, accompanie
![Image Component](docs/images/Image_component.png) ![Image Component](docs/images/Image_component.png)
- `NumberSlider`: An interactive slider component to adjust numerical values, including floats and integers, on the frontend while synchronizing the data with the backend in real-time. #### `NumberSlider`
This component provides an interactive slider interface for adjusting numerical values on the frontend. It supports both floats and integers. The values adjusted on the frontend are synchronized with the backend in real-time, ensuring consistent data representation.
The slider can be customized with initial values, minimum and maximum limits, and step sizes to fit various use cases.
```python ```python
import pydase import pydase
@@ -270,7 +296,7 @@ Below are the components available in the `pydase.components` module, accompanie
class MyService(pydase.DataService): class MyService(pydase.DataService):
slider = NumberSlider(value=3.5, min=0, max=10, step_size=0.1) slider = NumberSlider(value=3.5, min=0, max=10, step_size=0.1, type="float")
if __name__ == "__main__": if __name__ == "__main__":
@@ -280,15 +306,82 @@ Below are the components available in the `pydase.components` module, accompanie
![Slider Component](docs/images/Slider_component.png) ![Slider Component](docs/images/Slider_component.png)
#### `ColouredEnum`
This component provides a way to visually represent different states or categories in a data service using colour-coded options. It behaves similarly to a standard `Enum`, but the values encode colours in a format understood by CSS. The colours can be defined using various methods like Hexadecimal, RGB, HSL, and more.
If the property associated with the `ColouredEnum` has a setter function, the keys of the enum will be rendered as a dropdown menu, allowing users to interact and select different options. Without a setter function, the selected key will simply be displayed as a coloured box with text inside, serving as a visual indicator.
```python
import pydase
import pydase.components as pyc
class MyStatus(pyc.ColouredEnum):
PENDING = "#FFA500" # Hexadecimal colour (Orange)
RUNNING = "#0000FF80" # Hexadecimal colour with transparency (Blue)
PAUSED = "rgb(169, 169, 169)" # RGB colour (Dark Gray)
RETRYING = "rgba(255, 255, 0, 0.3)" # RGB colour with transparency (Yellow)
COMPLETED = "hsl(120, 100%, 50%)" # HSL colour (Green)
FAILED = "hsla(0, 100%, 50%, 0.7)" # HSL colour with transparency (Red)
CANCELLED = "SlateGray" # Cross-browser colour name (Slate Gray)
class StatusTest(pydase.DataService):
_status = MyStatus.RUNNING
@property
def status(self) -> MyStatus:
return self._status
@status.setter
def status(self, value: MyStatus) -> None:
# do something ...
self._status = value
# Modifying or accessing the status value:
my_service = StatusExample()
my_service.status = MyStatus.FAILED
```
![ColouredEnum Component](docs/images/ColouredEnum_component.png)
#### Extending with New Components #### Extending with New Components
Users can also extend the library by creating custom components. This involves defining the behavior on the Python backend and the visual representation on the frontend. For those looking to introduce new components, the [guide on adding components](https://pydase.readthedocs.io/en/latest/dev-guide/Adding_Components/) provides detailed steps on achieving this. Users can also extend the library by creating custom components. This involves defining the behavior on the Python backend and the visual representation on the frontend. For those looking to introduce new components, the [guide on adding components](https://pydase.readthedocs.io/en/latest/dev-guide/Adding_Components/) provides detailed steps on achieving this.
<!-- Component User Guide End -->
## Customizing Web Interface Style
`pydase` allows you to enhance the user experience by customizing the web interface's appearance. You can apply your own styles globally across the web interface by passing a custom CSS file to the server during initialization.
Here's how you can use this feature:
1. Prepare your custom CSS file with the desired styles.
2. When initializing your server, use the `css` parameter of the `Server` class to specify the path to your custom CSS file.
```python
from pydase import Server, DataService
class Device(DataService):
# ... your service definition ...
if __name__ == "__main__":
service = MyService()
server = Server(service, css="path/to/your/custom.css").run()
```
This will apply the styles defined in `custom.css` to the web interface, allowing you to maintain branding consistency or improve visual accessibility.
Please ensure that the CSS file path is accessible from the server's running location. Relative or absolute paths can be used depending on your setup.
## Understanding Service Persistence ## Understanding Service Persistence
`pydase` allows you to easily persist the state of your service by saving it to a file. This is especially useful when you want to maintain the service's state across different runs. `pydase` allows you to easily persist the state of your service by saving it to a file. This is especially useful when you want to maintain the service's state across different runs.
To save the state of your service, pass a `filename` keyword argument to the `__init__` method of the `DataService` base class. If the file specified by `filename` does not exist, the service will create this file and store its state in it when the service is shut down. If the file already exists, the service will load the state from this file, setting the values of its attributes to the values stored in the file. To save the state of your service, pass a `filename` keyword argument to the constructor of the `pydase.Server` class. If the file specified by `filename` does not exist, the state manager will create this file and store its state in it when the service is shut down. If the file already exists, the state manager will load the state from this file, setting the values of its attributes to the values stored in the file.
Here's an example: Here's an example:
@@ -296,23 +389,42 @@ Here's an example:
from pydase import DataService, Server from pydase import DataService, Server
class Device(DataService): class Device(DataService):
def __init__(self, filename: str) -> None:
# ... your init code ...
# Pass the filename argument to the parent class
super().__init__(filename=filename)
# ... defining the Device class ... # ... defining the Device class ...
if __name__ == "__main__": if __name__ == "__main__":
service = Device("device_state.json") service = Device()
Server(service).run() Server(service, filename="device_state.json").run()
``` ```
In this example, the state of the `Device` service will be saved to `device_state.json` when the service is shut down. If `device_state.json` exists when the service is started, the service will restore its state from this file. In this example, the state of the `Device` service will be saved to `device_state.json` when the service is shut down. If `device_state.json` exists when the server is started, the state manager will restore the state of the service from this file.
Note: If the service class structure has changed since the last time its state was saved, only the attributes that have remained the same will be restored from the settings file. ### Controlling Property State Loading with `@load_state`
By default, the state manager only restores values for public attributes of your service. If you have properties that you want to control the loading for, you can use the `@load_state` decorator on your property setters. This indicates to the state manager that the value of the property should be loaded from the state file.
Here is how you can apply the `@load_state` decorator:
```python
from pydase import DataService
from pydase.data_service.state_manager import load_state
class Device(DataService):
_name = "Default Device Name"
@property
def name(self) -> str:
return self._name
@name.setter
@load_state
def name(self, value: str) -> None:
self._name = value
```
With the `@load_state` decorator applied to the `name` property setter, the state manager will load and apply the `name` property's value from the file storing the state upon server startup, assuming it exists.
Note: If the service class structure has changed since the last time its state was saved, only the attributes and properties decorated with `@load_state` that have remained the same will be restored from the settings file.
## Understanding Tasks in pydase ## Understanding Tasks in pydase
@@ -421,7 +533,7 @@ For more information about what you can do with the units, please consult the do
## Changing the Log Level ## Changing the Log Level
You can change the log level of loguru by either You can change the log level of the logger by either
1. (RECOMMENDED) setting the `ENVIRONMENT` environment variable to "production" or "development" 1. (RECOMMENDED) setting the `ENVIRONMENT` environment variable to "production" or "development"
@@ -435,10 +547,14 @@ You can change the log level of loguru by either
```python ```python
# <your_script.py> # <your_script.py>
import logging
from pydase.utils.logging import setup_logging from pydase.utils.logging import setup_logging
setup_logging("INFO") setup_logging("INFO") # or setup_logging(logging.INFO)
logger = logging.getLogger()
# ... and your log
logger.info("My info message.")
``` ```
## Documentation ## Documentation

View File

@@ -107,19 +107,21 @@ Write the React component code, following the structure and patterns used in exi
For example, for the `Image` component, a template could look like this: For example, for the `Image` component, a template could look like this:
```tsx ```tsx
import { emit_update } from '../socket'; // use this when your component should update values in the backend import { setAttribute, runMethod } from '../socket'; // use this when your component should sets values of attributes
// or runs a method, respectively
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import React, { useEffect, useRef, useState } from 'react'; import React, { useEffect, useRef, useState } from 'react';
import { Card, Collapse, Image } from 'react-bootstrap'; import { Card, Collapse, Image } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { ChevronDown, ChevronRight } from 'react-bootstrap-icons'; import { ChevronDown, ChevronRight } from 'react-bootstrap-icons';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface ImageComponentProps { interface ImageComponentProps {
name: string; name: string;
parentPath: string; parentPath: string;
readOnly: boolean; readOnly: boolean;
docString: string; docString: string;
addNotification: (string) => void; addNotification: (message: string) => void;
// Define your component specific props here // Define your component specific props here
value: string; value: string;
format: string; format: string;
@@ -130,6 +132,8 @@ export const ImageComponent = React.memo((props: ImageComponentProps) => {
const renderCount = useRef(0); const renderCount = useRef(0);
const [open, setOpen] = useState(true); // add this if you want to expand/collapse your component const [open, setOpen] = useState(true); // add this if you want to expand/collapse your component
const fullAccessPath = parentPath.concat('.' + name);
const id = getIdFromFullAccessPath(fullAccessPath);
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
@@ -143,7 +147,7 @@ export const ImageComponent = React.memo((props: ImageComponentProps) => {
// Your component logic here // Your component logic here
return ( return (
<div className={'imageComponent'} id={parentPath.concat('.' + name)}> <div className={'imageComponent'} id={id}>
{/* Add the Card and Collapse components here if you want to be able to expand and {/* Add the Card and Collapse components here if you want to be able to expand and
collapse your component. */} collapse your component. */}
<Card> <Card>
@@ -170,29 +174,32 @@ export const ImageComponent = React.memo((props: ImageComponentProps) => {
### Step 3: Emitting Updates to the Backend ### Step 3: Emitting Updates to the Backend
Often, React components in the frontend will need to send updates to the backend, especially when user interactions result in a change of state or data. In `pydase`, we use `socketio` to seamlessly communicate these changes. Here's a detailed guide on how to emit update events from your frontend component: React components in the frontend often need to send updates to the backend, particularly when user interactions modify the component's state or data. In `pydase`, we use `socketio` for smooth communication of these changes. To handle updates, we primarily use two events: `setAttribute` for updating attributes, and `runMethod` for executing backend methods. Below is a detailed guide on how to emit these events from your frontend component:
1. **Setting Up Emission**: Ensure you've imported the required functions and methods for emission. The main function we'll use for this is `emit_update` from the `socket` module: 1. **Setup for emitting events**:
First, ensure you've imported the necessary functions from the `socket` module for both updating attributes and executing methods:
```tsx ```tsx
import { emit_update } from '../socket'; import { setAttribute, runMethod } from '../socket';
``` ```
2. **Understanding the Emission Parameters**: 2. **Event Parameters**:
When emitting an update, we send three main pieces of data: - When using **`setAttribute`**, we send three main pieces of data:
- `name`: The name of the attribute within the `DataService` instance to update.
- `parentPath`: The access path for the parent object of the attribute to be updated.
- `value`: The new value for the attribute, which must match the backend attribute type.
- For **`runMethod`**, the parameters are slightly different:
- `name`: The name of the method to be executed in the backend.
- `parentPath`: Similar to `setAttribute`, it's the access path to the object containing the method.
- `kwargs`: A dictionary of keyword arguments that the method requires.
- `parentPath`: This is the access path for the parent object of the attribute to be updated. This forms the basis to create the full access path for the attribute. For instance, for the attribute access path `attr1.list_attr[0].attr2`, `attr1.list_attr[0]` would be the `parentPath`. 3. **Implementation**:
- `name`: This represents the name of the attribute to be updated within the `DataService` instance. If the attribute is part of a nested structure, this would be the name of the attribute in the last nested object. So, for `attr1.list_attr[0].attr2`, `attr2` would be the name. For illustation, take the `ButtonComponent`. When the button state changes, we want to send this update to the backend:
- `value`: This is the new value intended for the attribute. Ensure that the type of this value matches the type of the attribute in the backend.
3. **Implementing the Emission**:
To illustrate the emission process, let's consider the `ButtonComponent`. When the button state changes, we want to send this update to the backend:
```tsx ```tsx
import { setAttribute } from '../socket';
// ... (other imports) // ... (other imports)
export const ButtonComponent = React.memo((props: ButtonComponentProps) => { export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
@@ -200,7 +207,7 @@ Often, React components in the frontend will need to send updates to the backend
const { name, parentPath, value } = props; const { name, parentPath, value } = props;
const setChecked = (checked: boolean) => { const setChecked = (checked: boolean) => {
emit_update(name, parentPath, checked); setAttribute(name, parentPath, checked);
}; };
return ( return (
@@ -215,7 +222,8 @@ Often, React components in the frontend will need to send updates to the backend
}); });
``` ```
In this example, whenever the button's checked state changes (`onChange` event), we invoke the `setChecked` method, which in turn emits the new state to the backend using `emit_update`. In this example, whenever the button's checked state changes (`onChange` event), we invoke the `setChecked` method, which in turn emits the new state to the backend using `setAttribute`.
### Step 4: Add the New Component to the GenericComponent ### Step 4: Add the New Component to the GenericComponent

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

View File

@@ -0,0 +1,6 @@
# Components Guide
{%
include-markdown "../../README.md"
start="<!-- Component User Guide Start -->"
end="<!-- Component User Guide End -->"
%}

View File

@@ -7,12 +7,10 @@
], ],
"extends": [ "extends": [
"eslint:recommended", "eslint:recommended",
"plugin:@typescript-eslint/eslint-recommended",
"plugin:@typescript-eslint/recommended", "plugin:@typescript-eslint/recommended",
"prettier" "prettier"
], ],
"rules": { "rules": {
"no-console": 1, // Means warning "prettier/prettier": "error"
"prettier/prettier": 2 // Means error }
} }
} }

File diff suppressed because it is too large Load Diff

View File

@@ -46,9 +46,12 @@
"@types/node": "^20.0.0", "@types/node": "^20.0.0",
"@types/react": "^18.0.0", "@types/react": "^18.0.0",
"@types/react-dom": "^18.0.0", "@types/react-dom": "^18.0.0",
"eslint-config-prettier": "^8.8.0", "@typescript-eslint/eslint-plugin": "^6.9.0",
"eslint-plugin-prettier": "^5.0.0", "@typescript-eslint/parser": "^6.9.0",
"prettier": "^3.0.0", "eslint": "^8.52.0",
"@babel/plugin-proposal-private-property-in-object": "7.21.11" "eslint-config-prettier": "^9.0.0",
"eslint-plugin-prettier": "^5.0.1",
"prettier": "^3.0.3",
"typescript": "^4.9.0"
} }
} }

View File

@@ -12,14 +12,28 @@ input.instantUpdate {
} }
.navbarOffset { .navbarOffset {
padding-top: 60px !important; padding-top: 60px !important;
right: 20;
} }
/* .toastContainer { .toastContainer {
position: fixed; position: fixed !important;
} */ padding: 5px;
}
.notificationToast { .notificationToast {
background-color: rgba(114, 214, 253, 0.5) !important; background-color: rgba(114, 214, 253, 0.5) !important;
} }
.exceptionToast { .exceptionToast {
background-color: rgba(216, 41, 18, 0.678) !important; background-color: rgba(216, 41, 18, 0.678) !important;
} }
.buttonComponent {
float: left !important;
margin-right: 10px !important;
}
.stringComponent {
float: left !important;
margin-right: 10px !important;
}
.numberComponent {
float: left !important;
margin-right: 10px !important;
width: 270px !important;
}

View File

@@ -7,6 +7,7 @@ import {
} from './components/DataServiceComponent'; } from './components/DataServiceComponent';
import './App.css'; import './App.css';
import { Notifications } from './components/NotificationsComponent'; import { Notifications } from './components/NotificationsComponent';
import { ConnectionToast } from './components/ConnectionToast';
type ValueType = boolean | string | number | object; type ValueType = boolean | string | number | object;
@@ -34,12 +35,6 @@ type ExceptionMessage = {
* *
* If the property to be updated is an object or an array, it is updated * If the property to be updated is an object or an array, it is updated
* recursively. * recursively.
*
* @param {Array<string>} path - An array where each element is a key in the object,
* forming a path to the property to be updated.
* @param {object} obj - The object to be updated.
* @param {object} value - The new value for the property specified by the path.
* @return {object} - A new object with the specified property updated.
*/ */
function updateNestedObject(path: Array<string>, obj: object, value: ValueType) { function updateNestedObject(path: Array<string>, obj: object, value: ValueType) {
// Base case: If the path is empty, return the new value. // Base case: If the path is empty, return the new value.
@@ -108,15 +103,15 @@ const reducer = (state: State, action: Action): State => {
throw new Error(); throw new Error();
} }
}; };
const App = () => { const App = () => {
const [state, dispatch] = useReducer(reducer, null); const [state, dispatch] = useReducer(reducer, null);
const stateRef = useRef(state); // Declare a reference to hold the current state const stateRef = useRef(state); // Declare a reference to hold the current state
const [isInstantUpdate, setIsInstantUpdate] = useState(false); const [isInstantUpdate, setIsInstantUpdate] = useState(false);
const [showSettings, setShowSettings] = useState(false); const [showSettings, setShowSettings] = useState(false);
const [showNotification, setShowNotification] = useState(true); const [showNotification, setShowNotification] = useState(false);
const [notifications, setNotifications] = useState([]); const [notifications, setNotifications] = useState([]);
const [exceptions, setExceptions] = useState([]); const [exceptions, setExceptions] = useState([]);
const [connectionStatus, setConnectionStatus] = useState('connecting');
// Keep the state reference up to date // Keep the state reference up to date
useEffect(() => { useEffect(() => {
@@ -124,10 +119,37 @@ const App = () => {
}, [state]); }, [state]);
useEffect(() => { useEffect(() => {
// Fetch data from the API when the component mounts // Allow the user to add a custom css file
fetch(`http://${hostname}:${port}/custom.css`)
.then((response) => {
if (response.ok) {
// If the file exists, create a link element for the custom CSS
const link = document.createElement('link');
link.href = `http://${hostname}:${port}/custom.css`;
link.type = 'text/css';
link.rel = 'stylesheet';
document.head.appendChild(link);
}
})
.catch(console.error); // Handle the error appropriately
socket.on('connect', () => {
// Fetch data from the API when the client connects
fetch(`http://${hostname}:${port}/service-properties`) fetch(`http://${hostname}:${port}/service-properties`)
.then((response) => response.json()) .then((response) => response.json())
.then((data: DataServiceJSON) => dispatch({ type: 'SET_DATA', data })); .then((data: DataServiceJSON) => dispatch({ type: 'SET_DATA', data }));
setConnectionStatus('connected');
});
socket.on('disconnect', () => {
setConnectionStatus('disconnected');
setTimeout(() => {
// Only set "reconnecting" is the state is still "disconnected"
// E.g. when the client has already reconnected
setConnectionStatus((currentState) =>
currentState === 'disconnected' ? 'reconnecting' : currentState
);
}, 2000);
});
socket.on('notify', onNotify); socket.on('notify', onNotify);
socket.on('exception', onException); socket.on('exception', onException);
@@ -198,7 +220,7 @@ const App = () => {
// While the data is loading // While the data is loading
if (!state) { if (!state) {
return <p>Loading...</p>; return <ConnectionToast connectionStatus={connectionStatus} />;
} }
return ( return (
<> <>
@@ -244,11 +266,13 @@ const App = () => {
<div className="App navbarOffset"> <div className="App navbarOffset">
<DataServiceComponent <DataServiceComponent
name={''}
props={state as DataServiceJSON} props={state as DataServiceJSON}
isInstantUpdate={isInstantUpdate} isInstantUpdate={isInstantUpdate}
addNotification={addNotification} addNotification={addNotification}
/> />
</div> </div>
<ConnectionToast connectionStatus={connectionStatus} />
</> </>
); );
}; };

View File

@@ -1,7 +1,8 @@
import React, { useEffect, useRef } from 'react'; import React, { useEffect, useRef } from 'react';
import { emit_update } from '../socket'; import { runMethod } from '../socket';
import { InputGroup, Form, Button } from 'react-bootstrap'; import { InputGroup, Form, Button } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface AsyncMethodProps { interface AsyncMethodProps {
name: string; name: string;
@@ -10,13 +11,14 @@ interface AsyncMethodProps {
value: Record<string, string>; value: Record<string, string>;
docString?: string; docString?: string;
hideOutput?: boolean; hideOutput?: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => { export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
const { name, parentPath, docString, value: runningTask, addNotification } = props; const { name, parentPath, docString, value: runningTask, addNotification } = props;
const renderCount = useRef(0); const renderCount = useRef(0);
const formRef = useRef(null); const formRef = useRef(null);
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
@@ -54,18 +56,18 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
const execute = async (event: React.FormEvent) => { const execute = async (event: React.FormEvent) => {
event.preventDefault(); event.preventDefault();
let method_name: string; let method_name: string;
const args = {}; const kwargs: Record<string, unknown> = {};
if (runningTask !== undefined && runningTask !== null) { if (runningTask !== undefined && runningTask !== null) {
method_name = `stop_${name}`; method_name = `stop_${name}`;
} else { } else {
Object.keys(props.parameters).forEach( Object.keys(props.parameters).forEach(
(name) => (args[name] = event.target[name].value) (name) => (kwargs[name] = event.target[name].value)
); );
method_name = `start_${name}`; method_name = `start_${name}`;
} }
emit_update(method_name, parentPath, { args: args }); runMethod(method_name, parentPath, kwargs);
}; };
const args = Object.entries(props.parameters).map(([name, type], index) => { const args = Object.entries(props.parameters).map(([name, type], index) => {
@@ -87,11 +89,9 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
}); });
return ( return (
<div <div className="align-items-center asyncMethodComponent" id={id}>
className="align-items-center asyncMethodComponent"
id={parentPath.concat('.' + name)}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<h5> <h5>
Function: {name} Function: {name}
@@ -99,11 +99,7 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
</h5> </h5>
<Form onSubmit={execute} ref={formRef}> <Form onSubmit={execute} ref={formRef}>
{args} {args}
<Button <Button id={`button-${id}`} name={name} value={parentPath} type="submit">
id={`button-${parentPath}.${name}`}
name={name}
value={parentPath}
type="submit">
{runningTask ? 'Stop' : 'Start'} {runningTask ? 'Stop' : 'Start'}
</Button> </Button>
</Form> </Form>

View File

@@ -1,7 +1,8 @@
import React, { useEffect, useRef } from 'react'; import React, { useEffect, useRef } from 'react';
import { ToggleButton } from 'react-bootstrap'; import { ToggleButton } from 'react-bootstrap';
import { emit_update } from '../socket'; import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface ButtonComponentProps { interface ButtonComponentProps {
name: string; name: string;
@@ -10,13 +11,14 @@ interface ButtonComponentProps {
readOnly: boolean; readOnly: boolean;
docString: string; docString: string;
mapping?: [string, string]; // Enforce a tuple of two strings mapping?: [string, string]; // Enforce a tuple of two strings
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const ButtonComponent = React.memo((props: ButtonComponentProps) => { export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
const { name, parentPath, value, readOnly, docString, mapping, addNotification } = const { name, parentPath, value, readOnly, docString, mapping, addNotification } =
props; props;
const buttonName = mapping ? (value ? mapping[0] : mapping[1]) : name; const buttonName = mapping ? (value ? mapping[0] : mapping[1]) : name;
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
const renderCount = useRef(0); const renderCount = useRef(0);
@@ -29,25 +31,25 @@ export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
}, [props.value]); }, [props.value]);
const setChecked = (checked: boolean) => { const setChecked = (checked: boolean) => {
emit_update(name, parentPath, checked); setAttribute(name, parentPath, checked);
}; };
return ( return (
<div className={'buttonComponent'} id={parentPath.concat('.' + name)}> <div className={'buttonComponent'} id={id}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
<ToggleButton <ToggleButton
id={`toggle-check-${parentPath}.${name}`} id={`toggle-check-${id}`}
type="checkbox" type="checkbox"
variant={value ? 'success' : 'secondary'} variant={value ? 'success' : 'secondary'}
checked={value} checked={value}
value={parentPath} value={parentPath}
disabled={readOnly} disabled={readOnly}
onChange={(e) => setChecked(e.currentTarget.checked)}> onChange={(e) => setChecked(e.currentTarget.checked)}>
<p>{buttonName}</p> {buttonName}
</ToggleButton> </ToggleButton>
</div> </div>
); );

View File

@@ -0,0 +1,76 @@
import React, { useEffect, useRef } from 'react';
import { InputGroup, Form, Row, Col } from 'react-bootstrap';
import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface ColouredEnumComponentProps {
name: string;
parentPath: string;
value: string;
docString?: string;
readOnly: boolean;
enumDict: Record<string, string>;
addNotification: (message: string) => void;
}
export const ColouredEnumComponent = React.memo((props: ColouredEnumComponentProps) => {
const {
name,
parentPath: parentPath,
value,
docString,
enumDict,
readOnly,
addNotification
} = props;
const renderCount = useRef(0);
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
useEffect(() => {
renderCount.current++;
});
useEffect(() => {
addNotification(`${parentPath}.${name} changed to ${value}.`);
}, [props.value]);
const handleValueChange = (newValue: string) => {
setAttribute(name, parentPath, newValue);
};
return (
<div className={'enumComponent'} id={id}>
{process.env.NODE_ENV === 'development' && (
<div>Render count: {renderCount.current}</div>
)}
<DocStringComponent docString={docString} />
<Row>
<Col className="d-flex align-items-center">
<InputGroup.Text>{name}</InputGroup.Text>
{readOnly ? (
// Display the Form.Control when readOnly is true
<Form.Control
value={value}
disabled={true}
style={{ backgroundColor: enumDict[value] }}
/>
) : (
// Display the Form.Select when readOnly is false
<Form.Select
aria-label="coloured-enum-select"
value={value}
style={{ backgroundColor: enumDict[value] }}
onChange={(event) => handleValueChange(event.target.value)}>
{Object.entries(enumDict).map(([key]) => (
<option key={key} value={key}>
{key}
</option>
))}
</Form.Select>
)}
</Col>
</Row>
</div>
);
});

View File

@@ -0,0 +1,86 @@
import React, { useEffect, useState } from 'react';
import { Toast, Button, ToastContainer } from 'react-bootstrap';
type ConnectionToastProps = {
connectionStatus: string;
};
/**
* ConnectionToast Component
*
* Displays a toast notification that reflects the current connection status.
*
* Props:
* - connectionStatus (string): The current status of the connection which can be
* 'connecting', 'connected', 'disconnected', or 'reconnecting'. The component uses this
* status to determine the message, background color (`bg`), and auto-hide delay of the toast.
*
* The toast is designed to automatically appear based on changes to the `connectionStatus` prop
* and provides a close button to manually dismiss the toast. It uses `react-bootstrap`'s Toast
* component to show the connection status in a stylized format, and Bootstrap's utility classes
* for alignment and spacing.
*/
export const ConnectionToast = React.memo(
({ connectionStatus }: ConnectionToastProps) => {
const [show, setShow] = useState(true);
useEffect(() => {
setShow(true);
}, [connectionStatus]);
const handleClose = () => setShow(false);
const getToastContent = (): {
message: string;
bg: string; // bootstrap uses `bg` prop for background color
delay: number | undefined;
} => {
switch (connectionStatus) {
case 'connecting':
return {
message: 'Connecting...',
bg: 'info',
delay: undefined
};
case 'connected':
return { message: 'Connected', bg: 'success', delay: 1000 };
case 'disconnected':
return {
message: 'Disconnected',
bg: 'danger',
delay: undefined
};
case 'reconnecting':
return {
message: 'Reconnecting...',
bg: 'info',
delay: undefined
};
default:
return {
message: '',
bg: 'info',
delay: undefined
};
}
};
const { message, bg, delay } = getToastContent();
return (
<ToastContainer position="bottom-center" className="toastContainer">
<Toast
show={show}
onClose={handleClose}
delay={delay}
autohide={delay !== undefined}
bg={bg}>
<Toast.Body className="d-flex justify-content-between">
{message}
<Button variant="close" size="sm" onClick={handleClose} />
</Toast.Body>
</Toast>
</ToastContainer>
);
}
);

View File

@@ -3,33 +3,41 @@ import React from 'react';
import { Card, Collapse } from 'react-bootstrap'; import { Card, Collapse } from 'react-bootstrap';
import { ChevronDown, ChevronRight } from 'react-bootstrap-icons'; import { ChevronDown, ChevronRight } from 'react-bootstrap-icons';
import { Attribute, GenericComponent } from './GenericComponent'; import { Attribute, GenericComponent } from './GenericComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
type DataServiceProps = { type DataServiceProps = {
name: string;
props: DataServiceJSON; props: DataServiceJSON;
parentPath?: string; parentPath?: string;
isInstantUpdate: boolean; isInstantUpdate: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
}; };
export type DataServiceJSON = Record<string, Attribute>; export type DataServiceJSON = Record<string, Attribute>;
export const DataServiceComponent = React.memo( export const DataServiceComponent = React.memo(
({ ({
name,
props, props,
parentPath = 'DataService', parentPath = 'DataService',
isInstantUpdate, isInstantUpdate,
addNotification addNotification
}: DataServiceProps) => { }: DataServiceProps) => {
const [open, setOpen] = useState(true); const [open, setOpen] = useState(true);
let fullAccessPath = parentPath;
if (name) {
fullAccessPath = parentPath.concat('.' + name);
}
const id = getIdFromFullAccessPath(fullAccessPath);
return ( return (
<div className="dataServiceComponent"> <div className="dataServiceComponent" id={id}>
<Card className="mb-3"> <Card className="mb-3">
<Card.Header <Card.Header
onClick={() => setOpen(!open)} onClick={() => setOpen(!open)}
style={{ cursor: 'pointer' }} // Change cursor style on hover style={{ cursor: 'pointer' }} // Change cursor style on hover
> >
{parentPath} {open ? <ChevronDown /> : <ChevronRight />} {fullAccessPath} {open ? <ChevronDown /> : <ChevronRight />}
</Card.Header> </Card.Header>
<Collapse in={open}> <Collapse in={open}>
<Card.Body> <Card.Body>
@@ -39,7 +47,7 @@ export const DataServiceComponent = React.memo(
key={key} key={key}
attribute={value} attribute={value}
name={key} name={key}
parentPath={parentPath} parentPath={fullAccessPath}
isInstantUpdate={isInstantUpdate} isInstantUpdate={isInstantUpdate}
addNotification={addNotification} addNotification={addNotification}
/> />

View File

@@ -1,6 +1,6 @@
import React, { useEffect, useRef } from 'react'; import React, { useEffect, useRef } from 'react';
import { InputGroup, Form, Row, Col } from 'react-bootstrap'; import { InputGroup, Form, Row, Col } from 'react-bootstrap';
import { emit_update } from '../socket'; import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
interface EnumComponentProps { interface EnumComponentProps {
@@ -9,7 +9,7 @@ interface EnumComponentProps {
value: string; value: string;
docString?: string; docString?: string;
enumDict: Record<string, string>; enumDict: Record<string, string>;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const EnumComponent = React.memo((props: EnumComponentProps) => { export const EnumComponent = React.memo((props: EnumComponentProps) => {
@@ -33,13 +33,13 @@ export const EnumComponent = React.memo((props: EnumComponentProps) => {
}, [props.value]); }, [props.value]);
const handleValueChange = (newValue: string) => { const handleValueChange = (newValue: string) => {
emit_update(name, parentPath, newValue); setAttribute(name, parentPath, newValue);
}; };
return ( return (
<div className={'enumComponent'} id={parentPath.concat('.' + name)}> <div className={'enumComponent'} id={parentPath.concat('.' + name)}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
<Row> <Row>

View File

@@ -9,6 +9,7 @@ import { StringComponent } from './StringComponent';
import { ListComponent } from './ListComponent'; import { ListComponent } from './ListComponent';
import { DataServiceComponent, DataServiceJSON } from './DataServiceComponent'; import { DataServiceComponent, DataServiceJSON } from './DataServiceComponent';
import { ImageComponent } from './ImageComponent'; import { ImageComponent } from './ImageComponent';
import { ColouredEnumComponent } from './ColouredEnumComponent';
type AttributeType = type AttributeType =
| 'str' | 'str'
@@ -21,7 +22,8 @@ type AttributeType =
| 'DataService' | 'DataService'
| 'Enum' | 'Enum'
| 'NumberSlider' | 'NumberSlider'
| 'Image'; | 'Image'
| 'ColouredEnum';
type ValueType = boolean | string | number | object; type ValueType = boolean | string | number | object;
export interface Attribute { export interface Attribute {
@@ -38,7 +40,7 @@ type GenericComponentProps = {
name: string; name: string;
parentPath: string; parentPath: string;
isInstantUpdate: boolean; isInstantUpdate: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
}; };
export const GenericComponent = React.memo( export const GenericComponent = React.memo(
@@ -151,8 +153,9 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'DataService') { } else if (attribute.type === 'DataService') {
return ( return (
<DataServiceComponent <DataServiceComponent
name={name}
props={attribute.value as DataServiceJSON} props={attribute.value as DataServiceJSON}
parentPath={parentPath.concat('.', name)} parentPath={parentPath}
isInstantUpdate={isInstantUpdate} isInstantUpdate={isInstantUpdate}
addNotification={addNotification} addNotification={addNotification}
/> />
@@ -181,6 +184,19 @@ export const GenericComponent = React.memo(
addNotification={addNotification} addNotification={addNotification}
/> />
); );
} else if (attribute.type === 'ColouredEnum') {
console.log(attribute);
return (
<ColouredEnumComponent
name={name}
parentPath={parentPath}
docString={attribute.doc}
value={String(attribute.value)}
readOnly={attribute.readonly}
enumDict={attribute.enum}
addNotification={addNotification}
/>
);
} else { } else {
return <div key={name}>{name}</div>; return <div key={name}>{name}</div>;
} }

View File

@@ -2,6 +2,7 @@ import React, { useEffect, useRef, useState } from 'react';
import { Card, Collapse, Image } from 'react-bootstrap'; import { Card, Collapse, Image } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { ChevronDown, ChevronRight } from 'react-bootstrap-icons'; import { ChevronDown, ChevronRight } from 'react-bootstrap-icons';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface ImageComponentProps { interface ImageComponentProps {
name: string; name: string;
@@ -10,7 +11,7 @@ interface ImageComponentProps {
readOnly: boolean; readOnly: boolean;
docString: string; docString: string;
format: string; format: string;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const ImageComponent = React.memo((props: ImageComponentProps) => { export const ImageComponent = React.memo((props: ImageComponentProps) => {
@@ -18,6 +19,7 @@ export const ImageComponent = React.memo((props: ImageComponentProps) => {
const renderCount = useRef(0); const renderCount = useRef(0);
const [open, setOpen] = useState(true); const [open, setOpen] = useState(true);
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
@@ -28,7 +30,10 @@ export const ImageComponent = React.memo((props: ImageComponentProps) => {
}, [props.value]); }, [props.value]);
return ( return (
<div className={'imageComponent'} id={parentPath.concat('.' + name)}> <div className={'imageComponent'} id={id}>
{process.env.NODE_ENV === 'development' && (
<div>Render count: {renderCount.current}</div>
)}
<Card> <Card>
<Card.Header <Card.Header
onClick={() => setOpen(!open)} onClick={() => setOpen(!open)}

View File

@@ -1,6 +1,7 @@
import React, { useEffect, useRef } from 'react'; import React, { useEffect, useRef } from 'react';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { Attribute, GenericComponent } from './GenericComponent'; import { Attribute, GenericComponent } from './GenericComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface ListComponentProps { interface ListComponentProps {
name: string; name: string;
@@ -8,7 +9,7 @@ interface ListComponentProps {
value: Attribute[]; value: Attribute[];
docString: string; docString: string;
isInstantUpdate: boolean; isInstantUpdate: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const ListComponent = React.memo((props: ListComponentProps) => { export const ListComponent = React.memo((props: ListComponentProps) => {
@@ -16,15 +17,16 @@ export const ListComponent = React.memo((props: ListComponentProps) => {
props; props;
const renderCount = useRef(0); const renderCount = useRef(0);
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
}, [props]); }, [props]);
return ( return (
<div className={'listComponent'} id={parentPath.concat(name)}> <div className={'listComponent'} id={id}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
{value.map((item, index) => { {value.map((item, index) => {

View File

@@ -1,7 +1,8 @@
import React, { useState, useEffect, useRef } from 'react'; import React, { useState, useEffect, useRef } from 'react';
import { emit_update } from '../socket'; import { runMethod } from '../socket';
import { Button, InputGroup, Form, Collapse } from 'react-bootstrap'; import { Button, InputGroup, Form, Collapse } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface MethodProps { interface MethodProps {
name: string; name: string;
@@ -9,7 +10,7 @@ interface MethodProps {
parameters: Record<string, string>; parameters: Record<string, string>;
docString?: string; docString?: string;
hideOutput?: boolean; hideOutput?: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const MethodComponent = React.memo((props: MethodProps) => { export const MethodComponent = React.memo((props: MethodProps) => {
@@ -19,6 +20,7 @@ export const MethodComponent = React.memo((props: MethodProps) => {
const [hideOutput, setHideOutput] = useState(false); const [hideOutput, setHideOutput] = useState(false);
// Add a new state variable to hold the list of function calls // Add a new state variable to hold the list of function calls
const [functionCalls, setFunctionCalls] = useState([]); const [functionCalls, setFunctionCalls] = useState([]);
const id = getIdFromFullAccessPath(parentPath.concat('.' + name));
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
@@ -44,18 +46,21 @@ export const MethodComponent = React.memo((props: MethodProps) => {
const execute = async (event: React.FormEvent) => { const execute = async (event: React.FormEvent) => {
event.preventDefault(); event.preventDefault();
const args = {}; const kwargs = {};
Object.keys(props.parameters).forEach( Object.keys(props.parameters).forEach(
(name) => (args[name] = event.target[name].value) (name) => (kwargs[name] = event.target[name].value)
); );
emit_update(name, parentPath, { args: args }, (ack) => { runMethod(name, parentPath, kwargs, (ack) => {
// Update the functionCalls state with the new call if we get an acknowledge msg // Update the functionCalls state with the new call if we get an acknowledge msg
if (ack !== undefined) { if (ack !== undefined) {
setFunctionCalls((prevCalls) => [...prevCalls, { name, args, result: ack }]); setFunctionCalls((prevCalls) => [
...prevCalls,
{ name, args: kwargs, result: ack }
]);
} }
}); });
triggerNotification(args); triggerNotification(kwargs);
}; };
const args = Object.entries(props.parameters).map(([name, type], index) => { const args = Object.entries(props.parameters).map(([name, type], index) => {
@@ -69,11 +74,9 @@ export const MethodComponent = React.memo((props: MethodProps) => {
}); });
return ( return (
<div <div className="align-items-center methodComponent" id={id}>
className="align-items-center methodComponent"
id={parentPath.concat('.' + name)}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<h5 onClick={() => setHideOutput(!hideOutput)} style={{ cursor: 'pointer' }}> <h5 onClick={() => setHideOutput(!hideOutput)} style={{ cursor: 'pointer' }}>
Function: {name} Function: {name}
@@ -81,11 +84,9 @@ export const MethodComponent = React.memo((props: MethodProps) => {
</h5> </h5>
<Form onSubmit={execute}> <Form onSubmit={execute}>
{args} {args}
<div>
<Button variant="primary" type="submit"> <Button variant="primary" type="submit">
Execute Execute
</Button> </Button>
</div>
</Form> </Form>
<Collapse in={!hideOutput}> <Collapse in={!hideOutput}>

View File

@@ -25,10 +25,7 @@ export const Notifications = React.memo((props: NotificationProps) => {
} = props; } = props;
return ( return (
<ToastContainer <ToastContainer className="navbarOffset toastContainer" position="top-end">
className="navbarOffset toastContainer"
position="top-end"
style={{ position: 'fixed' }}>
{showNotification && {showNotification &&
notifications.map((notification) => ( notifications.map((notification) => (
<Toast <Toast

View File

@@ -1,8 +1,9 @@
import React, { useEffect, useRef, useState } from 'react'; import React, { useEffect, useRef, useState } from 'react';
import { Form, InputGroup } from 'react-bootstrap'; import { Form, InputGroup } from 'react-bootstrap';
import { emit_update } from '../socket'; import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import '../App.css'; import '../App.css';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
// TODO: add button functionality // TODO: add button functionality
@@ -22,7 +23,7 @@ interface NumberComponentProps {
value: number, value: number,
callback?: (ack: unknown) => void callback?: (ack: unknown) => void
) => void; ) => void;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
// TODO: highlight the digit that is being changed by setting both selectionStart and // TODO: highlight the digit that is being changed by setting both selectionStart and
@@ -30,8 +31,8 @@ interface NumberComponentProps {
const handleArrowKey = ( const handleArrowKey = (
key: string, key: string,
value: string, value: string,
selectionStart: number, selectionStart: number
selectionEnd: number // selectionEnd: number
) => { ) => {
// Split the input value into the integer part and decimal part // Split the input value into the integer part and decimal part
const parts = value.split('.'); const parts = value.split('.');
@@ -124,20 +125,22 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
// If emitUpdate is passed, use this instead of the emit_update from the socket // If emitUpdate is passed, use this instead of the emit_update from the socket
// Also used when used with a slider // Also used when used with a slider
const emitUpdate = const emitUpdate =
props.customEmitUpdate !== undefined ? props.customEmitUpdate : emit_update; props.customEmitUpdate !== undefined ? props.customEmitUpdate : setAttribute;
const renderCount = useRef(0); const renderCount = useRef(0);
// Create a state for the cursor position // Create a state for the cursor position
const [cursorPosition, setCursorPosition] = useState(null); const [cursorPosition, setCursorPosition] = useState(null);
// Create a state for the input string // Create a state for the input string
const [inputString, setInputString] = useState(props.value.toString()); const [inputString, setInputString] = useState(props.value.toString());
const fullAccessPath = parentPath.concat('.' + name);
const id = getIdFromFullAccessPath(fullAccessPath);
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
// Set the cursor position after the component re-renders // Set the cursor position after the component re-renders
const inputElement = document.getElementsByName( const inputElement = document.getElementsByName(
parentPath.concat(name) fullAccessPath
)[0] as HTMLInputElement; )[0] as HTMLInputElement;
if (inputElement && cursorPosition !== null) { if (inputElement && cursorPosition !== null) {
inputElement.setSelectionRange(cursorPosition, cursorPosition); inputElement.setSelectionRange(cursorPosition, cursorPosition);
@@ -214,6 +217,16 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
// Select everything when pressing Ctrl + a // Select everything when pressing Ctrl + a
target.setSelectionRange(0, target.value.length); target.setSelectionRange(0, target.value.length);
return; return;
} else if (key === '-') {
if (selectionStart === 0 && !value.startsWith('-')) {
newValue = '-' + value;
selectionStart++;
} else if (value.startsWith('-') && selectionStart === 1) {
newValue = value.substring(1); // remove minus sign
selectionStart--;
} else {
return; // Ignore "-" pressed in other positions
}
} else if (!isNaN(key) && key !== ' ') { } else if (!isNaN(key) && key !== ' ') {
// Check if a number key or a decimal point key is pressed // Check if a number key or a decimal point key is pressed
({ value: newValue, selectionStart } = handleNumericKey( ({ value: newValue, selectionStart } = handleNumericKey(
@@ -233,8 +246,8 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
({ value: newValue, selectionStart } = handleArrowKey( ({ value: newValue, selectionStart } = handleArrowKey(
key, key,
value, value,
selectionStart, selectionStart
selectionEnd // selectionEnd
)); ));
} else if (key === 'Backspace') { } else if (key === 'Backspace') {
({ value: newValue, selectionStart } = handleBackspaceKey( ({ value: newValue, selectionStart } = handleBackspaceKey(
@@ -275,9 +288,9 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
}; };
return ( return (
<div className="numberComponent" id={parentPath.concat('.' + name)}> <div className="numberComponent" id={id}>
{process.env.NODE_ENV === 'development' && showName && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
<div className="d-flex"> <div className="d-flex">
@@ -287,7 +300,7 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
type="text" type="text"
value={inputString} value={inputString}
disabled={readOnly} disabled={readOnly}
name={parentPath.concat(name)} name={fullAccessPath}
onKeyDown={handleKeyDown} onKeyDown={handleKeyDown}
onBlur={handleBlur} onBlur={handleBlur}
className={isInstantUpdate && !readOnly ? 'instantUpdate' : ''} className={isInstantUpdate && !readOnly ? 'instantUpdate' : ''}

View File

@@ -1,9 +1,10 @@
import React, { useEffect, useRef, useState } from 'react'; import React, { useEffect, useRef, useState } from 'react';
import { InputGroup, Form, Row, Col, Collapse, ToggleButton } from 'react-bootstrap'; import { InputGroup, Form, Row, Col, Collapse, ToggleButton } from 'react-bootstrap';
import { emit_update } from '../socket'; import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import { Slider } from '@mui/material'; import { Slider } from '@mui/material';
import { NumberComponent } from './NumberComponent'; import { NumberComponent } from './NumberComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
interface SliderComponentProps { interface SliderComponentProps {
name: string; name: string;
@@ -15,17 +16,12 @@ interface SliderComponentProps {
docString: string; docString: string;
stepSize: number; stepSize: number;
isInstantUpdate: boolean; isInstantUpdate: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const SliderComponent = React.memo((props: SliderComponentProps) => { export const SliderComponent = React.memo((props: SliderComponentProps) => {
const renderCount = useRef(0); const renderCount = useRef(0);
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
useEffect(() => {
renderCount.current++;
});
const { const {
name, name,
parentPath, parentPath,
@@ -38,6 +34,12 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
isInstantUpdate, isInstantUpdate,
addNotification addNotification
} = props; } = props;
const fullAccessPath = parentPath.concat('.' + name);
const id = getIdFromFullAccessPath(fullAccessPath);
useEffect(() => {
renderCount.current++;
});
useEffect(() => { useEffect(() => {
addNotification(`${parentPath}.${name} changed to ${value}.`); addNotification(`${parentPath}.${name} changed to ${value}.`);
@@ -64,7 +66,7 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
max: number = props.max, max: number = props.max,
stepSize: number = props.stepSize stepSize: number = props.stepSize
) => { ) => {
emit_update( setAttribute(
name, name,
parentPath, parentPath,
{ {
@@ -102,9 +104,9 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
}; };
return ( return (
<div className="sliderComponent" id={parentPath.concat('.' + name)}> <div className="sliderComponent" id={id}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
@@ -145,6 +147,7 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
</Col> </Col>
<Col xs="auto"> <Col xs="auto">
<ToggleButton <ToggleButton
id={`button-${id}`}
onClick={() => setOpen(!open)} onClick={() => setOpen(!open)}
type="checkbox" type="checkbox"
checked={open} checked={open}

View File

@@ -1,8 +1,9 @@
import React, { useEffect, useRef, useState } from 'react'; import React, { useEffect, useRef, useState } from 'react';
import { Form, InputGroup } from 'react-bootstrap'; import { Form, InputGroup } from 'react-bootstrap';
import { emit_update } from '../socket'; import { setAttribute } from '../socket';
import { DocStringComponent } from './DocStringComponent'; import { DocStringComponent } from './DocStringComponent';
import '../App.css'; import '../App.css';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
// TODO: add button functionality // TODO: add button functionality
@@ -13,7 +14,7 @@ interface StringComponentProps {
readOnly: boolean; readOnly: boolean;
docString: string; docString: string;
isInstantUpdate: boolean; isInstantUpdate: boolean;
addNotification: (string) => void; addNotification: (message: string) => void;
} }
export const StringComponent = React.memo((props: StringComponentProps) => { export const StringComponent = React.memo((props: StringComponentProps) => {
@@ -22,6 +23,8 @@ export const StringComponent = React.memo((props: StringComponentProps) => {
const renderCount = useRef(0); const renderCount = useRef(0);
const [inputString, setInputString] = useState(props.value); const [inputString, setInputString] = useState(props.value);
const fullAccessPath = parentPath.concat('.' + name);
const id = getIdFromFullAccessPath(fullAccessPath);
useEffect(() => { useEffect(() => {
renderCount.current++; renderCount.current++;
@@ -38,26 +41,26 @@ export const StringComponent = React.memo((props: StringComponentProps) => {
const handleChange = (event) => { const handleChange = (event) => {
setInputString(event.target.value); setInputString(event.target.value);
if (isInstantUpdate) { if (isInstantUpdate) {
emit_update(name, parentPath, event.target.value); setAttribute(name, parentPath, event.target.value);
} }
}; };
const handleKeyDown = (event) => { const handleKeyDown = (event) => {
if (event.key === 'Enter' && !isInstantUpdate) { if (event.key === 'Enter' && !isInstantUpdate) {
emit_update(name, parentPath, inputString); setAttribute(name, parentPath, inputString);
} }
}; };
const handleBlur = () => { const handleBlur = () => {
if (!isInstantUpdate) { if (!isInstantUpdate) {
emit_update(name, parentPath, inputString); setAttribute(name, parentPath, inputString);
} }
}; };
return ( return (
<div className={'stringComponent'} id={parentPath.concat(name)}> <div className={'stringComponent'} id={id}>
{process.env.NODE_ENV === 'development' && ( {process.env.NODE_ENV === 'development' && (
<p>Render count: {renderCount.current}</p> <div>Render count: {renderCount.current}</div>
)} )}
<DocStringComponent docString={docString} /> <DocStringComponent docString={docString} />
<InputGroup> <InputGroup>

View File

@@ -9,15 +9,28 @@ console.debug('Websocket: ', URL);
export const socket = io(URL, { path: '/ws/socket.io', transports: ['websocket'] }); export const socket = io(URL, { path: '/ws/socket.io', transports: ['websocket'] });
export const emit_update = ( export const setAttribute = (
name: string, name: string,
parentPath: string, parentPath: string,
value: unknown, value: unknown,
callback?: (ack: unknown) => void callback?: (ack: unknown) => void
) => { ) => {
if (callback) { if (callback) {
socket.emit('frontend_update', { name, parent_path: parentPath, value }, callback); socket.emit('set_attribute', { name, parent_path: parentPath, value }, callback);
} else { } else {
socket.emit('frontend_update', { name, parent_path: parentPath, value }); socket.emit('set_attribute', { name, parent_path: parentPath, value });
}
};
export const runMethod = (
name: string,
parentPath: string,
kwargs: Record<string, unknown>,
callback?: (ack: unknown) => void
) => {
if (callback) {
socket.emit('run_method', { name, parent_path: parentPath, kwargs }, callback);
} else {
socket.emit('run_method', { name, parent_path: parentPath, kwargs });
} }
}; };

View File

@@ -0,0 +1,12 @@
export function getIdFromFullAccessPath(fullAccessPath: string) {
// Replace '].' with a single dash
let id = fullAccessPath.replace(/\]\./g, '-');
// Replace any character that is not a word character or underscore with a dash
id = id.replace(/[^\w_]+/g, '-');
// Remove any trailing dashes
id = id.replace(/-+$/, '');
return id;
}

View File

@@ -4,6 +4,8 @@ edit_uri: blob/docs/docs/
nav: nav:
- Home: index.md - Home: index.md
- Getting Started: getting-started.md - Getting Started: getting-started.md
- User Guide:
- Components Guide: user-guide/Components.md
- Developer Guide: - Developer Guide:
- Developer Guide: dev-guide/README.md - Developer Guide: dev-guide/README.md
- API Reference: dev-guide/api.md - API Reference: dev-guide/api.md
@@ -22,7 +24,6 @@ markdown_extensions:
- smarty - smarty
- toc: - toc:
permalink: true permalink: true
baselevel: 4
- pymdownx.highlight: - pymdownx.highlight:
anchor_linenums: true anchor_linenums: true
- pymdownx.snippets - pymdownx.snippets
@@ -38,5 +39,3 @@ plugins:
watch: watch:
- src/pydase - src/pydase

1160
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[tool.poetry] [tool.poetry]
name = "pydase" name = "pydase"
version = "0.1.1" version = "0.3.1"
description = "A flexible and robust Python library for creating, managing, and interacting with data services, with built-in support for web and RPC servers, and customizable features for diverse use cases." description = "A flexible and robust Python library for creating, managing, and interacting with data services, with built-in support for web and RPC servers, and customizable features for diverse use cases."
authors = ["Mose Mueller <mosmuell@ethz.ch>"] authors = ["Mose Mueller <mosmuell@ethz.ch>"]
readme = "README.md" readme = "README.md"
@@ -10,7 +10,6 @@ packages = [{ include = "pydase", from = "src" }]
[tool.poetry.dependencies] [tool.poetry.dependencies]
python = "^3.10" python = "^3.10"
rpyc = "^5.3.1" rpyc = "^5.3.1"
loguru = "^0.7.0"
fastapi = "^0.100.0" fastapi = "^0.100.0"
uvicorn = "^0.22.0" uvicorn = "^0.22.0"
toml = "^0.10.2" toml = "^0.10.2"
@@ -36,6 +35,7 @@ flake8-pep604 = "^0.1.0"
flake8-eradicate = "^1.4.0" flake8-eradicate = "^1.4.0"
matplotlib = "^3.7.2" matplotlib = "^3.7.2"
pyright = "^1.1.323" pyright = "^1.1.323"
pytest-mock = "^3.11.1"
[tool.poetry.group.docs.dependencies] [tool.poetry.group.docs.dependencies]
@@ -49,12 +49,13 @@ requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api" build-backend = "poetry.core.masonry.api"
[tool.pyright] [tool.pyright]
include = ["src/pydase", "tests"] include = ["src/pydase"]
exclude = ["**/node_modules", "**/__pycache__", "docs", "frontend"] exclude = ["**/node_modules", "**/__pycache__", "docs", "frontend", "tests"]
venvPath = "." venvPath = "."
venv = ".venv" venv = ".venv"
typeCheckingMode = "basic" typeCheckingMode = "basic"
reportUnknownMemberType = true reportUnknownMemberType = true
reportUnknownParameterType = true
[tool.black] [tool.black]
line-length = 88 line-length = 88

View File

@@ -27,10 +27,12 @@ print(my_service.voltage.value) # Output: 5
``` ```
""" """
from pydase.components.coloured_enum import ColouredEnum
from pydase.components.image import Image from pydase.components.image import Image
from pydase.components.number_slider import NumberSlider from pydase.components.number_slider import NumberSlider
__all__ = [ __all__ = [
"NumberSlider", "NumberSlider",
"Image", "Image",
"ColouredEnum",
] ]

View File

@@ -0,0 +1,61 @@
from enum import Enum
class ColouredEnum(Enum):
"""
Represents a UI element that can display colour-coded text based on its value.
This class extends the standard Enum but requires its values to be valid CSS
colour codes. Supported colour formats include:
- Hexadecimal colours
- Hexadecimal colours with transparency
- RGB colours
- RGBA colours
- HSL colours
- HSLA colours
- Predefined/Cross-browser colour names
Refer to the this website for more details on colour formats:
(https://www.w3schools.com/cssref/css_colours_legal.php)
The behavior of this component in the UI depends on how it's defined in the data
service:
- As property with a setter or as attribute: Renders as a dropdown menu,
allowing users to select and change its value from the frontend.
- As property without a setter: Displays as a coloured box with the key of the
`ColouredEnum` as text inside, serving as a visual indicator without user
interaction.
Example:
--------
```python
import pydase.components as pyc
import pydase
class MyStatus(pyc.ColouredEnum):
PENDING = "#FFA500" # Orange
RUNNING = "#0000FF80" # Transparent Blue
PAUSED = "rgb(169, 169, 169)" # Dark Gray
RETRYING = "rgba(255, 255, 0, 0.3)" # Transparent Yellow
COMPLETED = "hsl(120, 100%, 50%)" # Green
FAILED = "hsla(0, 100%, 50%, 0.7)" # Transparent Red
CANCELLED = "SlateGray" # Slate Gray
class StatusExample(pydase.DataService):
_status = MyStatus.RUNNING
@property
def status(self) -> MyStatus:
return self._status
@status.setter
def status(self, value: MyStatus) -> None:
# Custom logic here...
self._status = value
# Example usage:
my_service = StatusExample()
my_service.status = MyStatus.FAILED
```
"""
pass

View File

@@ -1,17 +1,19 @@
import base64 import base64
import io import io
import logging
from pathlib import Path from pathlib import Path
from typing import TYPE_CHECKING, Optional from typing import TYPE_CHECKING, Optional
from urllib.request import urlopen from urllib.request import urlopen
import PIL.Image import PIL.Image # type: ignore
from loguru import logger
from pydase.data_service.data_service import DataService from pydase.data_service.data_service import DataService
if TYPE_CHECKING: if TYPE_CHECKING:
from matplotlib.figure import Figure from matplotlib.figure import Figure
logger = logging.getLogger(__name__)
class Image(DataService): class Image(DataService):
def __init__( def __init__(
@@ -54,7 +56,7 @@ class Image(DataService):
self._load_from_base64(value_, format_) self._load_from_base64(value_, format_)
def _load_from_base64(self, value_: bytes, format_: str) -> None: def _load_from_base64(self, value_: bytes, format_: str) -> None:
value = value_.decode("utf-8") if isinstance(value_, bytes) else value_ value = value_.decode("utf-8")
self._value = value self._value = value
self._format = format_ self._format = format_

View File

@@ -1,9 +1,10 @@
import logging
from typing import Any, Literal from typing import Any, Literal
from loguru import logger
from pydase.data_service.data_service import DataService from pydase.data_service.data_service import DataService
logger = logging.getLogger(__name__)
class NumberSlider(DataService): class NumberSlider(DataService):
""" """

View File

@@ -4,9 +4,9 @@ from abc import ABC
from typing import TYPE_CHECKING, Any from typing import TYPE_CHECKING, Any
if TYPE_CHECKING: if TYPE_CHECKING:
from .callback_manager import CallbackManager from pydase.data_service.callback_manager import CallbackManager
from .data_service import DataService from pydase.data_service.data_service import DataService
from .task_manager import TaskManager from pydase.data_service.task_manager import TaskManager
class AbstractDataService(ABC): class AbstractDataService(ABC):

View File

@@ -1,10 +1,9 @@
from __future__ import annotations from __future__ import annotations
import inspect import inspect
import logging
from collections.abc import Callable from collections.abc import Callable
from typing import TYPE_CHECKING, Any from typing import TYPE_CHECKING, Any, cast
from loguru import logger
from pydase.data_service.abstract_data_service import AbstractDataService from pydase.data_service.abstract_data_service import AbstractDataService
from pydase.utils.helpers import get_class_and_instance_attributes from pydase.utils.helpers import get_class_and_instance_attributes
@@ -14,6 +13,8 @@ from .data_service_list import DataServiceList
if TYPE_CHECKING: if TYPE_CHECKING:
from .data_service import DataService from .data_service import DataService
logger = logging.getLogger(__name__)
class CallbackManager: class CallbackManager:
_notification_callbacks: list[Callable[[str, str, Any], Any]] = [] _notification_callbacks: list[Callable[[str, str, Any], Any]] = []
@@ -55,9 +56,10 @@ class CallbackManager:
self, obj: "AbstractDataService", parent_path: str self, obj: "AbstractDataService", parent_path: str
) -> None: ) -> None:
""" """
This method ensures that notifications are emitted whenever a list attribute of This method ensures that notifications are emitted whenever a public list
a DataService instance changes. These notifications pertain solely to the list attribute of a DataService instance changes. These notifications pertain solely
item changes, not to changes in attributes of objects within the list. to the list item changes, not to changes in attributes of objects within the
list.
The method works by converting all list attributes (both at the class and The method works by converting all list attributes (both at the class and
instance levels) into DataServiceList objects. Each DataServiceList is then instance levels) into DataServiceList objects. Each DataServiceList is then
@@ -101,6 +103,8 @@ class CallbackManager:
value=value, value=value,
) )
if self.service == self.service.__root__ if self.service == self.service.__root__
# Skip private and protected lists
and not cast(str, attr_name).startswith("_")
else None else None
) )
@@ -109,9 +113,12 @@ class CallbackManager:
attr_value.add_callback(callback) attr_value.add_callback(callback)
continue continue
if id(attr_value) in self._list_mapping: if id(attr_value) in self._list_mapping:
# If the list `attr_value` was already referenced somewhere else
notifying_list = self._list_mapping[id(attr_value)] notifying_list = self._list_mapping[id(attr_value)]
notifying_list.add_callback(callback) notifying_list.add_callback(callback)
else: else:
# convert the builtin list into a DataServiceList and add the
# callback
notifying_list = DataServiceList(attr_value, callback=[callback]) notifying_list = DataServiceList(attr_value, callback=[callback])
self._list_mapping[id(attr_value)] = notifying_list self._list_mapping[id(attr_value)] = notifying_list
@@ -352,6 +359,12 @@ class CallbackManager:
attrs: dict[str, Any] = get_class_and_instance_attributes(obj) attrs: dict[str, Any] = get_class_and_instance_attributes(obj)
for nested_attr_name, nested_attr in attrs.items(): for nested_attr_name, nested_attr in attrs.items():
if isinstance(nested_attr, DataServiceList):
for i, item in enumerate(nested_attr):
if isinstance(item, AbstractDataService):
self._register_start_stop_task_callbacks(
item, parent_path=f"{parent_path}.{nested_attr_name}[{i}]"
)
if isinstance(nested_attr, AbstractDataService): if isinstance(nested_attr, AbstractDataService):
self._register_start_stop_task_callbacks( self._register_start_stop_task_callbacks(
nested_attr, parent_path=f"{parent_path}.{nested_attr_name}" nested_attr, parent_path=f"{parent_path}.{nested_attr_name}"

View File

@@ -1,12 +1,10 @@
import asyncio import logging
import inspect import warnings
import json
import os
from enum import Enum from enum import Enum
from typing import Any, Optional, cast, get_type_hints from pathlib import Path
from typing import Any, Optional, get_type_hints
import rpyc import rpyc # type: ignore
from loguru import logger
import pydase.units as u import pydase.units as u
from pydase.data_service.abstract_data_service import AbstractDataService from pydase.data_service.abstract_data_service import AbstractDataService
@@ -14,19 +12,23 @@ from pydase.data_service.callback_manager import CallbackManager
from pydase.data_service.task_manager import TaskManager from pydase.data_service.task_manager import TaskManager
from pydase.utils.helpers import ( from pydase.utils.helpers import (
convert_arguments_to_hinted_types, convert_arguments_to_hinted_types,
generate_paths_from_DataService_dict,
get_class_and_instance_attributes, get_class_and_instance_attributes,
get_component_class_names, get_object_attr_from_path_list,
get_nested_value_from_DataService_by_path_and_key,
get_object_attr_from_path,
is_property_attribute, is_property_attribute,
parse_list_attr_and_index, parse_list_attr_and_index,
update_value_if_changed, update_value_if_changed,
) )
from pydase.utils.serializer import (
Serializer,
generate_serialized_data_paths,
get_nested_dict_by_path,
)
from pydase.utils.warnings import ( from pydase.utils.warnings import (
warn_if_instance_class_does_not_inherit_from_DataService, warn_if_instance_class_does_not_inherit_from_DataService,
) )
logger = logging.getLogger(__name__)
def process_callable_attribute(attr: Any, args: dict[str, Any]) -> Any: def process_callable_attribute(attr: Any, args: dict[str, Any]) -> Any:
converted_args_or_error_msg = convert_arguments_to_hinted_types( converted_args_or_error_msg = convert_arguments_to_hinted_types(
@@ -40,7 +42,7 @@ def process_callable_attribute(attr: Any, args: dict[str, Any]) -> Any:
class DataService(rpyc.Service, AbstractDataService): class DataService(rpyc.Service, AbstractDataService):
def __init__(self, filename: Optional[str] = None) -> None: def __init__(self, **kwargs: Any) -> None:
self._callback_manager: CallbackManager = CallbackManager(self) self._callback_manager: CallbackManager = CallbackManager(self)
self._task_manager = TaskManager(self) self._task_manager = TaskManager(self)
@@ -49,14 +51,21 @@ class DataService(rpyc.Service, AbstractDataService):
self.__root__: "DataService" = self self.__root__: "DataService" = self
"""Keep track of the root object. This helps to filter the emission of """Keep track of the root object. This helps to filter the emission of
notifications. This overwrite the TaksManager's __root__ attribute.""" notifications."""
self._filename: Optional[str] = filename filename = kwargs.pop("filename", None)
if filename is not None:
warnings.warn(
"The 'filename' argument is deprecated and will be removed in a future version. "
"Please pass the 'filename' argument to `pydase.Server`.",
DeprecationWarning,
stacklevel=2,
)
self._filename: str | Path = filename
self._callback_manager.register_callbacks() self._callback_manager.register_callbacks()
self.__check_instance_classes() self.__check_instance_classes()
self._initialised = True self._initialised = True
self._load_values_from_json()
def __setattr__(self, __name: str, __value: Any) -> None: def __setattr__(self, __name: str, __value: Any) -> None:
# converting attributes that are not properties # converting attributes that are not properties
@@ -83,8 +92,9 @@ class DataService(rpyc.Service, AbstractDataService):
def __check_instance_classes(self) -> None: def __check_instance_classes(self) -> None:
for attr_name, attr_value in get_class_and_instance_attributes(self).items(): for attr_name, attr_value in get_class_and_instance_attributes(self).items():
# every class defined by the user should inherit from DataService # every class defined by the user should inherit from DataService if it is
if not attr_name.startswith("_DataService__"): # assigned to a public attribute
if not attr_name.startswith("_"):
warn_if_instance_class_does_not_inherit_from_DataService(attr_value) warn_if_instance_class_does_not_inherit_from_DataService(attr_value)
def __set_attribute_based_on_type( # noqa:CFQ002 def __set_attribute_based_on_type( # noqa:CFQ002
@@ -128,50 +138,44 @@ class DataService(rpyc.Service, AbstractDataService):
# allow all other attributes # allow all other attributes
setattr(self, name, value) setattr(self, name, value)
def _load_values_from_json(self) -> None:
if self._filename is not None:
# Check if the file specified by the filename exists
if os.path.exists(self._filename):
with open(self._filename, "r") as f:
# Load JSON data from file and update class attributes with these
# values
self.load_DataService_from_JSON(cast(dict[str, Any], json.load(f)))
def write_to_file(self) -> None: def write_to_file(self) -> None:
""" """
Serialize the DataService instance and write it to a JSON file. Serialize the DataService instance and write it to a JSON file.
Args: This method is deprecated and will be removed in a future version.
filename (str): The name of the file to write to. Service persistence is handled by `pydase.Server` now, instead.
""" """
if self._filename is not None:
with open(self._filename, "w") as f: warnings.warn(
json.dump(self.serialize(), f, indent=4) "'write_to_file' is deprecated and will be removed in a future version. "
else: "Service persistence is handled by `pydase.Server` now, instead.",
logger.error( DeprecationWarning,
f"Class {self.__class__.__name__} was not initialised with a filename. " stacklevel=2,
'Skipping "write_to_file"...'
) )
if hasattr(self, "_state_manager"):
getattr(self, "_state_manager").save_state()
def load_DataService_from_JSON(self, json_dict: dict[str, Any]) -> None: def load_DataService_from_JSON(self, json_dict: dict[str, Any]) -> None:
warnings.warn(
"'load_DataService_from_JSON' is deprecated and will be removed in a "
"future version. "
"Service persistence is handled by `pydase.Server` now, instead.",
DeprecationWarning,
stacklevel=2,
)
# Traverse the serialized representation and set the attributes of the class # Traverse the serialized representation and set the attributes of the class
serialized_class = self.serialize() serialized_class = self.serialize()
for path in generate_paths_from_DataService_dict(json_dict): for path in generate_serialized_data_paths(json_dict):
value = get_nested_value_from_DataService_by_path_and_key( nested_json_dict = get_nested_dict_by_path(json_dict, path)
json_dict, path=path value = nested_json_dict["value"]
) value_type = nested_json_dict["type"]
value_type = get_nested_value_from_DataService_by_path_and_key(
json_dict, path=path, key="type" nested_class_dict = get_nested_dict_by_path(serialized_class, path)
) class_value_type = nested_class_dict.get("type", None)
class_value_type = get_nested_value_from_DataService_by_path_and_key(
serialized_class, path=path, key="type"
)
if class_value_type == value_type: if class_value_type == value_type:
class_attr_is_read_only = ( class_attr_is_read_only = nested_class_dict["readonly"]
get_nested_value_from_DataService_by_path_and_key(
serialized_class, path=path, key="readonly"
)
)
if class_attr_is_read_only: if class_attr_is_read_only:
logger.debug( logger.debug(
f'Attribute "{path}" is read-only. Ignoring value from JSON ' f'Attribute "{path}" is read-only. Ignoring value from JSON '
@@ -182,6 +186,10 @@ class DataService(rpyc.Service, AbstractDataService):
parts = path.split(".") parts = path.split(".")
attr_name = parts[-1] attr_name = parts[-1]
# Convert dictionary into Quantity
if class_value_type == "Quantity":
value = u.convert_to_quantity(value)
self.update_DataService_attribute(parts[:-1], attr_name, value) self.update_DataService_attribute(parts[:-1], attr_name, value)
else: else:
logger.info( logger.info(
@@ -206,136 +214,7 @@ class DataService(rpyc.Service, AbstractDataService):
Returns: Returns:
dict: The serialized instance. dict: The serialized instance.
""" """
result: dict[str, dict[str, Any]] = {} return Serializer.serialize_object(self)["value"]
# Get the dictionary of the base class
base_set = set(type(super()).__dict__)
# Get the dictionary of the derived class
derived_set = set(type(self).__dict__)
# Get the difference between the two dictionaries
derived_only_set = derived_set - base_set
instance_dict = set(self.__dict__)
# Merge the class and instance dictionaries
merged_set = derived_only_set | instance_dict
def get_attribute_doc(attr: Any) -> Optional[str]:
"""This function takes an input attribute attr and returns its documentation
string if it's different from the documentation of its type, otherwise,
it returns None.
"""
attr_doc = inspect.getdoc(attr)
attr_class_doc = inspect.getdoc(type(attr))
if attr_class_doc != attr_doc:
return attr_doc
else:
return None
# Iterate over attributes, properties, class attributes, and methods
for key in sorted(merged_set):
if key.startswith("_"):
continue # Skip attributes that start with underscore
# Skip keys that start with "start_" or "stop_" and end with an async method
# name
if (key.startswith("start_") or key.startswith("stop_")) and key.split(
"_", 1
)[1] in {
name
for name, _ in inspect.getmembers(
self, predicate=inspect.iscoroutinefunction
)
}:
continue
# Get the value of the current attribute or method
value = getattr(self, key)
if isinstance(value, DataService):
result[key] = {
"type": type(value).__name__
if type(value).__name__ in get_component_class_names()
else "DataService",
"value": value.serialize(),
"readonly": False,
"doc": get_attribute_doc(value),
}
elif isinstance(value, list):
result[key] = {
"type": "list",
"value": [
{
"type": type(item).__name__
if not isinstance(item, DataService)
or type(item).__name__ in get_component_class_names()
else "DataService",
"value": item.serialize()
if isinstance(item, DataService)
else item,
"readonly": False,
"doc": get_attribute_doc(value),
}
for item in value
],
"readonly": False,
}
elif inspect.isfunction(value) or inspect.ismethod(value):
sig = inspect.signature(value)
# Store parameters and their anotations in a dictionary
parameters: dict[str, Optional[str]] = {}
for k, v in sig.parameters.items():
annotation = v.annotation
if annotation is not inspect._empty:
if isinstance(annotation, type):
# Handle regular types
parameters[k] = annotation.__name__
else:
parameters[k] = str(annotation)
else:
parameters[k] = None
running_task_info = None
if (
key in self._task_manager.tasks
): # If there's a running task for this method
task_info = self._task_manager.tasks[key]
running_task_info = task_info["kwargs"]
result[key] = {
"type": "method",
"async": asyncio.iscoroutinefunction(value),
"parameters": parameters,
"doc": get_attribute_doc(value),
"readonly": True,
"value": running_task_info,
}
elif isinstance(value, Enum):
result[key] = {
"type": "Enum",
"value": value.name,
"enum": {
name: member.value
for name, member in value.__class__.__members__.items()
},
"readonly": False,
"doc": get_attribute_doc(value),
}
else:
result[key] = {
"type": type(value).__name__,
"value": value
if not isinstance(value, u.Quantity)
else {"magnitude": value.m, "unit": str(value.u)},
"readonly": False,
"doc": get_attribute_doc(value),
}
if isinstance(getattr(self.__class__, key, None), property):
prop: property = getattr(self.__class__, key)
result[key]["readonly"] = prop.fset is None
result[key]["doc"] = get_attribute_doc(prop)
return result
def update_DataService_attribute( def update_DataService_attribute(
self, self,
@@ -343,10 +222,19 @@ class DataService(rpyc.Service, AbstractDataService):
attr_name: str, attr_name: str,
value: Any, value: Any,
) -> None: ) -> None:
warnings.warn(
"'update_DataService_attribute' is deprecated and will be removed in a "
"future version. "
"Service state management is handled by `pydase.data_service.state_manager`"
"now, instead.",
DeprecationWarning,
stacklevel=2,
)
# If attr_name corresponds to a list entry, extract the attr_name and the index # If attr_name corresponds to a list entry, extract the attr_name and the index
attr_name, index = parse_list_attr_and_index(attr_name) attr_name, index = parse_list_attr_and_index(attr_name)
# Traverse the object according to the path parts # Traverse the object according to the path parts
target_obj = get_object_attr_from_path(self, path_list) target_obj = get_object_attr_from_path_list(self, path_list)
# If the attribute is a property, change it using the setter without getting the # If the attribute is a property, change it using the setter without getting the
# property value (would otherwise be bad for expensive getter methods) # property value (would otherwise be bad for expensive getter methods)
@@ -354,7 +242,7 @@ class DataService(rpyc.Service, AbstractDataService):
setattr(target_obj, attr_name, value) setattr(target_obj, attr_name, value)
return return
attr = get_object_attr_from_path(target_obj, [attr_name]) attr = get_object_attr_from_path_list(target_obj, [attr_name])
if attr is None: if attr is None:
return return

View File

@@ -0,0 +1,35 @@
import logging
from typing import TYPE_CHECKING, Any
from pydase.utils.serializer import set_nested_value_by_path
if TYPE_CHECKING:
from pydase import DataService
logger = logging.getLogger(__name__)
class DataServiceCache:
def __init__(self, service: "DataService") -> None:
self._cache: dict[str, Any] = {}
self.service = service
self._initialize_cache()
@property
def cache(self) -> dict[str, Any]:
return self._cache
def _initialize_cache(self) -> None:
"""Initializes the cache and sets up the callback."""
logger.debug("Initializing cache.")
self._cache = self.service.serialize()
self.service._callback_manager.add_notification_callback(self.update_cache)
def update_cache(self, parent_path: str, name: str, value: Any) -> None:
# Remove the part before the first "." in the parent_path
parent_path = ".".join(parent_path.split(".")[1:])
# Construct the full path
full_path = f"{parent_path}.{name}" if parent_path else name
set_nested_value_by_path(self._cache, full_path, value)

View File

@@ -1,6 +1,7 @@
from collections.abc import Callable from collections.abc import Callable
from typing import Any from typing import Any
import pydase.units as u
from pydase.utils.warnings import ( from pydase.utils.warnings import (
warn_if_instance_class_does_not_inherit_from_DataService, warn_if_instance_class_does_not_inherit_from_DataService,
) )
@@ -47,6 +48,14 @@ class DataServiceList(list):
super().__init__(*args, **kwargs) # type: ignore super().__init__(*args, **kwargs) # type: ignore
def __setitem__(self, key: int, value: Any) -> None: # type: ignore def __setitem__(self, key: int, value: Any) -> None: # type: ignore
current_value = self.__getitem__(key)
# parse ints into floats if current value is a float
if isinstance(current_value, float) and isinstance(value, int):
value = float(value)
if isinstance(current_value, u.Quantity):
value = u.convert_to_quantity(value, str(current_value.u))
super().__setitem__(key, value) # type: ignore super().__setitem__(key, value) # type: ignore
for callback in self.callbacks: for callback in self.callbacks:

View File

@@ -0,0 +1,269 @@
import json
import logging
import os
from collections.abc import Callable
from pathlib import Path
from typing import TYPE_CHECKING, Any, Optional, cast
import pydase.units as u
from pydase.data_service.data_service_cache import DataServiceCache
from pydase.utils.helpers import (
get_object_attr_from_path_list,
parse_list_attr_and_index,
)
from pydase.utils.serializer import (
dump,
generate_serialized_data_paths,
get_nested_dict_by_path,
)
if TYPE_CHECKING:
from pydase import DataService
logger = logging.getLogger(__name__)
def load_state(func: Callable[..., Any]) -> Callable[..., Any]:
"""This function should be used as a decorator on property setters to indicate that
the value should be loaded from the JSON file.
Example:
>>> class Service(pydase.DataService):
... _name = "Service"
...
... @property
... def name(self) -> str:
... return self._name
...
... @name.setter
... @load_state
... def name(self, value: str) -> None:
... self._name = value
"""
func._load_state = True
return func
def has_load_state_decorator(prop: property):
"""Determines if the property's setter method is decorated with the `@load_state`
decorator.
"""
try:
return getattr(prop.fset, "_load_state")
except AttributeError:
return False
class StateManager:
"""
Manages the state of a DataService instance, serving as both a cache and a
persistence layer. It is designed to provide quick access to the latest known state
for newly connecting web clients without the need for expensive property accesses
that may involve complex calculations or I/O operations.
The StateManager listens for state change notifications from the DataService's
callback manager and updates its cache accordingly. This cache does not always
reflect the most current complex property states but rather retains the value from
the last known state, optimizing for performance and reducing the load on the
system.
While the StateManager ensures that the cached state is as up-to-date as possible,
it does not autonomously update complex properties of the DataService. Such
properties must be updated programmatically, for instance, by invoking specific
tasks or methods that trigger the necessary operations to refresh their state.
The cached state maintained by the StateManager is particularly useful for web
clients that connect to the system and need immediate access to the current state of
the DataService. By avoiding direct and potentially costly property accesses, the
StateManager provides a snapshot of the DataService's state that is sufficiently
accurate for initial rendering and interaction.
Attributes:
cache (dict[str, Any]):
A dictionary cache of the DataService's state.
filename (str):
The file name used for storing the DataService's state.
service (DataService):
The DataService instance whose state is being managed.
Note:
The StateManager's cache updates are triggered by notifications and do not
include autonomous updates of complex DataService properties, which must be
managed programmatically. The cache serves the purpose of providing immediate
state information to web clients, reflecting the state after the last property
update.
"""
def __init__(self, service: "DataService", filename: Optional[str | Path] = None):
self.filename = getattr(service, "_filename", None)
if filename is not None:
if self.filename is not None:
logger.warning(
f"Overwriting filename {self.filename!r} with {filename!r}."
)
self.filename = filename
self.service = service
self._data_service_cache = DataServiceCache(self.service)
@property
def cache(self) -> dict[str, Any]:
"""Returns the cached DataService state."""
return self._data_service_cache.cache
def save_state(self) -> None:
"""
Saves the DataService's current state to a JSON file defined by `self.filename`.
Logs an error if `self.filename` is not set.
"""
if self.filename is not None:
with open(self.filename, "w") as f:
json.dump(self.cache, f, indent=4)
else:
logger.error(
"State manager was not initialised with a filename. Skipping "
"'save_state'..."
)
def load_state(self) -> None:
"""
Loads the DataService's state from a JSON file defined by `self.filename`.
Updates the service's attributes, respecting type and read-only constraints.
"""
# Traverse the serialized representation and set the attributes of the class
json_dict = self._get_state_dict_from_JSON_file()
if json_dict == {}:
logger.debug("Could not load the service state.")
return
for path in generate_serialized_data_paths(json_dict):
nested_json_dict = get_nested_dict_by_path(json_dict, path)
nested_class_dict = get_nested_dict_by_path(self.cache, path)
value, value_type = nested_json_dict["value"], nested_json_dict["type"]
class_attr_value_type = nested_class_dict.get("type", None)
if (
class_attr_value_type == value_type
and self.__is_loadable_state_attribute(path)
):
self.set_service_attribute_value_by_path(path, value)
else:
logger.info(
f"Attribute type of {path!r} changed from {value_type!r} to "
f"{class_attr_value_type!r}. Ignoring value from JSON file..."
)
def _get_state_dict_from_JSON_file(self) -> dict[str, Any]:
if self.filename is not None:
# Check if the file specified by the filename exists
if os.path.exists(self.filename):
with open(self.filename, "r") as f:
# Load JSON data from file and update class attributes with these
# values
return cast(dict[str, Any], json.load(f))
return {}
def set_service_attribute_value_by_path(
self,
path: str,
value: Any,
) -> None:
"""
Sets the value of an attribute in the service managed by the `StateManager`
given its path as a dot-separated string.
This method updates the attribute specified by 'path' with 'value' only if the
attribute is not read-only and the new value differs from the current one.
It also handles type-specific conversions for the new value before setting it.
Args:
path: A dot-separated string indicating the hierarchical path to the
attribute.
value: The new value to set for the attribute.
"""
current_value_dict = get_nested_dict_by_path(self.cache, path)
# This will also filter out methods as they are 'read-only'
if current_value_dict["readonly"]:
logger.debug(f"Attribute {path!r} is read-only. Ignoring new value...")
return
converted_value = self.__convert_value_if_needed(value, current_value_dict)
# only set value when it has changed
if self.__attr_value_has_changed(converted_value, current_value_dict["value"]):
self.__update_attribute_by_path(path, converted_value)
else:
logger.debug(f"Value of attribute {path!r} has not changed...")
def __attr_value_has_changed(self, value_object: Any, current_value: Any) -> bool:
"""Check if the serialized value of `value_object` differs from `current_value`.
The method serializes `value_object` to compare it, which is mainly
necessary for handling Quantity objects.
"""
return dump(value_object)["value"] != current_value
def __convert_value_if_needed(
self, value: Any, current_value_dict: dict[str, Any]
) -> Any:
if current_value_dict["type"] == "Quantity":
return u.convert_to_quantity(value, current_value_dict["value"]["unit"])
return value
def __update_attribute_by_path(self, path: str, value: Any) -> None:
parent_path_list, attr_name = path.split(".")[:-1], path.split(".")[-1]
# If attr_name corresponds to a list entry, extract the attr_name and the
# index
attr_name, index = parse_list_attr_and_index(attr_name)
# Update path to reflect the attribute without list indices
path = ".".join([*parent_path_list, attr_name])
attr_cache_type = get_nested_dict_by_path(self.cache, path)["type"]
# Traverse the object according to the path parts
target_obj = get_object_attr_from_path_list(self.service, parent_path_list)
if attr_cache_type in ("ColouredEnum", "Enum"):
enum_attr = get_object_attr_from_path_list(target_obj, [attr_name])
setattr(target_obj, attr_name, enum_attr.__class__[value])
elif attr_cache_type == "list":
list_obj = get_object_attr_from_path_list(target_obj, [attr_name])
list_obj[index] = value
else:
setattr(target_obj, attr_name, value)
def __is_loadable_state_attribute(self, property_path: str) -> bool:
"""Checks if an attribute defined by a dot-separated path should be loaded from
storage.
For properties, it verifies the presence of the '@load_state' decorator. Regular
attributes default to being loadable.
"""
parent_object = get_object_attr_from_path_list(
self.service, property_path.split(".")[:-1]
)
attr_name = property_path.split(".")[-1]
prop = getattr(type(parent_object), attr_name, None)
if isinstance(prop, property):
has_decorator = has_load_state_decorator(prop)
if not has_decorator:
logger.debug(
f"Property {attr_name!r} has no '@load_state' decorator. "
"Ignoring value from JSON file..."
)
return has_decorator
return True

View File

@@ -2,15 +2,20 @@ from __future__ import annotations
import asyncio import asyncio
import inspect import inspect
import logging
from collections.abc import Callable from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import TYPE_CHECKING, Any, TypedDict from typing import TYPE_CHECKING, Any, TypedDict
from loguru import logger from pydase.data_service.abstract_data_service import AbstractDataService
from pydase.data_service.data_service_list import DataServiceList
from pydase.utils.helpers import get_class_and_instance_attributes
if TYPE_CHECKING: if TYPE_CHECKING:
from .data_service import DataService from .data_service import DataService
logger = logging.getLogger(__name__)
class TaskDict(TypedDict): class TaskDict(TypedDict):
task: asyncio.Task[None] task: asyncio.Task[None]
@@ -94,10 +99,70 @@ class TaskManager:
for name, method in inspect.getmembers( for name, method in inspect.getmembers(
self.service, predicate=inspect.iscoroutinefunction self.service, predicate=inspect.iscoroutinefunction
): ):
# create start and stop methods for each coroutine
setattr(self.service, f"start_{name}", self._make_start_task(name, method))
setattr(self.service, f"stop_{name}", self._make_stop_task(name))
def _initiate_task_startup(self) -> None:
if self.service._autostart_tasks is not None:
for service_name, args in self.service._autostart_tasks.items():
start_method = getattr(self.service, f"start_{service_name}", None)
if start_method is not None and callable(start_method):
start_method(*args)
else:
logger.warning(
f"No start method found for service '{service_name}'"
)
def start_autostart_tasks(self) -> None:
self._initiate_task_startup()
attrs = get_class_and_instance_attributes(self.service)
for _, attr_value in attrs.items():
if isinstance(attr_value, AbstractDataService):
attr_value._task_manager.start_autostart_tasks()
elif isinstance(attr_value, DataServiceList):
for i, item in enumerate(attr_value):
if isinstance(item, AbstractDataService):
item._task_manager.start_autostart_tasks()
def _make_stop_task(self, name: str) -> Callable[..., Any]:
"""
Factory function to create a 'stop_task' function for a running task.
The generated function cancels the associated asyncio task using 'name' for
identification, ensuring proper cleanup. Avoids closure and late binding issues.
Args:
name (str): The name of the coroutine task, used for its identification.
"""
def stop_task() -> None:
# cancel the task
task = self.tasks.get(name, None)
if task is not None:
self._loop.call_soon_threadsafe(task["task"].cancel)
return stop_task
def _make_start_task( # noqa
self, name: str, method: Callable[..., Any]
) -> Callable[..., Any]:
"""
Factory function to create a 'start_task' function for a coroutine.
The generated function starts the coroutine as an asyncio task, handling
registration and monitoring.
It uses 'name' and 'method' to avoid the closure and late binding issue.
Args:
name (str): The name of the coroutine, used for task management.
method (callable): The coroutine to be turned into an asyncio task.
"""
@wraps(method) @wraps(method)
def start_task(*args: Any, **kwargs: Any) -> None: def start_task(*args: Any, **kwargs: Any) -> None:
def task_done_callback(task: asyncio.Task, name: str) -> None: def task_done_callback(task: asyncio.Task[None], name: str) -> None:
"""Handles tasks that have finished. """Handles tasks that have finished.
Removes a task from the tasks dictionary, calls the defined Removes a task from the tasks dictionary, calls the defined
@@ -123,7 +188,7 @@ class TaskManager:
try: try:
await method(*args, **kwargs) await method(*args, **kwargs)
except asyncio.CancelledError: except asyncio.CancelledError:
print(f"Task {name} was cancelled") logger.info(f"Task {name} was cancelled")
if not self.tasks.get(name): if not self.tasks.get(name):
# Get the signature of the coroutine method to start # Get the signature of the coroutine method to start
@@ -135,9 +200,7 @@ class TaskManager:
# Extend the list of positional arguments with None values to match # Extend the list of positional arguments with None values to match
# the length of the parameter names list. This is done to ensure # the length of the parameter names list. This is done to ensure
# that zip can pair each parameter name with a corresponding value. # that zip can pair each parameter name with a corresponding value.
args_padded = list(args) + [None] * ( args_padded = list(args) + [None] * (len(parameter_names) - len(args))
len(parameter_names) - len(args)
)
# Create a dictionary of keyword arguments by pairing the parameter # Create a dictionary of keyword arguments by pairing the parameter
# names with the values in 'args_padded'. Then merge this dictionary # names with the values in 'args_padded'. Then merge this dictionary
@@ -169,23 +232,4 @@ class TaskManager:
else: else:
logger.error(f"Task `{name}` is already running!") logger.error(f"Task `{name}` is already running!")
def stop_task() -> None: return start_task
# cancel the task
task = self.tasks.get(name, None)
if task is not None:
self._loop.call_soon_threadsafe(task["task"].cancel)
# create start and stop methods for each coroutine
setattr(self.service, f"start_{name}", start_task)
setattr(self.service, f"stop_{name}", stop_task)
def start_autostart_tasks(self) -> None:
if self.service._autostart_tasks is not None:
for service_name, args in self.service._autostart_tasks.items():
start_method = getattr(self.service, f"start_{service_name}", None)
if start_method is not None and callable(start_method):
start_method(*args)
else:
logger.warning(
f"No start method found for service '{service_name}'"
)

View File

@@ -1,13 +1,13 @@
{ {
"files": { "files": {
"main.css": "/static/css/main.398bc7f8.css", "main.css": "/static/css/main.32559665.css",
"main.js": "/static/js/main.c348625e.js", "main.js": "/static/js/main.6d4f9d3a.js",
"index.html": "/index.html", "index.html": "/index.html",
"main.398bc7f8.css.map": "/static/css/main.398bc7f8.css.map", "main.32559665.css.map": "/static/css/main.32559665.css.map",
"main.c348625e.js.map": "/static/js/main.c348625e.js.map" "main.6d4f9d3a.js.map": "/static/js/main.6d4f9d3a.js.map"
}, },
"entrypoints": [ "entrypoints": [
"static/css/main.398bc7f8.css", "static/css/main.32559665.css",
"static/js/main.c348625e.js" "static/js/main.6d4f9d3a.js"
] ]
} }

View File

@@ -1 +1 @@
<!doctype html><html lang="en"><head><meta charset="utf-8"/><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Web site displaying a pydase UI."/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>pydase App</title><script defer="defer" src="/static/js/main.c348625e.js"></script><link href="/static/css/main.398bc7f8.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html> <!doctype html><html lang="en"><head><meta charset="utf-8"/><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Web site displaying a pydase UI."/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>pydase App</title><script defer="defer" src="/static/js/main.6d4f9d3a.js"></script><link href="/static/css/main.32559665.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html>

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -45,11 +45,3 @@
* This source code is licensed under the MIT license found in the * This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree. * LICENSE file in the root directory of this source tree.
*/ */
/**
* @mui/styled-engine v5.13.2
*
* @license MIT
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,26 +1,27 @@
import asyncio import asyncio
import logging
import os import os
import signal import signal
import threading import threading
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from enum import Enum from enum import Enum
from pathlib import Path
from types import FrameType from types import FrameType
from typing import Any, Optional, Protocol, TypedDict from typing import Any, Optional, Protocol, TypedDict
import uvicorn import uvicorn
from loguru import logger from rpyc import ForkingServer, ThreadedServer # type: ignore
from rpyc import (
ForkingServer, # can be used for multiprocessing, e.g. a database interface server
)
from rpyc import ThreadedServer
from uvicorn.server import HANDLED_SIGNALS from uvicorn.server import HANDLED_SIGNALS
import pydase.units as u import pydase.units as u
from pydase import DataService from pydase import DataService
from pydase.data_service.state_manager import StateManager
from pydase.version import __version__ from pydase.version import __version__
from .web_server import WebAPI from .web_server import WebAPI
logger = logging.getLogger(__name__)
class AdditionalServerProtocol(Protocol): class AdditionalServerProtocol(Protocol):
""" """
@@ -45,13 +46,22 @@ class AdditionalServerProtocol(Protocol):
The hostname or IP address at which the server will be hosted. This could be a The hostname or IP address at which the server will be hosted. This could be a
local address (like '127.0.0.1' for localhost) or a public IP address. local address (like '127.0.0.1' for localhost) or a public IP address.
state_manager: StateManager
The state manager managing the state cache and persistence of the exposed
service.
**kwargs: Any **kwargs: Any
Any additional parameters required for initializing the server. These parameters Any additional parameters required for initializing the server. These parameters
are specific to the server's implementation. are specific to the server's implementation.
""" """
def __init__( def __init__(
self, service: DataService, port: int, host: str, **kwargs: Any self,
service: DataService,
port: int,
host: str,
state_manager: StateManager,
**kwargs: Any,
) -> None: ) -> None:
... ...
@@ -96,9 +106,10 @@ class Server:
Whether to enable the RPC server. Default is True. Whether to enable the RPC server. Default is True.
enable_web: bool enable_web: bool
Whether to enable the web server. Default is True. Whether to enable the web server. Default is True.
filename: str | Path | None
Filename of the file managing the service state persistence. Defaults to None.
use_forking_server: bool use_forking_server: bool
Whether to use ForkingServer for multiprocessing (e.g. for a database interface Whether to use ForkingServer for multiprocessing. Default is False.
server). Default is False.
web_settings: dict[str, Any] web_settings: dict[str, Any]
Additional settings for the web server. Default is {} (an empty dictionary). Additional settings for the web server. Default is {} (an empty dictionary).
additional_servers : list[AdditionalServer] additional_servers : list[AdditionalServer]
@@ -118,9 +129,15 @@ class Server:
>>> class MyCustomServer: >>> class MyCustomServer:
... def __init__( ... def __init__(
... self, service: DataService, port: int, host: str, **kwargs: Any ... self,
... service: DataService,
... port: int,
... host: str,
... state_manager: StateManager,
... **kwargs: Any
... ): ... ):
... self.service = service ... self.service = service
... self.state_manager = state_manager
... self.port = port ... self.port = port
... self.host = host ... self.host = host
... # handle any additional arguments... ... # handle any additional arguments...
@@ -155,6 +172,7 @@ class Server:
web_port: int = 8001, web_port: int = 8001,
enable_rpc: bool = True, enable_rpc: bool = True,
enable_web: bool = True, enable_web: bool = True,
filename: Optional[str | Path] = None,
use_forking_server: bool = False, use_forking_server: bool = False,
web_settings: dict[str, Any] = {}, web_settings: dict[str, Any] = {},
additional_servers: list[AdditionalServer] = [], additional_servers: list[AdditionalServer] = [],
@@ -185,6 +203,10 @@ class Server:
"additional_servers": [], "additional_servers": [],
**kwargs, **kwargs,
} }
self._state_manager = StateManager(self._service, filename)
if getattr(self._service, "_filename", None) is not None:
self._service._state_manager = self._state_manager
self._state_manager.load_state()
def run(self) -> None: def run(self) -> None:
""" """
@@ -247,6 +269,7 @@ class Server:
self._service, self._service,
port=server["port"], port=server["port"],
host=self._host, host=self._host,
state_manager=self._state_manager,
info=self._info, info=self._info,
**server["kwargs"], **server["kwargs"],
) )
@@ -269,6 +292,7 @@ class Server:
self._wapi: WebAPI = WebAPI( self._wapi: WebAPI = WebAPI(
service=self._service, service=self._service,
info=self._info, info=self._info,
state_manager=self._state_manager,
**self._kwargs, **self._kwargs,
) )
web_server = uvicorn.Server( web_server = uvicorn.Server(
@@ -320,9 +344,9 @@ class Server:
async def shutdown(self) -> None: async def shutdown(self) -> None:
logger.info("Shutting down") logger.info("Shutting down")
logger.info(f"Saving data to {self._service._filename}.") logger.info(f"Saving data to {self._state_manager.filename}.")
if self._service._filename is not None: if self._state_manager is not None:
self._service.write_to_file() self._state_manager.save_state()
await self.__cancel_servers() await self.__cancel_servers()
await self.__cancel_tasks() await self.__cancel_tasks()
@@ -356,20 +380,16 @@ class Server:
# Signals can only be listened to from the main thread. # Signals can only be listened to from the main thread.
return return
try:
for sig in HANDLED_SIGNALS:
self._loop.add_signal_handler(sig, self.handle_exit, sig, None)
except NotImplementedError:
# Windows
for sig in HANDLED_SIGNALS: for sig in HANDLED_SIGNALS:
signal.signal(sig, self.handle_exit) signal.signal(sig, self.handle_exit)
def handle_exit(self, sig: int = 0, frame: Optional[FrameType] = None) -> None: def handle_exit(self, sig: int = 0, frame: Optional[FrameType] = None) -> None:
logger.info("Handling exit")
if self.should_exit and sig == signal.SIGINT: if self.should_exit and sig == signal.SIGINT:
self.force_exit = True logger.warning(f"Received signal {sig}, forcing exit...")
os._exit(1)
else: else:
self.should_exit = True self.should_exit = True
logger.warning(f"Received signal {sig}, exiting... (CTRL+C to force quit)")
def custom_exception_handler( def custom_exception_handler(
self, loop: asyncio.AbstractEventLoop, context: dict[str, Any] self, loop: asyncio.AbstractEventLoop, context: dict[str, Any]

View File

@@ -1,15 +1,21 @@
import logging
from pathlib import Path from pathlib import Path
from typing import Any, TypedDict from typing import Any, TypedDict
import socketio import socketio # type: ignore
from fastapi import FastAPI from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import FileResponse
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from loguru import logger
from pydase import DataService from pydase import DataService
from pydase.data_service.data_service import process_callable_attribute
from pydase.data_service.state_manager import StateManager
from pydase.utils.helpers import get_object_attr_from_path_list
from pydase.version import __version__ from pydase.version import __version__
logger = logging.getLogger(__name__)
class UpdateDict(TypedDict): class UpdateDict(TypedDict):
""" """
@@ -40,6 +46,25 @@ class UpdateDict(TypedDict):
value: Any value: Any
class RunMethodDict(TypedDict):
"""
A TypedDict subclass representing a dictionary used for running methods from the
exposed DataService.
Attributes:
name (str): The name of the method to be run.
parent_path (str): The access path for the parent object of the method to be
run. This is used to construct the full access path for the method. For
example, for an method with access path 'attr1.list_attr[0].method_name',
'attr1.list_attr[0]' would be the parent_path.
kwargs (dict[str, Any]): The arguments passed to the method.
"""
name: str
parent_path: str
kwargs: dict[str, Any]
class WebAPI: class WebAPI:
__sio_app: socketio.ASGIApp __sio_app: socketio.ASGIApp
__fastapi_app: FastAPI __fastapi_app: FastAPI
@@ -47,6 +72,7 @@ class WebAPI:
def __init__( # noqa: CFQ002 def __init__( # noqa: CFQ002
self, self,
service: DataService, service: DataService,
state_manager: StateManager,
frontend: str | Path | None = None, frontend: str | Path | None = None,
css: str | Path | None = None, css: str | Path | None = None,
enable_CORS: bool = True, enable_CORS: bool = True,
@@ -55,6 +81,7 @@ class WebAPI:
**kwargs: Any, **kwargs: Any,
): ):
self.service = service self.service = service
self.state_manager = state_manager
self.frontend = frontend self.frontend = frontend
self.css = css self.css = css
self.enable_CORS = enable_CORS self.enable_CORS = enable_CORS
@@ -73,18 +100,27 @@ class WebAPI:
sio = socketio.AsyncServer(async_mode="asgi") sio = socketio.AsyncServer(async_mode="asgi")
@sio.event # type: ignore @sio.event # type: ignore
def frontend_update(sid: str, data: UpdateDict) -> Any: def set_attribute(sid: str, data: UpdateDict) -> Any:
logger.debug(f"Received frontend update: {data}") logger.debug(f"Received frontend update: {data}")
path_list, attr_name = data["parent_path"].split("."), data["name"] path_list = [*data["parent_path"].split("."), data["name"]]
path_list.remove("DataService") # always at the start, does not do anything path_list.remove("DataService") # always at the start, does not do anything
return self.service.update_DataService_attribute( path = ".".join(path_list)
path_list=path_list, attr_name=attr_name, value=data["value"] return self.state_manager.set_service_attribute_value_by_path(
path=path, value=data["value"]
) )
@sio.event # type: ignore
def run_method(sid: str, data: RunMethodDict) -> Any:
logger.debug(f"Running method: {data}")
path_list = [*data["parent_path"].split("."), data["name"]]
path_list.remove("DataService") # always at the start, does not do anything
method = get_object_attr_from_path_list(self.service, path_list)
return process_callable_attribute(method, data["kwargs"])
self.__sio = sio self.__sio = sio
self.__sio_app = socketio.ASGIApp(self.__sio) self.__sio_app = socketio.ASGIApp(self.__sio)
def setup_fastapi_app(self) -> None: # noqa: CFQ004 def setup_fastapi_app(self) -> None: # noqa
app = FastAPI() app = FastAPI()
if self.enable_CORS: if self.enable_CORS:
@@ -97,7 +133,6 @@ class WebAPI:
) )
app.mount("/ws", self.__sio_app) app.mount("/ws", self.__sio_app)
# @app.get("/version", include_in_schema=False)
@app.get("/version") @app.get("/version")
def version() -> str: def version() -> str:
return __version__ return __version__
@@ -112,7 +147,14 @@ class WebAPI:
@app.get("/service-properties") @app.get("/service-properties")
def service_properties() -> dict[str, Any]: def service_properties() -> dict[str, Any]:
return self.service.serialize() return self.state_manager.cache
# exposing custom.css file provided by user
if self.css is not None:
@app.get("/custom.css")
async def styles() -> FileResponse:
return FileResponse(str(self.css))
app.mount( app.mount(
"/", "/",
@@ -124,14 +166,6 @@ class WebAPI:
self.__fastapi_app = app self.__fastapi_app = app
def add_endpoint(self, name: str) -> None:
# your endpoint creation code
pass
def get_custom_openapi(self) -> None:
# your custom openapi generation code
pass
@property @property
def sio(self) -> socketio.AsyncServer: def sio(self) -> socketio.AsyncServer:
return self.__sio return self.__sio

View File

@@ -2,7 +2,7 @@ from typing import TypedDict
import pint import pint
units: pint.UnitRegistry = pint.UnitRegistry() units: pint.UnitRegistry = pint.UnitRegistry(autoconvert_offset_to_baseunit=True)
units.default_format = "~P" # pretty and short format units.default_format = "~P" # pretty and short format
Quantity = pint.Quantity Quantity = pint.Quantity

View File

@@ -1,10 +1,19 @@
import re import inspect
import logging
from itertools import chain from itertools import chain
from typing import Any, Optional, cast from typing import Any, Optional
from loguru import logger logger = logging.getLogger(__name__)
STANDARD_TYPES = ("int", "float", "bool", "str", "Enum", "NoneType", "Quantity")
def get_attribute_doc(attr: Any) -> Optional[str]:
"""This function takes an input attribute attr and returns its documentation
string if it's different from the documentation of its type, otherwise,
it returns None.
"""
attr_doc = inspect.getdoc(attr)
attr_class_doc = inspect.getdoc(type(attr))
return attr_doc if attr_class_doc != attr_doc else None
def get_class_and_instance_attributes(obj: object) -> dict[str, Any]: def get_class_and_instance_attributes(obj: object) -> dict[str, Any]:
@@ -22,7 +31,7 @@ def get_class_and_instance_attributes(obj: object) -> dict[str, Any]:
return attrs return attrs
def get_object_attr_from_path(target_obj: Any, path: list[str]) -> Any: def get_object_attr_from_path_list(target_obj: Any, path: list[str]) -> Any:
""" """
Traverse the object tree according to the given path. Traverse the object tree according to the given path.
@@ -55,213 +64,6 @@ def get_object_attr_from_path(target_obj: Any, path: list[str]) -> Any:
return target_obj return target_obj
def generate_paths_from_DataService_dict(
data: dict, parent_path: str = ""
) -> list[str]:
"""
Recursively generate paths from a dictionary representing a DataService object.
This function traverses through a nested dictionary, which is typically obtained
from serializing a DataService object. The function generates a list where each
element is a string representing the path to each terminal value in the original
dictionary.
The paths are represented as strings, with dots ('.') denoting nesting levels and
square brackets ('[]') denoting list indices.
Args:
data (dict): The input dictionary to generate paths from. This is typically
obtained from serializing a DataService object.
parent_path (str, optional): The current path up to the current level of
recursion. Defaults to ''.
Returns:
list[str]: A list with paths as elements.
Note:
The function ignores keys whose "type" is "method", as these represent methods
of the DataService object and not its state.
Example:
-------
>>> {
... "attr1": {"type": "int", "value": 10},
... "attr2": {
... "type": "list",
... "value": [{"type": "int", "value": 1}, {"type": "int", "value": 2}],
... },
... "add": {
... "type": "method",
... "async": False,
... "parameters": {"a": "float", "b": "int"},
... "doc": "Returns the sum of the numbers a and b.",
... },
... }
>>> print(generate_paths_from_DataService_dict(nested_dict))
[attr1, attr2[0], attr2[1]]
"""
paths = []
for key, value in data.items():
if value["type"] == "method":
# ignoring methods
continue
new_path = f"{parent_path}.{key}" if parent_path else key
if isinstance(value["value"], dict) and value["type"] != "Quantity":
paths.extend(generate_paths_from_DataService_dict(value["value"], new_path)) # type: ignore
elif isinstance(value["value"], list):
for index, item in enumerate(value["value"]):
indexed_key_path = f"{new_path}[{index}]"
if isinstance(item["value"], dict):
paths.extend( # type: ignore
generate_paths_from_DataService_dict(
item["value"], indexed_key_path
)
)
else:
paths.append(indexed_key_path) # type: ignore
else:
paths.append(new_path) # type: ignore
return paths
def extract_dict_or_list_entry(data: dict[str, Any], key: str) -> dict[str, Any] | None:
"""
Extract a nested dictionary or list entry based on the provided key.
Given a dictionary and a key, this function retrieves the corresponding nested
dictionary or list entry. If the key includes an index in the format "[<index>]",
the function assumes that the corresponding entry in the dictionary is a list, and
it will attempt to retrieve the indexed item from that list.
Args:
data (dict): The input dictionary containing nested dictionaries or lists.
key (str): The key specifying the desired entry within the dictionary. The key
can be a regular dictionary key or can include an index in the format
"[<index>]" to retrieve an item from a nested list.
Returns:
dict | None: The nested dictionary or list item found for the given key. If the
key is invalid, or if the specified index is out of bounds for a list, it
returns None.
Example:
>>> data = {
... "attr1": [
... {"type": "int", "value": 10}, {"type": "string", "value": "hello"}
... ],
... "attr2": {
... "type": "MyClass",
... "value": {"sub_attr": {"type": "float", "value": 20.5}}
... }
... }
>>> extract_dict_or_list_entry(data, "attr1[1]")
{"type": "string", "value": "hello"}
>>> extract_dict_or_list_entry(data, "attr2")
{"type": "MyClass", "value": {"sub_attr": {"type": "float", "value": 20.5}}}
"""
attr_name = key
index: Optional[int] = None
# Check if the key contains an index part like '[<index>]'
if "[" in key and key.endswith("]"):
attr_name, index_part = key.split("[", 1)
index_part = index_part.rstrip("]") # remove the closing bracket
# Convert the index part to an integer
if index_part.isdigit():
index = int(index_part)
else:
logger.error(f"Invalid index format in key: {key}")
current_data: dict[str, Any] | list[dict[str, Any]] | None = data.get(
attr_name, None
)
if not isinstance(current_data, dict):
# key does not exist in dictionary, e.g. when class does not have this
# attribute
return None
if isinstance(current_data["value"], list):
current_data = current_data["value"]
if index is not None and 0 <= index < len(current_data):
current_data = current_data[index]
else:
return None
# When the attribute is a class instance, the attributes are nested in the
# "value" key
if current_data["type"] not in STANDARD_TYPES:
current_data = cast(dict[str, Any], current_data.get("value", None)) # type: ignore
assert isinstance(current_data, dict)
return current_data
def get_nested_value_from_DataService_by_path_and_key(
data: dict[str, Any], path: str, key: str = "value"
) -> Any:
"""
Get the value associated with a specific key from a dictionary given a path.
This function traverses the dictionary according to the path provided and
returns the value associated with the specified key at that path. The path is
a string with dots connecting the levels and brackets indicating list indices.
The function can handle complex dictionaries where data is nested within different
types of objects. It checks the type of each object it encounters and correctly
descends into the object if it is not a standard type (i.e., int, float, bool, str,
Enum).
Args:
data (dict): The input dictionary to get the value from.
path (str): The path to the value in the dictionary.
key (str, optional): The key associated with the value to be returned.
Default is "value".
Returns:
Any: The value associated with the specified key at the given path in the
dictionary.
Examples:
Let's consider the following dictionary:
>>> data = {
>>> "attr1": {"type": "int", "value": 10},
>>> "attr2": {
"type": "MyClass",
"value": {"attr3": {"type": "float", "value": 20.5}}
}
>>> }
The function can be used to get the value of 'attr1' as follows:
>>> get_nested_value_by_path_and_key(data, "attr1")
10
It can also be used to get the value of 'attr3', which is nested within 'attr2',
as follows:
>>> get_nested_value_by_path_and_key(data, "attr2.attr3", "type")
float
"""
# Split the path into parts
parts: list[str] = re.split(r"\.", path) # Split by '.'
current_data: dict[str, Any] | None = data
for part in parts:
if current_data is None:
return
current_data = extract_dict_or_list_entry(current_data, part)
if isinstance(current_data, dict):
return current_data.get(key, None)
def convert_arguments_to_hinted_types( def convert_arguments_to_hinted_types(
args: dict[str, Any], type_hints: dict[str, Any] args: dict[str, Any], type_hints: dict[str, Any]
) -> dict[str, Any] | str: ) -> dict[str, Any] | str:
@@ -346,38 +148,34 @@ def parse_list_attr_and_index(attr_string: str) -> tuple[str, Optional[int]]:
""" """
Parses an attribute string and extracts a potential list attribute name and its Parses an attribute string and extracts a potential list attribute name and its
index. index.
Logs an error if the index is not a valid digit.
This function examines the provided attribute string. If the string contains square Args:
brackets, it assumes that it's a list attribute and the string within brackets is attr_string (str):
the index of an element. It then returns the attribute name and the index as an The attribute string to parse. Can be a regular attribute name (e.g.,
integer. If no brackets are present, the function assumes it's a regular attribute 'attr_name') or a list attribute with an index (e.g., 'list_attr[2]').
and returns the attribute name and None as the index.
Parameters:
-----------
attr_string: str
The attribute string to parse. Can be a regular attribute name (e.g.
'attr_name') or a list attribute with an index (e.g. 'list_attr[2]').
Returns: Returns:
-------- tuple[str, Optional[int]]:
tuple: (str, Optional[int]) A tuple containing the attribute name as a string and the index as an
A tuple containing the attribute name as a string and the index as an integer if integer if present, otherwise None.
present, otherwise None.
Example: Examples:
-------- >>> parse_attribute_and_index('list_attr[2]')
>>> parse_list_attr_and_index('list_attr[2]')
('list_attr', 2) ('list_attr', 2)
>>> parse_list_attr_and_index('attr_name') >>> parse_attribute_and_index('attr_name')
('attr_name', None) ('attr_name', None)
""" """
attr_name = attr_string
index = None index = None
if "[" in attr_string and "]" in attr_string: attr_name = attr_string
attr_name, idx = attr_string[:-1].split("[") if "[" in attr_string and attr_string.endswith("]"):
index = int(idx) attr_name, index_part = attr_string.split("[", 1)
index_part = index_part.rstrip("]")
if index_part.isdigit():
index = int(index_part)
else:
logger.error(f"Invalid index format in key: {attr_name}")
return attr_name, index return attr_name, index

View File

@@ -1,82 +1,115 @@
import logging import logging
import sys import sys
from types import FrameType from copy import copy
from typing import Optional from typing import Optional
import loguru import uvicorn.logging
import rpyc
from uvicorn.config import LOGGING_CONFIG from uvicorn.config import LOGGING_CONFIG
import pydase.config import pydase.config
ALLOWED_LOG_LEVELS = ["DEBUG", "INFO", "ERROR"]
class DefaultFormatter(uvicorn.logging.ColourizedFormatter):
"""
A custom log formatter class that:
* Outputs the LOG_LEVEL with an appropriate color.
* If a log call includes an `extras={"color_message": ...}` it will be used
for formatting the output, instead of the plain text message.
"""
def formatMessage(self, record: logging.LogRecord) -> str:
recordcopy = copy(record)
levelname = recordcopy.levelname
seperator = " " * (8 - len(recordcopy.levelname))
if self.use_colors:
levelname = self.color_level_name(levelname, recordcopy.levelno)
if "color_message" in recordcopy.__dict__:
recordcopy.msg = recordcopy.__dict__["color_message"]
recordcopy.__dict__["message"] = recordcopy.getMessage()
recordcopy.__dict__["levelprefix"] = levelname + seperator
return logging.Formatter.formatMessage(self, recordcopy)
def should_use_colors(self) -> bool:
return sys.stderr.isatty() # pragma: no cover
# from: https://github.com/Delgan/loguru section def setup_logging(level: Optional[str | int] = None) -> None:
# "Entirely compatible with standard logging" """
class InterceptHandler(logging.Handler): Configures the logging settings for the application.
def emit(self, record: logging.LogRecord) -> None:
# Ignore "asyncio.CancelledError" raised by uvicorn
if record.name == "uvicorn.error" and "CancelledError" in record.msg:
return
# Get corresponding Loguru level if it exists. This function sets up logging with specific formatting and colorization of log
level: int | str messages. The log level is determined based on the application's operation mode,
try: with an option to override the level. By default, in a development environment, the
level = loguru.logger.level(record.levelname).name log level is set to DEBUG, whereas in other environments, it is set to INFO.
except ValueError:
level = record.levelno
# Find caller from where originated the logged message. Parameters:
frame: Optional[FrameType] = sys._getframe(6) level (Optional[str | int]):
depth = 6 A specific log level to set for the application. If None, the log level is
while frame and frame.f_code.co_filename == logging.__file__: determined based on the application's operation mode. Accepts standard log
frame = frame.f_back level names ('DEBUG', 'INFO', etc.) and corresponding numerical values.
depth += 1
try: Example:
msg = record.getMessage()
except TypeError:
# A `TypeError` is raised when the `msg` string expects more arguments
# than are provided by `args`. This can happen when intercepting log
# messages with a certain format, like
# > logger.debug("call: %s%r", method_name, *args) # in tiqi_rpc
# where `*args` unpacks a sequence of values that should replace
# placeholders in the string.
msg = record.msg % (record.args[0], record.args[2:]) # type: ignore
loguru.logger.opt(depth=depth, exception=record.exc_info).log(level, msg) ```python
>>> import logging
>>> setup_logging(logging.DEBUG)
>>> setup_logging("INFO")
```
"""
logger = logging.getLogger()
def setup_logging(level: Optional[str] = None) -> None:
loguru.logger.debug("Configuring service logging.")
if pydase.config.OperationMode().environment == "development": if pydase.config.OperationMode().environment == "development":
log_level = "DEBUG" log_level = logging.DEBUG
else: else:
log_level = "INFO" log_level = logging.INFO
if level is not None and level in ALLOWED_LOG_LEVELS: # If a level is specified, check whether it's a string or an integer.
log_level = level if level is not None:
if isinstance(level, str):
# Convert known log level strings directly to their corresponding logging
# module constants.
level_name = level.upper() # Ensure level names are uppercase
if hasattr(logging, level_name):
log_level = getattr(logging, level_name)
else:
raise ValueError(
f"Invalid log level: {level}. Must be one of 'DEBUG', 'INFO', "
"'WARNING', 'ERROR', etc."
)
elif isinstance(level, int):
log_level = level # Directly use integer levels
else:
raise ValueError("Log level must be a string or an integer.")
loguru.logger.remove() # Set the logger's level.
loguru.logger.add(sys.stderr, level=log_level) logger.setLevel(log_level)
# set up the rpyc logger *before* adding the InterceptHandler to the logging module # create console handler and set level to debug
rpyc.setup_logger(quiet=True) # type: ignore ch = logging.StreamHandler()
logging.basicConfig(handlers=[InterceptHandler()], level=0) # add formatter to ch
ch.setFormatter(
DefaultFormatter(
fmt="%(asctime)s.%(msecs)03d | %(levelprefix)s | %(name)s:%(funcName)s:%(lineno)d - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
)
# add ch to logger
logger.addHandler(ch)
logger.debug("Configuring service logging.")
logging.getLogger("asyncio").setLevel(logging.INFO) logging.getLogger("asyncio").setLevel(logging.INFO)
logging.getLogger("urllib3").setLevel(logging.INFO) logging.getLogger("urllib3").setLevel(logging.INFO)
# overwriting the uvicorn logging config to use the loguru intercept handler # configuring uvicorn logger
LOGGING_CONFIG["handlers"] = { LOGGING_CONFIG["formatters"]["default"][
"default": { "fmt"
"()": InterceptHandler, ] = "%(asctime)s.%(msecs)03d | %(levelprefix)s %(message)s"
"formatter": "default", LOGGING_CONFIG["formatters"]["default"]["datefmt"] = "%Y-%m-%d %H:%M:%S"
}, LOGGING_CONFIG["formatters"]["access"][
"access": { "fmt"
"()": InterceptHandler, ] = '%(asctime)s.%(msecs)03d | %(levelprefix)s %(client_addr)s - "%(request_line)s" %(status_code)s'
"formatter": "access", LOGGING_CONFIG["formatters"]["access"]["datefmt"] = "%Y-%m-%d %H:%M:%S"
},
}

View File

@@ -0,0 +1,387 @@
import inspect
import logging
from collections.abc import Callable
from enum import Enum
from typing import Any, Optional
import pydase.units as u
from pydase.data_service.abstract_data_service import AbstractDataService
from pydase.utils.helpers import (
get_attribute_doc,
get_component_class_names,
parse_list_attr_and_index,
)
logger = logging.getLogger(__name__)
class SerializationPathError(Exception):
pass
class SerializationValueError(Exception):
pass
class Serializer:
@staticmethod
def serialize_object(obj: Any) -> dict[str, Any]:
result: dict[str, Any] = {}
if isinstance(obj, AbstractDataService):
result = Serializer._serialize_DataService(obj)
elif isinstance(obj, list):
result = Serializer._serialize_list(obj)
elif isinstance(obj, dict):
result = Serializer._serialize_dict(obj)
# Special handling for u.Quantity
elif isinstance(obj, u.Quantity):
result = Serializer._serialize_Quantity(obj)
# Handling for Enums
elif isinstance(obj, Enum):
result = Serializer._serialize_enum(obj)
# Methods and coroutines
elif inspect.isfunction(obj) or inspect.ismethod(obj):
result = Serializer._serialize_method(obj)
else:
obj_type = type(obj).__name__
value = obj
readonly = False
doc = get_attribute_doc(obj)
result = {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
return result
@staticmethod
def _serialize_enum(obj: Enum) -> dict[str, Any]:
value = obj.name
readonly = False
doc = get_attribute_doc(obj)
if type(obj).__base__.__name__ == "ColouredEnum":
obj_type = "ColouredEnum"
else:
obj_type = "Enum"
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
"enum": {
name: member.value for name, member in obj.__class__.__members__.items()
},
}
@staticmethod
def _serialize_Quantity(obj: u.Quantity) -> dict[str, Any]:
obj_type = "Quantity"
readonly = False
doc = get_attribute_doc(obj)
value = {"magnitude": obj.m, "unit": str(obj.u)}
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_dict(obj: dict[str, Any]) -> dict[str, Any]:
obj_type = "dict"
readonly = False
doc = get_attribute_doc(obj)
value = {key: Serializer.serialize_object(val) for key, val in obj.items()}
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_list(obj: list[Any]) -> dict[str, Any]:
obj_type = "list"
readonly = False
doc = get_attribute_doc(obj)
value = [Serializer.serialize_object(o) for o in obj]
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_method(obj: Callable[..., Any]) -> dict[str, Any]:
obj_type = "method"
value = None
readonly = True
doc = get_attribute_doc(obj)
# Store parameters and their anotations in a dictionary
sig = inspect.signature(obj)
parameters: dict[str, Optional[str]] = {}
for k, v in sig.parameters.items():
annotation = v.annotation
if annotation is not inspect._empty:
if isinstance(annotation, type):
# Handle regular types
parameters[k] = annotation.__name__
else:
# Union, string annotation, Literal types, ...
parameters[k] = str(annotation)
else:
parameters[k] = None
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
"async": inspect.iscoroutinefunction(obj),
"parameters": parameters,
}
@staticmethod
def _serialize_DataService(obj: AbstractDataService) -> dict[str, Any]:
readonly = False
doc = get_attribute_doc(obj)
obj_type = type(obj).__name__
if type(obj).__name__ not in get_component_class_names():
obj_type = "DataService"
# Get the dictionary of the base class
base_set = set(type(obj).__base__.__dict__)
# Get the dictionary of the derived class
derived_set = set(type(obj).__dict__)
# Get the difference between the two dictionaries
derived_only_set = derived_set - base_set
instance_dict = set(obj.__dict__)
# Merge the class and instance dictionaries
merged_set = derived_only_set | instance_dict
value = {}
# Iterate over attributes, properties, class attributes, and methods
for key in sorted(merged_set):
if key.startswith("_"):
continue # Skip attributes that start with underscore
# Skip keys that start with "start_" or "stop_" and end with an async
# method name
if (key.startswith("start_") or key.startswith("stop_")) and key.split(
"_", 1
)[1] in {
name
for name, _ in inspect.getmembers(
obj, predicate=inspect.iscoroutinefunction
)
}:
continue
val = getattr(obj, key)
value[key] = Serializer.serialize_object(val)
# If there's a running task for this method
if key in obj._task_manager.tasks:
task_info = obj._task_manager.tasks[key]
value[key]["value"] = task_info["kwargs"]
# If the DataService attribute is a property
if isinstance(getattr(obj.__class__, key, None), property):
prop: property = getattr(obj.__class__, key)
value[key]["readonly"] = prop.fset is None
value[key]["doc"] = get_attribute_doc(prop) # overwrite the doc
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
def dump(obj: Any) -> dict[str, Any]:
return Serializer.serialize_object(obj)
def set_nested_value_by_path(
serialization_dict: dict[str, Any], path: str, value: Any
) -> None:
"""
Set a value in a nested dictionary structure, which conforms to the serialization
format used by `pydase.utils.serializer.Serializer`, using a dot-notation path.
Args:
serialization_dict:
The base dictionary representing data serialized with
`pydase.utils.serializer.Serializer`.
path:
The dot-notation path (e.g., 'attr1.attr2[0].attr3') indicating where to
set the value.
value:
The new value to set at the specified path.
Note:
- If the index equals the length of the list, the function will append the
serialized representation of the 'value' to the list.
"""
parent_path_parts, attr_name = path.split(".")[:-1], path.split(".")[-1]
current_dict: dict[str, Any] = serialization_dict
try:
for path_part in parent_path_parts:
current_dict = get_next_level_dict_by_key(
current_dict, path_part, allow_append=False
)
current_dict = current_dict["value"]
current_dict = get_next_level_dict_by_key(
current_dict, attr_name, allow_append=True
)
except (SerializationPathError, SerializationValueError, KeyError) as e:
logger.error(e)
return
# setting the new value
serialized_value = dump(value)
if "readonly" in current_dict:
current_dict["value"] = serialized_value["value"]
current_dict["type"] = serialized_value["type"]
else:
current_dict.update(serialized_value)
def get_nested_dict_by_path(
serialization_dict: dict[str, Any],
path: str,
) -> dict[str, Any]:
parent_path_parts, attr_name = path.split(".")[:-1], path.split(".")[-1]
current_dict: dict[str, Any] = serialization_dict
try:
for path_part in parent_path_parts:
current_dict = get_next_level_dict_by_key(
current_dict, path_part, allow_append=False
)
current_dict = current_dict["value"]
current_dict = get_next_level_dict_by_key(
current_dict, attr_name, allow_append=False
)
except (SerializationPathError, SerializationValueError, KeyError) as e:
logger.error(e)
return {}
return current_dict
def get_next_level_dict_by_key(
serialization_dict: dict[str, Any],
attr_name: str,
allow_append: bool = False,
) -> dict[str, Any]:
"""
Retrieve a nested dictionary entry or list item from a data structure serialized
with `pydase.utils.serializer.Serializer`.
Args:
serialization_dict: The base dictionary representing serialized data.
attr_name: The key name representing the attribute in the dictionary,
e.g. 'list_attr[0]' or 'attr'
allow_append: Flag to allow appending a new entry if `index` is out of range by
one.
Returns:
The dictionary or list item corresponding to the attribute and index.
Raises:
SerializationPathError: If the path composed of `attr_name` and `index` is
invalid or leads to an IndexError or KeyError.
SerializationValueError: If the expected nested structure is not a dictionary.
"""
# Check if the key contains an index part like 'attr_name[<index>]'
attr_name, index = parse_list_attr_and_index(attr_name)
try:
if index is not None:
serialization_dict = serialization_dict[attr_name]["value"][index]
else:
serialization_dict = serialization_dict[attr_name]
except IndexError as e:
if allow_append and index == len(serialization_dict[attr_name]["value"]):
# Appending to list
serialization_dict[attr_name]["value"].append({})
serialization_dict = serialization_dict[attr_name]["value"][index]
else:
raise SerializationPathError(
f"Error occured trying to change '{attr_name}[{index}]': {e}"
)
except KeyError:
raise SerializationPathError(
f"Error occured trying to access the key '{attr_name}': it is either "
"not present in the current dictionary or its value does not contain "
"a 'value' key."
)
if not isinstance(serialization_dict, dict):
raise SerializationValueError(
f"Expected a dictionary at '{attr_name}', but found type "
f"'{type(serialization_dict).__name__}' instead."
)
return serialization_dict
def generate_serialized_data_paths(
data: dict[str, Any], parent_path: str = ""
) -> list[str]:
"""
Generate a list of access paths for all attributes in a dictionary representing
data serialized with `pydase.utils.serializer.Serializer`, excluding those that are
methods.
Args:
data: The dictionary representing serialized data, typically produced by
`pydase.utils.serializer.Serializer`.
parent_path: The base path to prepend to the keys in the `data` dictionary to
form the access paths. Defaults to an empty string.
Returns:
A list of strings where each string is a dot-notation access path to an
attribute in the serialized data.
"""
paths = []
for key, value in data.items():
if value["type"] == "method":
# ignoring methods
continue
new_path = f"{parent_path}.{key}" if parent_path else key
if isinstance(value["value"], dict) and value["type"] != "Quantity":
paths.extend(generate_serialized_data_paths(value["value"], new_path)) # type: ignore
elif isinstance(value["value"], list):
for index, item in enumerate(value["value"]):
indexed_key_path = f"{new_path}[{index}]"
if isinstance(item["value"], dict):
paths.extend( # type: ignore
generate_serialized_data_paths(item["value"], indexed_key_path)
)
else:
paths.append(indexed_key_path) # type: ignore
else:
paths.append(new_path) # type: ignore
return paths

View File

@@ -1,4 +1,8 @@
from loguru import logger import logging
from pydase.utils.helpers import get_component_class_names
logger = logging.getLogger(__name__)
def warn_if_instance_class_does_not_inherit_from_DataService(__value: object) -> None: def warn_if_instance_class_does_not_inherit_from_DataService(__value: object) -> None:
@@ -13,7 +17,8 @@ def warn_if_instance_class_does_not_inherit_from_DataService(__value: object) ->
"asyncio.unix_events", "asyncio.unix_events",
"_abc", "_abc",
] ]
and base_class_name not in ["DataService", "list", "Enum"] and base_class_name
not in ["DataService", "list", "Enum"] + get_component_class_names()
and type(__value).__name__ not in ["CallbackManager", "TaskManager", "Quantity"] and type(__value).__name__ not in ["CallbackManager", "TaskManager", "Quantity"]
): ):
logger.warning( logger.warning(

View File

@@ -1,26 +0,0 @@
from collections.abc import Generator
from typing import Any
import pytest
from loguru import logger
from pytest import LogCaptureFixture
from pydase import DataService
from pydase.data_service.callback_manager import CallbackManager
@pytest.fixture
def caplog(caplog: LogCaptureFixture) -> Generator[LogCaptureFixture, Any, None]:
handler_id = logger.add(caplog.handler, format="{message}")
yield caplog
logger.remove(handler_id)
def emit(self: Any, parent_path: str, name: str, value: Any) -> None:
if isinstance(value, DataService):
value = value.serialize()
print(f"{parent_path}.{name} = {value}")
CallbackManager.emit_notification = emit # type: ignore

View File

@@ -0,0 +1,41 @@
from pytest import LogCaptureFixture
from pydase.components.coloured_enum import ColouredEnum
from pydase.data_service.data_service import DataService
def test_ColouredEnum(caplog: LogCaptureFixture) -> None:
class MyStatus(ColouredEnum):
RUNNING = "#00FF00"
FAILING = "#FF0000"
class ServiceClass(DataService):
_status = MyStatus.RUNNING
@property
def status(self) -> MyStatus:
return self._status
@status.setter
def status(self, value: MyStatus) -> None:
# do something ...
self._status = value
service = ServiceClass()
service.status = MyStatus.FAILING
assert "ServiceClass.status changed to MyStatus.FAILING" in caplog.text
def test_warning(caplog: LogCaptureFixture) -> None: # noqa
class MyStatus(ColouredEnum):
RUNNING = "#00FF00"
FAILING = "#FF0000"
class ServiceClass(DataService):
status = MyStatus.RUNNING
assert (
"Warning: Class MyStatus does not inherit from DataService." not in caplog.text
)

View File

@@ -3,10 +3,8 @@ from pytest import CaptureFixture, LogCaptureFixture
from pydase.components.number_slider import NumberSlider from pydase.components.number_slider import NumberSlider
from pydase.data_service.data_service import DataService from pydase.data_service.data_service import DataService
from .. import caplog # noqa
def test_NumberSlider(caplog: LogCaptureFixture) -> None:
def test_NumberSlider(capsys: CaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
number_slider = NumberSlider(1, 0, 10, 1) number_slider = NumberSlider(1, 0, 10, 1)
int_number_slider = NumberSlider(1, 0, 10, 1, "int") int_number_slider = NumberSlider(1, 0, 10, 1, "int")
@@ -30,28 +28,13 @@ def test_NumberSlider(capsys: CaptureFixture) -> None:
service.number_slider.value = 10.0 service.number_slider.value = 10.0
service.int_number_slider.value = 10.1 service.int_number_slider.value = 10.1
captured = capsys.readouterr() assert "ServiceClass.number_slider.value changed to 10.0" in caplog.text
assert "ServiceClass.int_number_slider.value changed to 10" in caplog.text
expected_output = sorted( caplog.clear()
[
"ServiceClass.number_slider.value = 10.0",
"ServiceClass.int_number_slider.value = 10",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
service.number_slider.min = 1.1 service.number_slider.min = 1.1
captured = capsys.readouterr() assert "ServiceClass.number_slider.min changed to 1.1" in caplog.text
expected_output = sorted(
[
"ServiceClass.number_slider.min = 1.1",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
def test_init_error(caplog: LogCaptureFixture) -> None: # noqa def test_init_error(caplog: LogCaptureFixture) -> None: # noqa

View File

@@ -0,0 +1,42 @@
import logging
from pytest import LogCaptureFixture
import pydase
logger = logging.getLogger()
def test_DataService_task_callback(caplog: LogCaptureFixture) -> None:
class MyService(pydase.DataService):
async def my_task(self) -> None:
logger.info("Triggered task.")
async def my_other_task(self) -> None:
logger.info("Triggered other task.")
service = MyService()
service.start_my_task() # type: ignore
service.start_my_other_task() # type: ignore
assert "MyService.my_task changed to {}" in caplog.text
assert "MyService.my_other_task changed to {}" in caplog.text
def test_DataServiceList_task_callback(caplog: LogCaptureFixture) -> None:
class MySubService(pydase.DataService):
async def my_task(self) -> None:
logger.info("Triggered task.")
async def my_other_task(self) -> None:
logger.info("Triggered other task.")
class MyService(pydase.DataService):
sub_services_list = [MySubService() for i in range(2)]
service = MyService()
service.sub_services_list[0].start_my_task() # type: ignore
service.sub_services_list[1].start_my_other_task() # type: ignore
assert "MyService.sub_services_list[0].my_task changed to {}" in caplog.text
assert "MyService.sub_services_list[1].my_other_task changed to {}" in caplog.text

View File

@@ -1,64 +0,0 @@
from enum import Enum
import pydase
def test_enum_serialize() -> None:
class EnumClass(Enum):
FOO = "foo"
BAR = "bar"
class EnumAttribute(pydase.DataService):
def __init__(self) -> None:
self.some_enum = EnumClass.FOO
super().__init__()
class EnumPropertyWithoutSetter(pydase.DataService):
def __init__(self) -> None:
self._some_enum = EnumClass.FOO
super().__init__()
@property
def some_enum(self) -> EnumClass:
return self._some_enum
class EnumPropertyWithSetter(pydase.DataService):
def __init__(self) -> None:
self._some_enum = EnumClass.FOO
super().__init__()
@property
def some_enum(self) -> EnumClass:
return self._some_enum
@some_enum.setter
def some_enum(self, value: EnumClass) -> None:
self._some_enum = value
assert EnumAttribute().serialize() == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
"doc": None,
}
}
assert EnumPropertyWithoutSetter().serialize() == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": True,
"doc": None,
}
}
assert EnumPropertyWithSetter().serialize() == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
"doc": None,
}
}

View File

@@ -0,0 +1,25 @@
import logging
import pydase
from pydase.data_service.data_service_cache import DataServiceCache
from pydase.utils.serializer import get_nested_dict_by_path
logger = logging.getLogger()
def test_nested_attributes_cache_callback() -> None:
class SubClass(pydase.DataService):
name = "Hello"
class ServiceClass(pydase.DataService):
class_attr = SubClass()
name = "World"
test_service = ServiceClass()
cache = DataServiceCache(test_service)
test_service.name = "Peepz"
assert get_nested_dict_by_path(cache.cache, "name")["value"] == "Peepz"
test_service.class_attr.name = "Ciao"
assert get_nested_dict_by_path(cache.cache, "class_attr.name")["value"] == "Ciao"

View File

@@ -0,0 +1,129 @@
from typing import Any
from pytest import LogCaptureFixture
import pydase.units as u
from pydase import DataService
def test_class_list_attribute(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService):
attr = [0, 1]
service_instance = ServiceClass()
service_instance.attr[0] = 1337
assert "ServiceClass.attr[0] changed to 1337" in caplog.text
caplog.clear()
def test_instance_list_attribute(caplog: LogCaptureFixture) -> None:
class SubClass(DataService):
name = "SubClass"
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr: list[Any] = [0, SubClass()]
super().__init__()
service_instance = ServiceClass()
service_instance.attr[0] = "Hello"
assert "ServiceClass.attr[0] changed to Hello" in caplog.text
caplog.clear()
service_instance.attr[1] = SubClass()
assert f"ServiceClass.attr[1] changed to {service_instance.attr[1]}" in caplog.text
caplog.clear()
def test_reused_instance_list_attribute(caplog: LogCaptureFixture) -> None:
some_list = [0, 1, 2]
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr = some_list
self.attr_2 = some_list
self.attr_3 = [0, 1, 2]
super().__init__()
service_instance = ServiceClass()
service_instance.attr[0] = 20
assert service_instance.attr == service_instance.attr_2
assert service_instance.attr != service_instance.attr_3
assert "ServiceClass.attr[0] changed to 20" in caplog.text
assert "ServiceClass.attr_2[0] changed to 20" in caplog.text
def test_nested_reused_instance_list_attribute(caplog: LogCaptureFixture) -> None:
some_list = [0, 1, 2]
class SubClass(DataService):
attr_list = some_list
def __init__(self) -> None:
self.attr_list_2 = some_list
super().__init__()
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr = some_list
self.subclass = SubClass()
super().__init__()
service_instance = ServiceClass()
service_instance.attr[0] = 20
assert service_instance.attr == service_instance.subclass.attr_list
assert "ServiceClass.attr[0] changed to 20" in caplog.text
assert "ServiceClass.subclass.attr_list[0] changed to 20" in caplog.text
assert "ServiceClass.subclass.attr_list_2[0] changed to 20" in caplog.text
def test_protected_list_attribute(caplog: LogCaptureFixture) -> None:
"""Changing protected lists should not emit notifications for the lists themselves,
but still for all properties depending on them.
"""
class ServiceClass(DataService):
_attr = [0, 1]
@property
def list_dependend_property(self) -> int:
return self._attr[0]
service_instance = ServiceClass()
service_instance._attr[0] = 1337
assert "ServiceClass.list_dependend_property changed to 1337" in caplog.text
def test_converting_int_to_float_entries(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService):
float_list = [0.0]
service_instance = ServiceClass()
service_instance.float_list[0] = 1
assert isinstance(service_instance.float_list[0], float)
assert "ServiceClass.float_list[0] changed to 1.0" in caplog.text
def test_converting_number_to_quantity_entries(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService):
quantity_list: list[u.Quantity] = [1 * u.units.A]
service_instance = ServiceClass()
service_instance.quantity_list[0] = 4 # type: ignore
assert isinstance(service_instance.quantity_list[0], u.Quantity)
assert "ServiceClass.quantity_list[0] changed to 4.0 A" in caplog.text
caplog.clear()
service_instance.quantity_list[0] = 3.1 * u.units.mA
assert isinstance(service_instance.quantity_list[0], u.Quantity)
assert "ServiceClass.quantity_list[0] changed to 3.1 mA" in caplog.text

View File

@@ -0,0 +1,278 @@
import json
from pathlib import Path
from typing import Any
from pytest import LogCaptureFixture
import pydase
import pydase.units as u
from pydase.components.coloured_enum import ColouredEnum
from pydase.data_service.state_manager import (
StateManager,
has_load_state_decorator,
load_state,
)
class SubService(pydase.DataService):
name = "SubService"
class State(ColouredEnum):
RUNNING = "#0000FF80"
COMPLETED = "hsl(120, 100%, 50%)"
FAILED = "hsla(0, 100%, 50%, 0.7)"
class Service(pydase.DataService):
def __init__(self, **kwargs: Any) -> None:
self.subservice = SubService()
self.some_unit: u.Quantity = 1.2 * u.units.A
self.some_float = 1.0
self.list_attr = [1.0, 2.0]
self._property_attr = 1337.0
self._name = "Service"
self.state = State.RUNNING
super().__init__(**kwargs)
@property
def name(self) -> str:
return self._name
@property
def property_attr(self) -> float:
return self._property_attr
@property_attr.setter
def property_attr(self, value: float) -> None:
self._property_attr = value
CURRENT_STATE = Service().serialize()
LOAD_STATE = {
"list_attr": {
"type": "list",
"value": [
{"type": "float", "value": 1.4, "readonly": False, "doc": None},
{"type": "float", "value": 2.0, "readonly": False, "doc": None},
],
"readonly": False,
"doc": None,
},
"name": {
"type": "str",
"value": "Another name",
"readonly": True,
"doc": None,
},
"some_float": {
"type": "int",
"value": 10,
"readonly": False,
"doc": None,
},
"property_attr": {
"type": "float",
"value": 1337.1,
"readonly": False,
"doc": None,
},
"some_unit": {
"type": "Quantity",
"value": {"magnitude": 12.0, "unit": "A"},
"readonly": False,
"doc": None,
},
"state": {
"type": "ColouredEnum",
"value": "FAILED",
"readonly": True,
"doc": None,
"enum": {
"RUNNING": "#0000FF80",
"COMPLETED": "hsl(120, 100%, 50%)",
"FAILED": "hsla(0, 100%, 50%, 0.7)",
},
},
"subservice": {
"type": "DataService",
"value": {
"name": {
"type": "str",
"value": "SubService",
"readonly": False,
"doc": None,
}
},
"readonly": False,
"doc": None,
},
"removed_attr": {
"type": "str",
"value": "removed",
"readonly": False,
"doc": None,
},
}
def test_save_state(tmp_path: Path):
# Create a StateManager instance with a temporary file
file = tmp_path / "test_state.json"
manager = StateManager(service=Service(), filename=str(file))
# Trigger the saving action
manager.save_state()
# Now check that the file was written correctly
assert file.read_text() == json.dumps(CURRENT_STATE, indent=4)
def test_load_state(tmp_path: Path, caplog: LogCaptureFixture):
# Create a StateManager instance with a temporary file
file = tmp_path / "test_state.json"
# Write a temporary JSON file to read back
with open(file, "w") as f:
json.dump(LOAD_STATE, f, indent=4)
service = Service()
manager = StateManager(service=service, filename=str(file))
manager.load_state()
assert service.some_unit == u.Quantity(12, "A") # has changed
assert service.list_attr[0] == 1.4 # has changed
assert service.list_attr[1] == 2.0 # has not changed
assert (
service.property_attr == 1337
) # has not changed as property has not @load_state decorator
assert service.state == State.FAILED # has changed
assert service.name == "Service" # has not changed as readonly
assert service.some_float == 1.0 # has not changed due to different type
assert service.subservice.name == "SubService" # didn't change
assert "Service.some_unit changed to 12.0 A!" in caplog.text
assert (
"Property 'name' has no '@load_state' decorator. "
"Ignoring value from JSON file..." in caplog.text
)
assert (
"Attribute type of 'some_float' changed from 'int' to 'float'. "
"Ignoring value from JSON file..."
) in caplog.text
assert (
"Attribute type of 'removed_attr' changed from 'str' to None. "
"Ignoring value from JSON file..." in caplog.text
)
assert "Value of attribute 'subservice.name' has not changed..." in caplog.text
def test_filename_warning(tmp_path: Path, caplog: LogCaptureFixture):
file = tmp_path / "test_state.json"
service = Service(filename=str(file))
StateManager(service=service, filename=str(file))
assert f"Overwriting filename {str(file)!r} with {str(file)!r}." in caplog.text
def test_filename_error(caplog: LogCaptureFixture):
service = Service()
manager = StateManager(service=service)
manager.save_state()
assert (
"State manager was not initialised with a filename. Skipping 'save_state'..."
in caplog.text
)
def test_readonly_attribute(tmp_path: Path, caplog: LogCaptureFixture):
# Create a StateManager instance with a temporary file
file = tmp_path / "test_state.json"
# Write a temporary JSON file to read back
with open(file, "w") as f:
json.dump(LOAD_STATE, f, indent=4)
service = Service()
manager = StateManager(service=service, filename=str(file))
manager.load_state()
assert service.name == "Service"
assert (
"Property 'name' has no '@load_state' decorator. "
"Ignoring value from JSON file..." in caplog.text
)
def test_changed_type(tmp_path: Path, caplog: LogCaptureFixture):
# Create a StateManager instance with a temporary file
file = tmp_path / "test_state.json"
# Write a temporary JSON file to read back
with open(file, "w") as f:
json.dump(LOAD_STATE, f, indent=4)
service = Service()
manager = StateManager(service=service, filename=str(file))
manager.load_state()
assert (
"Attribute type of 'some_float' changed from 'int' to "
"'float'. Ignoring value from JSON file..."
) in caplog.text
def test_property_load_state(tmp_path: Path):
# Create a StateManager instance with a temporary file
file = tmp_path / "test_state.json"
LOAD_STATE = {
"name": {
"type": "str",
"value": "Some other name",
"readonly": False,
"doc": None,
},
"not_loadable_attr": {
"type": "str",
"value": "But I AM loadable!?",
"readonly": False,
"doc": None,
},
}
# Write a temporary JSON file to read back
with open(file, "w") as f:
json.dump(LOAD_STATE, f, indent=4)
class Service(pydase.DataService):
_name = "Service"
_not_loadable_attr = "Not loadable"
@property
def name(self) -> str:
return self._name
@name.setter
@load_state
def name(self, value: str) -> None:
self._name = value
@property
def not_loadable_attr(self) -> str:
return self._not_loadable_attr
@not_loadable_attr.setter
def not_loadable_attr(self, value: str) -> None:
self._not_loadable_attr = value
@property
def property_without_setter(self) -> None:
return
service_instance = Service()
StateManager(service_instance, filename=file).load_state()
assert service_instance.name == "Some other name"
assert service_instance.not_loadable_attr == "Not loadable"
assert not has_load_state_decorator(type(service_instance).property_without_setter)

View File

@@ -0,0 +1,85 @@
import logging
from pytest import LogCaptureFixture
import pydase
logger = logging.getLogger()
def test_autostart_task_callback(caplog: LogCaptureFixture) -> None:
class MyService(pydase.DataService):
def __init__(self) -> None:
self._autostart_tasks = { # type: ignore
"my_task": (),
"my_other_task": (),
}
super().__init__()
async def my_task(self) -> None:
logger.info("Triggered task.")
async def my_other_task(self) -> None:
logger.info("Triggered other task.")
service = MyService()
service._task_manager.start_autostart_tasks()
assert "MyService.my_task changed to {}" in caplog.text
assert "MyService.my_other_task changed to {}" in caplog.text
def test_DataService_subclass_autostart_task_callback(
caplog: LogCaptureFixture,
) -> None:
class MySubService(pydase.DataService):
def __init__(self) -> None:
self._autostart_tasks = { # type: ignore
"my_task": (),
"my_other_task": (),
}
super().__init__()
async def my_task(self) -> None:
logger.info("Triggered task.")
async def my_other_task(self) -> None:
logger.info("Triggered other task.")
class MyService(pydase.DataService):
sub_service = MySubService()
service = MyService()
service._task_manager.start_autostart_tasks()
assert "MyService.sub_service.my_task changed to {}" in caplog.text
assert "MyService.sub_service.my_other_task changed to {}" in caplog.text
def test_DataServiceList_subclass_autostart_task_callback(
caplog: LogCaptureFixture,
) -> None:
class MySubService(pydase.DataService):
def __init__(self) -> None:
self._autostart_tasks = { # type: ignore
"my_task": (),
"my_other_task": (),
}
super().__init__()
async def my_task(self) -> None:
logger.info("Triggered task.")
async def my_other_task(self) -> None:
logger.info("Triggered other task.")
class MyService(pydase.DataService):
sub_services_list = [MySubService() for i in range(2)]
service = MyService()
service._task_manager.start_autostart_tasks()
assert "MyService.sub_services_list[0].my_task changed to {}" in caplog.text
assert "MyService.sub_services_list[0].my_other_task changed to {}" in caplog.text
assert "MyService.sub_services_list[1].my_task changed to {}" in caplog.text
assert "MyService.sub_services_list[1].my_other_task changed to {}" in caplog.text

View File

@@ -0,0 +1,35 @@
import signal
from pytest_mock import MockerFixture
import pydase
def test_signal_handling(mocker: MockerFixture):
# Mock os._exit and signal.signal
mock_exit = mocker.patch("os._exit")
mock_signal = mocker.patch("signal.signal")
class MyService(pydase.DataService):
pass
# Instantiate your server object
server = pydase.Server(MyService())
# Call the method to install signal handlers
server.install_signal_handlers()
# Check if the signal handlers were registered correctly
assert mock_signal.call_args_list == [
mocker.call(signal.SIGINT, server.handle_exit),
mocker.call(signal.SIGTERM, server.handle_exit),
]
# Simulate receiving a SIGINT signal for the first time
server.handle_exit(signal.SIGINT, None)
assert server.should_exit # assuming should_exit is public
mock_exit.assert_not_called()
# Simulate receiving a SIGINT signal for the second time
server.handle_exit(signal.SIGINT, None)
mock_exit.assert_called_once_with(1)

View File

@@ -1,101 +0,0 @@
from pytest import CaptureFixture
from pydase import DataService
def test_class_list_attribute(capsys: CaptureFixture) -> None:
class ServiceClass(DataService):
attr = [0, 1]
service_instance = ServiceClass()
service_instance.attr[0] = 1337
captured = capsys.readouterr()
assert captured.out == "ServiceClass.attr[0] = 1337\n"
def test_instance_list_attribute(capsys: CaptureFixture) -> None:
class SubClass(DataService):
name = "SubClass"
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr = [0, SubClass()]
super().__init__()
service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr[0] = "Hello"
captured = capsys.readouterr()
assert captured.out == "ServiceClass.attr[0] = Hello\n"
service_instance.attr[1] = SubClass()
captured = capsys.readouterr()
assert (
captured.out.strip()
== "ServiceClass.attr[1] = {'name': {'type': 'str', 'value': 'SubClass',"
" 'readonly': False, 'doc': None}}"
)
def test_reused_instance_list_attribute(capsys: CaptureFixture) -> None:
some_list = [0, 1, 2]
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr = some_list
self.attr_2 = some_list
self.attr_3 = [0, 1, 2]
super().__init__()
service_instance = ServiceClass()
service_instance.attr[0] = 20
captured = capsys.readouterr()
assert service_instance.attr == service_instance.attr_2
assert service_instance.attr != service_instance.attr_3
expected_output = sorted(
[
"ServiceClass.attr[0] = 20",
"ServiceClass.attr_2[0] = 20",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_reused_instance_list_attribute(capsys: CaptureFixture) -> None:
some_list = [0, 1, 2]
class SubClass(DataService):
attr_list = some_list
def __init__(self) -> None:
self.attr_list_2 = some_list
super().__init__()
class ServiceClass(DataService):
def __init__(self) -> None:
self.attr = some_list
self.subclass = SubClass()
super().__init__()
service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr[0] = 20
captured = capsys.readouterr()
assert service_instance.attr == service_instance.subclass.attr_list
expected_output = sorted(
[
"ServiceClass.subclass.attr_list_2[0] = 20",
"ServiceClass.subclass.attr_list[0] = 20",
"ServiceClass.attr[0] = 20",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output

View File

@@ -1,9 +1,9 @@
from pytest import CaptureFixture from pytest import LogCaptureFixture
from pydase import DataService from pydase import DataService
def test_class_attributes(capsys: CaptureFixture) -> None: def test_class_attributes(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -11,14 +11,12 @@ def test_class_attributes(capsys: CaptureFixture) -> None:
attr_1 = SubClass() attr_1 = SubClass()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr_1.name = "Hi" service_instance.attr_1.name = "Hi"
captured = capsys.readouterr() assert "ServiceClass.attr_1.name changed to Hi" in caplog.text
assert captured.out.strip() == "ServiceClass.attr_1.name = Hi"
def test_instance_attributes(capsys: CaptureFixture) -> None: def test_instance_attributes(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -28,25 +26,22 @@ def test_instance_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr_1.name = "Hi" service_instance.attr_1.name = "Hi"
captured = capsys.readouterr() assert "ServiceClass.attr_1.name changed to Hi" in caplog.text
assert captured.out.strip() == "ServiceClass.attr_1.name = Hi"
def test_class_attribute(capsys: CaptureFixture) -> None: def test_class_attribute(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
attr = 0 attr = 0
service_instance = ServiceClass() service_instance = ServiceClass()
service_instance.attr = 1 service_instance.attr = 1
captured = capsys.readouterr() assert "ServiceClass.attr changed to 1" in caplog.text
assert captured.out == "ServiceClass.attr = 1\n"
def test_instance_attribute(capsys: CaptureFixture) -> None: def test_instance_attribute(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
def __init__(self) -> None: def __init__(self) -> None:
self.attr = "Hello World" self.attr = "Hello World"
@@ -55,11 +50,10 @@ def test_instance_attribute(capsys: CaptureFixture) -> None:
service_instance = ServiceClass() service_instance = ServiceClass()
service_instance.attr = "Hello" service_instance.attr = "Hello"
captured = capsys.readouterr() assert "ServiceClass.attr changed to Hello" in caplog.text
assert captured.out == "ServiceClass.attr = Hello\n"
def test_reused_instance_attributes(capsys: CaptureFixture) -> None: def test_reused_instance_attributes(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -72,22 +66,14 @@ def test_reused_instance_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr_1.name = "Hi" service_instance.attr_1.name = "Hi"
captured = capsys.readouterr()
assert service_instance.attr_1 == service_instance.attr_2 assert service_instance.attr_1 == service_instance.attr_2
expected_output = sorted( assert "ServiceClass.attr_1.name changed to Hi" in caplog.text
[ assert "ServiceClass.attr_2.name changed to Hi" in caplog.text
"ServiceClass.attr_1.name = Hi",
"ServiceClass.attr_2.name = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_reused_attributes_mixed(capsys: CaptureFixture) -> None: def test_reused_attributes_mixed(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
pass pass
@@ -101,22 +87,14 @@ def test_reused_attributes_mixed(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr_1.name = "Hi" service_instance.attr_1.name = "Hi"
captured = capsys.readouterr()
assert service_instance.attr_1 == service_instance.attr_2 assert service_instance.attr_1 == service_instance.attr_2
expected_output = sorted( assert "ServiceClass.attr_1.name changed to Hi" in caplog.text
[ assert "ServiceClass.attr_2.name changed to Hi" in caplog.text
"ServiceClass.attr_1.name = Hi",
"ServiceClass.attr_2.name = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_class_attributes(capsys: CaptureFixture) -> None: def test_nested_class_attributes(caplog: LogCaptureFixture) -> None:
class SubSubSubClass(DataService): class SubSubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -133,26 +111,18 @@ def test_nested_class_attributes(capsys: CaptureFixture) -> None:
attr = SubClass() attr = SubClass()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr.attr.attr.name = "Hi" service_instance.attr.attr.attr.name = "Hi"
service_instance.attr.attr.name = "Hou" service_instance.attr.attr.name = "Hou"
service_instance.attr.name = "foo" service_instance.attr.name = "foo"
service_instance.name = "bar" service_instance.name = "bar"
captured = capsys.readouterr() assert "ServiceClass.attr.attr.attr.name changed to Hi" in caplog.text
expected_output = sorted( assert "ServiceClass.attr.attr.name changed to Hou" in caplog.text
[ assert "ServiceClass.attr.name changed to foo" in caplog.text
"ServiceClass.attr.attr.attr.name = Hi", assert "ServiceClass.name changed to bar" in caplog.text
"ServiceClass.attr.attr.name = Hou",
"ServiceClass.attr.name = foo",
"ServiceClass.name = bar",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_instance_attributes(capsys: CaptureFixture) -> None: def test_nested_instance_attributes(caplog: LogCaptureFixture) -> None:
class SubSubSubClass(DataService): class SubSubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -175,26 +145,18 @@ def test_nested_instance_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr.attr.attr.name = "Hi" service_instance.attr.attr.attr.name = "Hi"
service_instance.attr.attr.name = "Hou" service_instance.attr.attr.name = "Hou"
service_instance.attr.name = "foo" service_instance.attr.name = "foo"
service_instance.name = "bar" service_instance.name = "bar"
captured = capsys.readouterr() assert "ServiceClass.attr.attr.attr.name changed to Hi" in caplog.text
expected_output = sorted( assert "ServiceClass.attr.attr.name changed to Hou" in caplog.text
[ assert "ServiceClass.attr.name changed to foo" in caplog.text
"ServiceClass.attr.attr.attr.name = Hi", assert "ServiceClass.name changed to bar" in caplog.text
"ServiceClass.attr.attr.name = Hou",
"ServiceClass.attr.name = foo",
"ServiceClass.name = bar",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_advanced_nested_class_attributes(capsys: CaptureFixture) -> None: def test_advanced_nested_class_attributes(caplog: LogCaptureFixture) -> None:
class SubSubSubClass(DataService): class SubSubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -209,32 +171,17 @@ def test_advanced_nested_class_attributes(capsys: CaptureFixture) -> None:
subattr = SubSubClass() subattr = SubSubClass()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr.attr.attr.name = "Hi" service_instance.attr.attr.attr.name = "Hi"
captured = capsys.readouterr() assert "ServiceClass.attr.attr.attr.name changed to Hi" in caplog.text
expected_output = sorted( assert "ServiceClass.subattr.attr.name changed to Hi" in caplog.text
[
"ServiceClass.attr.attr.attr.name = Hi",
"ServiceClass.subattr.attr.name = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.subattr.attr.name = "Ho" service_instance.subattr.attr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr.attr.attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.subattr.attr.name changed to Ho" in caplog.text
[
"ServiceClass.attr.attr.attr.name = Ho",
"ServiceClass.subattr.attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_advanced_nested_instance_attributes(capsys: CaptureFixture) -> None: def test_advanced_nested_instance_attributes(caplog: LogCaptureFixture) -> None:
class SubSubSubClass(DataService): class SubSubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -257,32 +204,19 @@ def test_advanced_nested_instance_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
service_instance.attr.attr.attr.name = "Hi" service_instance.attr.attr.attr.name = "Hi"
assert "ServiceClass.attr.attr.attr.name changed to Hi" in caplog.text
assert "ServiceClass.subattr.attr.name changed to Hi" in caplog.text
caplog.clear()
captured = capsys.readouterr()
expected_output = sorted(
[
"ServiceClass.attr.attr.attr.name = Hi",
"ServiceClass.subattr.attr.name = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.subattr.attr.name = "Ho" service_instance.subattr.attr.name = "Ho"
assert "ServiceClass.attr.attr.attr.name changed to Ho" in caplog.text
captured = capsys.readouterr() assert "ServiceClass.subattr.attr.name changed to Ho" in caplog.text
expected_output = sorted( caplog.clear()
[
"ServiceClass.attr.attr.attr.name = Ho",
"ServiceClass.subattr.attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_advanced_nested_attributes_mixed(capsys: CaptureFixture) -> None: def test_advanced_nested_attributes_mixed(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -310,44 +244,28 @@ def test_advanced_nested_attributes_mixed(capsys: CaptureFixture) -> None:
# instances of SubSubClass are unequal # instances of SubSubClass are unequal
assert service_instance.attr.attr_1 != service_instance.class_attr.class_attr assert service_instance.attr.attr_1 != service_instance.class_attr.class_attr
_ = capsys.readouterr()
service_instance.class_attr.class_attr.name = "Ho" service_instance.class_attr.class_attr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.class_attr.class_attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr.class_attr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.class_attr.class_attr.name = Ho",
"ServiceClass.attr.class_attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.class_attr.attr_1.name = "Ho" service_instance.class_attr.attr_1.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.class_attr.attr_1.name changed to Ho" in caplog.text
expected_output = sorted(["ServiceClass.class_attr.attr_1.name = Ho"]) assert "ServiceClass.attr.attr_1.name changed to Ho" not in caplog.text
actual_output = sorted(captured.out.strip().split("\n")) caplog.clear()
assert actual_output == expected_output
service_instance.attr.class_attr.name = "Ho" service_instance.attr.class_attr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.class_attr.class_attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr.class_attr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr.class_attr.name = Ho",
"ServiceClass.class_attr.class_attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.attr.attr_1.name = "Ho" service_instance.attr.attr_1.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr.attr_1.name changed to Ho" in caplog.text
expected_output = sorted(["ServiceClass.attr.attr_1.name = Ho"]) assert "ServiceClass.class_attr.attr_1.name changed to Ho" not in caplog.text
actual_output = sorted(captured.out.strip().split("\n")) caplog.clear()
assert actual_output == expected_output
def test_class_list_attributes(capsys: CaptureFixture) -> None: def test_class_list_attributes(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -359,59 +277,36 @@ def test_class_list_attributes(capsys: CaptureFixture) -> None:
attr = subclass_instance attr = subclass_instance
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
assert service_instance.attr_list[0] != service_instance.attr_list[1] assert service_instance.attr_list[0] != service_instance.attr_list[1]
service_instance.attr_list[0].name = "Ho" service_instance.attr_list[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list[1].name changed to Ho" not in caplog.text
[ caplog.clear()
"ServiceClass.attr_list[0].name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.attr_list[1].name = "Ho" service_instance.attr_list[1].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list[0].name changed to Ho" not in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list[1].name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr_list[1].name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
assert service_instance.attr_list_2[0] == service_instance.attr assert service_instance.attr_list_2[0] == service_instance.attr
assert service_instance.attr_list_2[0] == service_instance.attr_list_2[1] assert service_instance.attr_list_2[0] == service_instance.attr_list_2[1]
service_instance.attr_list_2[0].name = "Ho" service_instance.attr_list_2[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list_2[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list_2[1].name changed to Ho" in caplog.text
[ assert "ServiceClass.attr.name changed to Ho" in caplog.text
"ServiceClass.attr_list_2[0].name = Ho", caplog.clear()
"ServiceClass.attr_list_2[1].name = Ho",
"ServiceClass.attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.attr_list_2[1].name = "Ho" service_instance.attr_list_2[1].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list_2[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list_2[1].name changed to Ho" in caplog.text
[ assert "ServiceClass.attr.name changed to Ho" in caplog.text
"ServiceClass.attr_list_2[0].name = Ho", caplog.clear()
"ServiceClass.attr_list_2[1].name = Ho",
"ServiceClass.attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_class_list_attributes(capsys: CaptureFixture) -> None: def test_nested_class_list_attributes(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -425,34 +320,21 @@ def test_nested_class_list_attributes(capsys: CaptureFixture) -> None:
subattr = subsubclass_instance subattr = subsubclass_instance
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
assert service_instance.attr[0].attr_list[0] == service_instance.subattr assert service_instance.attr[0].attr_list[0] == service_instance.subattr
service_instance.attr[0].attr_list[0].name = "Ho" service_instance.attr[0].attr_list[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr[0].attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.subattr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr[0].attr_list[0].name = Ho",
"ServiceClass.subattr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.subattr.name = "Ho" service_instance.subattr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr[0].attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.subattr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr[0].attr_list[0].name = Ho",
"ServiceClass.subattr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_instance_list_attributes(capsys: CaptureFixture) -> None: def test_instance_list_attributes(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -466,63 +348,42 @@ def test_instance_list_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
assert service_instance.attr_list[0] != service_instance.attr_list[1] assert service_instance.attr_list[0] != service_instance.attr_list[1]
service_instance.attr_list[0].name = "Ho" service_instance.attr_list[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted(["ServiceClass.attr_list[0].name = Ho"]) assert "ServiceClass.attr_list[1].name changed to Ho" not in caplog.text
actual_output = sorted(captured.out.strip().split("\n")) caplog.clear()
assert actual_output == expected_output
service_instance.attr_list[1].name = "Ho" service_instance.attr_list[1].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr_list[0].name changed to Ho" not in caplog.text
expected_output = sorted(["ServiceClass.attr_list[1].name = Ho"]) assert "ServiceClass.attr_list[1].name changed to Ho" in caplog.text
actual_output = sorted(captured.out.strip().split("\n")) caplog.clear()
assert actual_output == expected_output
assert service_instance.attr_list_2[0] == service_instance.attr assert service_instance.attr_list_2[0] == service_instance.attr
assert service_instance.attr_list_2[0] == service_instance.attr_list_2[1] assert service_instance.attr_list_2[0] == service_instance.attr_list_2[1]
service_instance.attr_list_2[0].name = "Ho" service_instance.attr_list_2[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list_2[0].name changed to Ho" in caplog.text
[ assert "ServiceClass.attr_list_2[1].name changed to Ho" in caplog.text
"ServiceClass.attr.name = Ho", caplog.clear()
"ServiceClass.attr_list_2[0].name = Ho",
"ServiceClass.attr_list_2[1].name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.attr_list_2[1].name = "Ho" service_instance.attr_list_2[1].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list_2[0].name changed to Ho" in caplog.text
[ assert "ServiceClass.attr_list_2[1].name changed to Ho" in caplog.text
"ServiceClass.attr.name = Ho", caplog.clear()
"ServiceClass.attr_list_2[0].name = Ho",
"ServiceClass.attr_list_2[1].name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.attr.name = "Ho" service_instance.attr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr.name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.attr_list_2[0].name changed to Ho" in caplog.text
[ assert "ServiceClass.attr_list_2[1].name changed to Ho" in caplog.text
"ServiceClass.attr.name = Ho", caplog.clear()
"ServiceClass.attr_list_2[0].name = Ho",
"ServiceClass.attr_list_2[1].name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_instance_list_attributes(capsys: CaptureFixture) -> None: def test_nested_instance_list_attributes(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -541,28 +402,15 @@ def test_nested_instance_list_attributes(capsys: CaptureFixture) -> None:
super().__init__() super().__init__()
service_instance = ServiceClass() service_instance = ServiceClass()
_ = capsys.readouterr()
assert service_instance.attr[0].attr_list[0] == service_instance.class_attr assert service_instance.attr[0].attr_list[0] == service_instance.class_attr
service_instance.attr[0].attr_list[0].name = "Ho" service_instance.attr[0].attr_list[0].name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr[0].attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.class_attr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr[0].attr_list[0].name = Ho",
"ServiceClass.class_attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
service_instance.class_attr.name = "Ho" service_instance.class_attr.name = "Ho"
captured = capsys.readouterr() assert "ServiceClass.attr[0].attr_list[0].name changed to Ho" in caplog.text
expected_output = sorted( assert "ServiceClass.class_attr.name changed to Ho" in caplog.text
[ caplog.clear()
"ServiceClass.attr[0].attr_list[0].name = Ho",
"ServiceClass.class_attr.name = Ho",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output

View File

@@ -1,9 +1,9 @@
from pytest import CaptureFixture from pytest import LogCaptureFixture
from pydase import DataService from pydase import DataService
def test_properties(capsys: CaptureFixture) -> None: def test_properties(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
_voltage = 10.0 _voltage = 10.0
_current = 1.0 _current = 1.0
@@ -31,30 +31,17 @@ def test_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.voltage = 1 test_service.voltage = 1
captured = capsys.readouterr() assert "ServiceClass.power changed to 1.0" in caplog.text
expected_output = sorted( assert "ServiceClass.voltage changed to 1.0" in caplog.text
[ caplog.clear()
"ServiceClass.power = 1.0",
"ServiceClass.voltage = 1.0",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
test_service.current = 12.0 test_service.current = 12.0
captured = capsys.readouterr() assert "ServiceClass.power changed to 12.0" in caplog.text
expected_output = sorted( assert "ServiceClass.current changed to 12.0" in caplog.text
[
"ServiceClass.power = 12.0",
"ServiceClass.current = 12.0",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_nested_properties(capsys: CaptureFixture) -> None: def test_nested_properties(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
name = "Hello" name = "Hello"
@@ -77,45 +64,31 @@ def test_nested_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.name = "Peepz" test_service.name = "Peepz"
captured = capsys.readouterr() assert "ServiceClass.name changed to Peepz" in caplog.text
expected_output = sorted( assert "ServiceClass.sub_name changed to Hello Peepz" in caplog.text
[ assert "ServiceClass.subsub_name changed to Hello Peepz" in caplog.text
"ServiceClass.name = Peepz", caplog.clear()
"ServiceClass.sub_name = Hello Peepz",
"ServiceClass.subsub_name = Hello Peepz",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
test_service.class_attr.name = "Hi" test_service.class_attr.name = "Hi"
captured = capsys.readouterr() assert "ServiceClass.sub_name changed to Hi Peepz" in caplog.text
expected_output = sorted( assert (
[ "ServiceClass.subsub_name changed to Hello Peepz" in caplog.text
"ServiceClass.sub_name = Hi Peepz", ) # registers subclass changes
"ServiceClass.subsub_name = Hello Peepz", # registers subclass changes assert "ServiceClass.class_attr.name changed to Hi" in caplog.text
"ServiceClass.class_attr.name = Hi", caplog.clear()
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
test_service.class_attr.class_attr.name = "Ciao" test_service.class_attr.class_attr.name = "Ciao"
captured = capsys.readouterr() assert (
expected_output = sorted( "ServiceClass.sub_name changed to Hi Peepz" in caplog.text
[ ) # registers subclass changes
"ServiceClass.sub_name = Hi Peepz", # registers subclass changes assert "ServiceClass.subsub_name changed to Ciao Peepz" in caplog.text
"ServiceClass.subsub_name = Ciao Peepz", assert "ServiceClass.class_attr.class_attr.name changed to Ciao" in caplog.text
"ServiceClass.class_attr.class_attr.name = Ciao", caplog.clear()
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_simple_list_properties(capsys: CaptureFixture) -> None: def test_simple_list_properties(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
list = ["Hello", "Ciao"] list = ["Hello", "Ciao"]
name = "World" name = "World"
@@ -127,30 +100,17 @@ def test_simple_list_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.name = "Peepz" test_service.name = "Peepz"
captured = capsys.readouterr() assert "ServiceClass.name changed to Peepz" in caplog.text
expected_output = sorted( assert "ServiceClass.total_name changed to Hello Peepz" in caplog.text
[ caplog.clear()
"ServiceClass.name = Peepz",
"ServiceClass.total_name = Hello Peepz",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
test_service.list[0] = "Hi" test_service.list[0] = "Hi"
captured = capsys.readouterr() assert "ServiceClass.total_name changed to Hi Peepz" in caplog.text
expected_output = sorted( assert "ServiceClass.list[0] changed to Hi" in caplog.text
[
"ServiceClass.total_name = Hi Peepz",
"ServiceClass.list[0] = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_class_list_properties(capsys: CaptureFixture) -> None: def test_class_list_properties(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
@@ -165,73 +125,17 @@ def test_class_list_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.name = "Peepz" test_service.name = "Peepz"
captured = capsys.readouterr() assert "ServiceClass.name changed to Peepz" in caplog.text
expected_output = sorted( assert "ServiceClass.total_name changed to Hello Peepz" in caplog.text
[ caplog.clear()
"ServiceClass.name = Peepz",
"ServiceClass.total_name = Hello Peepz",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
test_service.list[0].name = "Hi" test_service.list[0].name = "Hi"
captured = capsys.readouterr() assert "ServiceClass.total_name changed to Hi Peepz" in caplog.text
expected_output = sorted( assert "ServiceClass.list[0].name changed to Hi" in caplog.text
[
"ServiceClass.total_name = Hi Peepz",
"ServiceClass.list[0].name = Hi",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_subclass_properties(capsys: CaptureFixture) -> None: def test_subclass_properties(caplog: LogCaptureFixture) -> None:
class SubClass(DataService):
name = "Hello"
_voltage = 10.0
_current = 1.0
@property
def power(self) -> float:
return self._voltage * self.current
@property
def voltage(self) -> float:
return self._voltage
@voltage.setter
def voltage(self, value: float) -> None:
self._voltage = value
@property
def current(self) -> float:
return self._current
@current.setter
def current(self, value: float) -> None:
self._current = value
class ServiceClass(DataService):
class_attr = SubClass()
test_service = ServiceClass()
test_service.class_attr.voltage = 10.0
captured = capsys.readouterr()
expected_output = sorted(
[
"ServiceClass.class_attr.voltage = 10.0",
"ServiceClass.class_attr.power = 10.0",
]
)
actual_output = sorted(captured.out.strip().split("\n"))
assert actual_output == expected_output
def test_subclass_properties(capsys: CaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
_voltage = 10.0 _voltage = 10.0
@@ -267,21 +171,15 @@ def test_subclass_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.class_attr.voltage = 10.0 test_service.class_attr.voltage = 10.0
captured = capsys.readouterr()
expected_output = sorted(
{
"ServiceClass.class_attr.voltage = 10.0",
"ServiceClass.class_attr.power = 10.0",
"ServiceClass.voltage = 10.0",
}
)
# using a set here as "ServiceClass.voltage = 10.0" is emitted twice. Once for # using a set here as "ServiceClass.voltage = 10.0" is emitted twice. Once for
# changing voltage, and once for changing power. # changing voltage, and once for changing power.
actual_output = sorted(set(captured.out.strip().split("\n"))) assert "ServiceClass.class_attr.voltage changed to 10.0" in caplog.text
assert actual_output == expected_output assert "ServiceClass.class_attr.power changed to 10.0" in caplog.text
assert "ServiceClass.voltage changed to 10.0" in caplog.text
caplog.clear()
def test_subclass_properties_2(capsys: CaptureFixture) -> None: def test_subclass_properties_2(caplog: LogCaptureFixture) -> None:
class SubClass(DataService): class SubClass(DataService):
name = "Hello" name = "Hello"
_voltage = 10.0 _voltage = 10.0
@@ -317,24 +215,17 @@ def test_subclass_properties_2(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.class_attr[1].current = 10.0 test_service.class_attr[1].current = 10.0
captured = capsys.readouterr()
expected_output = sorted(
{
"ServiceClass.class_attr[1].current = 10.0",
"ServiceClass.class_attr[1].power = 100.0",
"ServiceClass.voltage = 10.0",
}
)
# using a set here as "ServiceClass.voltage = 10.0" is emitted twice. Once for # using a set here as "ServiceClass.voltage = 10.0" is emitted twice. Once for
# changing current, and once for changing power. Note that the voltage property is # changing current, and once for changing power. Note that the voltage property is
# only dependent on class_attr[0] but still emits an update notification. This is # only dependent on class_attr[0] but still emits an update notification. This is
# because every time any item in the list `test_service.class_attr` is changed, # because every time any item in the list `test_service.class_attr` is changed,
# a notification will be emitted. # a notification will be emitted.
actual_output = sorted(set(captured.out.strip().split("\n"))) assert "ServiceClass.class_attr[1].current changed to 10.0" in caplog.text
assert actual_output == expected_output assert "ServiceClass.class_attr[1].power changed to 100.0" in caplog.text
assert "ServiceClass.voltage changed to 10.0" in caplog.text
def test_subsubclass_properties(capsys: CaptureFixture) -> None: def test_subsubclass_properties(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
_voltage = 10.0 _voltage = 10.0
@@ -364,21 +255,18 @@ def test_subsubclass_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.class_attr[1].class_attr.voltage = 100.0 test_service.class_attr[1].class_attr.voltage = 100.0
captured = capsys.readouterr() assert (
expected_output = sorted( "ServiceClass.class_attr[0].class_attr.voltage changed to 100.0" in caplog.text
{
"ServiceClass.class_attr[0].class_attr.voltage = 100.0",
"ServiceClass.class_attr[1].class_attr.voltage = 100.0",
"ServiceClass.class_attr[0].power = 50.0",
"ServiceClass.class_attr[1].power = 50.0",
"ServiceClass.power = 50.0",
}
) )
actual_output = sorted(set(captured.out.strip().split("\n"))) assert (
assert actual_output == expected_output "ServiceClass.class_attr[1].class_attr.voltage changed to 100.0" in caplog.text
)
assert "ServiceClass.class_attr[0].power changed to 50.0" in caplog.text
assert "ServiceClass.class_attr[1].power changed to 50.0" in caplog.text
assert "ServiceClass.power changed to 50.0" in caplog.text
def test_subsubclass_instance_properties(capsys: CaptureFixture) -> None: def test_subsubclass_instance_properties(caplog: LogCaptureFixture) -> None:
class SubSubClass(DataService): class SubSubClass(DataService):
def __init__(self) -> None: def __init__(self) -> None:
self._voltage = 10.0 self._voltage = 10.0
@@ -412,16 +300,9 @@ def test_subsubclass_instance_properties(capsys: CaptureFixture) -> None:
test_service = ServiceClass() test_service = ServiceClass()
test_service.class_attr[1].attr[0].voltage = 100.0 test_service.class_attr[1].attr[0].voltage = 100.0
captured = capsys.readouterr()
# again, changing an item in a list will trigger the callbacks. This is why a # again, changing an item in a list will trigger the callbacks. This is why a
# notification for `ServiceClass.power` is emitted although it did not change its # notification for `ServiceClass.power` is emitted although it did not change its
# value # value
expected_output = sorted( assert "ServiceClass.class_attr[1].attr[0].voltage changed to 100.0" in caplog.text
{ assert "ServiceClass.class_attr[1].power changed to 50.0" in caplog.text
"ServiceClass.class_attr[1].attr[0].voltage = 100.0", assert "ServiceClass.power changed to 5.0" in caplog.text
"ServiceClass.class_attr[1].power = 50.0",
"ServiceClass.power = 5.0",
}
)
actual_output = sorted(set(captured.out.strip().split("\n")))
assert actual_output == expected_output

View File

@@ -1,12 +1,12 @@
from typing import Any from typing import Any
from pytest import CaptureFixture from pytest import LogCaptureFixture
import pydase.units as u import pydase.units as u
from pydase.data_service.data_service import DataService from pydase.data_service.data_service import DataService
def test_DataService_setattr(capsys: CaptureFixture) -> None: def test_DataService_setattr(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
voltage = 1.0 * u.units.V voltage = 1.0 * u.units.V
_current: u.Quantity = 1.0 * u.units.mA _current: u.Quantity = 1.0 * u.units.mA
@@ -28,31 +28,17 @@ def test_DataService_setattr(capsys: CaptureFixture) -> None:
assert service.voltage == 10.0 * u.units.V # type: ignore assert service.voltage == 10.0 * u.units.V # type: ignore
assert service.current == 1.5 * u.units.mA assert service.current == 1.5 * u.units.mA
captured = capsys.readouterr()
expected_output = sorted( assert "ServiceClass.voltage changed to 10.0 V" in caplog.text
[ assert "ServiceClass.current changed to 1.5 mA" in caplog.text
"ServiceClass.voltage = 10.0 V",
"ServiceClass.current = 1.5 mA",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
service.voltage = 12.0 * u.units.V # type: ignore service.voltage = 12.0 * u.units.V # type: ignore
service.current = 1.51 * u.units.A service.current = 1.51 * u.units.A
assert service.voltage == 12.0 * u.units.V # type: ignore assert service.voltage == 12.0 * u.units.V # type: ignore
assert service.current == 1.51 * u.units.A assert service.current == 1.51 * u.units.A
captured = capsys.readouterr()
expected_output = sorted( assert "ServiceClass.voltage changed to 12.0 V" in caplog.text
[ assert "ServiceClass.current changed to 1.51 A" in caplog.text
"ServiceClass.voltage = 12.0 V",
"ServiceClass.current = 1.51 A",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
def test_convert_to_quantity() -> None: def test_convert_to_quantity() -> None:
@@ -62,7 +48,7 @@ def test_convert_to_quantity() -> None:
assert u.convert_to_quantity(1.0 * u.units.mV) == 1.0 * u.units.mV assert u.convert_to_quantity(1.0 * u.units.mV) == 1.0 * u.units.mV
def test_update_DataService_attribute(capsys: CaptureFixture) -> None: def test_update_DataService_attribute(caplog: LogCaptureFixture) -> None:
class ServiceClass(DataService): class ServiceClass(DataService):
voltage = 1.0 * u.units.V voltage = 1.0 * u.units.V
_current: u.Quantity = 1.0 * u.units.mA _current: u.Quantity = 1.0 * u.units.mA
@@ -80,36 +66,59 @@ def test_update_DataService_attribute(capsys: CaptureFixture) -> None:
service.update_DataService_attribute( service.update_DataService_attribute(
path_list=[], attr_name="voltage", value=1.0 * u.units.mV path_list=[], attr_name="voltage", value=1.0 * u.units.mV
) )
captured = capsys.readouterr()
expected_output = sorted( assert "ServiceClass.voltage changed to 1.0 mV" in caplog.text
[
"ServiceClass.voltage = 1.0 mV",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
service.update_DataService_attribute(path_list=[], attr_name="voltage", value=2) service.update_DataService_attribute(path_list=[], attr_name="voltage", value=2)
captured = capsys.readouterr()
expected_output = sorted( assert "ServiceClass.voltage changed to 2.0 mV" in caplog.text
[
"ServiceClass.voltage = 2.0 mV",
]
)
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output
service.update_DataService_attribute( service.update_DataService_attribute(
path_list=[], attr_name="voltage", value={"magnitude": 123, "unit": "kV"} path_list=[], attr_name="voltage", value={"magnitude": 123, "unit": "kV"}
) )
captured = capsys.readouterr()
expected_output = sorted( assert "ServiceClass.voltage changed to 123.0 kV" in caplog.text
[
"ServiceClass.voltage = 123.0 kV",
] def test_autoconvert_offset_to_baseunit() -> None:
) import pint
actual_output = sorted(captured.out.strip().split("\n")) # type: ignore
assert actual_output == expected_output assert u.units.autoconvert_offset_to_baseunit is True
try:
quantity = 10 * u.units.degC
except pint.errors.OffsetUnitCalculusError as exc:
assert False, f"Offset unit raises exception {exc}"
def test_loading_from_json(caplog: LogCaptureFixture) -> None:
"""This function tests if the quantity read from the json description is actually
passed as a quantity to the property setter."""
JSON_DICT = {
"some_unit": {
"type": "Quantity",
"value": {"magnitude": 10.0, "unit": "A"},
"readonly": False,
"doc": None,
}
}
class ServiceClass(DataService):
def __init__(self):
self._unit: u.Quantity = 1 * u.units.A
super().__init__()
@property
def some_unit(self) -> u.Quantity:
return self._unit
@some_unit.setter
def some_unit(self, value: u.Quantity) -> None:
assert isinstance(value, u.Quantity)
self._unit = value
service = ServiceClass()
service.load_DataService_from_JSON(JSON_DICT)
assert "ServiceClass.some_unit changed to 10.0 A" in caplog.text

View File

@@ -1,70 +1,6 @@
import pytest import pytest
from pydase.utils.helpers import ( from pydase.utils.helpers import is_property_attribute
extract_dict_or_list_entry,
get_nested_value_from_DataService_by_path_and_key,
is_property_attribute,
)
# Sample data for the tests
data_sample = {
"attr1": {"type": "bool", "value": False, "readonly": False, "doc": None},
"class_attr": {
"type": "MyClass",
"value": {"sub_attr": {"type": "float", "value": 20.5}},
},
"list_attr": {
"type": "list",
"value": [
{"type": "int", "value": 0, "readonly": False, "doc": None},
{"type": "float", "value": 1.0, "readonly": False, "doc": None},
],
"readonly": False,
},
}
# Tests for extract_dict_or_list_entry
def test_extract_dict_with_valid_list_index() -> None:
result = extract_dict_or_list_entry(data_sample, "list_attr[1]")
assert result == {"type": "float", "value": 1.0, "readonly": False, "doc": None}
def test_extract_dict_without_list_index() -> None:
result = extract_dict_or_list_entry(data_sample, "attr1")
assert result == {"type": "bool", "value": False, "readonly": False, "doc": None}
def test_extract_dict_with_invalid_key() -> None:
result = extract_dict_or_list_entry(data_sample, "attr_not_exist")
assert result is None
def test_extract_dict_with_invalid_list_index() -> None:
result = extract_dict_or_list_entry(data_sample, "list_attr[5]")
assert result is None
# Tests for get_nested_value_from_DataService_by_path_and_key
def test_get_nested_value_with_default_key() -> None:
result = get_nested_value_from_DataService_by_path_and_key(
data_sample, "list_attr[0]"
)
assert result == 0
def test_get_nested_value_with_custom_key() -> None:
result = get_nested_value_from_DataService_by_path_and_key(
data_sample, "class_attr.sub_attr", "type"
)
assert result == "float"
def test_get_nested_value_with_invalid_path() -> None:
result = get_nested_value_from_DataService_by_path_and_key(
data_sample, "class_attr.nonexistent_attr"
)
assert result is None
@pytest.mark.parametrize( @pytest.mark.parametrize(

View File

@@ -0,0 +1,71 @@
import logging
from pytest import LogCaptureFixture
from pydase.utils.logging import setup_logging
def test_log_error(caplog: LogCaptureFixture):
setup_logging("ERROR")
logger = logging.getLogger()
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.warning("This is a warning message")
logger.error("This is an error message")
# Check the log records as well as the level.
assert "This is a debug message" not in caplog.text
assert "This is an info message" not in caplog.text
assert "This is a warning message" not in caplog.text
assert "This is an error message" in caplog.text
assert any(record.levelname == "ERROR" for record in caplog.records)
def test_log_warning(caplog: LogCaptureFixture):
setup_logging("WARNING")
logger = logging.getLogger()
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.warning("This is a warning message")
logger.error("This is an error message")
# Check the log records as well as the level.
assert "This is a debug message" not in caplog.text
assert "This is an info message" not in caplog.text
assert "This is a warning message" in caplog.text
assert "This is an error message" in caplog.text
assert any(record.levelname == "ERROR" for record in caplog.records)
def test_log_debug(caplog: LogCaptureFixture):
setup_logging("DEBUG")
logger = (
logging.getLogger()
) # Get the root logger or replace with the appropriate logger.
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.warning("This is a warning message")
logger.error("This is an error message")
# Now, check that the message is in the log records.
assert "This is a debug message" in caplog.text
assert "This is an info message" in caplog.text
assert "This is a warning message" in caplog.text
assert "This is an error message" in caplog.text
def test_log_info(caplog: LogCaptureFixture):
setup_logging("INFO")
logger = (
logging.getLogger()
) # Get the root logger or replace with the appropriate logger.
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.warning("This is a warning message")
logger.error("This is an error message")
# Now, check that the message is in the log records.
assert "This is a debug message" not in caplog.text
assert "This is an info message" in caplog.text
assert "This is a warning message" in caplog.text
assert "This is an error message" in caplog.text

View File

@@ -0,0 +1,417 @@
import asyncio
from enum import Enum
import pytest
import pydase
import pydase.units as u
from pydase.components.coloured_enum import ColouredEnum
from pydase.utils.serializer import (
SerializationPathError,
dump,
get_nested_dict_by_path,
get_next_level_dict_by_key,
set_nested_value_by_path,
)
@pytest.mark.parametrize(
"test_input, expected",
[
(1, {"type": "int", "value": 1, "readonly": False, "doc": None}),
(1.0, {"type": "float", "value": 1.0, "readonly": False, "doc": None}),
(True, {"type": "bool", "value": True, "readonly": False, "doc": None}),
(
u.Quantity(10, "m"),
{
"type": "Quantity",
"value": {"magnitude": 10, "unit": "meter"},
"readonly": False,
"doc": None,
},
),
],
)
def test_dump(test_input, expected):
assert dump(test_input) == expected
def test_enum_serialize() -> None:
class EnumClass(Enum):
FOO = "foo"
BAR = "bar"
class EnumAttribute(pydase.DataService):
def __init__(self) -> None:
self.some_enum = EnumClass.FOO
super().__init__()
class EnumPropertyWithoutSetter(pydase.DataService):
def __init__(self) -> None:
self._some_enum = EnumClass.FOO
super().__init__()
@property
def some_enum(self) -> EnumClass:
return self._some_enum
class EnumPropertyWithSetter(pydase.DataService):
def __init__(self) -> None:
self._some_enum = EnumClass.FOO
super().__init__()
@property
def some_enum(self) -> EnumClass:
return self._some_enum
@some_enum.setter
def some_enum(self, value: EnumClass) -> None:
self._some_enum = value
assert dump(EnumAttribute())["value"] == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
"doc": None,
}
}
assert dump(EnumPropertyWithoutSetter())["value"] == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": True,
"doc": None,
}
}
assert dump(EnumPropertyWithSetter())["value"] == {
"some_enum": {
"type": "Enum",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
"doc": None,
}
}
def test_ColouredEnum_serialize() -> None:
class Status(ColouredEnum):
PENDING = "#FFA500"
RUNNING = "#0000FF80"
PAUSED = "rgb(169, 169, 169)"
RETRYING = "rgba(255, 255, 0, 0.3)"
COMPLETED = "hsl(120, 100%, 50%)"
FAILED = "hsla(0, 100%, 50%, 0.7)"
CANCELLED = "SlateGray"
assert dump(Status.FAILED) == {
"type": "ColouredEnum",
"value": "FAILED",
"enum": {
"CANCELLED": "SlateGray",
"COMPLETED": "hsl(120, 100%, 50%)",
"FAILED": "hsla(0, 100%, 50%, 0.7)",
"PAUSED": "rgb(169, 169, 169)",
"PENDING": "#FFA500",
"RETRYING": "rgba(255, 255, 0, 0.3)",
"RUNNING": "#0000FF80",
},
"readonly": False,
"doc": None,
}
def test_method_serialization() -> None:
class ClassWithMethod(pydase.DataService):
def some_method(self) -> str:
return "some method"
async def some_task(self, sleep_time: int) -> None:
while True:
await asyncio.sleep(sleep_time)
instance = ClassWithMethod()
instance.start_some_task(10) # type: ignore
assert dump(instance)["value"] == {
"some_method": {
"async": False,
"doc": None,
"parameters": {},
"readonly": True,
"type": "method",
"value": None,
},
"some_task": {
"async": True,
"doc": None,
"parameters": {"sleep_time": "int"},
"readonly": True,
"type": "method",
"value": {"sleep_time": 10},
},
}
def test_methods_with_type_hints() -> None:
def method_without_type_hint(arg_without_type_hint) -> None:
pass
def method_with_type_hint(some_argument: int) -> None:
pass
def method_with_union_type_hint(some_argument: int | float) -> None:
pass
assert dump(method_without_type_hint) == {
"async": False,
"doc": None,
"parameters": {"arg_without_type_hint": None},
"readonly": True,
"type": "method",
"value": None,
}
assert dump(method_with_type_hint) == {
"async": False,
"doc": None,
"parameters": {"some_argument": "int"},
"readonly": True,
"type": "method",
"value": None,
}
assert dump(method_with_union_type_hint) == {
"async": False,
"doc": None,
"parameters": {"some_argument": "int | float"},
"readonly": True,
"type": "method",
"value": None,
}
def test_list_serialization() -> None:
class MySubclass(pydase.DataService):
_name = "hi"
bool_attr = True
int_attr = 1
@property
def name(self) -> str:
return self._name
class ClassWithListAttribute(pydase.DataService):
list_attr = [1, MySubclass()]
instance = ClassWithListAttribute()
assert dump(instance)["value"] == {
"list_attr": {
"doc": None,
"readonly": False,
"type": "list",
"value": [
{"doc": None, "readonly": False, "type": "int", "value": 1},
{
"doc": None,
"readonly": False,
"type": "DataService",
"value": {
"bool_attr": {
"doc": None,
"readonly": False,
"type": "bool",
"value": True,
},
"int_attr": {
"doc": None,
"readonly": False,
"type": "int",
"value": 1,
},
"name": {
"doc": None,
"readonly": True,
"type": "str",
"value": "hi",
},
},
},
],
}
}
def test_dict_serialization() -> None:
class MyClass(pydase.DataService):
name = "my class"
test_dict = {
"int_key": 1,
"float_key": 1.0,
"bool_key": True,
"Quantity_key": 1.0 * u.units.s,
"DataService_key": MyClass(),
}
assert dump(test_dict) == {
"doc": None,
"readonly": False,
"type": "dict",
"value": {
"DataService_key": {
"doc": None,
"readonly": False,
"type": "DataService",
"value": {
"name": {
"doc": None,
"readonly": False,
"type": "str",
"value": "my class",
}
},
},
"Quantity_key": {
"doc": None,
"readonly": False,
"type": "Quantity",
"value": {"magnitude": 1.0, "unit": "s"},
},
"bool_key": {"doc": None, "readonly": False, "type": "bool", "value": True},
"float_key": {
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
},
"int_key": {"doc": None, "readonly": False, "type": "int", "value": 1},
},
}
@pytest.fixture
def setup_dict():
class MySubclass(pydase.DataService):
attr3 = 1.0
list_attr = [1.0, 1]
class ServiceClass(pydase.DataService):
attr1 = 1.0
attr2 = MySubclass()
attr_list = [0, 1, MySubclass()]
return ServiceClass().serialize()
def test_update_attribute(setup_dict):
set_nested_value_by_path(setup_dict, "attr1", 15)
assert setup_dict["attr1"]["value"] == 15
def test_update_nested_attribute(setup_dict):
set_nested_value_by_path(setup_dict, "attr2.attr3", 25.0)
assert setup_dict["attr2"]["value"]["attr3"]["value"] == 25.0
def test_update_list_entry(setup_dict):
set_nested_value_by_path(setup_dict, "attr_list[1]", 20)
assert setup_dict["attr_list"]["value"][1]["value"] == 20
def test_update_list_append(setup_dict):
set_nested_value_by_path(setup_dict, "attr_list[3]", 20)
assert setup_dict["attr_list"]["value"][3]["value"] == 20
def test_update_invalid_list_index(setup_dict, caplog: pytest.LogCaptureFixture):
set_nested_value_by_path(setup_dict, "attr_list[10]", 30)
assert (
"Error occured trying to change 'attr_list[10]': list index "
"out of range" in caplog.text
)
def test_update_invalid_path(setup_dict, caplog: pytest.LogCaptureFixture):
set_nested_value_by_path(setup_dict, "invalid_path", 30)
assert (
"Error occured trying to access the key 'invalid_path': it is either "
"not present in the current dictionary or its value does not contain "
"a 'value' key." in caplog.text
)
def test_update_list_inside_class(setup_dict):
set_nested_value_by_path(setup_dict, "attr2.list_attr[1]", 40)
assert setup_dict["attr2"]["value"]["list_attr"]["value"][1]["value"] == 40
def test_update_class_attribute_inside_list(setup_dict):
set_nested_value_by_path(setup_dict, "attr_list[2].attr3", 50)
assert setup_dict["attr_list"]["value"][2]["value"]["attr3"]["value"] == 50
def test_get_next_level_attribute_nested_dict(setup_dict):
nested_dict = get_next_level_dict_by_key(setup_dict, "attr1")
assert nested_dict == setup_dict["attr1"]
def test_get_next_level_list_entry_nested_dict(setup_dict):
nested_dict = get_next_level_dict_by_key(setup_dict, "attr_list[0]")
assert nested_dict == setup_dict["attr_list"]["value"][0]
def test_get_next_level_invalid_path_nested_dict(setup_dict):
with pytest.raises(SerializationPathError):
get_next_level_dict_by_key(setup_dict, "invalid_path")
def test_get_next_level_invalid_list_index(setup_dict):
with pytest.raises(SerializationPathError):
get_next_level_dict_by_key(setup_dict, "attr_list[10]")
def test_get_attribute(setup_dict):
nested_dict = get_nested_dict_by_path(setup_dict, "attr1")
assert nested_dict["value"] == 1.0
def test_get_nested_attribute(setup_dict):
nested_dict = get_nested_dict_by_path(setup_dict, "attr2.attr3")
assert nested_dict["value"] == 1.0
def test_get_list_entry(setup_dict):
nested_dict = get_nested_dict_by_path(setup_dict, "attr_list[1]")
assert nested_dict["value"] == 1
def test_get_list_inside_class(setup_dict):
nested_dict = get_nested_dict_by_path(setup_dict, "attr2.list_attr[1]")
assert nested_dict["value"] == 1.0
def test_get_class_attribute_inside_list(setup_dict):
nested_dict = get_nested_dict_by_path(setup_dict, "attr_list[2].attr3")
assert nested_dict["value"] == 1.0
def test_get_invalid_list_index(setup_dict, caplog: pytest.LogCaptureFixture):
get_nested_dict_by_path(setup_dict, "attr_list[10]")
assert (
"Error occured trying to change 'attr_list[10]': list index "
"out of range" in caplog.text
)
def test_get_invalid_path(setup_dict, caplog: pytest.LogCaptureFixture):
get_nested_dict_by_path(setup_dict, "invalid_path")
assert (
"Error occured trying to access the key 'invalid_path': it is either "
"not present in the current dictionary or its value does not contain "
"a 'value' key." in caplog.text
)

View File

@@ -2,8 +2,6 @@ from pytest import LogCaptureFixture
from pydase import DataService from pydase import DataService
from . import caplog # noqa
def test_setattr_warnings(caplog: LogCaptureFixture) -> None: # noqa def test_setattr_warnings(caplog: LogCaptureFixture) -> None: # noqa
# def test_setattr_warnings(capsys: CaptureFixture) -> None: # def test_setattr_warnings(capsys: CaptureFixture) -> None:
@@ -32,3 +30,19 @@ def test_private_attribute_warning(caplog: LogCaptureFixture) -> None: # noqa
" Warning: You should not set private but rather protected attributes! Use " " Warning: You should not set private but rather protected attributes! Use "
"_something instead of __something." in caplog.text "_something instead of __something." in caplog.text
) )
def test_protected_attribute_warning(caplog: LogCaptureFixture) -> None: # noqa
class SubClass:
name = "Hello"
class ServiceClass(DataService):
def __init__(self) -> None:
self._subclass = SubClass
super().__init__()
ServiceClass()
assert (
"Warning: Class SubClass does not inherit from DataService." not in caplog.text
)