164 Commits

Author SHA1 Message Date
Mose Müller
46868743c7 Merge pull request #123 from tiqi-group/36-feat-add-support-for-dictionaries
feat: adds support for dictionaries
2024-04-30 15:48:16 +02:00
Mose Müller
8203e3a498 updates version to v0.8.2 2024-04-30 15:47:50 +02:00
Mose Müller
82b9c14af3 ignores mypy overrides error 2024-04-30 15:46:50 +02:00
Mose Müller
b209ad75bb fixes serializer types and test
pydase dicts can only have stringed keys. This is now reflected in the serializer, as well.
2024-04-30 15:46:39 +02:00
Mose Müller
88a630518b updates ProxyDict types, ignores mypy error 2024-04-30 15:42:48 +02:00
Mose Müller
ed80c92b1f adds dict test for pydase.Client
The pop
2024-04-30 15:33:56 +02:00
Mose Müller
36e30970c5 adds dict.pop to pydase.Client
The pop method removes the element in the dictionary on the server, but it will not
return anything. This is because pop will delete the element on the server, and returned
proxy classes will not be meaningful.
2024-04-30 15:24:15 +02:00
Mose Müller
3384d1bebf adds dict.pop method to ObservableDict 2024-04-30 13:15:42 +02:00
Mose Müller
e2f94c8a28 updates Readme (mentions dict support under standard data types) 2024-04-30 13:02:58 +02:00
Mose Müller
4d442cfadc adds ProxyDict to pydase client 2024-04-30 11:50:15 +02:00
Mose Müller
2701a995e1 updates test_helpers (replacing float key with dotted string key) 2024-04-30 11:48:05 +02:00
Mose Müller
47a73ad55f moves _ObservableDict tests into separate file 2024-04-30 11:47:22 +02:00
Mose Müller
ad4f926472 replaces ObservableDict key type warning with exception 2024-04-30 11:44:07 +02:00
Mose Müller
208dee2b92 dictionaries can only take strings now
The object serializations are passed through json.dumps before they are emitted to the
clients. JSON, apparently, can only handle keys of type string, which is why I have to
limit the dictionary key types to strings, as well.
2024-04-30 11:12:45 +02:00
Mose Müller
02b2d4fb10 observer: ignoring __annotations__ class attribute 2024-04-30 11:09:54 +02:00
Mose Müller
b2f59dd447 updates serializer type (dictionary object) 2024-04-30 10:51:52 +02:00
Mose Müller
33aa8708fd frontend: fixes displayName for dotted dictionary keys 2024-04-30 10:02:46 +02:00
Mose Müller
37d698a1b2 fixes web settings (displayName for dotted dictionary keys) 2024-04-30 10:02:13 +02:00
Mose Müller
8fa91e8121 adds tests for generate_serialized_data_paths and get_data_paths_from_serialized_object 2024-04-30 10:02:13 +02:00
Mose Müller
b9131c9df2 updates generate_serialized_data_paths to handle dictionaries well 2024-04-30 10:02:13 +02:00
Mose Müller
1c1584c2cf fixes dictionary serialization for keys that are not strings 2024-04-30 10:02:13 +02:00
Mose Müller
bb3d6fcce1 updates _ObservableDict
- allows for strings and numbers now
- key will have double quotes (") instead of single quote (') when key is a string
- fixed some few things
- added/updated tests
2024-04-30 10:02:13 +02:00
Mose Müller
e9a7e785dd npm run build 2024-04-30 10:02:13 +02:00
Mose Müller
a214d6d85a using id as form name for number and string component
This removes errors saying that quotes within element name are not allowed.
2024-04-30 10:02:13 +02:00
Mose Müller
6eaf1a03d1 adds onChange prop to number component form field to remove console errors 2024-04-29 15:20:11 +02:00
Mose Müller
31f1c9a8ce adds "None" type to AttributeType type 2024-04-29 15:08:12 +02:00
Mose Müller
02f1dba0f3 frontend: updates stateUtils 2024-04-29 15:05:48 +02:00
Mose Müller
dc40fc299f adds test for failing get_object_by_path_parts 2024-04-26 09:49:27 +02:00
Mose Müller
348f8aac9b removes tests for get_object_attr_from_path (uses get_object_by_path_parts internally) 2024-04-26 09:49:01 +02:00
Mose Müller
b314ae7dec updates helper tests 2024-04-26 09:42:57 +02:00
Mose Müller
25e578fbba parse_full_access_path can match floats inside brackets now 2024-04-26 09:42:26 +02:00
Mose Müller
1ee6a299b2 updates tests for is_property_attribute 2024-04-26 09:25:11 +02:00
Mose Müller
f315cd62d6 moves render_in_frontend function to decorators module, removes unused update_value_if_change function 2024-04-26 09:10:05 +02:00
Mose Müller
87d172b94b simplifies, documents and tests parse_serialized_key helper function 2024-04-26 08:15:21 +02:00
Mose Müller
a2c60a9c40 chore: replacing old method, adding TODO 2024-04-25 17:44:16 +02:00
Mose Müller
66376e2e6c removes parse_keyed_attribute 2024-04-25 17:40:33 +02:00
Mose Müller
d1c00a2612 using parse_full_access_path instead of path.split(".") in state manager 2024-04-25 17:34:48 +02:00
Mose Müller
6dd878a062 uses helper method in serializer 2024-04-25 17:33:42 +02:00
Mose Müller
2898b62b9c adds parse_serialized_key and get_object_by_path_parts helper methods 2024-04-25 17:33:32 +02:00
Mose Müller
b29c86ac2c updates Serializer functions
- using parse_full_access_path instead of parse_keyed_attribute
- renames get_next_level_dict_by_key to get_container_item_by_key
- replaces ensure_exists and get_nested_value by get_or_create_item_in_container

This allows us to handle access paths like "dict_attr['key'][0].some_attr".
2024-04-25 16:52:40 +02:00
Mose Müller
c75b203c3d creates functions to split full access paths and combine the atomic parts back together 2024-04-25 16:30:20 +02:00
Mose Müller
036e80b920 removes warning when using dict as attribute 2024-04-25 14:23:59 +02:00
Mose Müller
de7badd007 fixes exception message 2024-04-25 14:22:49 +02:00
Mose Müller
7e06944018 updates tests for get_next_level_dict_by_key 2024-04-25 10:52:53 +02:00
Mose Müller
4e9e1384df updates get_next_level_dict_by_key to handle lists and dictionaries 2024-04-25 10:52:36 +02:00
Mose Müller
5f7cc7f671 fixes type 2024-04-25 10:51:53 +02:00
Mose Müller
768be76cc8 replaces parseListAttrAndIndex with parseKeyedAttribute inn stateUtils 2024-04-23 14:35:22 +02:00
Mose Müller
8fd83fbd7d updates get_object_attr_from_path to support dictionaries 2024-04-23 14:21:39 +02:00
Mose Müller
564eeeb433 adds dictionary support to state_manager (__update_attribute_by_path) 2024-04-22 19:32:09 +02:00
Mose Müller
216368571a fixes parse_keyed_attributes 2024-04-22 19:31:29 +02:00
Mose Müller
2df1a673ac adds DictComponent to GenericComponent 2024-04-22 19:11:23 +02:00
Mose Müller
d40d9c5e47 adds first version of DictComponent 2024-04-22 19:11:13 +02:00
Mose Müller
6cae76bde1 adds tests for parse_keyed_attribute 2024-04-22 19:11:02 +02:00
Mose Müller
32e2a8a4d1 replaces parse_list_attr_and_index with parse_keyed_attribute to support dictionaries 2024-04-22 19:11:02 +02:00
Mose Müller
0ac4049282 Merge pull request #122 from tiqi-group/fix/ListComponent_item_key
Fix: list component item key
2024-04-22 18:39:51 +02:00
Mose Müller
d24c66e522 npm run build 2024-04-22 18:38:27 +02:00
Mose Müller
9ae6895858 replaces undefined name by full_access_path in ListComponent item 2024-04-22 18:38:03 +02:00
Mose Müller
2b8e25f5f1 Merge pull request #121 from tiqi-group/feat/add_async_method_status_spinner
Feat: adds async method status spinner
2024-04-22 17:49:58 +02:00
Mose Müller
9cfcb1ba0c npm run build 2024-04-22 17:47:14 +02:00
Mose Müller
a73e721b73 adds spinner to async task when waiting for backend status update 2024-04-22 17:46:58 +02:00
Mose Müller
503240aeae Merge pull request #120 from tiqi-group/documentation/coloured_enum
adds note that coloured enum values must be unique
2024-04-17 11:49:45 +02:00
Mose Müller
ba24deecb7 adds note that coloured enum values must be unique 2024-04-17 11:47:43 +02:00
Mose Müller
5333acd583 updates version number 2024-04-17 09:34:30 +02:00
Mose Müller
81c05d2e14 Merge pull request #119 from tiqi-group/fix/component_display
Fix/component display
2024-04-17 09:28:32 +02:00
Mose Müller
8832c879a1 npm run build 2024-04-17 09:21:51 +02:00
Mose Müller
ec1f68ae4a using fullAccessPath as Form name for NumberComponent fixing cursor jumps 2024-04-17 09:21:02 +02:00
Mose Müller
f5e108bbe5 fixes readonly coloured enum 2024-04-17 09:11:33 +02:00
Mose Müller
dfe543067f fixes frontend button 2024-04-17 09:07:24 +02:00
Mose Müller
a77dcfdfae Merge pull request #118 from tiqi-group/feat/sio_server_client
Feat/sio server client
2024-04-16 11:48:27 +02:00
Mose Müller
fe01ada733 adds tab completion test for client 2024-04-16 11:29:44 +02:00
Mose Müller
16c1f966ab adds test for dynamically added attribute 2024-04-16 11:15:42 +02:00
Mose Müller
003ee95272 replaces test_image test image url 2024-04-16 11:01:52 +02:00
Mose Müller
dfbf1c61af updates Readme (replaces rpyc with pydase.Client) 2024-04-16 10:55:54 +02:00
Mose Müller
7233e5933b updates client documentation 2024-04-16 10:41:18 +02:00
Mose Müller
09e66400c3 removes rpyc dependency 2024-04-16 10:21:25 +02:00
Mose Müller
6977b795e5 updates pydase.Server docstring 2024-04-16 10:21:25 +02:00
Mose Müller
8911b860d7 updates client list testing 2024-04-16 10:08:42 +02:00
Mose Müller
245b1844c9 adds update_value method to reduce code duplication 2024-04-16 10:08:42 +02:00
Mose Müller
d48ae9f5ad adds trigger_method function to reduce code duplication 2024-04-16 09:59:39 +02:00
Mose Müller
cf637d19ae adds docstring to ProxyClass 2024-04-16 09:49:00 +02:00
Mose Müller
edfb7d0341 Using super() in proxy class constructor 2024-04-16 09:36:34 +02:00
Mose Müller
7b06786307 updates pydase.Client documentation and constructor arguments 2024-04-16 09:34:29 +02:00
Mose Müller
5eeaefdd63 refactors client sio event setup 2024-04-15 08:16:46 +02:00
Mose Müller
f65a0e31c3 udpates client tests 2024-04-09 13:52:41 +02:00
Mose Müller
fbada6d818 adds ProxyList methods 2024-04-09 13:43:43 +02:00
Mose Müller
507f286963 adds blocking kwarg to client
If blocking is true, client init will wait until it could connect to the server.
2024-04-09 13:27:55 +02:00
Mose Müller
c148eba5dd updates client tests 2024-04-09 09:25:47 +02:00
Mose Müller
61c7dc8333 client retries to connect if server is not available. Connection process is not blocking anymore 2024-04-09 09:25:01 +02:00
Mose Müller
a879b09e0b updates version to 0.8.0 2024-04-09 09:25:01 +02:00
Mose Müller
bba21e3241 adds Client tests 2024-04-09 09:25:01 +02:00
Mose Müller
16bd17f75c adds deserializer tests 2024-04-09 09:25:01 +02:00
Mose Müller
ad2800aaf6 improves exception deserialization
Tries to use builtins exceptions if possible.
2024-04-08 11:13:14 +02:00
Mose Müller
d792601663 removes out-dated tests 2024-04-08 10:23:24 +02:00
Mose Müller
166fc57877 adds property observer test 2024-04-08 10:23:24 +02:00
Mose Müller
5b762db535 fixes detection of property dependencies for classes inheriting from other observables
Inheriting from a class that itself has defined properties, you cannot get those properties by calling
vars(type(obj)). Instead, you have to go through all the classes' members and check if they are properties.
You can either do this using dir(type(obj)) and get the members using getattr or just use inspect.getmembers.
2024-04-08 10:23:24 +02:00
Mose Müller
73b2355d35 fixes Client specific errors when setting proxy attributes / methods
Also ignores mypy errors
2024-04-08 10:23:24 +02:00
Mose Müller
6335ea21ad fixes warnings in ProxyClassMixin 2024-04-04 16:30:26 +02:00
Mose Müller
690ecd7317 adds aiohttp to python deps (used for socketio.AsyncClient) 2024-04-04 16:26:28 +02:00
Mose Müller
9cb667581a removes exposed dump method (circular import apparently)
fixes test
2024-04-04 16:26:28 +02:00
Mose Müller
5936e7091e updates sio events
- adds disconnect event which marks the DeviceConnection as disconnected
- updates connect event to notify the observer about the new state and set connected to True
2024-04-04 16:20:31 +02:00
Mose Müller
ad0fd8e833 updates proxy class usage
- ProxyClass class is inheriting from DeviceConnection and is only used for topmost proxy
- classes of nested proxy objects are dynamically created to keep their component types
- adds _initialise method to ProxyClassMixin as I cannot pass sio_client and loop to each
component_class (initialising a class with multiple base classes will pass the arguments passed to
the constructor to each initialiser function)
2024-04-04 16:19:29 +02:00
Mose Müller
473c6660e6 fixes warnings 2024-04-04 11:41:12 +02:00
Mose Müller
5511ebc808 updates client proxy
- will now be changed in place (instead of being overwritten on reconnect, which was the only way
of adding or removing property getters / setters)
- replaces getters/setters and methods of proxy with __setattr__ and __getattribute__ functionality
- replaces ProxyClassFactory with ProxyClass and ProxyLoader. The latter updates the former on
reconnect
- client does not need to be a DataService anymore. It only establishes the connection and holds
the reference to the proxy class.
2024-04-04 11:31:14 +02:00
Mose Müller
439665177d removes unused attribute of ProxyConnection 2024-04-03 10:54:03 +02:00
Mose Müller
c0b25c0581 adds Client to default exports of pydase 2024-04-03 10:47:46 +02:00
Mose Müller
60a7dda60a restructures client to have separate thread for its asyncio loop 2024-04-03 10:28:06 +02:00
Mose Müller
381d98b078 updates is_property_attribute to accept the full_access_path instead of the attr_name only 2024-03-29 08:47:24 +01:00
Mose Müller
658fb13d9d improves Client._notify_changed by not emitting sio events when properties change 2024-03-29 08:44:48 +01:00
Mose Müller
a582dc23ac makes proxyclass reconnection wait time a float to not get warning 2024-03-29 08:44:20 +01:00
Mose Müller
19b24f3060 avoids notifying server when updates are pushed from the server itself 2024-03-28 18:41:01 +01:00
Mose Müller
d100bb5fea udpates Client and ProxyClassFactory
- Client:
  - inherits from DataService now
  - acts as an observer of the proxy class and sends updates to the sio server
- ProxyClassFactory
  - ProxyConnection is now a DeviceConnection -> users will see if the client is connected
2024-03-28 18:41:01 +01:00
Mose Müller
36a70badce fixes observable _construct_extended_attr_path
Passing an empty string resulted in an extended path ending with a "."
2024-03-28 18:13:44 +01:00
Mose Müller
9916d6df60 adds support for dynamically adding attributes to DataService instances 2024-03-28 14:30:09 +01:00
Mose Müller
b4c84da57e npm run build 2024-03-28 11:31:12 +01:00
Mose Müller
ecf0e99318 fixes units test 2024-03-28 11:30:18 +01:00
Mose Müller
10ac007a0c ignores complexity errors 2024-03-28 11:30:07 +01:00
Mose Müller
900017791a fixes loading of removed attributes. Prints debug log instead of raising exception 2024-03-28 11:27:16 +01:00
Mose Müller
edb06b1612 restructures StateManager
- Updates logic of loading the state
- set_service_attribut_value_by_path expects serialized object instead of the value now
- uses loads instead of __convert_value_if_needed now
2024-03-28 10:21:28 +01:00
Mose Müller
bb5205b2e4 get_object_attr_from_path expects string instead of list now 2024-03-28 10:18:43 +01:00
Mose Müller
c02c75aab5 prevents users to override nested ProxyClasses in sio client proxy 2024-03-28 10:09:08 +01:00
Mose Müller
cc3fdfbb27 makes sio client private on ProxyClass 2024-03-28 09:48:25 +01:00
Mose Müller
7d399df158 proxy class will raise exception raised on server when setting value 2024-03-28 09:29:37 +01:00
Mose Müller
92e2c0e8ef fixes deserialization of floats 2024-03-28 08:57:59 +01:00
Mose Müller
65f63e08ae fixes changing Quantity from frontend 2024-03-28 08:55:58 +01:00
Mose Müller
4eddf4b980 todos 2024-03-27 17:50:51 +01:00
Mose Müller
9d7099f116 updates socket.ts (passing access_path to backend) 2024-03-27 17:49:57 +01:00
Mose Müller
3f096bda96 fixes loading enum from json file
Loading from json file happens by name. The sio client will send the whole
enumeration and thus we have to handle both strings and enumerations.
2024-03-27 17:30:37 +01:00
Mose Müller
e56a6e0653 fixing tests 2024-03-27 17:13:40 +01:00
Mose Müller
e71186dce4 updates types 2024-03-27 17:13:37 +01:00
Mose Müller
d1007fad14 removes unused code 2024-03-27 16:52:48 +01:00
Mose Müller
6f2c1f8951 exports dump function from pydase.utils.serialization 2024-03-27 16:37:06 +01:00
Mose Müller
f18880abd5 moves serializer tests into separate module 2024-03-27 16:31:08 +01:00
Mose Müller
9851ccfcdf moves serializer file into serialization module 2024-03-27 16:30:15 +01:00
Mose Müller
f312ec1e51 moving deserializer into serialization module 2024-03-27 16:26:46 +01:00
Mose Müller
7405d2cafc adds serialization module, moves types into separate file 2024-03-27 16:26:24 +01:00
Mose Müller
e6251975b8 adds try...except blocks around update_value and get_value sio events 2024-03-27 16:18:59 +01:00
Mose Müller
780a2466d3 fixes updating a value through sio client 2024-03-27 16:18:04 +01:00
Mose Müller
8979a1885e fixes method execution from frontend, adds simple serialization methods 2024-03-27 16:00:54 +01:00
Mose Müller
fbc4af28ae removes debugging statements 2024-03-27 16:00:24 +01:00
Mose Müller
454b0fb7d1 adds start and stop methods for tasks in socketio client 2024-03-27 15:32:51 +01:00
Mose Müller
9d3264de1f fixes cache update of task status change 2024-03-27 15:32:51 +01:00
Mose Müller
2d6c681690 improves SerializedObject type hint 2024-03-27 15:32:51 +01:00
Mose Müller
612e62d06b updates ProxyClassFactory (go through handled types before components) 2024-03-27 15:20:50 +01:00
Mose Müller
31f280c9cb frontend components pass actual readOnly and docString values to backend 2024-03-27 15:20:50 +01:00
Mose Müller
e4f5374783 fixes docstring when setting nested value by path 2024-03-27 15:20:50 +01:00
Mose Müller
6397307690 restructuring EnumComponent (now for both Enum and ColouredEnum) 2024-03-27 15:20:50 +01:00
Mose Müller
2ce4c9ce9b using new runMethod function 2024-03-27 15:20:50 +01:00
Mose Müller
15cf0bd414 adapting components to new callback function 2024-03-27 15:20:23 +01:00
Mose Müller
ff3a509132 passing fullAccessPath instead of parentPath and name 2024-03-27 15:20:23 +01:00
Mose Müller
1a01222cb3 updates changeCallback and SerializedObject in GenericComponent.tsx 2024-03-27 12:08:10 +01:00
Mose Müller
2eb996b382 updates frontend socket to use new sio events 2024-03-27 12:08:10 +01:00
Mose Müller
8addcd26aa fixes state manager enum handlign 2024-03-27 12:08:10 +01:00
Mose Müller
4db15f2fe8 updates sio events in web server 2024-03-27 12:08:10 +01:00
Mose Müller
27f22d472d updates Deserializer (handle components at last) 2024-03-27 12:06:14 +01:00
Mose Müller
c1aa678384 clients will now receive updates from socketio server and notify the observer 2024-03-27 12:06:14 +01:00
Mose Müller
11670addc4 replaces ClientDeserializer with ProxyClassFactory 2024-03-27 12:06:14 +01:00
Mose Müller
1c663e9a2e updates Deserializer (type hints, adding keyword to argument) 2024-03-27 12:06:14 +01:00
Mose Müller
ada9dcce4a adds websocket-client package 2024-03-27 12:06:14 +01:00
Mose Müller
bd5c162148 adds socketio client code 2024-03-27 12:06:14 +01:00
Mose Müller
4e1ec90dee adds Deserializer, converting SerializedObject objects back to actual objects 2024-03-27 12:06:14 +01:00
Mose Müller
4406acf4dd adds support for serializing exceptions 2024-03-27 12:06:14 +01:00
Mose Müller
1ad917a423 removes rpyc 2024-03-27 12:06:14 +01:00
Mose Müller
57e7deb552 Serializer adds full_access_path to serialized object representation 2024-03-26 10:52:06 +01:00
Mose Müller
d9ea33abb6 adds enum name to serialized object representation 2024-03-26 10:50:16 +01:00
66 changed files with 4389 additions and 2068 deletions

View File

@@ -11,7 +11,9 @@
- [Defining a DataService](#defining-a-dataservice)
- [Running the Server](#running-the-server)
- [Accessing the Web Interface](#accessing-the-web-interface)
- [Connecting to the Service using rpyc](#connecting-to-the-service-using-rpyc)
- [Connecting to the Service via Python Client](#connecting-to-the-service-via-python-client)
- [Tab Completion Support](#tab-completion-support)
- [Integration within Another Service](#integration-within-another-service)
- [Understanding the Component System](#understanding-the-component-system)
- [Built-in Type and Enum Components](#built-in-type-and-enum-components)
- [Method Components](#method-components)
@@ -44,7 +46,7 @@
<!-- no toc -->
- [Simple data service definition through class-based interface](#defining-a-dataService)
- [Integrated web interface for interactive access and control of your data service](#accessing-the-web-interface)
- [Support for `rpyc` connections, allowing for programmatic control and interaction with your service](#connecting-to-the-service-using-rpyc)
- [Support for programmatic control and interaction with your service](#connecting-to-the-service-via-python-client)
- [Component system bridging Python backend with frontend visual representation](#understanding-the-component-system)
- [Customizable styling for the web interface through user-defined CSS](#customizing-web-interface-style)
- [Saving and restoring the service state for service persistence](#understanding-service-persistence)
@@ -74,11 +76,11 @@ pip install pydase
<!--usage-start-->
Using `pydase` involves three main steps: defining a `DataService` subclass, running the server, and then connecting to the service either programmatically using `rpyc` or through the web interface.
Using `pydase` involves three main steps: defining a `DataService` subclass, running the server, and then connecting to the service either programmatically using `pydase.Client` or through the web interface.
### Defining a DataService
To use pydase, you'll first need to create a class that inherits from `DataService`. This class represents your custom data service, which will be exposed via RPC (using rpyc) and a web server. Your class can implement class / instance attributes and synchronous and asynchronous tasks.
To use pydase, you'll first need to create a class that inherits from `DataService`. This class represents your custom data service, which will be exposed via a web server. Your class can implement class / instance attributes and synchronous and asynchronous tasks.
Here's an example:
@@ -159,23 +161,51 @@ Once the server is running, you can access the web interface in a browser:
In this interface, you can interact with the properties of your `Device` service.
### Connecting to the Service using rpyc
### Connecting to the Service via Python Client
You can also connect to the service using `rpyc`. Here's an example on how to establish a connection and interact with the service:
You can connect to the service using the `pydase.Client`. Below is an example of how to establish a connection to a service and interact with it:
```python
import rpyc
import pydase
# Connect to the service
conn = rpyc.connect("<ip_addr>", 18871)
client = conn.root
# Replace the hostname and port with the IP address and the port of the machine where
# the service is running, respectively
client_proxy = pydase.Client(hostname="<ip_addr>", port=8001).proxy
# Interact with the service
client.voltage = 5.0
print(client.voltage) # prints 5.0
# After the connection, interact with the service attributes as if they were local
client_proxy.voltage = 5.0
print(client_proxy.voltage) # Expected output: 5.0
```
In this example, replace `<ip_addr>` with the IP address of the machine where the service is running. After establishing a connection, you can interact with the service attributes as if they were local attributes.
This example demonstrates setting and retrieving the `voltage` attribute through the client proxy.
The proxy acts as a local representative of the remote service, enabling straightforward interaction.
The proxy class dynamically synchronizes with the server's exposed attributes. This synchronization allows the proxy to be automatically updated with any attributes or methods that the server exposes, essentially mirroring the server's API. This dynamic updating enables users to interact with the remote service as if they were working with a local object.
#### Tab Completion Support
In interactive environments such as Python interpreters and Jupyter notebooks, the proxy class supports tab completion, which allows users to explore available methods and attributes.
#### Integration within Another Service
You can also integrate a client proxy within another service. Here's how you can set it up:
```python
import pydase
class MyService(pydase.DataService):
# Initialize the client without blocking the constructor
proxy = pydase.Client(hostname="<ip_addr>", port=8001, block_until_connected=False).proxy
if __name__ == "__main__":
service = MyService()
# Create a server that exposes this service; adjust the web_port as needed
server = pydase.Server(service, web_port=8002). run()
```
In this setup, the `MyService` class has a `proxy` attribute that connects to a `pydase` service located at `<ip_addr>:8001`.
The `block_until_connected=False` argument allows the service to start up even if the initial connection attempt fails.
This configuration is particularly useful in distributed systems where services may start in any order.
<!--usage-end-->
@@ -193,6 +223,7 @@ In `pydase`, components are fundamental building blocks that bridge the Python b
- `int` and `float`: Manifested as the `NumberComponent`.
- `bool`: Rendered as a `ButtonComponent`.
- `list`: Each item displayed individually, named after the list attribute and its index.
- `dict`: Each key-value pair displayed individually, named after the dictionary attribute and its key. **Note** that the dictionary keys must be strings.
- `enum.Enum`: Presented as an `EnumComponent`, facilitating dropdown selection.
### Method Components
@@ -608,6 +639,9 @@ my_service.status = MyStatus.FAILED
![ColouredEnum Component](docs/images/ColouredEnum_component.png)
**Note** that each enumeration name and value must be unique.
This means that you should use different colour formats when you want to use a colour multiple times.
#### Extending with New Components
Users can also extend the library by creating custom components. This involves defining the behavior on the Python backend and the visual representation on the frontend. For those looking to introduce new components, the [guide on adding components](https://pydase.readthedocs.io/en/latest/dev-guide/Adding_Components/) provides detailed steps on achieving this.

View File

@@ -188,8 +188,6 @@ const App = () => {
<div className="App navbarOffset">
<WebSettingsContext.Provider value={webSettings}>
<GenericComponent
name=""
parentPath=""
attribute={state as SerializedValue}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}

View File

@@ -1,12 +1,11 @@
import React, { useEffect, useRef } from 'react';
import React, { useEffect, useRef, useState } from 'react';
import { runMethod } from '../socket';
import { Form, Button, InputGroup } from 'react-bootstrap';
import { Form, Button, InputGroup, Spinner } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import { LevelName } from './NotificationsComponent';
type AsyncMethodProps = {
name: string;
parentPath: string;
fullAccessPath: string;
value: 'RUNNING' | null;
docString?: string;
hideOutput?: boolean;
@@ -18,8 +17,7 @@ type AsyncMethodProps = {
export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
const {
name,
parentPath,
fullAccessPath,
docString,
value: runningTask,
addNotification,
@@ -34,7 +32,9 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
const renderCount = useRef(0);
const formRef = useRef(null);
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
const [spinning, setSpinning] = useState(false);
const name = fullAccessPath.split('.').at(-1);
const parentPath = fullAccessPath.slice(0, -(name.length + 1));
useEffect(() => {
renderCount.current++;
@@ -46,6 +46,7 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
message = `${fullAccessPath} was started.`;
}
addNotification(message);
setSpinning(false);
}, [props.value]);
const execute = async (event: React.FormEvent) => {
@@ -58,7 +59,9 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
method_name = `start_${name}`;
}
runMethod(method_name, parentPath, {});
const accessPath = [parentPath, method_name].filter((element) => element).join('.');
setSpinning(true);
runMethod(accessPath);
};
return (
@@ -73,7 +76,13 @@ export const AsyncMethodComponent = React.memo((props: AsyncMethodProps) => {
<DocStringComponent docString={docString} />
</InputGroup.Text>
<Button id={`button-${id}`} type="submit">
{runningTask === 'RUNNING' ? 'Stop ' : 'Start '}
{spinning ? (
<Spinner size="sm" role="status" aria-hidden="true" />
) : runningTask === 'RUNNING' ? (
'Stop '
) : (
'Start '
)}
</Button>
</InputGroup>
</Form>

View File

@@ -1,22 +1,17 @@
import React, { useEffect, useRef } from 'react';
import { ToggleButton } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import { SerializedValue } from './GenericComponent';
import { LevelName } from './NotificationsComponent';
type ButtonComponentProps = {
name: string;
parentPath?: string;
fullAccessPath: string;
value: boolean;
readOnly: boolean;
docString: string;
mapping?: [string, string]; // Enforce a tuple of two strings
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
changeCallback?: (value: SerializedValue, callback?: (ack: unknown) => void) => void;
displayName: string;
id: string;
};
@@ -24,6 +19,7 @@ type ButtonComponentProps = {
export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
const {
value,
fullAccessPath,
readOnly,
docString,
addNotification,
@@ -32,9 +28,6 @@ export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
id
} = props;
// const buttonName = props.mapping ? (value ? props.mapping[0] : props.mapping[1]) : name;
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
const renderCount = useRef(0);
@@ -47,7 +40,13 @@ export const ButtonComponent = React.memo((props: ButtonComponentProps) => {
}, [props.value]);
const setChecked = (checked: boolean) => {
changeCallback(checked);
changeCallback({
type: 'bool',
value: checked,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
};
return (

View File

@@ -1,100 +0,0 @@
import React, { useEffect, useRef, useState } from 'react';
import { InputGroup, Form, Row, Col } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import { LevelName } from './NotificationsComponent';
type ColouredEnumComponentProps = {
name: string;
parentPath: string;
value: string;
docString?: string;
readOnly: boolean;
enumDict: Record<string, string>;
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
displayName: string;
id: string;
};
export const ColouredEnumComponent = React.memo((props: ColouredEnumComponentProps) => {
const {
name,
value,
docString,
enumDict,
readOnly,
addNotification,
displayName,
id
} = props;
let { changeCallback } = props;
if (changeCallback === undefined) {
changeCallback = (value: string) => {
setEnumValue(() => {
return value;
});
};
}
const renderCount = useRef(0);
const [enumValue, setEnumValue] = useState(value);
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
useEffect(() => {
renderCount.current++;
});
useEffect(() => {
setEnumValue(() => {
return props.value;
});
addNotification(`${fullAccessPath} changed to ${value}.`);
}, [props.value]);
return (
<div className={'component enumComponent'} id={id}>
{process.env.NODE_ENV === 'development' && (
<div>Render count: {renderCount.current}</div>
)}
<Row>
<Col className="d-flex align-items-center">
<InputGroup.Text>
{displayName}
<DocStringComponent docString={docString} />
</InputGroup.Text>
{readOnly ? (
// Display the Form.Control when readOnly is true
<Form.Control
value={enumValue}
name={name}
disabled={true}
style={{ backgroundColor: enumDict[enumValue] }}
/>
) : (
// Display the Form.Select when readOnly is false
<Form.Select
aria-label="coloured-enum-select"
value={enumValue}
name={name}
style={{ backgroundColor: enumDict[enumValue] }}
onChange={(event) => changeCallback(event.target.value)}>
{Object.entries(enumDict).map(([key]) => (
<option key={key} value={key}>
{key}
</option>
))}
</Form.Select>
)}
</Col>
</Row>
</div>
);
});

View File

@@ -6,9 +6,7 @@ import { SerializedValue, GenericComponent } from './GenericComponent';
import { LevelName } from './NotificationsComponent';
type DataServiceProps = {
name: string;
props: DataServiceJSON;
parentPath?: string;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
displayName: string;
@@ -18,17 +16,8 @@ type DataServiceProps = {
export type DataServiceJSON = Record<string, SerializedValue>;
export const DataServiceComponent = React.memo(
({
name,
props,
parentPath = undefined,
isInstantUpdate,
addNotification,
displayName,
id
}: DataServiceProps) => {
({ props, isInstantUpdate, addNotification, displayName, id }: DataServiceProps) => {
const [open, setOpen] = useState(true);
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
if (displayName !== '') {
return (
@@ -43,8 +32,6 @@ export const DataServiceComponent = React.memo(
<GenericComponent
key={key}
attribute={value}
name={key}
parentPath={fullAccessPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
/>
@@ -61,8 +48,6 @@ export const DataServiceComponent = React.memo(
<GenericComponent
key={key}
attribute={value}
name={key}
parentPath={fullAccessPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
/>

View File

@@ -4,9 +4,8 @@ import { DataServiceComponent, DataServiceJSON } from './DataServiceComponent';
import { MethodComponent } from './MethodComponent';
type DeviceConnectionProps = {
name: string;
fullAccessPath: string;
props: DataServiceJSON;
parentPath: string;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
displayName: string;
@@ -15,9 +14,8 @@ type DeviceConnectionProps = {
export const DeviceConnectionComponent = React.memo(
({
name,
fullAccessPath,
props,
parentPath,
isInstantUpdate,
addNotification,
displayName,
@@ -26,8 +24,6 @@ export const DeviceConnectionComponent = React.memo(
const { connected, connect, ...updatedProps } = props;
const connectedVal = connected.value;
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
return (
<div className="deviceConnectionComponent" id={id}>
{!connectedVal && (
@@ -36,8 +32,7 @@ export const DeviceConnectionComponent = React.memo(
{displayName != '' ? displayName : 'Device'} is currently not available!
</div>
<MethodComponent
name="connect"
parentPath={fullAccessPath}
fullAccessPath={`${fullAccessPath}.connect`}
docString={connect.doc}
addNotification={addNotification}
displayName={'reconnect'}
@@ -47,9 +42,7 @@ export const DeviceConnectionComponent = React.memo(
</div>
)}
<DataServiceComponent
name={name}
props={updatedProps}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
displayName={displayName}

View File

@@ -0,0 +1,42 @@
import React, { useEffect, useRef } from 'react';
import { DocStringComponent } from './DocStringComponent';
import { SerializedValue, GenericComponent } from './GenericComponent';
import { LevelName } from './NotificationsComponent';
type DictComponentProps = {
value: Record<string, SerializedValue>;
docString: string;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
id: string;
};
export const DictComponent = React.memo((props: DictComponentProps) => {
const { value, docString, isInstantUpdate, addNotification, id } = props;
const renderCount = useRef(0);
const valueArray = Object.values(value);
useEffect(() => {
renderCount.current++;
}, [props]);
return (
<div className={'listComponent'} id={id}>
{process.env.NODE_ENV === 'development' && (
<div>Render count: {renderCount.current}</div>
)}
<DocStringComponent docString={docString} />
{valueArray.map((item) => {
return (
<GenericComponent
key={item.full_access_path}
attribute={item}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
/>
);
})}
</div>
);
});

View File

@@ -1,63 +1,58 @@
import React, { useEffect, useRef, useState } from 'react';
import { InputGroup, Form, Row, Col } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import { SerializedValue } from './GenericComponent';
import { LevelName } from './NotificationsComponent';
type EnumComponentProps = {
export type EnumSerialization = {
type: 'Enum' | 'ColouredEnum';
full_access_path: string;
name: string;
parentPath: string;
value: string;
docString?: string;
readOnly: boolean;
enumDict: Record<string, string>;
readonly: boolean;
doc?: string | null;
enum: Record<string, string>;
};
type EnumComponentProps = {
attribute: EnumSerialization;
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
displayName: string;
id: string;
changeCallback?: (value: SerializedValue, callback?: (ack: unknown) => void) => void;
};
export const EnumComponent = React.memo((props: EnumComponentProps) => {
const { attribute, addNotification, displayName, id } = props;
const {
name,
full_access_path: fullAccessPath,
value,
docString,
enumDict,
addNotification,
displayName,
id,
readOnly
} = props;
doc: docString,
enum: enumDict,
readonly: readOnly
} = attribute;
let { changeCallback } = props;
if (changeCallback === undefined) {
changeCallback = (value: string) => {
changeCallback = (value: SerializedValue) => {
setEnumValue(() => {
return value;
return String(value.value);
});
};
}
const renderCount = useRef(0);
const [enumValue, setEnumValue] = useState(value);
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
useEffect(() => {
renderCount.current++;
});
useEffect(() => {
setEnumValue(() => {
return props.value;
return value;
});
addNotification(`${fullAccessPath} changed to ${value}.`);
}, [props.value]);
}, [value]);
return (
<div className={'component enumComponent'} id={id}>
@@ -73,17 +68,41 @@ export const EnumComponent = React.memo((props: EnumComponentProps) => {
{readOnly ? (
// Display the Form.Control when readOnly is true
<Form.Control value={enumDict[enumValue]} name={name} disabled={true} />
<Form.Control
style={
attribute.type == 'ColouredEnum'
? { backgroundColor: enumDict[enumValue] }
: {}
}
value={attribute.type == 'ColouredEnum' ? enumValue : enumDict[enumValue]}
name={fullAccessPath}
disabled={true}
/>
) : (
// Display the Form.Select when readOnly is false
<Form.Select
aria-label="example-select"
value={enumValue}
name={name}
onChange={(event) => changeCallback(event.target.value)}>
name={fullAccessPath}
style={
attribute.type == 'ColouredEnum'
? { backgroundColor: enumDict[enumValue] }
: {}
}
onChange={(event) =>
changeCallback({
type: attribute.type,
name: attribute.name,
enum: enumDict,
value: event.target.value,
full_access_path: fullAccessPath,
readonly: attribute.readonly,
doc: attribute.doc
})
}>
{Object.entries(enumDict).map(([key, val]) => (
<option key={key} value={key}>
{val}
{attribute.type == 'ColouredEnum' ? key : val}
</option>
))}
</Form.Select>

View File

@@ -2,7 +2,7 @@ import React, { useContext } from 'react';
import { ButtonComponent } from './ButtonComponent';
import { NumberComponent } from './NumberComponent';
import { SliderComponent } from './SliderComponent';
import { EnumComponent } from './EnumComponent';
import { EnumComponent, EnumSerialization } from './EnumComponent';
import { MethodComponent } from './MethodComponent';
import { AsyncMethodComponent } from './AsyncMethodComponent';
import { StringComponent } from './StringComponent';
@@ -10,11 +10,12 @@ import { ListComponent } from './ListComponent';
import { DataServiceComponent, DataServiceJSON } from './DataServiceComponent';
import { DeviceConnectionComponent } from './DeviceConnection';
import { ImageComponent } from './ImageComponent';
import { ColouredEnumComponent } from './ColouredEnumComponent';
import { LevelName } from './NotificationsComponent';
import { getIdFromFullAccessPath } from '../utils/stringUtils';
import { WebSettingsContext } from '../WebSettings';
import { setAttribute } from '../socket';
import { updateValue } from '../socket';
import { DictComponent } from './DictComponent';
import { parseFullAccessPath } from '../utils/stateUtils';
type AttributeType =
| 'str'
@@ -22,7 +23,9 @@ type AttributeType =
| 'float'
| 'int'
| 'Quantity'
| 'None'
| 'list'
| 'dict'
| 'method'
| 'DataService'
| 'DeviceConnection'
@@ -34,6 +37,8 @@ type AttributeType =
type ValueType = boolean | string | number | Record<string, unknown>;
export type SerializedValue = {
type: AttributeType;
full_access_path: string;
name?: string;
value?: ValueType | ValueType[];
readonly: boolean;
doc?: string | null;
@@ -43,24 +48,41 @@ export type SerializedValue = {
};
type GenericComponentProps = {
attribute: SerializedValue;
name: string;
parentPath: string;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
};
const getPathFromPathParts = (pathParts: string[]): string => {
let path = '';
for (const pathPart of pathParts) {
if (!pathPart.startsWith('[') && path !== '') {
path += '.';
}
path += pathPart;
}
return path;
};
const createDisplayNameFromAccessPath = (fullAccessPath: string): string => {
const displayNameParts = [];
const parsedFullAccessPath = parseFullAccessPath(fullAccessPath);
for (let i = parsedFullAccessPath.length - 1; i >= 0; i--) {
const item = parsedFullAccessPath[i];
displayNameParts.unshift(item);
if (!item.startsWith('[')) {
break;
}
}
return getPathFromPathParts(displayNameParts);
};
export const GenericComponent = React.memo(
({
attribute,
name,
parentPath,
isInstantUpdate,
addNotification
}: GenericComponentProps) => {
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
({ attribute, isInstantUpdate, addNotification }: GenericComponentProps) => {
const { full_access_path: fullAccessPath } = attribute;
const id = getIdFromFullAccessPath(fullAccessPath);
const webSettings = useContext(WebSettingsContext);
let displayName = name;
let displayName = createDisplayNameFromAccessPath(fullAccessPath);
if (webSettings[fullAccessPath]) {
if (webSettings[fullAccessPath].display === false) {
@@ -72,19 +94,16 @@ export const GenericComponent = React.memo(
}
function changeCallback(
value: unknown,
attributeName: string = name,
prefix: string = parentPath,
value: SerializedValue,
callback: (ack: unknown) => void = undefined
) {
setAttribute(attributeName, prefix, value, callback);
updateValue(value, callback);
}
if (attribute.type === 'bool') {
return (
<ButtonComponent
name={name}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.doc}
readOnly={attribute.readonly}
value={Boolean(attribute.value)}
@@ -97,9 +116,8 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'float' || attribute.type === 'int') {
return (
<NumberComponent
name={name}
type={attribute.type}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.doc}
readOnly={attribute.readonly}
value={Number(attribute.value)}
@@ -113,9 +131,8 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'Quantity') {
return (
<NumberComponent
name={name}
type="float"
parentPath={parentPath}
type="Quantity"
fullAccessPath={fullAccessPath}
docString={attribute.doc}
readOnly={attribute.readonly}
value={Number(attribute.value['magnitude'])}
@@ -130,8 +147,7 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'NumberSlider') {
return (
<SliderComponent
name={name}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.value['value'].doc}
readOnly={attribute.readonly}
value={attribute.value['value']}
@@ -145,15 +161,10 @@ export const GenericComponent = React.memo(
id={id}
/>
);
} else if (attribute.type === 'Enum') {
} else if (attribute.type === 'Enum' || attribute.type === 'ColouredEnum') {
return (
<EnumComponent
name={name}
parentPath={parentPath}
docString={attribute.doc}
value={String(attribute.value)}
readOnly={attribute.readonly}
enumDict={attribute.enum}
attribute={attribute as EnumSerialization}
addNotification={addNotification}
changeCallback={changeCallback}
displayName={displayName}
@@ -164,8 +175,7 @@ export const GenericComponent = React.memo(
if (!attribute.async) {
return (
<MethodComponent
name={name}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.doc}
addNotification={addNotification}
displayName={displayName}
@@ -176,10 +186,9 @@ export const GenericComponent = React.memo(
} else {
return (
<AsyncMethodComponent
name={name}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.doc}
value={attribute.value as Record<string, string>}
value={attribute.value as 'RUNNING' | null}
addNotification={addNotification}
displayName={displayName}
id={id}
@@ -190,11 +199,10 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'str') {
return (
<StringComponent
name={name}
fullAccessPath={fullAccessPath}
value={attribute.value as string}
readOnly={attribute.readonly}
docString={attribute.doc}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
changeCallback={changeCallback}
@@ -205,9 +213,7 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'DataService') {
return (
<DataServiceComponent
name={name}
props={attribute.value as DataServiceJSON}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
displayName={displayName}
@@ -217,9 +223,8 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'DeviceConnection') {
return (
<DeviceConnectionComponent
name={name}
fullAccessPath={fullAccessPath}
props={attribute.value as DataServiceJSON}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
displayName={displayName}
@@ -229,10 +234,18 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'list') {
return (
<ListComponent
name={name}
value={attribute.value as SerializedValue[]}
docString={attribute.doc}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
id={id}
/>
);
} else if (attribute.type === 'dict') {
return (
<DictComponent
value={attribute.value as Record<string, SerializedValue>}
docString={attribute.doc}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
id={id}
@@ -241,8 +254,7 @@ export const GenericComponent = React.memo(
} else if (attribute.type === 'Image') {
return (
<ImageComponent
name={name}
parentPath={parentPath}
fullAccessPath={fullAccessPath}
docString={attribute.value['value'].doc}
displayName={displayName}
id={id}
@@ -252,23 +264,8 @@ export const GenericComponent = React.memo(
format={attribute.value['format']['value'] as string}
/>
);
} else if (attribute.type === 'ColouredEnum') {
return (
<ColouredEnumComponent
name={name}
parentPath={parentPath}
docString={attribute.doc}
value={String(attribute.value)}
readOnly={attribute.readonly}
enumDict={attribute.enum}
addNotification={addNotification}
changeCallback={changeCallback}
displayName={displayName}
id={id}
/>
);
} else {
return <div key={name}>{name}</div>;
return <div key={fullAccessPath}>{fullAccessPath}</div>;
}
}
);

View File

@@ -5,8 +5,7 @@ import { ChevronDown, ChevronRight } from 'react-bootstrap-icons';
import { LevelName } from './NotificationsComponent';
type ImageComponentProps = {
name: string;
parentPath: string;
fullAccessPath: string;
value: string;
docString: string;
format: string;
@@ -16,13 +15,11 @@ type ImageComponentProps = {
};
export const ImageComponent = React.memo((props: ImageComponentProps) => {
const { value, docString, format, addNotification, displayName, id } = props;
const { fullAccessPath, value, docString, format, addNotification, displayName, id } =
props;
const renderCount = useRef(0);
const [open, setOpen] = useState(true);
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
useEffect(() => {
renderCount.current++;

View File

@@ -4,8 +4,6 @@ import { SerializedValue, GenericComponent } from './GenericComponent';
import { LevelName } from './NotificationsComponent';
type ListComponentProps = {
name: string;
parentPath?: string;
value: SerializedValue[];
docString: string;
isInstantUpdate: boolean;
@@ -14,8 +12,7 @@ type ListComponentProps = {
};
export const ListComponent = React.memo((props: ListComponentProps) => {
const { name, parentPath, value, docString, isInstantUpdate, addNotification, id } =
props;
const { value, docString, isInstantUpdate, addNotification, id } = props;
const renderCount = useRef(0);
@@ -29,13 +26,11 @@ export const ListComponent = React.memo((props: ListComponentProps) => {
<div>Render count: {renderCount.current}</div>
)}
<DocStringComponent docString={docString} />
{value.map((item, index) => {
{value.map((item) => {
return (
<GenericComponent
key={`${name}[${index}]`}
key={item.full_access_path}
attribute={item}
name={`${name}[${index}]`}
parentPath={parentPath}
isInstantUpdate={isInstantUpdate}
addNotification={addNotification}
/>

View File

@@ -5,8 +5,7 @@ import { DocStringComponent } from './DocStringComponent';
import { LevelName } from './NotificationsComponent';
type MethodProps = {
name: string;
parentPath: string;
fullAccessPath: string;
docString?: string;
addNotification: (message: string, levelname?: LevelName) => void;
displayName: string;
@@ -15,7 +14,7 @@ type MethodProps = {
};
export const MethodComponent = React.memo((props: MethodProps) => {
const { name, parentPath, docString, addNotification, displayName, id } = props;
const { fullAccessPath, docString, addNotification, displayName, id } = props;
// Conditional rendering based on the 'render' prop.
if (!props.render) {
@@ -24,7 +23,6 @@ export const MethodComponent = React.memo((props: MethodProps) => {
const renderCount = useRef(0);
const formRef = useRef(null);
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
const triggerNotification = () => {
const message = `Method ${fullAccessPath} was triggered.`;
@@ -34,7 +32,7 @@ export const MethodComponent = React.memo((props: MethodProps) => {
const execute = async (event: React.FormEvent) => {
event.preventDefault();
runMethod(name, parentPath, {});
runMethod(fullAccessPath);
triggerNotification();
};

View File

@@ -3,6 +3,7 @@ import { Form, InputGroup } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import '../App.css';
import { LevelName } from './NotificationsComponent';
import { SerializedValue } from './GenericComponent';
// TODO: add button functionality
@@ -30,21 +31,15 @@ export type FloatObject = {
export type NumberObject = IntObject | FloatObject | QuantityObject;
type NumberComponentProps = {
name: string;
type: 'float' | 'int';
parentPath?: string;
type: 'float' | 'int' | 'Quantity';
fullAccessPath: string;
value: number;
readOnly: boolean;
docString: string;
isInstantUpdate: boolean;
unit?: string;
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
changeCallback?: (value: SerializedValue, callback?: (ack: unknown) => void) => void;
displayName?: string;
id: string;
};
@@ -161,7 +156,7 @@ const handleNumericKey = (
export const NumberComponent = React.memo((props: NumberComponentProps) => {
const {
name,
fullAccessPath,
value,
readOnly,
type,
@@ -179,9 +174,6 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
// Create a state for the input string
const [inputString, setInputString] = useState(value.toString());
const renderCount = useRef(0);
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
const handleKeyDown = (event) => {
const { key, target } = event;
@@ -225,7 +217,7 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
selectionStart,
selectionEnd
));
} else if (key === '.' && type === 'float') {
} else if (key === '.' && (type === 'float' || type === 'Quantity')) {
({ value: newValue, selectionStart } = handleNumericKey(
key,
value,
@@ -252,7 +244,20 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
selectionEnd
));
} else if (key === 'Enter' && !isInstantUpdate) {
changeCallback(Number(newValue));
let updatedValue: number | Record<string, unknown> = Number(newValue);
if (type === 'Quantity') {
updatedValue = {
magnitude: Number(newValue),
unit: unit
};
}
changeCallback({
type: type,
value: updatedValue,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
return;
} else {
console.debug(key);
@@ -261,7 +266,20 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
// Update the input value and maintain the cursor position
if (isInstantUpdate) {
changeCallback(Number(newValue));
let updatedValue: number | Record<string, unknown> = Number(newValue);
if (type === 'Quantity') {
updatedValue = {
magnitude: Number(newValue),
unit: unit
};
}
changeCallback({
type: type,
value: updatedValue,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
}
setInputString(newValue);
@@ -273,7 +291,20 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
const handleBlur = () => {
if (!isInstantUpdate) {
// If not in "instant update" mode, emit an update when the input field loses focus
changeCallback(Number(inputString));
let updatedValue: number | Record<string, unknown> = Number(inputString);
if (type === 'Quantity') {
updatedValue = {
magnitude: Number(inputString),
unit: unit
};
}
changeCallback({
type: type,
value: updatedValue,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
}
};
useEffect(() => {
@@ -297,7 +328,7 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
useEffect(() => {
// Set the cursor position after the component re-renders
const inputElement = document.getElementsByName(name)[0] as HTMLInputElement;
const inputElement = document.getElementsByName(id)[0] as HTMLInputElement;
if (inputElement && cursorPosition !== null) {
inputElement.setSelectionRange(cursorPosition, cursorPosition);
}
@@ -319,7 +350,8 @@ export const NumberComponent = React.memo((props: NumberComponentProps) => {
type="text"
value={inputString}
disabled={readOnly}
name={name}
onChange={() => {}}
name={id}
onKeyDown={handleKeyDown}
onBlur={handleBlur}
className={isInstantUpdate && !readOnly ? 'instantUpdate' : ''}

View File

@@ -4,24 +4,19 @@ import { DocStringComponent } from './DocStringComponent';
import { Slider } from '@mui/material';
import { NumberComponent, NumberObject } from './NumberComponent';
import { LevelName } from './NotificationsComponent';
import { SerializedValue } from './GenericComponent';
type SliderComponentProps = {
name: string;
fullAccessPath: string;
min: NumberObject;
max: NumberObject;
parentPath?: string;
value: NumberObject;
readOnly: boolean;
docString: string;
stepSize: NumberObject;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
changeCallback?: (value: SerializedValue, callback?: (ack: unknown) => void) => void;
displayName: string;
id: string;
};
@@ -30,8 +25,7 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
const renderCount = useRef(0);
const [open, setOpen] = useState(false);
const {
name,
parentPath,
fullAccessPath,
value,
min,
max,
@@ -43,7 +37,6 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
displayName,
id
} = props;
const fullAccessPath = [parentPath, name].filter((element) => element).join('.');
useEffect(() => {
renderCount.current++;
@@ -71,11 +64,26 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
if (Array.isArray(newNumber)) {
newNumber = newNumber[0];
}
changeCallback(newNumber, `${name}.value`);
changeCallback({
type: value.type,
value: newNumber,
full_access_path: `${fullAccessPath}.value`,
readonly: value.readonly,
doc: docString
});
};
const handleValueChange = (newValue: number, valueType: string) => {
changeCallback(newValue, `${name}.${valueType}`);
const handleValueChange = (
newValue: number,
name: string,
valueObject: NumberObject
) => {
changeCallback({
type: valueObject.type,
value: newValue,
full_access_path: `${fullAccessPath}.${name}`,
readonly: valueObject.readonly
});
};
const deconstructNumberDict = (
@@ -133,15 +141,14 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
<Col xs="3" xl>
<NumberComponent
isInstantUpdate={isInstantUpdate}
parentPath={parentPath}
name={`${name}.value`}
docString=""
fullAccessPath={`${fullAccessPath}.value`}
docString={docString}
readOnly={valueReadOnly}
type="float"
value={valueMagnitude}
unit={valueUnit}
addNotification={() => {}}
changeCallback={(value) => changeCallback(value, name + '.value')}
changeCallback={changeCallback}
id={id + '-value'}
/>
</Col>
@@ -179,7 +186,7 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
type="number"
value={minMagnitude}
disabled={minReadOnly}
onChange={(e) => handleValueChange(Number(e.target.value), 'min')}
onChange={(e) => handleValueChange(Number(e.target.value), 'min', min)}
/>
</Col>
@@ -189,7 +196,7 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
type="number"
value={maxMagnitude}
disabled={maxReadOnly}
onChange={(e) => handleValueChange(Number(e.target.value), 'max')}
onChange={(e) => handleValueChange(Number(e.target.value), 'max', max)}
/>
</Col>
@@ -199,7 +206,9 @@ export const SliderComponent = React.memo((props: SliderComponentProps) => {
type="number"
value={stepSizeMagnitude}
disabled={stepSizeReadOnly}
onChange={(e) => handleValueChange(Number(e.target.value), 'step_size')}
onChange={(e) =>
handleValueChange(Number(e.target.value), 'step_size', stepSize)
}
/>
</Col>
</Row>

View File

@@ -3,30 +3,25 @@ import { Form, InputGroup } from 'react-bootstrap';
import { DocStringComponent } from './DocStringComponent';
import '../App.css';
import { LevelName } from './NotificationsComponent';
import { SerializedValue } from './GenericComponent';
// TODO: add button functionality
type StringComponentProps = {
name: string;
parentPath?: string;
fullAccessPath: string;
value: string;
readOnly: boolean;
docString: string;
isInstantUpdate: boolean;
addNotification: (message: string, levelname?: LevelName) => void;
changeCallback?: (
value: unknown,
attributeName?: string,
prefix?: string,
callback?: (ack: unknown) => void
) => void;
changeCallback?: (value: SerializedValue, callback?: (ack: unknown) => void) => void;
displayName: string;
id: string;
};
export const StringComponent = React.memo((props: StringComponentProps) => {
const {
name,
fullAccessPath,
readOnly,
docString,
isInstantUpdate,
@@ -38,9 +33,6 @@ export const StringComponent = React.memo((props: StringComponentProps) => {
const renderCount = useRef(0);
const [inputString, setInputString] = useState(props.value);
const fullAccessPath = [props.parentPath, props.name]
.filter((element) => element)
.join('.');
useEffect(() => {
renderCount.current++;
@@ -63,14 +55,26 @@ export const StringComponent = React.memo((props: StringComponentProps) => {
const handleKeyDown = (event) => {
if (event.key === 'Enter' && !isInstantUpdate) {
changeCallback(inputString);
changeCallback({
type: 'str',
value: inputString,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
event.preventDefault();
}
};
const handleBlur = () => {
if (!isInstantUpdate) {
changeCallback(inputString);
changeCallback({
type: 'str',
value: inputString,
full_access_path: fullAccessPath,
readonly: readOnly,
doc: docString
});
}
};
@@ -86,7 +90,7 @@ export const StringComponent = React.memo((props: StringComponentProps) => {
</InputGroup.Text>
<Form.Control
type="text"
name={name}
name={id}
value={inputString}
disabled={readOnly}
onChange={handleChange}

View File

@@ -1,4 +1,6 @@
import { io } from 'socket.io-client';
import { SerializedValue } from './components/GenericComponent';
import { serializeDict, serializeList } from './utils/serializationUtils';
export const hostname =
process.env.NODE_ENV === 'development' ? `localhost` : window.location.hostname;
@@ -9,28 +11,44 @@ console.debug('Websocket: ', URL);
export const socket = io(URL, { path: '/ws/socket.io', transports: ['websocket'] });
export const setAttribute = (
name: string,
parentPath: string,
value: unknown,
export const updateValue = (
serializedObject: SerializedValue,
callback?: (ack: unknown) => void
) => {
if (callback) {
socket.emit('set_attribute', { name, parent_path: parentPath, value }, callback);
socket.emit(
'update_value',
{ access_path: serializedObject['full_access_path'], value: serializedObject },
callback
);
} else {
socket.emit('set_attribute', { name, parent_path: parentPath, value });
socket.emit('update_value', {
access_path: serializedObject['full_access_path'],
value: serializedObject
});
}
};
export const runMethod = (
name: string,
parentPath: string,
kwargs: Record<string, unknown>,
accessPath: string,
args: unknown[] = [],
kwargs: Record<string, unknown> = {},
callback?: (ack: unknown) => void
) => {
const serializedArgs = serializeList(args);
const serializedKwargs = serializeDict(kwargs);
if (callback) {
socket.emit('run_method', { name, parent_path: parentPath, kwargs }, callback);
socket.emit(
'trigger_method',
{ access_path: accessPath, args: serializedArgs, kwargs: serializedKwargs },
callback
);
} else {
socket.emit('run_method', { name, parent_path: parentPath, kwargs });
socket.emit('trigger_method', {
access_path: accessPath,
args: serializedArgs,
kwargs: serializedKwargs
});
}
};

View File

@@ -0,0 +1,101 @@
const serializePrimitive = (
obj: number | boolean | string | null,
accessPath: string
) => {
let type: string;
if (typeof obj === 'number') {
type = Number.isInteger(obj) ? 'int' : 'float';
return {
full_access_path: accessPath,
doc: null,
readonly: false,
type,
value: obj
};
} else if (typeof obj === 'boolean') {
type = 'bool';
return {
full_access_path: accessPath,
doc: null,
readonly: false,
type,
value: obj
};
} else if (typeof obj === 'string') {
type = 'str';
return {
full_access_path: accessPath,
doc: null,
readonly: false,
type,
value: obj
};
} else if (obj === null) {
type = 'NoneType';
return {
full_access_path: accessPath,
doc: null,
readonly: false,
type,
value: null
};
} else {
throw new Error('Unsupported type for serialization');
}
};
export const serializeList = (obj: unknown[], accessPath: string = '') => {
const doc = null;
const value = obj.map((item, index) => {
if (
typeof item === 'number' ||
typeof item === 'boolean' ||
typeof item === 'string' ||
item === null
) {
serializePrimitive(
item as number | boolean | string | null,
`${accessPath}[${index}]`
);
}
});
return {
full_access_path: accessPath,
type: 'list',
value,
readonly: false,
doc
};
};
export const serializeDict = (
obj: Record<string, unknown>,
accessPath: string = ''
) => {
const doc = null;
const value = Object.entries(obj).reduce((acc, [key, val]) => {
// Construct the new access path for nested properties
const newPath = `${accessPath}["${key}"]`;
// Serialize each value in the dictionary and assign to the accumulator
if (
typeof val === 'number' ||
typeof val === 'boolean' ||
typeof val === 'string' ||
val === null
) {
acc[key] = serializePrimitive(val as number | boolean | string | null, newPath);
}
return acc;
}, {});
return {
full_access_path: accessPath,
type: 'dict',
value,
readonly: false,
doc
};
};

View File

@@ -7,19 +7,129 @@ export type State = {
doc: string | null;
};
/**
* Splits a full access path into its atomic parts, separating attribute names, numeric
* indices (including floating points), and string keys within indices.
*
* @param path The full access path string to be split into components.
* @returns An array of components that make up the path, including attribute names,
* numeric indices, and string keys as separate elements.
*/
export function parseFullAccessPath(path: string): string[] {
// The pattern matches:
// \w+ - Words
// \[\d+\.\d+\] - Floating point numbers inside brackets
// \[\d+\] - Integers inside brackets
// \["[^"]*"\] - Double-quoted strings inside brackets
// \['[^']*'\] - Single-quoted strings inside brackets
const pattern = /\w+|\[\d+\.\d+\]|\[\d+\]|\["[^"]*"\]|\['[^']*'\]/g;
const matches = path.match(pattern);
return matches ?? []; // Return an empty array if no matches found
}
/**
* Parse a serialized key and convert it to an appropriate type (number or string).
*
* @param serializedKey The serialized key, which might be enclosed in brackets and quotes.
* @returns The processed key as a number or an unquoted string.
*
* Examples:
* console.log(parseSerializedKey("attr_name")); // Outputs: attr_name (string)
* console.log(parseSerializedKey("[123]")); // Outputs: 123 (number)
* console.log(parseSerializedKey("[12.3]")); // Outputs: 12.3 (number)
* console.log(parseSerializedKey("['hello']")); // Outputs: hello (string)
* console.log(parseSerializedKey('["12.34"]')); // Outputs: "12.34" (string)
* console.log(parseSerializedKey('["complex"]'));// Outputs: "complex" (string)
*/
function parseSerializedKey(serializedKey: string): string | number {
// Strip outer brackets if present
if (serializedKey.startsWith('[') && serializedKey.endsWith(']')) {
serializedKey = serializedKey.slice(1, -1);
}
// Strip quotes if the resulting string is quoted
if (
(serializedKey.startsWith("'") && serializedKey.endsWith("'")) ||
(serializedKey.startsWith('"') && serializedKey.endsWith('"'))
) {
return serializedKey.slice(1, -1);
}
// Try converting to a number if the string is not quoted
const parsedNumber = parseFloat(serializedKey);
if (!isNaN(parsedNumber)) {
return parsedNumber;
}
// Return the original string if it's not a valid number
return serializedKey;
}
function getOrCreateItemInContainer(
container: Record<string | number, SerializedValue> | SerializedValue[],
key: string | number,
allowAddKey: boolean
): SerializedValue {
// Check if the key exists and return the item if it does
if (key in container) {
return container[key];
}
// Handling the case where the key does not exist
if (Array.isArray(container)) {
// Handling arrays
if (allowAddKey && key === container.length) {
container.push(createEmptySerializedObject());
return container[key];
}
throw new Error(`Index out of bounds: ${key}`);
} else {
// Handling objects
if (allowAddKey) {
container[key] = createEmptySerializedObject();
return container[key];
}
throw new Error(`Key not found: ${key}`);
}
}
/**
* Retrieve an item from a container specified by the passed key. Add an item to the
* container if allowAppend is set to True.
*
* @param container Either a dictionary or list of serialized objects.
* @param key The key name or index (as a string) representing the attribute in the container.
* @param allowAppend Whether to allow appending a new entry if the specified index is out of range by exactly one position.
* @returns The serialized object corresponding to the specified key.
* @throws SerializationPathError If the key is invalid or leads to an access error without append permissions.
* @throws SerializationValueError If the expected structure is incorrect.
*/
function getContainerItemByKey(
container: Record<string, SerializedValue> | SerializedValue[],
key: string,
allowAppend: boolean = false
): SerializedValue {
const processedKey = parseSerializedKey(key);
try {
return getOrCreateItemInContainer(container, processedKey, allowAppend);
} catch (error) {
if (error instanceof RangeError) {
throw new Error(`Index '${processedKey}': ${error.message}`);
} else if (error instanceof Error) {
throw new Error(`Key '${processedKey}': ${error.message}`);
}
throw error; // Re-throw if it's not a known error type
}
}
export function setNestedValueByPath(
serializationDict: Record<string, SerializedValue>,
path: string,
serializedValue: SerializedValue
): Record<string, SerializedValue> {
const parentPathParts = path.split('.').slice(0, -1);
const attrName = path.split('.').pop();
if (!attrName) {
throw new Error('Invalid path');
}
let currentSerializedValue: SerializedValue;
const pathParts = parseFullAccessPath(path);
const newSerializationDict: Record<string, SerializedValue> = JSON.parse(
JSON.stringify(serializationDict)
);
@@ -27,81 +137,36 @@ export function setNestedValueByPath(
let currentDict = newSerializationDict;
try {
for (const pathPart of parentPathParts) {
currentSerializedValue = getNextLevelDictByKey(currentDict, pathPart, false);
// @ts-expect-error The value will be of type SerializedValue as we are still
// looping through the parent parts
currentDict = currentSerializedValue['value'];
for (let i = 0; i < pathParts.length - 1; i++) {
const pathPart = pathParts[i];
const nextLevelSerializedObject = getContainerItemByKey(
currentDict,
pathPart,
false
);
currentDict = nextLevelSerializedObject['value'] as Record<
string,
SerializedValue
>;
}
currentSerializedValue = getNextLevelDictByKey(currentDict, attrName, true);
const finalPart = pathParts[pathParts.length - 1];
const finalObject = getContainerItemByKey(currentDict, finalPart, true);
Object.assign(finalObject, serializedValue);
Object.assign(currentSerializedValue, serializedValue);
return newSerializationDict;
} catch (error) {
console.error(error);
return currentDict;
console.error(`Error occurred trying to change ${path}: ${error}`);
}
}
function getNextLevelDictByKey(
serializationDict: Record<string, SerializedValue>,
attrName: string,
allowAppend: boolean = false
): SerializedValue {
const [key, index] = parseListAttrAndIndex(attrName);
let currentDict: SerializedValue;
try {
if (index !== null) {
if (!serializationDict[key] || !Array.isArray(serializationDict[key]['value'])) {
throw new Error(`Expected an array at '${key}', but found something else.`);
}
if (index < serializationDict[key]['value'].length) {
currentDict = serializationDict[key]['value'][index];
} else if (allowAppend && index === serializationDict[key]['value'].length) {
// Appending to list
// @ts-expect-error When the index is not null, I expect an array
serializationDict[key]['value'].push({});
currentDict = serializationDict[key]['value'][index];
} else {
throw new Error(`Index out of range for '${key}[${index}]'.`);
}
} else {
if (!serializationDict[key]) {
throw new Error(`Key '${key}' not found.`);
}
currentDict = serializationDict[key];
}
} catch (error) {
throw new Error(`Error occurred trying to access '${attrName}': ${error}`);
}
if (typeof currentDict !== 'object' || currentDict === null) {
throw new Error(
`Expected a dictionary at '${attrName}', but found type '${typeof currentDict}' instead.`
);
}
return currentDict;
}
function parseListAttrAndIndex(attrString: string): [string, number | null] {
let index: number | null = null;
let attrName = attrString;
if (attrString.includes('[') && attrString.endsWith(']')) {
const parts = attrString.split('[');
attrName = parts[0];
const indexPart = parts[1].slice(0, -1); // Removes the closing ']'
if (!isNaN(parseInt(indexPart))) {
index = parseInt(indexPart);
} else {
console.error(`Invalid index format in key: ${attrString}`);
}
}
return [attrName, index];
function createEmptySerializedObject(): SerializedValue {
return {
full_access_path: '',
value: undefined,
type: 'None',
doc: null,
readonly: false
};
}

1294
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "pydase"
version = "0.7.4"
version = "0.8.2"
description = "A flexible and robust Python library for creating, managing, and interacting with data services, with built-in support for web and RPC servers, and customizable features for diverse use cases."
authors = ["Mose Mueller <mosmuell@ethz.ch>"]
readme = "README.md"
@@ -9,7 +9,6 @@ packages = [{ include = "pydase", from = "src" }]
[tool.poetry.dependencies]
python = "^3.10"
rpyc = "^5.3.1"
fastapi = "^0.108.0"
uvicorn = "^0.27.0"
toml = "^0.10.2"
@@ -17,6 +16,8 @@ python-socketio = "^5.8.0"
confz = "^2.0.0"
pint = "^0.22"
pillow = "^10.0.0"
websocket-client = "^1.7.0"
aiohttp = "^3.9.3"
[tool.poetry.group.dev]
optional = true

View File

@@ -1,3 +1,4 @@
from pydase.client.client import Client
from pydase.data_service import DataService
from pydase.server import Server
from pydase.utils.logging import setup_logging
@@ -7,4 +8,5 @@ setup_logging()
__all__ = [
"DataService",
"Server",
"Client",
]

View File

@@ -0,0 +1,3 @@
from pydase.client.client import Client
__all__ = ["Client"]

151
src/pydase/client/client.py Normal file
View File

@@ -0,0 +1,151 @@
import asyncio
import logging
import threading
from typing import TypedDict, cast
import socketio # type: ignore
import pydase.components
from pydase.client.proxy_loader import ProxyClassMixin, ProxyLoader
from pydase.utils.serialization.deserializer import loads
from pydase.utils.serialization.types import SerializedDataService, SerializedObject
logger = logging.getLogger(__name__)
class NotifyDataDict(TypedDict):
full_access_path: str
value: SerializedObject
class NotifyDict(TypedDict):
data: NotifyDataDict
def asyncio_loop_thread(loop: asyncio.AbstractEventLoop) -> None:
asyncio.set_event_loop(loop)
loop.run_forever()
class ProxyClass(ProxyClassMixin, pydase.components.DeviceConnection):
"""
A proxy class that serves as the interface for interacting with device connections
via a socket.io client in an asyncio environment.
Args:
sio_client (socketio.AsyncClient):
The socket.io client instance used for asynchronous communication with the
pydase service server.
loop (asyncio.AbstractEventLoop):
The event loop in which the client operations are managed and executed.
This class is used to create a proxy object that behaves like a local representation
of a remote pydase service, facilitating direct interaction as if it were local
while actually communicating over network protocols.
It can also be used as an attribute of a pydase service itself, e.g.
```python
import pydase
class MyService(pydase.DataService):
proxy = pydase.Client(
hostname="...", port=8001, block_until_connected=False
).proxy
if __name__ == "__main__":
service = MyService()
server = pydase.Server(service, web_port=8002).run()
```
"""
def __init__(
self, sio_client: socketio.AsyncClient, loop: asyncio.AbstractEventLoop
) -> None:
super().__init__()
self._initialise(sio_client=sio_client, loop=loop)
class Client:
"""
A client for connecting to a remote pydase service using socket.io. This client
handles asynchronous communication with a service, manages events such as
connection, disconnection, and updates, and ensures that the proxy object is
up-to-date with the server state.
Attributes:
proxy (ProxyClass):
A proxy object representing the remote service, facilitating interaction as
if it were local.
Args:
hostname (str):
Hostname of the exposed service this client attempts to connect to.
Default is "localhost".
port (int):
Port of the exposed service this client attempts to connect on.
Default is 8001.
block_until_connected (bool):
If set to True, the constructor will block until the connection to the
service has been established. This is useful for ensuring the client is
ready to use immediately after instantiation. Default is True.
"""
def __init__(
self,
hostname: str,
port: int,
block_until_connected: bool = True,
):
self._hostname = hostname
self._port = port
self._sio = socketio.AsyncClient()
self._loop = asyncio.new_event_loop()
self.proxy = ProxyClass(sio_client=self._sio, loop=self._loop)
self._thread = threading.Thread(
target=asyncio_loop_thread, args=(self._loop,), daemon=True
)
self._thread.start()
connection_future = asyncio.run_coroutine_threadsafe(
self._connect(), self._loop
)
if block_until_connected:
connection_future.result()
async def _connect(self) -> None:
logger.debug("Connecting to server '%s:%s' ...", self._hostname, self._port)
await self._setup_events()
await self._sio.connect(
f"ws://{self._hostname}:{self._port}",
socketio_path="/ws/socket.io",
transports=["websocket"],
retry=True,
)
async def _setup_events(self) -> None:
self._sio.on("connect", self._handle_connect)
self._sio.on("disconnect", self._handle_disconnect)
self._sio.on("notify", self._handle_update)
async def _handle_connect(self) -> None:
logger.debug("Connected to '%s:%s' ...", self._hostname, self._port)
serialized_object = cast(
SerializedDataService, await self._sio.call("service_serialization")
)
ProxyLoader.update_data_service_proxy(
self.proxy, serialized_object=serialized_object
)
serialized_object["type"] = "DeviceConnection"
self.proxy._notify_changed("", loads(serialized_object))
self.proxy._connected = True
async def _handle_disconnect(self) -> None:
logger.debug("Disconnected from '%s:%s' ...", self._hostname, self._port)
self.proxy._connected = False
async def _handle_update(self, data: NotifyDict) -> None:
self.proxy._notify_changed(
data["data"]["full_access_path"],
loads(data["data"]["value"]),
)

View File

@@ -0,0 +1,409 @@
import asyncio
import logging
from collections.abc import Iterable
from copy import copy
from typing import TYPE_CHECKING, Any, cast
import socketio # type: ignore
from typing_extensions import SupportsIndex
from pydase.utils.serialization.deserializer import Deserializer, loads
from pydase.utils.serialization.serializer import dump
from pydase.utils.serialization.types import SerializedObject
if TYPE_CHECKING:
from collections.abc import Callable
logger = logging.getLogger(__name__)
class ProxyAttributeError(Exception): ...
def trigger_method(
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
access_path: str,
args: list[Any],
kwargs: dict[str, Any],
) -> Any:
async def async_trigger_method() -> Any:
return await sio_client.call(
"trigger_method",
{
"access_path": access_path,
"args": dump(args),
"kwargs": dump(kwargs),
},
)
result: SerializedObject | None = asyncio.run_coroutine_threadsafe(
async_trigger_method(),
loop=loop,
).result()
if result is not None:
return ProxyLoader.loads_proxy(
serialized_object=result, sio_client=sio_client, loop=loop
)
return None
def update_value(
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
access_path: str,
value: Any,
) -> Any:
async def set_result() -> Any:
return await sio_client.call(
"update_value",
{
"access_path": access_path,
"value": dump(value),
},
)
result: SerializedObject | None = asyncio.run_coroutine_threadsafe(
set_result(),
loop=loop,
).result()
if result is not None:
ProxyLoader.loads_proxy(
serialized_object=result, sio_client=sio_client, loop=loop
)
class ProxyDict(dict[str, Any]):
def __init__(
self,
original_dict: dict[str, Any],
parent_path: str,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> None:
super().__init__(original_dict)
self._parent_path = parent_path
self._loop = loop
self._sio = sio_client
def __setitem__(self, key: str, value: Any) -> None:
observer_key = key
if isinstance(key, str):
observer_key = f'"{key}"'
full_access_path = f"{self._parent_path}[{observer_key}]"
update_value(self._sio, self._loop, full_access_path, value)
def pop(self, key: str) -> Any: # type: ignore
"""Removes the element from the dictionary on the server. It does not return
any proxy as the corresponding object on the server does not live anymore."""
full_access_path = f"{self._parent_path}.pop"
trigger_method(self._sio, self._loop, full_access_path, [key], {})
class ProxyList(list[Any]):
def __init__(
self,
original_list: list[Any],
parent_path: str,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> None:
super().__init__(original_list)
self._parent_path = parent_path
self._loop = loop
self._sio = sio_client
def __setitem__(self, key: int, value: Any) -> None: # type: ignore[override]
full_access_path = f"{self._parent_path}[{key}]"
update_value(self._sio, self._loop, full_access_path, value)
def append(self, __object: Any) -> None:
full_access_path = f"{self._parent_path}.append"
trigger_method(self._sio, self._loop, full_access_path, [__object], {})
def clear(self) -> None:
full_access_path = f"{self._parent_path}.clear"
trigger_method(self._sio, self._loop, full_access_path, [], {})
def extend(self, __iterable: Iterable[Any]) -> None:
full_access_path = f"{self._parent_path}.extend"
trigger_method(self._sio, self._loop, full_access_path, [__iterable], {})
def insert(self, __index: SupportsIndex, __object: Any) -> None:
full_access_path = f"{self._parent_path}.insert"
trigger_method(self._sio, self._loop, full_access_path, [__index, __object], {})
def pop(self, __index: SupportsIndex = -1) -> Any:
full_access_path = f"{self._parent_path}.pop"
return trigger_method(self._sio, self._loop, full_access_path, [__index], {})
def remove(self, __value: Any) -> None:
full_access_path = f"{self._parent_path}.remove"
trigger_method(self._sio, self._loop, full_access_path, [__value], {})
class ProxyClassMixin:
def __init__(self) -> None:
# declare before DataService init to avoid warning messaged
self._observers: dict[str, Any] = {}
self._proxy_getters: dict[str, Callable[..., Any]] = {}
self._proxy_setters: dict[str, Callable[..., Any]] = {}
self._proxy_methods: dict[str, Callable[..., Any]] = {}
def _initialise(
self,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> None:
self._loop = loop
self._sio = sio_client
def __dir__(self) -> list[str]:
"""Used to provide tab completion on CLI / notebook"""
static_dir = super().__dir__()
return sorted({*static_dir, *self._proxy_getters, *self._proxy_methods.keys()})
def __getattribute__(self, name: str) -> Any:
try:
if name in super().__getattribute__("_proxy_getters"):
return super().__getattribute__("_proxy_getters")[name]()
if name in super().__getattribute__("_proxy_methods"):
return super().__getattribute__("_proxy_methods")[name]
except AttributeError:
pass
return super().__getattribute__(name)
def __setattr__(self, name: str, value: Any) -> None:
try:
if name in super().__getattribute__("_proxy_setters"):
return super().__getattribute__("_proxy_setters")[name](value)
if name in super().__getattribute__("_proxy_getters"):
raise ProxyAttributeError(
f"Proxy attribute {name!r} of {type(self).__name__!r} is readonly!"
)
except AttributeError:
pass
return super().__setattr__(name, value)
def _handle_serialized_method(
self, attr_name: str, serialized_object: SerializedObject
) -> None:
def add_prefix_to_last_path_element(s: str, prefix: str) -> str:
parts = s.split(".")
parts[-1] = f"{prefix}_{parts[-1]}"
return ".".join(parts)
if serialized_object["type"] == "method":
if serialized_object["async"] is True:
start_method = copy(serialized_object)
start_method["full_access_path"] = add_prefix_to_last_path_element(
start_method["full_access_path"], "start"
)
stop_method = copy(serialized_object)
stop_method["full_access_path"] = add_prefix_to_last_path_element(
stop_method["full_access_path"], "stop"
)
self._add_method_proxy(f"start_{attr_name}", start_method)
self._add_method_proxy(f"stop_{attr_name}", stop_method)
else:
self._add_method_proxy(attr_name, serialized_object)
def _add_method_proxy(
self, attr_name: str, serialized_object: SerializedObject
) -> None:
def method_proxy(*args: Any, **kwargs: Any) -> Any:
return trigger_method(
self._sio,
self._loop,
serialized_object["full_access_path"],
list(args),
kwargs,
)
dict.__setitem__(self._proxy_methods, attr_name, method_proxy)
def _add_attr_proxy(
self, attr_name: str, serialized_object: SerializedObject
) -> None:
self._add_getattr_proxy(attr_name, serialized_object=serialized_object)
if not serialized_object["readonly"]:
self._add_setattr_proxy(attr_name, serialized_object=serialized_object)
def _add_setattr_proxy(
self, attr_name: str, serialized_object: SerializedObject
) -> None:
self._add_getattr_proxy(attr_name, serialized_object=serialized_object)
if not serialized_object["readonly"]:
def setter_proxy(value: Any) -> None:
update_value(
self._sio, self._loop, serialized_object["full_access_path"], value
)
dict.__setitem__(self._proxy_setters, attr_name, setter_proxy) # type: ignore
def _add_getattr_proxy(
self, attr_name: str, serialized_object: SerializedObject
) -> None:
def getter_proxy() -> Any:
async def get_result() -> Any:
return await self._sio.call(
"get_value", serialized_object["full_access_path"]
)
result = asyncio.run_coroutine_threadsafe(
get_result(),
loop=self._loop,
).result()
return ProxyLoader.loads_proxy(result, self._sio, self._loop)
dict.__setitem__(self._proxy_getters, attr_name, getter_proxy) # type: ignore
class ProxyLoader:
@staticmethod
def load_list_proxy(
serialized_object: SerializedObject,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> Any:
return ProxyList(
[
ProxyLoader.loads_proxy(item, sio_client, loop)
for item in cast(list[SerializedObject], serialized_object["value"])
],
parent_path=serialized_object["full_access_path"],
sio_client=sio_client,
loop=loop,
)
@staticmethod
def load_dict_proxy(
serialized_object: SerializedObject,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> Any:
return ProxyDict(
{
key: ProxyLoader.loads_proxy(value, sio_client, loop)
for key, value in cast(
dict[str, SerializedObject], serialized_object["value"]
).items()
},
parent_path=serialized_object["full_access_path"],
sio_client=sio_client,
loop=loop,
)
@staticmethod
def update_data_service_proxy(
proxy_class: ProxyClassMixin,
serialized_object: SerializedObject,
) -> Any:
proxy_class._proxy_getters.clear()
proxy_class._proxy_setters.clear()
proxy_class._proxy_methods.clear()
for key, value in cast(
dict[str, SerializedObject], serialized_object["value"]
).items():
type_handler: dict[str | None, None | Callable[..., Any]] = {
None: None,
"int": proxy_class._add_attr_proxy,
"float": proxy_class._add_attr_proxy,
"bool": proxy_class._add_attr_proxy,
"str": proxy_class._add_attr_proxy,
"NoneType": proxy_class._add_attr_proxy,
"Quantity": proxy_class._add_attr_proxy,
"Enum": proxy_class._add_attr_proxy,
"ColouredEnum": proxy_class._add_attr_proxy,
"method": proxy_class._handle_serialized_method,
"list": proxy_class._add_getattr_proxy,
"dict": proxy_class._add_getattr_proxy,
}
# First go through handled types (as ColouredEnum is also within the
# components)
handler = type_handler.get(value["type"])
if handler:
handler(key, value)
else:
proxy_class._add_getattr_proxy(key, value)
@staticmethod
def load_data_service_proxy(
serialized_object: SerializedObject,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> Any:
# Custom types like Components or DataService classes
component_class = cast(
type, Deserializer.get_component_class(serialized_object["type"])
)
class_bases = (
ProxyClassMixin,
component_class,
)
proxy_base_class: type[ProxyClassMixin] = type(
serialized_object["name"], # type: ignore
class_bases,
{},
)
proxy_class_instance = proxy_base_class()
proxy_class_instance._initialise(sio_client=sio_client, loop=loop)
ProxyLoader.update_data_service_proxy(
proxy_class=proxy_class_instance, serialized_object=serialized_object
)
return proxy_class_instance
@staticmethod
def load_default(
serialized_object: SerializedObject,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> Any:
return loads(serialized_object)
@staticmethod
def loads_proxy(
serialized_object: SerializedObject,
sio_client: socketio.AsyncClient,
loop: asyncio.AbstractEventLoop,
) -> Any:
type_handler: dict[str | None, None | Callable[..., Any]] = {
"int": ProxyLoader.load_default,
"float": ProxyLoader.load_default,
"bool": ProxyLoader.load_default,
"str": ProxyLoader.load_default,
"NoneType": ProxyLoader.load_default,
"Quantity": ProxyLoader.load_default,
"Enum": ProxyLoader.load_default,
"ColouredEnum": ProxyLoader.load_default,
"Exception": ProxyLoader.load_default,
"list": ProxyLoader.load_list_proxy,
"dict": ProxyLoader.load_dict_proxy,
}
# First go through handled types (as ColouredEnum is also within the components)
handler = type_handler.get(serialized_object["type"])
if handler:
return handler(
serialized_object=serialized_object, sio_client=sio_client, loop=loop
)
return ProxyLoader.load_data_service_proxy(
serialized_object=serialized_object, sio_client=sio_client, loop=loop
)

View File

@@ -56,4 +56,9 @@ class ColouredEnum(Enum):
my_service = StatusExample()
my_service.status = MyStatus.FAILED
```
Note
----
Each enumeration name and value must be unique. This means that you should use
different colour formats when you want to use a colour multiple times.
"""

View File

@@ -1,9 +1,9 @@
import asyncio
import pydase
import pydase.data_service
class DeviceConnection(pydase.DataService):
class DeviceConnection(pydase.data_service.DataService):
"""
Base class for device connection management within the pydase framework.

View File

@@ -13,7 +13,6 @@ class OperationMode(BaseConfig): # type: ignore[misc]
class ServiceConfig(BaseConfig): # type: ignore[misc]
config_dir: Path = Path("config")
web_port: int = 8001
rpc_port: int = 18871
CONFIG_SOURCES = EnvSource(allow_all=True, prefix="SERVICE_", file=".env")

View File

@@ -1,9 +1,7 @@
import inspect
import logging
from enum import Enum
from typing import Any, get_type_hints
import rpyc # type: ignore[import-untyped]
from typing import Any
import pydase.units as u
from pydase.data_service.abstract_data_service import AbstractDataService
@@ -12,11 +10,10 @@ from pydase.observer_pattern.observable.observable import (
Observable,
)
from pydase.utils.helpers import (
convert_arguments_to_hinted_types,
get_class_and_instance_attributes,
is_property_attribute,
)
from pydase.utils.serializer import (
from pydase.utils.serialization.serializer import (
SerializedObject,
Serializer,
)
@@ -24,19 +21,8 @@ from pydase.utils.serializer import (
logger = logging.getLogger(__name__)
def process_callable_attribute(attr: Any, args: dict[str, Any]) -> Any:
converted_args_or_error_msg = convert_arguments_to_hinted_types(
args, get_type_hints(attr)
)
return (
attr(**converted_args_or_error_msg)
if not isinstance(converted_args_or_error_msg, str)
else converted_args_or_error_msg
)
class DataService(rpyc.Service, AbstractDataService):
def __init__(self, **kwargs: Any) -> None:
class DataService(AbstractDataService):
def __init__(self) -> None:
super().__init__()
self._task_manager = TaskManager(self)
@@ -87,7 +73,7 @@ class DataService(rpyc.Service, AbstractDataService):
if not issubclass(
value_class,
(int | float | bool | str | list | Enum | u.Quantity | Observable),
(int | float | bool | str | list | dict | Enum | u.Quantity | Observable),
):
logger.warning(
"Class '%s' does not inherit from DataService. This may lead to"
@@ -106,26 +92,6 @@ class DataService(rpyc.Service, AbstractDataService):
):
self.__warn_if_not_observable(attr_value)
def _rpyc_getattr(self, name: str) -> Any:
if name.startswith("_"):
# disallow special and private attributes
raise AttributeError("cannot access private/special names")
# allow all other attributes
return getattr(self, name)
def _rpyc_setattr(self, name: str, value: Any) -> None:
if name.startswith("_"):
# disallow special and private attributes
raise AttributeError("cannot access private/special names")
# check if the attribute has a setter method
attr = getattr(self, name, None)
if isinstance(attr, property) and attr.fset is None:
raise AttributeError(f"{name} attribute does not have a setter method")
# allow all other attributes
setattr(self, name, value)
def serialize(self) -> SerializedObject:
"""
Serializes the instance into a dictionary, preserving the structure of the

View File

@@ -1,7 +1,7 @@
import logging
from typing import TYPE_CHECKING, Any, cast
from pydase.utils.serializer import (
from pydase.utils.serialization.serializer import (
SerializationPathError,
SerializationValueError,
SerializedObject,
@@ -45,8 +45,9 @@ class DataServiceCache:
)
except (SerializationPathError, SerializationValueError, KeyError):
return {
"full_access_path": full_access_path,
"value": None,
"type": None,
"type": "None",
"doc": None,
"readonly": False,
}

View File

@@ -8,8 +8,8 @@ from pydase.observer_pattern.observable.observable_object import ObservableObjec
from pydase.observer_pattern.observer.property_observer import (
PropertyObserver,
)
from pydase.utils.helpers import get_object_attr_from_path_list
from pydase.utils.serializer import SerializedObject, dump
from pydase.utils.helpers import get_object_attr_from_path
from pydase.utils.serialization.serializer import SerializedObject, dump
logger = logging.getLogger(__name__)
@@ -65,23 +65,23 @@ class DataServiceObserver(PropertyObserver):
cached_value_dict: SerializedObject | dict[str, Any],
) -> None:
value_dict = dump(value)
if cached_value_dict != {}:
if (
cached_value_dict["type"] != "method"
and cached_value_dict["type"] != value_dict["type"]
):
logger.warning(
"Type of '%s' changed from '%s' to '%s'. This could have unwanted "
"side effects! Consider setting it to '%s' directly.",
full_access_path,
cached_value_dict["type"],
value_dict["type"],
cached_value_dict["type"],
)
self.state_manager._data_service_cache.update_cache(
if (
cached_value_dict != {}
and cached_value_dict["type"] != "method"
and cached_value_dict["type"] != value_dict["type"]
):
logger.warning(
"Type of '%s' changed from '%s' to '%s'. This could have unwanted "
"side effects! Consider setting it to '%s' directly.",
full_access_path,
value,
cached_value_dict["type"],
value_dict["type"],
cached_value_dict["type"],
)
self.state_manager._data_service_cache.update_cache(
full_access_path,
value,
)
def _notify_dependent_property_changes(self, changed_attr_path: str) -> None:
changed_props = self.property_deps_dict.get(changed_attr_path, [])
@@ -92,7 +92,7 @@ class DataServiceObserver(PropertyObserver):
if prop not in self.changing_attributes:
self._notify_changed(
prop,
get_object_attr_from_path_list(self.observable, prop.split(".")),
get_object_attr_from_path(self.observable, prop),
)
def add_notification_callback(

View File

@@ -5,16 +5,17 @@ from collections.abc import Callable
from pathlib import Path
from typing import TYPE_CHECKING, Any, cast
import pydase.units as u
from pydase.data_service.data_service_cache import DataServiceCache
from pydase.utils.helpers import (
get_object_attr_from_path_list,
get_object_by_path_parts,
is_property_attribute,
parse_list_attr_and_index,
parse_full_access_path,
parse_serialized_key,
)
from pydase.utils.serializer import (
from pydase.utils.serialization.deserializer import loads
from pydase.utils.serialization.serializer import (
SerializationPathError,
SerializedObject,
dump,
generate_serialized_data_paths,
get_nested_dict_by_path,
serialized_dict_is_nested_object,
@@ -154,24 +155,26 @@ class StateManager:
return
for path in generate_serialized_data_paths(json_dict):
nested_json_dict = get_nested_dict_by_path(json_dict, path)
nested_class_dict = self._data_service_cache.get_value_dict_from_cache(path)
value, value_type = nested_json_dict["value"], nested_json_dict["type"]
class_attr_value_type = nested_class_dict.get("type", None)
if class_attr_value_type == value_type:
if self.__is_loadable_state_attribute(path):
self.set_service_attribute_value_by_path(path, value)
else:
logger.info(
"Attribute type of '%s' changed from '%s' to "
"'%s'. Ignoring value from JSON file...",
path,
value_type,
class_attr_value_type,
if self.__is_loadable_state_attribute(path):
nested_json_dict = get_nested_dict_by_path(json_dict, path)
nested_class_dict = self._data_service_cache.get_value_dict_from_cache(
path
)
value_type = nested_json_dict["type"]
class_attr_value_type = nested_class_dict.get("type", None)
if class_attr_value_type == value_type:
self.set_service_attribute_value_by_path(path, nested_json_dict)
else:
logger.info(
"Attribute type of '%s' changed from '%s' to "
"'%s'. Ignoring value from JSON file...",
path,
value_type,
class_attr_value_type,
)
def _get_state_dict_from_json_file(self) -> dict[str, Any]:
if self.filename is not None and os.path.exists(self.filename):
with open(self.filename) as f:
@@ -183,7 +186,7 @@ class StateManager:
def set_service_attribute_value_by_path(
self,
path: str,
value: Any,
serialized_value: SerializedObject,
) -> None:
"""
Sets the value of an attribute in the service managed by the `StateManager`
@@ -206,57 +209,60 @@ class StateManager:
logger.debug("Attribute '%s' is read-only. Ignoring new value...", path)
return
converted_value = self.__convert_value_if_needed(value, current_value_dict)
if "full_access_path" not in serialized_value:
# Backwards compatibility for JSON files not containing the
# full_access_path
logger.warning(
"The format of your JSON file is out-of-date. This might lead "
"to unexpected errors. Please consider updating it."
)
serialized_value["full_access_path"] = current_value_dict[
"full_access_path"
]
# only set value when it has changed
if self.__attr_value_has_changed(converted_value, current_value_dict["value"]):
self.__update_attribute_by_path(path, converted_value)
if self.__attr_value_has_changed(serialized_value, current_value_dict):
self.__update_attribute_by_path(path, serialized_value)
else:
logger.debug("Value of attribute '%s' has not changed...", path)
def __attr_value_has_changed(self, value_object: Any, current_value: Any) -> bool:
"""Check if the serialized value of `value_object` differs from `current_value`.
def __attr_value_has_changed(
self, serialized_new_value: Any, serialized_current_value: Any
) -> bool:
return not (
serialized_new_value["type"] == serialized_current_value["type"]
and serialized_new_value["value"] == serialized_current_value["value"]
)
The method serializes `value_object` to compare it, which is mainly
necessary for handling Quantity objects.
"""
return dump(value_object)["value"] != current_value
def __convert_value_if_needed(
self, value: Any, current_value_dict: SerializedObject
) -> Any:
if current_value_dict["type"] == "Quantity":
return u.convert_to_quantity(
value, cast(dict[str, Any], current_value_dict["value"])["unit"]
)
if current_value_dict["type"] == "float" and not isinstance(value, float):
return float(value)
return value
def __update_attribute_by_path(self, path: str, value: Any) -> None:
parent_path_list, attr_name = path.split(".")[:-1], path.split(".")[-1]
# If attr_name corresponds to a list entry, extract the attr_name and the
# index
attr_name, index = parse_list_attr_and_index(attr_name)
# Update path to reflect the attribute without list indices
path = ".".join([*parent_path_list, attr_name])
def __update_attribute_by_path(
self, path: str, serialized_value: SerializedObject
) -> None:
path_parts = parse_full_access_path(path)
target_obj = get_object_by_path_parts(self.service, path_parts[:-1])
attr_cache_type = get_nested_dict_by_path(self.cache_value, path)["type"]
# Traverse the object according to the path parts
target_obj = get_object_attr_from_path_list(self.service, parent_path_list)
# De-serialize the value
if attr_cache_type in ("ColouredEnum", "Enum"):
enum_attr = get_object_attr_from_path_list(target_obj, [attr_name])
setattr(target_obj, attr_name, enum_attr.__class__[value])
elif attr_cache_type == "list":
list_obj = get_object_attr_from_path_list(target_obj, [attr_name])
list_obj[index] = value
enum_attr = get_object_by_path_parts(target_obj, [path_parts[-1]])
# take the value of the existing enum class
if serialized_value["type"] in ("ColouredEnum", "Enum"):
try:
value = enum_attr.__class__[serialized_value["value"]]
except KeyError:
# This error will arise when setting an enum from another enum class
# In this case, we resort to loading the enum and setting it
# directly
value = loads(serialized_value)
else:
setattr(target_obj, attr_name, value)
value = loads(serialized_value)
# set the value
if isinstance(target_obj, list | dict):
processed_key = parse_serialized_key(path_parts[-1])
target_obj[processed_key] = value # type: ignore
else:
setattr(target_obj, path_parts[-1], value)
def __is_loadable_state_attribute(self, full_access_path: str) -> bool:
"""Checks if an attribute defined by a dot-separated path should be loaded from
@@ -266,28 +272,34 @@ class StateManager:
attributes default to being loadable.
"""
parent_object = get_object_attr_from_path_list(
self.service, full_access_path.split(".")[:-1]
)
attr_name = full_access_path.split(".")[-1]
path_parts = parse_full_access_path(full_access_path)
parent_object = get_object_by_path_parts(self.service, path_parts[:-1])
if is_property_attribute(parent_object, attr_name):
prop = getattr(type(parent_object), attr_name)
if is_property_attribute(parent_object, path_parts[-1]):
prop = getattr(type(parent_object), path_parts[-1])
has_decorator = has_load_state_decorator(prop)
if not has_decorator:
logger.debug(
"Property '%s' has no '@load_state' decorator. "
"Ignoring value from JSON file...",
attr_name,
path_parts[-1],
)
return has_decorator
cached_serialization_dict = get_nested_dict_by_path(
self.cache_value, full_access_path
)
try:
cached_serialization_dict = get_nested_dict_by_path(
self.cache_value, full_access_path
)
if cached_serialization_dict["value"] == "method":
if cached_serialization_dict["value"] == "method":
return False
# nested objects cannot be loaded
return not serialized_dict_is_nested_object(cached_serialization_dict)
except SerializationPathError:
logger.debug(
"Path %a could not be loaded. It does not correspond to an attribute of"
" the class. Ignoring value from JSON file...",
path_parts[-1],
)
return False
# nested objects cannot be loaded
return not serialized_dict_is_nested_object(cached_serialization_dict)

View File

@@ -1,13 +1,13 @@
{
"files": {
"main.css": "/static/css/main.7ef670d5.css",
"main.js": "/static/js/main.97ef73ea.js",
"main.js": "/static/js/main.57f8ec4c.js",
"index.html": "/index.html",
"main.7ef670d5.css.map": "/static/css/main.7ef670d5.css.map",
"main.97ef73ea.js.map": "/static/js/main.97ef73ea.js.map"
"main.57f8ec4c.js.map": "/static/js/main.57f8ec4c.js.map"
},
"entrypoints": [
"static/css/main.7ef670d5.css",
"static/js/main.97ef73ea.js"
"static/js/main.57f8ec4c.js"
]
}

View File

@@ -1 +1 @@
<!doctype html><html lang="en"><head><meta charset="utf-8"/><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Web site displaying a pydase UI."/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>pydase App</title><script defer="defer" src="/static/js/main.97ef73ea.js"></script><link href="/static/css/main.7ef670d5.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html>
<!doctype html><html lang="en"><head><meta charset="utf-8"/><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Web site displaying a pydase UI."/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>pydase App</title><script defer="defer" src="/static/js/main.57f8ec4c.js"></script><link href="/static/css/main.7ef670d5.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html>

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -15,6 +15,7 @@ class Observable(ObservableObject):
for k in set(type(self).__dict__)
- set(Observable.__dict__)
- set(self.__dict__)
- {"__annotations__"}
}
for name, value in class_attrs.items():
if isinstance(value, property) or callable(value):
@@ -67,5 +68,9 @@ class Observable(ObservableObject):
self, observer_attr_name: str, instance_attr_name: str
) -> str:
if observer_attr_name != "":
return f"{observer_attr_name}.{instance_attr_name}"
return (
f"{observer_attr_name}.{instance_attr_name}"
if instance_attr_name != ""
else observer_attr_name
)
return instance_attr_name

View File

@@ -3,6 +3,8 @@ from abc import ABC, abstractmethod
from collections.abc import Iterable
from typing import TYPE_CHECKING, Any, ClassVar, SupportsIndex
from pydase.utils.helpers import parse_serialized_key
if TYPE_CHECKING:
from pydase.observer_pattern.observer.observer import Observer
@@ -81,7 +83,7 @@ class ObservableObject(ABC):
)
observer._notify_change_start(extended_attr_path)
def _initialise_new_objects(self, attr_name_or_key: Any, value: Any) -> Any:
def _initialise_new_objects(self, attr_name_or_key: str, value: Any) -> Any:
new_value = value
if isinstance(value, list):
if id(value) in self._list_mapping:
@@ -93,14 +95,14 @@ class ObservableObject(ABC):
self._list_mapping[id(value)] = new_value
elif isinstance(value, dict):
if id(value) in self._dict_mapping:
# If the list `value` was already referenced somewhere else
# If the dict `value` was already referenced somewhere else
new_value = self._dict_mapping[id(value)]
else:
# convert the builtin list into a ObservableList
new_value = _ObservableDict(original_dict=value)
self._dict_mapping[id(value)] = new_value
if isinstance(new_value, ObservableObject):
new_value.add_observer(self, str(attr_name_or_key))
new_value.add_observer(self, attr_name_or_key)
return new_value
@abstractmethod
@@ -224,7 +226,7 @@ class _ObservableList(ObservableObject, list[Any]):
return instance_attr_name
class _ObservableDict(dict[str, Any], ObservableObject):
class _ObservableDict(ObservableObject, dict[str, Any]):
def __init__(
self,
original_dict: dict[str, Any],
@@ -233,24 +235,26 @@ class _ObservableDict(dict[str, Any], ObservableObject):
ObservableObject.__init__(self)
dict.__init__(self)
for key, value in self._original_dict.items():
super().__setitem__(key, self._initialise_new_objects(f"['{key}']", value))
self.__setitem__(key, self._initialise_new_objects(f'["{key}"]', value))
def __setitem__(self, key: str, value: Any) -> None:
if not isinstance(key, str):
logger.warning("Converting non-string dictionary key %s to string.", key)
key = str(key)
raise ValueError(
f"Invalid key type: {key} ({type(key).__name__}). In pydase services, "
"dictionary keys must be strings."
)
if hasattr(self, "_observers"):
self._remove_observer_if_observable(f"['{key}']")
value = self._initialise_new_objects(key, value)
self._notify_change_start(f"['{key}']")
self._remove_observer_if_observable(f'["{key}"]')
value = self._initialise_new_objects(f'["{key}"]', value)
self._notify_change_start(f'["{key}"]')
super().__setitem__(key, value)
self._notify_changed(f"['{key}']", value)
self._notify_changed(f'["{key}"]', value)
def _remove_observer_if_observable(self, name: str) -> None:
key = name[2:-2]
key = str(parse_serialized_key(name))
current_value = self.get(key, None)
if isinstance(current_value, ObservableObject):
@@ -262,3 +266,11 @@ class _ObservableDict(dict[str, Any], ObservableObject):
if observer_attr_name != "":
return f"{observer_attr_name}{instance_attr_name}"
return instance_attr_name
def pop(self, key: str) -> Any: # type: ignore[override]
self._remove_observer_if_observable(f'["{key}"]')
popped_item = super().pop(key)
self._notify_changed("", self)
return popped_item

View File

@@ -49,7 +49,7 @@ class PropertyObserver(Observer):
def _process_observable_properties(
self, obj: Observable, deps: dict[str, Any], prefix: str
) -> None:
for k, value in vars(type(obj)).items():
for k, value in inspect.getmembers(type(obj)):
prefix = (
f"{prefix}." if prefix != "" and not prefix.endswith(".") else prefix
)

View File

@@ -3,12 +3,10 @@ import logging
import os
import signal
import threading
from concurrent.futures import ThreadPoolExecutor
from pathlib import Path
from types import FrameType
from typing import Any, Protocol, TypedDict
from rpyc import ThreadedServer # type: ignore[import-untyped]
from uvicorn.server import HANDLED_SIGNALS
from pydase import DataService
@@ -51,8 +49,7 @@ class AdditionalServerProtocol(Protocol):
host: str,
port: int,
**kwargs: Any,
) -> None:
...
) -> None: ...
async def serve(self) -> Any:
"""Starts the server. This method should be implemented as an asynchronous
@@ -81,71 +78,67 @@ class Server:
Args:
service: DataService
The DataService instance that this server will manage.
The DataService instance that this server will manage.
host: str
The host address for the server. Default is '0.0.0.0', which means all
available network interfaces.
rpc_port: int
The port number for the RPC server. Default is
`pydase.config.ServiceConfig().rpc_port`.
The host address for the server. Default is '0.0.0.0', which means all
available network interfaces.
web_port: int
The port number for the web server. Default is
`pydase.config.ServiceConfig().web_port`.
enable_rpc: bool
Whether to enable the RPC server. Default is True.
The port number for the web server. Default is
`pydase.config.ServiceConfig().web_port`.
enable_web: bool
Whether to enable the web server. Default is True.
Whether to enable the web server. Default is True.
filename: str | Path | None
Filename of the file managing the service state persistence. Defaults to None.
use_forking_server: bool
Whether to use ForkingServer for multiprocessing. Default is False.
Filename of the file managing the service state persistence.
Defaults to None.
additional_servers : list[AdditionalServer]
A list of additional servers to run alongside the main server. Each entry in
the list should be a dictionary with the following structure:
- server: A class that adheres to the AdditionalServerProtocol. This class
should have an `__init__` method that accepts the DataService instance,
port, host, and optional keyword arguments, and a `serve` method that is
a coroutine responsible for starting the server.
- port: The port on which the additional server will be running.
- kwargs: A dictionary containing additional keyword arguments that will be
passed to the server's `__init__` method.
A list of additional servers to run alongside the main server. Each entry in
the list should be a dictionary with the following structure:
- server: A class that adheres to the AdditionalServerProtocol. This
class should have an `__init__` method that accepts the DataService
instance, port, host, and optional keyword arguments, and a `serve`
method that is a coroutine responsible for starting the server.
- port: The port on which the additional server will be running.
- kwargs: A dictionary containing additional keyword arguments that will
be passed to the server's `__init__` method.
Here's an example of how you might define an additional server:
Here's an example of how you might define an additional server:
```python
class MyCustomServer:
def __init__(
self,
data_service_observer: DataServiceObserver,
host: str,
port: int,
**kwargs: Any,
) -> None:
self.observer = data_service_observer
self.state_manager = self.observer.state_manager
self.service = self.state_manager.service
self.port = port
self.host = host
# handle any additional arguments...
>>> class MyCustomServer:
... def __init__(
... self,
... data_service_observer: DataServiceObserver,
... host: str,
... port: int,
... **kwargs: Any,
... ) -> None:
... self.observer = data_service_observer
... self.state_manager = self.observer.state_manager
... self.service = self.state_manager.service
... self.port = port
... self.host = host
... # handle any additional arguments...
...
... async def serve(self):
... # code to start the server...
async def serve(self):
# code to start the server...
```
And here's how you might add it to the `additional_servers` list when creating
a `Server` instance:
>>> server = Server(
... service=my_data_service,
... additional_servers=[
... {
... "server": MyCustomServer,
... "port": 12345,
... "kwargs": {"some_arg": "some_value"}
... }
... ],
... )
... server.run()
And here's how you might add it to the `additional_servers` list when
creating a `Server` instance:
```python
server = Server(
service=my_data_service,
additional_servers=[
{
"server": MyCustomServer,
"port": 12345,
"kwargs": {"some_arg": "some_value"}
}
],
)
server.run()
```
**kwargs: Any
Additional keyword arguments.
"""
@@ -154,9 +147,7 @@ class Server:
self,
service: DataService,
host: str = "0.0.0.0",
rpc_port: int = ServiceConfig().rpc_port,
web_port: int = ServiceConfig().web_port,
enable_rpc: bool = True,
enable_web: bool = True,
filename: str | Path | None = None,
additional_servers: list[AdditionalServer] | None = None,
@@ -166,16 +157,13 @@ class Server:
additional_servers = []
self._service = service
self._host = host
self._rpc_port = rpc_port
self._web_port = web_port
self._enable_rpc = enable_rpc
self._enable_web = enable_web
self._kwargs = kwargs
self._loop: asyncio.AbstractEventLoop
self._additional_servers = additional_servers
self.should_exit = False
self.servers: dict[str, asyncio.Future[Any]] = {}
self.executor: ThreadPoolExecutor | None = None
self._state_manager = StateManager(self._service, filename)
self._observer = DataServiceObserver(self._state_manager)
self._state_manager.load_state()
@@ -207,20 +195,6 @@ class Server:
self.install_signal_handlers()
self._service._task_manager.start_autostart_tasks()
if self._enable_rpc:
self.executor = ThreadPoolExecutor()
self._rpc_server = ThreadedServer(
self._service,
port=self._rpc_port,
protocol_config={
"allow_all_attrs": True,
"allow_setattr": True,
},
)
future_or_task = self._loop.run_in_executor(
executor=self.executor, func=self._rpc_server.start
)
self.servers["rpyc"] = future_or_task
for server in self._additional_servers:
addin_server = server["server"](
data_service_observer=self._observer,
@@ -258,10 +232,6 @@ class Server:
await self.__cancel_servers()
await self.__cancel_tasks()
if hasattr(self, "_rpc_server") and self._enable_rpc:
logger.debug("Closing rpyc server.")
self._rpc_server.close()
async def __cancel_servers(self) -> None:
for server_name, task in self.servers.items():
task.cancel()

View File

@@ -2,14 +2,15 @@ import asyncio
import logging
from typing import Any, TypedDict
import click
import socketio # type: ignore[import-untyped]
from pydase.data_service.data_service import process_callable_attribute
from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.data_service.state_manager import StateManager
from pydase.utils.helpers import get_object_attr_from_path_list
from pydase.utils.helpers import get_object_attr_from_path
from pydase.utils.logging import SocketIOHandler
from pydase.utils.serializer import SerializedObject
from pydase.utils.serialization.deserializer import Deserializer
from pydase.utils.serialization.serializer import SerializedObject, dump
logger = logging.getLogger(__name__)
@@ -21,26 +22,20 @@ class UpdateDict(TypedDict):
Attributes:
----------
name : str
The name of the attribute to be updated in the DataService instance.
If the attribute is part of a nested structure, this would be the name of the
attribute in the last nested object. For example, for an attribute access path
'attr1.list_attr[0].attr2', 'attr2' would be the name.
parent_path : str
The access path for the parent object of the attribute to be updated. This is
used to construct the full access path for the attribute. For example, for an
attribute access path 'attr1.list_attr[0].attr2', 'attr1.list_attr[0]' would be
the parent_path.
value : Any
The new value to be assigned to the attribute. The type of this value should
match the type of the attribute to be updated.
access_path : string
The full access path of the attribute to be updated.
value : SerializedObject
The serialized new value to be assigned to the attribute.
"""
name: str
parent_path: str
value: Any
access_path: str
value: SerializedObject
class TriggerMethodDict(TypedDict):
access_path: str
args: SerializedObject
kwargs: SerializedObject
class RunMethodDict(TypedDict):
@@ -119,24 +114,57 @@ def setup_sio_server(
return sio
def setup_sio_events(sio: socketio.AsyncServer, state_manager: StateManager) -> None:
@sio.event
def set_attribute(sid: str, data: UpdateDict) -> Any:
logger.debug("Received frontend update: %s", data)
parent_path = data["parent_path"].split(".")
path_list = [element for element in parent_path if element] + [data["name"]]
path = ".".join(path_list)
return state_manager.set_service_attribute_value_by_path(
path=path, value=data["value"]
def setup_sio_events(sio: socketio.AsyncServer, state_manager: StateManager) -> None: # noqa: C901
@sio.event # type: ignore
async def connect(sid: str, environ: Any) -> None:
logging.debug("Client [%s] connected", click.style(str(sid), fg="cyan"))
@sio.event # type: ignore
async def disconnect(sid: str) -> None:
logging.debug("Client [%s] disconnected", click.style(str(sid), fg="cyan"))
@sio.event # type: ignore
async def service_serialization(sid: str) -> SerializedObject:
logging.debug(
"Client [%s] requested service serialization",
click.style(str(sid), fg="cyan"),
)
return state_manager.cache
@sio.event
def run_method(sid: str, data: RunMethodDict) -> Any:
logger.debug("Running method: %s", data)
parent_path = data["parent_path"].split(".")
path_list = [element for element in parent_path if element] + [data["name"]]
method = get_object_attr_from_path_list(state_manager.service, path_list)
return process_callable_attribute(method, data["kwargs"])
async def update_value(sid: str, data: UpdateDict) -> SerializedObject | None: # type: ignore
path = data["access_path"]
try:
state_manager.set_service_attribute_value_by_path(
path=path, serialized_value=data["value"]
)
except Exception as e:
logger.exception(e)
return dump(e)
@sio.event
async def get_value(sid: str, access_path: str) -> SerializedObject:
try:
return state_manager._data_service_cache.get_value_dict_from_cache(
access_path
)
except Exception as e:
logger.exception(e)
return dump(e)
@sio.event
async def trigger_method(sid: str, data: TriggerMethodDict) -> Any:
try:
method = get_object_attr_from_path(
state_manager.service, data["access_path"]
)
args = Deserializer.deserialize(data["args"])
kwargs: dict[str, Any] = Deserializer.deserialize(data["kwargs"])
return dump(method(*args, **kwargs))
except Exception as e:
logger.error(e)
return dump(e)
def setup_logging_handler(sio: socketio.AsyncServer) -> None:

View File

@@ -16,7 +16,8 @@ from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.server.web_server.sio_setup import (
setup_sio_server,
)
from pydase.utils.serializer import generate_serialized_data_paths
from pydase.utils.helpers import get_path_from_path_parts, parse_full_access_path
from pydase.utils.serialization.serializer import generate_serialized_data_paths
from pydase.version import __version__
logger = logging.getLogger(__name__)
@@ -131,8 +132,18 @@ class WebServer:
if path in current_web_settings:
continue
# Creating the display name by reversely looping through the path parts
# until an item does not start with a square bracket, and putting the parts
# back together again. This allows for display names like
# >>> 'dict_attr["some.dotted.key"]'
display_name_parts: list[str] = []
for item in parse_full_access_path(path)[::-1]:
display_name_parts.insert(0, item)
if not item.startswith("["):
break
current_web_settings[path] = {
"displayName": path.split(".")[-1],
"displayName": get_path_from_path_parts(display_name_parts),
"display": True,
}

View File

@@ -1,3 +1,4 @@
import inspect
from collections.abc import Callable
from typing import Any
@@ -25,3 +26,17 @@ def frontend(func: Callable[..., Any]) -> Callable[..., Any]:
# Mark the function for frontend display.
func._display_in_frontend = True # type: ignore
return func
def render_in_frontend(func: Callable[..., Any]) -> bool:
"""Determines if the method should be rendered in the frontend.
It checks if the "@frontend" decorator was used or the method is a coroutine."""
if inspect.iscoroutinefunction(func):
return True
try:
return func._display_in_frontend # type: ignore
except AttributeError:
return False

View File

@@ -1,5 +1,6 @@
import inspect
import logging
import re
from collections.abc import Callable
from itertools import chain
from typing import Any
@@ -7,6 +8,92 @@ from typing import Any
logger = logging.getLogger(__name__)
def parse_serialized_key(serialized_key: str) -> str | int | float:
"""
Parse a serialized key and convert it to an appropriate type (int, float, or str).
Args:
serialized_key: str
The serialized key, which might be enclosed in brackets and quotes.
Returns:
int | float | str:
The processed key as an integer, float, or unquoted string.
Examples:
```python
print(parse_serialized_key("attr_name")) # Outputs: attr_name (str)
print(parse_serialized_key("[123]")) # Outputs: 123 (int)
print(parse_serialized_key("[12.3]")) # Outputs: 12.3 (float)
print(parse_serialized_key("['hello']")) # Outputs: hello (str)
print(parse_serialized_key('["12.34"]')) # Outputs: 12.34 (str)
print(parse_serialized_key('["complex"]')) # Outputs: complex (str)
```
"""
# Strip outer brackets if present
if serialized_key.startswith("[") and serialized_key.endswith("]"):
serialized_key = serialized_key[1:-1]
# Strip quotes if the resulting string is quoted
if serialized_key.startswith(("'", '"')) and serialized_key.endswith(("'", '"')):
return serialized_key[1:-1]
# Try converting to float or int if the string is not quoted
try:
return float(serialized_key) if "." in serialized_key else int(serialized_key)
except ValueError:
# Return the original string if it's not a valid number
return serialized_key
def parse_full_access_path(path: str) -> list[str]:
"""
Splits a full access path into its atomic parts, separating attribute names, numeric
indices (including floating points), and string keys within indices.
Args:
path: str
The full access path string to be split into components.
Returns:
list[str]
A list of components that make up the path, including attribute names,
numeric indices, and string keys as separate elements.
"""
# Matches:
# \w+ - Words
# \[\d+\.\d+\] - Floating point numbers inside brackets
# \[\d+\] - Integers inside brackets
# \["[^"]*"\] - Double-quoted strings inside brackets
# \['[^']*'\] - Single-quoted strings inside brackets
pattern = r'\w+|\[\d+\.\d+\]|\[\d+\]|\["[^"]*"\]|\[\'[^\']*\']'
return re.findall(pattern, path)
def get_path_from_path_parts(path_parts: list[str]) -> str:
"""Creates the full access path from its atomic parts.
The reverse function is given by `parse_full_access_path`.
Args:
path_parts: list[str]
A list of components that make up the path, including attribute names,
numeric indices and string keys enclosed in square brackets as separate
elements.
Returns:
str
The full access path corresponding to the path_parts.
"""
path = ""
for path_part in path_parts:
if not path_part.startswith("[") and path != "":
path += "."
path += path_part
return path
def get_attribute_doc(attr: Any) -> str | None:
"""This function takes an input attribute attr and returns its documentation
string if it's different from the documentation of its type, otherwise,
@@ -30,13 +117,27 @@ def get_class_and_instance_attributes(obj: object) -> dict[str, Any]:
return dict(chain(type(obj).__dict__.items(), obj.__dict__.items()))
def get_object_attr_from_path_list(target_obj: Any, path: list[str]) -> Any:
def get_object_by_path_parts(target_obj: Any, path_parts: list[str]) -> Any:
for part in path_parts:
if part.startswith("["):
deserialized_part = parse_serialized_key(part)
target_obj = target_obj[deserialized_part]
else:
try:
target_obj = getattr(target_obj, part)
except AttributeError:
logger.debug("Attribute %a does not exist in the object.", part)
return None
return target_obj
def get_object_attr_from_path(target_obj: Any, path: str) -> Any:
"""
Traverse the object tree according to the given path.
Args:
target_obj: The root object to start the traversal from.
path: A list of attribute names representing the path to traverse.
path: Access path of the object.
Returns:
The attribute at the end of the path. If the path includes a list index,
@@ -46,136 +147,8 @@ def get_object_attr_from_path_list(target_obj: Any, path: list[str]) -> Any:
Raises:
ValueError: If a list index in the path is not a valid integer.
"""
for part in path:
try:
# Try to split the part into attribute and index
attr, index_str = part.split("[", maxsplit=1)
index_str = index_str.replace("]", "")
index = int(index_str)
target_obj = getattr(target_obj, attr)[index]
except ValueError:
# No index, so just get the attribute
target_obj = getattr(target_obj, part)
except AttributeError:
# The attribute doesn't exist
logger.debug("Attribute % does not exist in the object.", part)
return None
return target_obj
def convert_arguments_to_hinted_types(
args: dict[str, Any], type_hints: dict[str, Any]
) -> dict[str, Any] | str:
"""
Convert the given arguments to their types hinted in the type_hints dictionary.
This function attempts to convert each argument in the args dictionary to the type
specified for the argument in the type_hints dictionary. If the conversion is
successful, the function replaces the original argument in the args dictionary with
the converted argument.
If a ValueError is raised during the conversion of an argument, the function logs
an error message and returns the error message as a string.
Args:
args: A dictionary of arguments to be converted. The keys are argument names
and the values are the arguments themselves.
type_hints: A dictionary of type hints for the arguments. The keys are
argument names and the values are the hinted types.
Returns:
A dictionary of the converted arguments if all conversions are successful,
or an error message string if a ValueError is raised during a conversion.
"""
# Convert arguments to their hinted types
for arg_name, arg_value in args.items():
if arg_name in type_hints:
arg_type = type_hints[arg_name]
if isinstance(arg_type, type):
# Attempt to convert the argument to its hinted type
try:
args[arg_name] = arg_type(arg_value)
except ValueError:
msg = (
f"Failed to convert argument '{arg_name}' to type "
f"{arg_type.__name__}"
)
logger.error(msg)
return msg
return args
def update_value_if_changed(
target: Any, attr_name_or_index: str | int, new_value: Any
) -> None:
"""
Updates the value of an attribute or a list element on a target object if the new
value differs from the current one.
This function supports updating both attributes of an object and elements of a list.
- For objects, the function first checks the current value of the attribute. If the
current value differs from the new value, the function updates the attribute.
- For lists, the function checks the current value at the specified index. If the
current value differs from the new value, the function updates the list element
at the given index.
Args:
target (Any):
The target object that has the attribute or the list.
attr_name_or_index (str | int):
The name of the attribute or the index of the list element.
new_value (Any):
The new value for the attribute or the list element.
"""
if isinstance(target, list) and isinstance(attr_name_or_index, int):
if target[attr_name_or_index] != new_value:
target[attr_name_or_index] = new_value
elif isinstance(attr_name_or_index, str):
# If the type matches and the current value is different from the new value,
# update the attribute.
if getattr(target, attr_name_or_index) != new_value:
setattr(target, attr_name_or_index, new_value)
else:
logger.error("Incompatible arguments: %s, %s.", target, attr_name_or_index)
def parse_list_attr_and_index(attr_string: str) -> tuple[str, int | None]:
"""
Parses an attribute string and extracts a potential list attribute name and its
index.
Logs an error if the index is not a valid digit.
Args:
attr_string (str):
The attribute string to parse. Can be a regular attribute name (e.g.,
'attr_name') or a list attribute with an index (e.g., 'list_attr[2]').
Returns:
tuple[str, Optional[int]]:
A tuple containing the attribute name as a string and the index as an
integer if present, otherwise None.
Examples:
>>> parse_attribute_and_index('list_attr[2]')
('list_attr', 2)
>>> parse_attribute_and_index('attr_name')
('attr_name', None)
"""
index = None
attr_name = attr_string
if "[" in attr_string and attr_string.endswith("]"):
attr_name, index_part = attr_string.split("[", 1)
index_part = index_part.rstrip("]")
if index_part.isdigit():
index = int(index_part)
else:
logger.error("Invalid index format in key: %s", attr_name)
return attr_name, index
path_parts = parse_full_access_path(path)
return get_object_by_path_parts(target_obj, path_parts)
def get_component_classes() -> list[type]:
@@ -195,8 +168,13 @@ def get_data_service_class_reference() -> Any:
return getattr(pydase.data_service.data_service, "DataService")
def is_property_attribute(target_obj: Any, attr_name: str) -> bool:
return isinstance(getattr(type(target_obj), attr_name, None), property)
def is_property_attribute(target_obj: Any, access_path: str) -> bool:
path_parts = parse_full_access_path(access_path)
target_obj = get_object_by_path_parts(target_obj, path_parts[:-1])
# don't have to check if target_obj is dict or list as their content cannot be
# properties -> always return False then
return isinstance(getattr(type(target_obj), path_parts[-1], None), property)
def function_has_arguments(func: Callable[..., Any]) -> bool:
@@ -209,17 +187,3 @@ def function_has_arguments(func: Callable[..., Any]) -> bool:
if len(parameters) > 0:
return True
return False
def render_in_frontend(func: Callable[..., Any]) -> bool:
"""Determines if the method should be rendered in the frontend.
It checks if the "@frontend" decorator was used or the method is a coroutine."""
if inspect.iscoroutinefunction(func):
return True
try:
return func._display_in_frontend # type: ignore
except AttributeError:
return False

View File

@@ -0,0 +1,151 @@
import enum
import logging
from typing import TYPE_CHECKING, Any, NoReturn, cast
import pydase
import pydase.components
import pydase.units as u
from pydase.utils.helpers import get_component_classes
from pydase.utils.serialization.types import SerializedObject
if TYPE_CHECKING:
from collections.abc import Callable
logger = logging.getLogger(__name__)
class Deserializer:
@classmethod
def deserialize(cls, serialized_object: SerializedObject) -> Any:
type_handler: dict[str | None, None | Callable[..., Any]] = {
None: None,
"int": cls.deserialize_primitive,
"float": cls.deserialize_primitive,
"bool": cls.deserialize_primitive,
"str": cls.deserialize_primitive,
"NoneType": cls.deserialize_primitive,
"Quantity": cls.deserialize_quantity,
"Enum": cls.deserialize_enum,
"ColouredEnum": lambda serialized_object: cls.deserialize_enum(
serialized_object, enum_class=pydase.components.ColouredEnum
),
"list": cls.deserialize_list,
"dict": cls.deserialize_dict,
"method": cls.deserialize_method,
"Exception": cls.deserialize_exception,
}
# First go through handled types (as ColouredEnum is also within the components)
handler = type_handler.get(serialized_object["type"])
if handler:
return handler(serialized_object)
# Custom types like Components or DataService classes
component_class = cls.get_component_class(serialized_object["type"])
if component_class:
return cls.deserialize_component_type(serialized_object, component_class)
return None
@classmethod
def deserialize_primitive(cls, serialized_object: SerializedObject) -> Any:
if serialized_object["type"] == "float":
return float(serialized_object["value"])
return serialized_object["value"]
@classmethod
def deserialize_quantity(cls, serialized_object: SerializedObject) -> Any:
return u.convert_to_quantity(serialized_object["value"]) # type: ignore
@classmethod
def deserialize_enum(
cls,
serialized_object: SerializedObject,
enum_class: type[enum.Enum] = enum.Enum,
) -> Any:
return enum_class(serialized_object["name"], serialized_object["enum"])[ # type: ignore
serialized_object["value"]
]
@classmethod
def deserialize_list(cls, serialized_object: SerializedObject) -> Any:
return [
cls.deserialize(item)
for item in cast(list[SerializedObject], serialized_object["value"])
]
@classmethod
def deserialize_dict(cls, serialized_object: SerializedObject) -> Any:
return {
key: cls.deserialize(value)
for key, value in cast(
dict[str, SerializedObject], serialized_object["value"]
).items()
}
@classmethod
def deserialize_method(cls, serialized_object: SerializedObject) -> Any:
return
@classmethod
def deserialize_exception(cls, serialized_object: SerializedObject) -> NoReturn:
import builtins
try:
exception = getattr(builtins, serialized_object["name"]) # type: ignore
except AttributeError:
exception = type(serialized_object["name"], (Exception,), {}) # type: ignore
raise exception(serialized_object["value"])
@staticmethod
def get_component_class(type_name: str | None) -> type | None:
for component_class in get_component_classes():
if type_name == component_class.__name__:
return component_class
if type_name == "DataService":
import pydase
return pydase.DataService
return None
@classmethod
def create_attr_property(cls, serialized_attr: SerializedObject) -> property:
attr_name = serialized_attr["full_access_path"].split(".")[-1]
def get(self) -> Any: # type: ignore
return getattr(self, f"_{attr_name}")
get.__doc__ = serialized_attr["doc"]
def set(self, value: Any) -> None: # type: ignore
return setattr(self, f"_{attr_name}", value)
if serialized_attr["readonly"]:
return property(get)
return property(get, set)
@classmethod
def deserialize_component_type(
cls, serialized_object: SerializedObject, base_class: type
) -> Any:
def create_proxy_class(serialized_object: SerializedObject) -> type:
class_bases = (base_class,)
class_attrs = {}
# Process and add properties based on the serialized object
for key, value in cast(
dict[str, SerializedObject], serialized_object["value"]
).items():
if value["type"] != "method":
class_attrs[key] = cls.create_attr_property(value)
# Initialize a placeholder for the attribute to avoid AttributeError
class_attrs[f"_{key}"] = cls.deserialize(value)
# Create the dynamic class with the given name and attributes
return type(serialized_object["name"], class_bases, class_attrs) # type: ignore
return create_proxy_class(serialized_object)()
def loads(serialized_object: SerializedObject) -> Any:
return Deserializer.deserialize(serialized_object)

View File

@@ -0,0 +1,548 @@
from __future__ import annotations
import inspect
import logging
import sys
from enum import Enum
from typing import TYPE_CHECKING, Any, Literal, cast
import pydase.units as u
from pydase.data_service.abstract_data_service import AbstractDataService
from pydase.data_service.task_manager import TaskStatus
from pydase.utils.decorators import render_in_frontend
from pydase.utils.helpers import (
get_attribute_doc,
get_component_classes,
get_data_service_class_reference,
parse_full_access_path,
parse_serialized_key,
)
from pydase.utils.serialization.types import (
DataServiceTypes,
SerializedBool,
SerializedDataService,
SerializedDict,
SerializedEnum,
SerializedException,
SerializedFloat,
SerializedInteger,
SerializedList,
SerializedMethod,
SerializedNoneType,
SerializedObject,
SerializedQuantity,
SerializedString,
SignatureDict,
)
if TYPE_CHECKING:
from collections.abc import Callable
logger = logging.getLogger(__name__)
class SerializationError(Exception):
pass
class SerializationPathError(Exception):
pass
class SerializationValueError(Exception):
pass
class Serializer:
@staticmethod
def serialize_object(obj: Any, access_path: str = "") -> SerializedObject: # noqa: C901
result: SerializedObject
if isinstance(obj, Exception):
result = Serializer._serialize_exception(obj)
elif isinstance(obj, AbstractDataService):
result = Serializer._serialize_data_service(obj, access_path=access_path)
elif isinstance(obj, list):
result = Serializer._serialize_list(obj, access_path=access_path)
elif isinstance(obj, dict):
result = Serializer._serialize_dict(obj, access_path=access_path)
# Special handling for u.Quantity
elif isinstance(obj, u.Quantity):
result = Serializer._serialize_quantity(obj, access_path=access_path)
# Handling for Enums
elif isinstance(obj, Enum):
result = Serializer._serialize_enum(obj, access_path=access_path)
# Methods and coroutines
elif inspect.isfunction(obj) or inspect.ismethod(obj):
result = Serializer._serialize_method(obj, access_path=access_path)
elif isinstance(obj, int | float | bool | str | None):
result = Serializer._serialize_primitive(obj, access_path=access_path)
try:
return result
except UnboundLocalError:
raise SerializationError(
f"Could not serialized object of type {type(obj)}."
)
@staticmethod
def _serialize_primitive(
obj: float | bool | str | None,
access_path: str,
) -> (
SerializedInteger
| SerializedFloat
| SerializedBool
| SerializedString
| SerializedNoneType
):
doc = get_attribute_doc(obj)
return { # type: ignore
"full_access_path": access_path,
"doc": doc,
"readonly": False,
"type": type(obj).__name__,
"value": obj,
}
@staticmethod
def _serialize_exception(obj: Exception) -> SerializedException:
return {
"full_access_path": "",
"doc": None,
"readonly": True,
"type": "Exception",
"value": obj.args[0],
"name": obj.__class__.__name__,
}
@staticmethod
def _serialize_enum(obj: Enum, access_path: str = "") -> SerializedEnum:
import pydase.components.coloured_enum
value = obj.name
doc = obj.__doc__
class_name = type(obj).__name__
if sys.version_info < (3, 11) and doc == "An enumeration.":
doc = None
if isinstance(obj, pydase.components.coloured_enum.ColouredEnum):
obj_type: Literal["ColouredEnum", "Enum"] = "ColouredEnum"
else:
obj_type = "Enum"
return {
"full_access_path": access_path,
"name": class_name,
"type": obj_type,
"value": value,
"readonly": False,
"doc": doc,
"enum": {
name: member.value for name, member in obj.__class__.__members__.items()
},
}
@staticmethod
def _serialize_quantity(
obj: u.Quantity, access_path: str = ""
) -> SerializedQuantity:
doc = get_attribute_doc(obj)
value: u.QuantityDict = {"magnitude": obj.m, "unit": str(obj.u)}
return {
"full_access_path": access_path,
"type": "Quantity",
"value": value,
"readonly": False,
"doc": doc,
}
@staticmethod
def _serialize_dict(obj: dict[str, Any], access_path: str = "") -> SerializedDict:
readonly = False
doc = get_attribute_doc(obj)
value = {}
for key, val in obj.items():
value[key] = Serializer.serialize_object(
val, access_path=f'{access_path}["{key}"]'
)
return {
"full_access_path": access_path,
"type": "dict",
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_list(obj: list[Any], access_path: str = "") -> SerializedList:
readonly = False
doc = get_attribute_doc(obj)
value = [
Serializer.serialize_object(o, access_path=f"{access_path}[{i}]")
for i, o in enumerate(obj)
]
return {
"full_access_path": access_path,
"type": "list",
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_method(
obj: Callable[..., Any], access_path: str = ""
) -> SerializedMethod:
readonly = True
doc = get_attribute_doc(obj)
frontend_render = render_in_frontend(obj)
# Store parameters and their anotations in a dictionary
sig = inspect.signature(obj)
sig.return_annotation
signature: SignatureDict = {"parameters": {}, "return_annotation": {}}
for k, v in sig.parameters.items():
default_value = cast(
dict[str, Any], {} if v.default == inspect._empty else dump(v.default)
)
default_value.pop("full_access_path", None)
signature["parameters"][k] = {
"annotation": str(v.annotation),
"default": default_value,
}
return {
"full_access_path": access_path,
"type": "method",
"value": None,
"readonly": readonly,
"doc": doc,
"async": inspect.iscoroutinefunction(obj),
"signature": signature,
"frontend_render": frontend_render,
}
@staticmethod
def _serialize_data_service(
obj: AbstractDataService, access_path: str = ""
) -> SerializedDataService:
readonly = False
doc = get_attribute_doc(obj)
obj_type: DataServiceTypes = "DataService"
obj_name = obj.__class__.__name__
# Get component base class if any
component_base_cls = next(
(cls for cls in get_component_classes() if isinstance(obj, cls)), None
)
if component_base_cls:
obj_type = component_base_cls.__name__ # type: ignore
# Get the set of DataService class attributes
data_service_attr_set = set(dir(get_data_service_class_reference()))
# Get the set of the object attributes
obj_attr_set = set(dir(obj))
# Get the difference between the two sets
derived_only_attr_set = obj_attr_set - data_service_attr_set
value: dict[str, SerializedObject] = {}
# Iterate over attributes, properties, class attributes, and methods
for key in sorted(derived_only_attr_set):
if key.startswith("_"):
continue # Skip attributes that start with underscore
# Skip keys that start with "start_" or "stop_" and end with an async
# method name
if key.startswith(("start_", "stop_")) and key.split("_", 1)[1] in {
name
for name, _ in inspect.getmembers(
obj, predicate=inspect.iscoroutinefunction
)
}:
continue
val = getattr(obj, key)
path = f"{access_path}.{key}" if access_path else key
serialized_object = Serializer.serialize_object(val, access_path=path)
# If there's a running task for this method
if serialized_object["type"] == "method" and key in obj._task_manager.tasks:
serialized_object["value"] = TaskStatus.RUNNING.name
value[key] = serialized_object
# If the DataService attribute is a property
if isinstance(getattr(obj.__class__, key, None), property):
prop: property = getattr(obj.__class__, key)
value[key]["readonly"] = prop.fset is None
value[key]["doc"] = get_attribute_doc(prop) # overwrite the doc
return {
"full_access_path": access_path,
"name": obj_name,
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
def dump(obj: Any) -> SerializedObject:
return Serializer.serialize_object(obj)
def set_nested_value_by_path(
serialization_dict: dict[Any, SerializedObject], path: str, value: Any
) -> None:
"""
Set a value in a nested dictionary structure, which conforms to the serialization
format used by `pydase.utils.serializer.Serializer`, using a dot-notation path.
Args:
serialization_dict:
The base dictionary representing data serialized with
`pydase.utils.serializer.Serializer`.
path:
The dot-notation path (e.g., 'attr1.attr2[0].attr3') indicating where to
set the value.
value:
The new value to set at the specified path.
Note:
- If the index equals the length of the list, the function will append the
serialized representation of the 'value' to the list.
"""
path_parts = parse_full_access_path(path)
current_dict: dict[Any, SerializedObject] = serialization_dict
try:
for path_part in path_parts[:-1]:
next_level_serialized_object = get_container_item_by_key(
current_dict, path_part, allow_append=False
)
current_dict = cast(
dict[Any, SerializedObject],
next_level_serialized_object["value"],
)
next_level_serialized_object = get_container_item_by_key(
current_dict, path_parts[-1], allow_append=True
)
except (SerializationPathError, SerializationValueError, KeyError) as e:
logger.error("Error occured trying to change %a: %s", path, e)
return
if next_level_serialized_object["type"] == "method": # state change of task
next_level_serialized_object["value"] = (
"RUNNING" if isinstance(value, TaskStatus) else None
)
else:
serialized_value = Serializer.serialize_object(value, access_path=path)
serialized_value["readonly"] = next_level_serialized_object["readonly"]
keys_to_keep = set(serialized_value.keys())
next_level_serialized_object.update(serialized_value) # type: ignore
# removes keys that are not present in the serialized new value
for key in list(next_level_serialized_object.keys()):
if key not in keys_to_keep:
next_level_serialized_object.pop(key, None) # type: ignore
def get_nested_dict_by_path(
serialization_dict: dict[Any, SerializedObject],
path: str,
) -> SerializedObject:
path_parts = parse_full_access_path(path)
current_dict: dict[Any, SerializedObject] = serialization_dict
for path_part in path_parts[:-1]:
next_level_serialized_object = get_container_item_by_key(
current_dict, path_part, allow_append=False
)
current_dict = cast(
dict[Any, SerializedObject],
next_level_serialized_object["value"],
)
return get_container_item_by_key(current_dict, path_parts[-1], allow_append=False)
def create_empty_serialized_object() -> SerializedObject:
"""Create a new empty serialized object."""
return {
"full_access_path": "",
"value": None,
"type": "None",
"doc": None,
"readonly": False,
}
def get_or_create_item_in_container(
container: dict[Any, SerializedObject] | list[SerializedObject],
key: Any,
*,
allow_add_key: bool,
) -> SerializedObject:
"""Ensure the key exists in the dictionary, append if necessary and allowed."""
try:
return container[key]
except IndexError:
if allow_add_key and key == len(container):
cast(list[SerializedObject], container).append(
create_empty_serialized_object()
)
return container[key]
raise
except KeyError:
if allow_add_key:
container[key] = create_empty_serialized_object()
return container[key]
raise
def get_container_item_by_key(
container: dict[Any, SerializedObject] | list[SerializedObject],
key: str,
*,
allow_append: bool = False,
) -> SerializedObject:
"""
Retrieve an item from a container specified by the passed key. Add an item to the
container if allow_append is set to True.
If specified keys or indexes do not exist, the function can append new elements to
dictionaries and to lists if `allow_append` is True and the missing element is
exactly the next sequential index (for lists).
Args:
container: dict[str, SerializedObject] | list[SerializedObject]
The container representing serialized data.
key: str
The key name representing the attribute in the dictionary, which may include
direct keys or indexes (e.g., 'attr_name', '["key"]' or '[0]').
allow_append: bool
Flag to allow appending a new entry if the specified index is out of range
by exactly one position.
Returns:
SerializedObject
The dictionary or list item corresponding to the specified attribute and
index.
Raises:
SerializationPathError:
If the path composed of `attr_name` and any specified index is invalid, or
leads to an IndexError or KeyError. This error is also raised if an attempt
to access a nonexistent key or index occurs without permission to append.
SerializationValueError:
If the retrieval results in an object that is expected to be a dictionary
but is not, indicating a mismatch between expected and actual serialized
data structure.
"""
processed_key = parse_serialized_key(key)
try:
return get_or_create_item_in_container(
container, processed_key, allow_add_key=allow_append
)
except IndexError as e:
raise SerializationPathError(f"Index '{processed_key}': {e}")
except KeyError as e:
raise SerializationPathError(f"Key '{processed_key}': {e}")
def get_data_paths_from_serialized_object( # noqa: C901
serialized_obj: SerializedObject,
parent_path: str = "",
) -> list[str]:
"""
Recursively extracts full access paths from a serialized object.
Args:
serialized_obj (SerializedObject):
The dictionary representing the serialization of an object. Produced by
`pydase.utils.serializer.Serializer`.
Returns:
list[str]:
A list of strings, each representing a full access path in the serialized
object.
"""
paths: list[str] = []
if isinstance(serialized_obj["value"], list):
for index, value in enumerate(serialized_obj["value"]):
new_path = f"{parent_path}[{index}]"
paths.append(new_path)
if serialized_dict_is_nested_object(value):
paths.extend(get_data_paths_from_serialized_object(value, new_path))
elif serialized_dict_is_nested_object(serialized_obj):
for key, value in cast(
dict[str, SerializedObject], serialized_obj["value"]
).items():
# Serialized dictionaries need to have a different new_path than nested
# classes
if serialized_obj["type"] == "dict":
processed_key = key
if isinstance(key, str):
processed_key = f'"{key}"'
new_path = f"{parent_path}[{processed_key}]"
else:
new_path = f"{parent_path}.{key}" if parent_path != "" else key
paths.append(new_path)
if serialized_dict_is_nested_object(value):
paths.extend(get_data_paths_from_serialized_object(value, new_path))
return paths
def generate_serialized_data_paths(
data: dict[str, SerializedObject],
) -> list[str]:
"""
Recursively extracts full access paths from a serialized DataService class instance.
Args:
data (dict[str, SerializedObject]):
The value of the "value" key of a serialized DataService class instance.
Returns:
list[str]:
A list of strings, each representing a full access path in the serialized
object.
"""
paths: list[str] = []
for key, value in data.items():
paths.append(key)
if serialized_dict_is_nested_object(value):
paths.extend(get_data_paths_from_serialized_object(value, key))
return paths
def serialized_dict_is_nested_object(serialized_dict: SerializedObject) -> bool:
value = serialized_dict["value"]
# We are excluding Quantity here as the value corresponding to the "value" key is
# a dictionary of the form {"magnitude": ..., "unit": ...}
return serialized_dict["type"] != "Quantity" and (isinstance(value, dict | list))

View File

@@ -0,0 +1,119 @@
from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Any, Literal, TypedDict
if TYPE_CHECKING:
import pydase.units as u
logger = logging.getLogger(__name__)
class SignatureDict(TypedDict):
parameters: dict[str, dict[str, Any]]
return_annotation: dict[str, Any]
class SerializedObjectBase(TypedDict):
full_access_path: str
doc: str | None
readonly: bool
class SerializedInteger(SerializedObjectBase):
value: int
type: Literal["int"]
class SerializedFloat(SerializedObjectBase):
value: float
type: Literal["float"]
class SerializedQuantity(SerializedObjectBase):
value: u.QuantityDict
type: Literal["Quantity"]
class SerializedBool(SerializedObjectBase):
value: bool
type: Literal["bool"]
class SerializedString(SerializedObjectBase):
value: str
type: Literal["str"]
class SerializedEnum(SerializedObjectBase):
name: str
value: str
type: Literal["Enum", "ColouredEnum"]
enum: dict[str, Any]
class SerializedList(SerializedObjectBase):
value: list[SerializedObject]
type: Literal["list"]
class SerializedDict(SerializedObjectBase):
value: dict[str, SerializedObject]
type: Literal["dict"]
class SerializedNoneType(SerializedObjectBase):
value: None
type: Literal["NoneType"]
class SerializedNoValue(SerializedObjectBase):
value: None
type: Literal["None"]
SerializedMethod = TypedDict(
"SerializedMethod",
{
"full_access_path": str,
"value": Literal["RUNNING"] | None,
"type": Literal["method"],
"doc": str | None,
"readonly": bool,
"async": bool,
"signature": SignatureDict,
"frontend_render": bool,
},
)
class SerializedException(SerializedObjectBase):
name: str
value: str
type: Literal["Exception"]
DataServiceTypes = Literal["DataService", "Image", "NumberSlider", "DeviceConnection"]
class SerializedDataService(SerializedObjectBase):
name: str
value: dict[str, SerializedObject]
type: DataServiceTypes
SerializedObject = (
SerializedBool
| SerializedFloat
| SerializedInteger
| SerializedString
| SerializedList
| SerializedDict
| SerializedNoneType
| SerializedMethod
| SerializedException
| SerializedDataService
| SerializedEnum
| SerializedQuantity
| SerializedNoValue
)

View File

@@ -1,454 +0,0 @@
from __future__ import annotations
import inspect
import logging
import sys
from enum import Enum
from typing import TYPE_CHECKING, Any, TypedDict, cast
if sys.version_info < (3, 11):
from typing_extensions import NotRequired
else:
from typing import NotRequired
import pydase.units as u
from pydase.data_service.abstract_data_service import AbstractDataService
from pydase.data_service.task_manager import TaskStatus
from pydase.utils.helpers import (
get_attribute_doc,
get_component_classes,
get_data_service_class_reference,
parse_list_attr_and_index,
render_in_frontend,
)
if TYPE_CHECKING:
from collections.abc import Callable
logger = logging.getLogger(__name__)
class SerializationPathError(Exception):
pass
class SerializationValueError(Exception):
pass
class SignatureDict(TypedDict):
parameters: dict[str, dict[str, Any]]
return_annotation: dict[str, Any]
SerializedObject = TypedDict(
"SerializedObject",
{
"name": NotRequired[str],
"value": "list[SerializedObject] | float | int | str | bool | dict[str, Any] | None", # noqa: E501
"type": str | None,
"doc": str | None,
"readonly": bool,
"enum": NotRequired[dict[str, Any]],
"async": NotRequired[bool],
"signature": NotRequired[SignatureDict],
"frontend_render": NotRequired[bool],
},
)
class Serializer:
@staticmethod
def serialize_object(obj: Any) -> SerializedObject:
result: SerializedObject
if isinstance(obj, AbstractDataService):
result = Serializer._serialize_data_service(obj)
elif isinstance(obj, list):
result = Serializer._serialize_list(obj)
elif isinstance(obj, dict):
result = Serializer._serialize_dict(obj)
# Special handling for u.Quantity
elif isinstance(obj, u.Quantity):
result = Serializer._serialize_quantity(obj)
# Handling for Enums
elif isinstance(obj, Enum):
result = Serializer._serialize_enum(obj)
# Methods and coroutines
elif inspect.isfunction(obj) or inspect.ismethod(obj):
result = Serializer._serialize_method(obj)
else:
obj_type = type(obj).__name__
value = obj
readonly = False
doc = get_attribute_doc(obj)
result = {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
return result
@staticmethod
def _serialize_enum(obj: Enum) -> SerializedObject:
import pydase.components.coloured_enum
value = obj.name
readonly = False
doc = obj.__doc__
if sys.version_info < (3, 11) and doc == "An enumeration.":
doc = None
if isinstance(obj, pydase.components.coloured_enum.ColouredEnum):
obj_type = "ColouredEnum"
else:
obj_type = "Enum"
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
"enum": {
name: member.value for name, member in obj.__class__.__members__.items()
},
}
@staticmethod
def _serialize_quantity(obj: u.Quantity) -> SerializedObject:
obj_type = "Quantity"
readonly = False
doc = get_attribute_doc(obj)
value = {"magnitude": obj.m, "unit": str(obj.u)}
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_dict(obj: dict[str, Any]) -> SerializedObject:
obj_type = "dict"
readonly = False
doc = get_attribute_doc(obj)
value = {key: Serializer.serialize_object(val) for key, val in obj.items()}
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_list(obj: list[Any]) -> SerializedObject:
obj_type = "list"
readonly = False
doc = get_attribute_doc(obj)
value = [Serializer.serialize_object(o) for o in obj]
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
@staticmethod
def _serialize_method(obj: Callable[..., Any]) -> SerializedObject:
obj_type = "method"
value = None
readonly = True
doc = get_attribute_doc(obj)
frontend_render = render_in_frontend(obj)
# Store parameters and their anotations in a dictionary
sig = inspect.signature(obj)
sig.return_annotation
signature: SignatureDict = {"parameters": {}, "return_annotation": {}}
for k, v in sig.parameters.items():
signature["parameters"][k] = {
"annotation": str(v.annotation),
"default": {} if v.default == inspect._empty else dump(v.default),
}
return {
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
"async": inspect.iscoroutinefunction(obj),
"signature": signature,
"frontend_render": frontend_render,
}
@staticmethod
def _serialize_data_service(obj: AbstractDataService) -> SerializedObject:
readonly = False
doc = get_attribute_doc(obj)
obj_type = "DataService"
obj_name = obj.__class__.__name__
# Get component base class if any
component_base_cls = next(
(cls for cls in get_component_classes() if isinstance(obj, cls)), None
)
if component_base_cls:
obj_type = component_base_cls.__name__
# Get the set of DataService class attributes
data_service_attr_set = set(dir(get_data_service_class_reference()))
# Get the set of the object attributes
obj_attr_set = set(dir(obj))
# Get the difference between the two sets
derived_only_attr_set = obj_attr_set - data_service_attr_set
value: dict[str, SerializedObject] = {}
# Iterate over attributes, properties, class attributes, and methods
for key in sorted(derived_only_attr_set):
if key.startswith("_"):
continue # Skip attributes that start with underscore
# Skip keys that start with "start_" or "stop_" and end with an async
# method name
if key.startswith(("start_", "stop_")) and key.split("_", 1)[1] in {
name
for name, _ in inspect.getmembers(
obj, predicate=inspect.iscoroutinefunction
)
}:
continue
val = getattr(obj, key)
value[key] = Serializer.serialize_object(val)
# If there's a running task for this method
if key in obj._task_manager.tasks:
value[key]["value"] = TaskStatus.RUNNING.name
# If the DataService attribute is a property
if isinstance(getattr(obj.__class__, key, None), property):
prop: property = getattr(obj.__class__, key)
value[key]["readonly"] = prop.fset is None
value[key]["doc"] = get_attribute_doc(prop) # overwrite the doc
return {
"name": obj_name,
"type": obj_type,
"value": value,
"readonly": readonly,
"doc": doc,
}
def dump(obj: Any) -> SerializedObject:
return Serializer.serialize_object(obj)
def set_nested_value_by_path(
serialization_dict: dict[str, SerializedObject], path: str, value: Any
) -> None:
"""
Set a value in a nested dictionary structure, which conforms to the serialization
format used by `pydase.utils.serializer.Serializer`, using a dot-notation path.
Args:
serialization_dict:
The base dictionary representing data serialized with
`pydase.utils.serializer.Serializer`.
path:
The dot-notation path (e.g., 'attr1.attr2[0].attr3') indicating where to
set the value.
value:
The new value to set at the specified path.
Note:
- If the index equals the length of the list, the function will append the
serialized representation of the 'value' to the list.
"""
parent_path_parts, attr_name = path.split(".")[:-1], path.split(".")[-1]
current_dict: dict[str, SerializedObject] = serialization_dict
try:
for path_part in parent_path_parts:
next_level_serialized_object = get_next_level_dict_by_key(
current_dict, path_part, allow_append=False
)
current_dict = cast(
dict[str, SerializedObject], next_level_serialized_object["value"]
)
next_level_serialized_object = get_next_level_dict_by_key(
current_dict, attr_name, allow_append=True
)
except (SerializationPathError, SerializationValueError, KeyError) as e:
logger.error(e)
return
if next_level_serialized_object["type"] == "method": # state change of task
next_level_serialized_object["value"] = (
value.name if isinstance(value, Enum) else None
)
else:
serialized_value = dump(value)
keys_to_keep = set(serialized_value.keys())
# TODO: you might also want to pop "doc" from serialized_value if
# it is overwriting the value of the current dict
serialized_value.pop("readonly") # type: ignore
next_level_serialized_object.update(serialized_value)
# removes keys that are not present in the serialized new value
for key in list(next_level_serialized_object.keys()):
if key not in keys_to_keep:
next_level_serialized_object.pop(key, None) # type: ignore
def get_nested_dict_by_path(
serialization_dict: dict[str, SerializedObject],
path: str,
) -> SerializedObject:
parent_path_parts, attr_name = path.split(".")[:-1], path.split(".")[-1]
current_dict: dict[str, SerializedObject] = serialization_dict
for path_part in parent_path_parts:
next_level_serialized_object = get_next_level_dict_by_key(
current_dict, path_part, allow_append=False
)
current_dict = cast(
dict[str, SerializedObject], next_level_serialized_object["value"]
)
return get_next_level_dict_by_key(current_dict, attr_name, allow_append=False)
def get_next_level_dict_by_key(
serialization_dict: dict[str, SerializedObject],
attr_name: str,
*,
allow_append: bool = False,
) -> SerializedObject:
"""
Retrieve a nested dictionary entry or list item from a data structure serialized
with `pydase.utils.serializer.Serializer`.
Args:
serialization_dict: The base dictionary representing serialized data.
attr_name: The key name representing the attribute in the dictionary,
e.g. 'list_attr[0]' or 'attr'
allow_append: Flag to allow appending a new entry if `index` is out of range by
one.
Returns:
The dictionary or list item corresponding to the attribute and index.
Raises:
SerializationPathError: If the path composed of `attr_name` and `index` is
invalid or leads to an IndexError or KeyError.
SerializationValueError: If the expected nested structure is not a dictionary.
"""
# Check if the key contains an index part like 'attr_name[<index>]'
attr_name, index = parse_list_attr_and_index(attr_name)
try:
if index is not None:
next_level_serialized_object = cast(
list[SerializedObject], serialization_dict[attr_name]["value"]
)[index]
else:
next_level_serialized_object = serialization_dict[attr_name]
except IndexError as e:
if (
index is not None
and allow_append
and index
== len(cast(list[SerializedObject], serialization_dict[attr_name]["value"]))
):
# Appending to list
cast(list[SerializedObject], serialization_dict[attr_name]["value"]).append(
{
"value": None,
"type": None,
"doc": None,
"readonly": False,
}
)
next_level_serialized_object = cast(
list[SerializedObject], serialization_dict[attr_name]["value"]
)[index]
else:
raise SerializationPathError(
f"Error occured trying to change '{attr_name}[{index}]': {e}"
)
except KeyError:
raise SerializationPathError(
f"Error occured trying to access the key '{attr_name}': it is either "
"not present in the current dictionary or its value does not contain "
"a 'value' key."
)
if not isinstance(next_level_serialized_object, dict):
raise SerializationValueError(
f"Expected a dictionary at '{attr_name}', but found type "
f"'{type(next_level_serialized_object).__name__}' instead."
)
return next_level_serialized_object
def generate_serialized_data_paths(
data: dict[str, Any], parent_path: str = ""
) -> list[str]:
"""
Generate a list of access paths for all attributes in a dictionary representing
data serialized with `pydase.utils.serializer.Serializer`, excluding those that are
methods. This function handles nested structures, including lists, by generating
paths for each element in the nested lists.
Args:
data (dict[str, Any]): The dictionary representing serialized data, typically
produced by `pydase.utils.serializer.Serializer`.
parent_path (str, optional): The base path to prepend to the keys in the `data`
dictionary to form the access paths. Defaults to an empty string.
Returns:
list[str]: A list of strings where each string is a dot-notation access path
to an attribute in the serialized data. For list elements, the path includes
the index in square brackets.
"""
paths: list[str] = []
for key, value in data.items():
new_path = f"{parent_path}.{key}" if parent_path else key
paths.append(new_path)
if serialized_dict_is_nested_object(value):
if isinstance(value["value"], list):
for index, item in enumerate(value["value"]):
indexed_key_path = f"{new_path}[{index}]"
paths.append(indexed_key_path)
if serialized_dict_is_nested_object(item):
paths.extend(
generate_serialized_data_paths(
item["value"], indexed_key_path
)
)
continue
paths.extend(generate_serialized_data_paths(value["value"], new_path))
return paths
def serialized_dict_is_nested_object(serialized_dict: SerializedObject) -> bool:
return (
serialized_dict["type"] != "Quantity"
and isinstance(serialized_dict["value"], dict)
) or isinstance(serialized_dict["value"], list)

136
tests/client/test_client.py Normal file
View File

@@ -0,0 +1,136 @@
import threading
from collections.abc import Generator
from typing import Any
import pydase
import pytest
from pydase.client.proxy_loader import ProxyAttributeError
@pytest.fixture(scope="session")
def pydase_client() -> Generator[pydase.Client, None, Any]:
class SubService(pydase.DataService):
name = "SubService"
subservice_instance = SubService()
class MyService(pydase.DataService):
def __init__(self) -> None:
super().__init__()
self._name = "MyService"
self._my_property = 12.1
self.sub_service = SubService()
self.list_attr = [1, 2]
self.dict_attr = {
"foo": subservice_instance,
"dotted.key": subservice_instance,
}
@property
def my_property(self) -> float:
return self._my_property
@my_property.setter
def my_property(self, value: float) -> None:
self._my_property = value
@property
def name(self) -> str:
return self._name
def my_method(self, input_str: str) -> str:
return input_str
server = pydase.Server(MyService(), web_port=9999)
thread = threading.Thread(target=server.run, daemon=True)
thread.start()
client = pydase.Client(hostname="localhost", port=9999)
yield client
server.handle_exit()
thread.join()
def test_property(pydase_client: pydase.Client) -> None:
assert pydase_client.proxy.my_property == 12.1
pydase_client.proxy.my_property = 2.1
assert pydase_client.proxy.my_property == 2.1
def test_readonly_property(pydase_client: pydase.Client) -> None:
assert pydase_client.proxy.name == "MyService"
with pytest.raises(ProxyAttributeError):
pydase_client.proxy.name = "Hello"
def test_method_execution(pydase_client: pydase.Client) -> None:
assert pydase_client.proxy.my_method("My return string") == "My return string"
assert (
pydase_client.proxy.my_method(input_str="My return string")
== "My return string"
)
with pytest.raises(TypeError):
pydase_client.proxy.my_method("Something", 2)
with pytest.raises(TypeError):
pydase_client.proxy.my_method(kwarg="hello")
def test_nested_service(pydase_client: pydase.Client) -> None:
assert pydase_client.proxy.sub_service.name == "SubService"
pydase_client.proxy.sub_service.name = "New name"
assert pydase_client.proxy.sub_service.name == "New name"
def test_list(pydase_client: pydase.Client) -> None:
assert pydase_client.proxy.list_attr == [1, 2]
pydase_client.proxy.list_attr.append(1)
assert pydase_client.proxy.list_attr == [1, 2, 1]
pydase_client.proxy.list_attr.extend([123, 2.1])
assert pydase_client.proxy.list_attr == [1, 2, 1, 123, 2.1]
pydase_client.proxy.list_attr.insert(1, 1.2)
assert pydase_client.proxy.list_attr == [1, 1.2, 2, 1, 123, 2.1]
assert pydase_client.proxy.list_attr.pop() == 2.1
assert pydase_client.proxy.list_attr == [1, 1.2, 2, 1, 123]
pydase_client.proxy.list_attr.remove(1.2)
assert pydase_client.proxy.list_attr == [1, 2, 1, 123]
pydase_client.proxy.list_attr[1] = 1337
assert pydase_client.proxy.list_attr == [1, 1337, 1, 123]
pydase_client.proxy.list_attr.clear()
assert pydase_client.proxy.list_attr == []
def test_dict(pydase_client: pydase.Client) -> None:
pydase_client.proxy.dict_attr["foo"].name = "foo"
assert pydase_client.proxy.dict_attr["foo"].name == "foo"
assert pydase_client.proxy.dict_attr["dotted.key"].name == "foo"
# pop will not return anything as the server object was deleted
assert pydase_client.proxy.dict_attr.pop("dotted.key") is None
# pop will remove the dictionary entry on the server
assert list(pydase_client.proxy.dict_attr.keys()) == ["foo"]
def test_tab_completion(pydase_client: pydase.Client) -> None:
# Tab completion gets its suggestions from the __dir__ class method
assert all(
x in pydase_client.proxy.__dir__()
for x in [
"list_attr",
"my_method",
"my_property",
"name",
"sub_service",
]
)

View File

@@ -3,10 +3,9 @@ import logging
import pydase
import pydase.components.device_connection
import pytest
from pytest import LogCaptureFixture
from tests.utils.test_serializer import pytest
logger = logging.getLogger(__name__)

View File

@@ -4,7 +4,7 @@ import pydase
import pydase.components
from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.data_service.state_manager import StateManager
from pydase.utils.serializer import dump
from pydase.utils.serialization.serializer import dump
from pytest import LogCaptureFixture
logger = logging.getLogger(__name__)
@@ -20,7 +20,7 @@ def test_image_functions(caplog: LogCaptureFixture) -> None:
state_manager = StateManager(service_instance)
DataServiceObserver(state_manager)
service_instance.my_image.load_from_url("https://cataas.com/cat")
service_instance.my_image.load_from_url("https://picsum.photos/200")
caplog.clear()
@@ -32,20 +32,24 @@ def test_image_serialization() -> None:
self.my_image = pydase.components.Image()
assert dump(MyService()) == {
"full_access_path": "",
"name": "MyService",
"type": "DataService",
"value": {
"my_image": {
"full_access_path": "my_image",
"name": "Image",
"type": "Image",
"value": {
"format": {
"full_access_path": "my_image.format",
"type": "str",
"value": "",
"readonly": True,
"doc": None,
},
"load_from_base64": {
"full_access_path": "my_image.load_from_base64",
"type": "method",
"value": None,
"readonly": True,
@@ -72,6 +76,7 @@ def test_image_serialization() -> None:
"frontend_render": False,
},
"load_from_matplotlib_figure": {
"full_access_path": "my_image.load_from_matplotlib_figure",
"type": "method",
"value": None,
"readonly": True,
@@ -95,6 +100,7 @@ def test_image_serialization() -> None:
"frontend_render": False,
},
"load_from_path": {
"full_access_path": "my_image.load_from_path",
"type": "method",
"value": None,
"readonly": True,
@@ -112,6 +118,7 @@ def test_image_serialization() -> None:
"frontend_render": False,
},
"load_from_url": {
"full_access_path": "my_image.load_from_url",
"type": "method",
"value": None,
"readonly": True,
@@ -126,6 +133,7 @@ def test_image_serialization() -> None:
"frontend_render": False,
},
"value": {
"full_access_path": "my_image.value",
"type": "str",
"value": "",
"readonly": True,

View File

@@ -1,14 +1,13 @@
import logging
from collections.abc import Callable
import pytest
from pydase.components.number_slider import NumberSlider
from pydase.data_service.data_service import DataService
from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.data_service.state_manager import StateManager
from pytest import LogCaptureFixture
from tests.utils.test_serializer import pytest
logger = logging.getLogger(__name__)

View File

@@ -37,8 +37,7 @@ def test_unexpected_type_change_warning(caplog: LogCaptureFixture) -> None:
def test_basic_inheritance_warning(caplog: LogCaptureFixture) -> None:
class SubService(DataService):
...
class SubService(DataService): ...
class SomeEnum(Enum):
HI = 0
@@ -58,11 +57,9 @@ def test_basic_inheritance_warning(caplog: LogCaptureFixture) -> None:
def name(self) -> str:
return self._name
def some_method(self) -> None:
...
def some_method(self) -> None: ...
async def some_task(self) -> None:
...
async def some_task(self) -> None: ...
ServiceClass()
@@ -135,3 +132,15 @@ def test_exposing_methods() -> None:
@frontend
def some_method(self, *args: Any) -> str:
return "some method"
def test_dynamically_added_attribute(caplog: LogCaptureFixture) -> None:
class MyService(DataService):
pass
service_instance = MyService()
pydase.Server(service_instance)
service_instance.dynamically_added_attr = 1.0
assert ("'dynamically_added_attr' changed to '1.0'") in caplog.text

View File

@@ -240,8 +240,8 @@ def test_load_state(tmp_path: Path, caplog: LogCaptureFixture) -> None:
"Ignoring value from JSON file..."
) in caplog.text
assert (
"Attribute type of 'removed_attr' changed from 'str' to 'None'. "
"Ignoring value from JSON file..." in caplog.text
"Path 'removed_attr' could not be loaded. It does not correspond to an "
"attribute of the class. Ignoring value from JSON file..." in caplog.text
)
assert "Value of attribute 'subservice.name' has not changed..." in caplog.text
assert "'my_slider.value' changed to '1.0'" in caplog.text

View File

@@ -0,0 +1,216 @@
import logging
from typing import Any
import pytest
from pydase.observer_pattern.observable import Observable
from pydase.observer_pattern.observer import Observer
logger = logging.getLogger(__name__)
class MyObserver(Observer):
def on_change(self, full_access_path: str, value: Any) -> None:
logger.info("'%s' changed to '%s'", full_access_path, value)
def test_simple_class_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class MyObservable(Observable):
dict_attr = {"first": "Hello"}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"] = "Ciao"
instance.dict_attr["second"] = "World"
assert "'dict_attr[\"first\"]' changed to 'Ciao'" in caplog.text
assert "'dict_attr[\"second\"]' changed to 'World'" in caplog.text
def test_instance_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {"first": NestedObservable()}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"].name = "Ciao"
assert "'dict_attr[\"first\"].name' changed to 'Ciao'" in caplog.text
def test_class_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
name = "Hello"
class MyObservable(Observable):
dict_attr = {"first": NestedObservable()}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"].name = "Ciao"
assert "'dict_attr[\"first\"].name' changed to 'Ciao'" in caplog.text
def test_nested_dict_instances(caplog: pytest.LogCaptureFixture) -> None:
dict_instance = {"first": "Hello", "second": "World"}
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.nested_dict_attr = {"nested": dict_instance}
instance = MyObservable()
MyObserver(instance)
instance.nested_dict_attr["nested"]["first"] = "Ciao"
assert "'nested_dict_attr[\"nested\"][\"first\"]' changed to 'Ciao'" in caplog.text
def test_dict_in_list_instance(caplog: pytest.LogCaptureFixture) -> None:
dict_instance = {"first": "Hello", "second": "World"}
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_in_list = [dict_instance]
instance = MyObservable()
MyObserver(instance)
instance.dict_in_list[0]["first"] = "Ciao"
assert "'dict_in_list[0][\"first\"]' changed to 'Ciao'" in caplog.text
def test_list_in_dict_instance(caplog: pytest.LogCaptureFixture) -> None:
list_instance: list[Any] = [1, 2, 3]
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.list_in_dict = {"some_list": list_instance}
instance = MyObservable()
MyObserver(instance)
instance.list_in_dict["some_list"][0] = "Ciao"
assert "'list_in_dict[\"some_list\"][0]' changed to 'Ciao'" in caplog.text
def test_key_type_error(caplog: pytest.LogCaptureFixture) -> None:
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {1.0: 1.0}
with pytest.raises(ValueError) as exc_info:
MyObservable()
assert (
"Invalid key type: 1.0 (float). In pydase services, dictionary keys must be "
"strings." in str(exc_info)
)
def test_removed_observer_on_class_dict_attr(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
nested_instance = NestedObservable()
class MyObservable(Observable):
nested_attr = nested_instance
changed_dict_attr = {"nested": nested_instance}
instance = MyObservable()
MyObserver(instance)
instance.changed_dict_attr["nested"] = "Ciao"
assert "'changed_dict_attr[\"nested\"]' changed to 'Ciao'" in caplog.text
caplog.clear()
assert nested_instance._observers == {
'["nested"]': [],
"nested_attr": [instance],
}
instance.nested_attr.name = "Hi"
assert "'nested_attr.name' changed to 'Hi'" in caplog.text
assert "'changed_dict_attr[\"nested\"].name' changed to 'Hi'" not in caplog.text
def test_removed_observer_on_instance_dict_attr(
caplog: pytest.LogCaptureFixture,
) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
nested_instance = NestedObservable()
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.nested_attr = nested_instance
self.changed_dict_attr = {"nested": nested_instance}
instance = MyObservable()
MyObserver(instance)
instance.changed_dict_attr["nested"] = "Ciao"
assert "'changed_dict_attr[\"nested\"]' changed to 'Ciao'" in caplog.text
caplog.clear()
assert nested_instance._observers == {
'["nested"]': [],
"nested_attr": [instance],
}
instance.nested_attr.name = "Hi"
assert "'nested_attr.name' changed to 'Hi'" in caplog.text
assert "'changed_dict_attr[\"nested\"].name' changed to 'Hi'" not in caplog.text
def test_dotted_dict_key(caplog: pytest.LogCaptureFixture) -> None:
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {"dotted.key": 1.0}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["dotted.key"] = "Ciao"
assert "'dict_attr[\"dotted.key\"]' changed to 'Ciao'" in caplog.text
def test_pop(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
nested_instance = NestedObservable()
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {"nested": nested_instance}
instance = MyObservable()
MyObserver(instance)
assert instance.dict_attr.pop("nested") == nested_instance
assert nested_instance._observers == {'["nested"]': []}
assert f"'dict_attr' changed to '{instance.dict_attr}'" in caplog.text

View File

@@ -69,66 +69,6 @@ def test_class_object_list_attribute(caplog: pytest.LogCaptureFixture) -> None:
assert "'list_attr[0].name' changed to 'Ciao'" in caplog.text
def test_simple_instance_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {"first": "Hello"}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"] = "Ciao"
instance.dict_attr["second"] = "World"
assert "'dict_attr['first']' changed to 'Ciao'" in caplog.text
assert "'dict_attr['second']' changed to 'World'" in caplog.text
def test_simple_class_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class MyObservable(Observable):
dict_attr = {"first": "Hello"}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"] = "Ciao"
instance.dict_attr["second"] = "World"
assert "'dict_attr['first']' changed to 'Ciao'" in caplog.text
assert "'dict_attr['second']' changed to 'World'" in caplog.text
def test_instance_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_attr = {"first": NestedObservable()}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"].name = "Ciao"
assert "'dict_attr['first'].name' changed to 'Ciao'" in caplog.text
def test_class_dict_attribute(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
name = "Hello"
class MyObservable(Observable):
dict_attr = {"first": NestedObservable()}
instance = MyObservable()
MyObserver(instance)
instance.dict_attr["first"].name = "Ciao"
assert "'dict_attr['first'].name' changed to 'Ciao'" in caplog.text
def test_removed_observer_on_class_list_attr(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
name = "Hello"
@@ -152,35 +92,6 @@ def test_removed_observer_on_class_list_attr(caplog: pytest.LogCaptureFixture) -
assert "'changed_list_attr[0].name' changed to 'Hi'" not in caplog.text
def test_removed_observer_on_instance_dict_attr(
caplog: pytest.LogCaptureFixture,
) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
nested_instance = NestedObservable()
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.nested_attr = nested_instance
self.changed_dict_attr = {"nested": nested_instance}
instance = MyObservable()
MyObserver(instance)
instance.changed_dict_attr["nested"] = "Ciao"
assert "'changed_dict_attr['nested']' changed to 'Ciao'" in caplog.text
caplog.clear()
instance.nested_attr.name = "Hi"
assert "'nested_attr.name' changed to 'Hi'" in caplog.text
assert "'changed_dict_attr['nested'].name' changed to 'Hi'" not in caplog.text
def test_removed_observer_on_instance_list_attr(
caplog: pytest.LogCaptureFixture,
) -> None:
@@ -210,78 +121,6 @@ def test_removed_observer_on_instance_list_attr(
assert "'changed_list_attr[0].name' changed to 'Hi'" not in caplog.text
def test_removed_observer_on_class_dict_attr(caplog: pytest.LogCaptureFixture) -> None:
class NestedObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.name = "Hello"
nested_instance = NestedObservable()
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.nested_attr = nested_instance
self.changed_dict_attr = {"nested": nested_instance}
instance = MyObservable()
MyObserver(instance)
instance.changed_dict_attr["nested"] = "Ciao"
assert "'changed_dict_attr['nested']' changed to 'Ciao'" in caplog.text
caplog.clear()
instance.nested_attr.name = "Hi"
assert "'nested_attr.name' changed to 'Hi'" in caplog.text
assert "'changed_dict_attr['nested'].name' changed to 'Hi'" not in caplog.text
def test_nested_dict_instances(caplog: pytest.LogCaptureFixture) -> None:
dict_instance = {"first": "Hello", "second": "World"}
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.nested_dict_attr = {"nested": dict_instance}
instance = MyObservable()
MyObserver(instance)
instance.nested_dict_attr["nested"]["first"] = "Ciao"
assert "'nested_dict_attr['nested']['first']' changed to 'Ciao'" in caplog.text
def test_dict_in_list_instance(caplog: pytest.LogCaptureFixture) -> None:
dict_instance = {"first": "Hello", "second": "World"}
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.dict_in_list = [dict_instance]
instance = MyObservable()
MyObserver(instance)
instance.dict_in_list[0]["first"] = "Ciao"
assert "'dict_in_list[0]['first']' changed to 'Ciao'" in caplog.text
def test_list_in_dict_instance(caplog: pytest.LogCaptureFixture) -> None:
list_instance: list[Any] = [1, 2, 3]
class MyObservable(Observable):
def __init__(self) -> None:
super().__init__()
self.list_in_dict = {"some_list": list_instance}
instance = MyObservable()
MyObserver(instance)
instance.list_in_dict["some_list"][0] = "Ciao"
assert "'list_in_dict['some_list'][0]' changed to 'Ciao'" in caplog.text
def test_list_append(caplog: pytest.LogCaptureFixture) -> None:
class OtherObservable(Observable):
def __init__(self) -> None:

View File

@@ -0,0 +1,21 @@
from typing import Any
from pydase.observer_pattern.observable.observable import Observable
from pydase.observer_pattern.observer.property_observer import PropertyObserver
def test_inherited_property_dependency_resolution() -> None:
class BaseObservable(Observable):
_name = "BaseObservable"
@property
def name(self) -> str:
return self._name
class DerivedObservable(BaseObservable):
_name = "DerivedObservable"
class MyObserver(PropertyObserver):
def on_change(self, full_access_path: str, value: Any) -> None: ...
assert MyObserver(DerivedObservable()).property_deps_dict == {"_name": ["name"]}

View File

@@ -1,68 +0,0 @@
import asyncio
import logging
import pydase
import pytest
from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.data_service.state_manager import StateManager
from pydase.server.web_server.sio_setup import (
RunMethodDict,
UpdateDict,
setup_sio_server,
)
logger = logging.getLogger(__name__)
@pytest.mark.asyncio
async def test_set_attribute_event() -> None:
class SubClass(pydase.DataService):
name = "SubClass"
class ServiceClass(pydase.DataService):
def __init__(self) -> None:
super().__init__()
self.sub_class = SubClass()
def some_method(self) -> None:
logger.info("Triggered 'test_method'.")
service_instance = ServiceClass()
state_manager = StateManager(service_instance)
observer = DataServiceObserver(state_manager)
server = setup_sio_server(observer, False, asyncio.get_running_loop())
test_sid = 1234
test_data: UpdateDict = {
"parent_path": "sub_class",
"name": "name",
"value": "new name",
}
server.handlers["/"]["set_attribute"](test_sid, test_data)
assert service_instance.sub_class.name == "new name"
@pytest.mark.asyncio
async def test_run_method_event(caplog: pytest.LogCaptureFixture):
class ServiceClass(pydase.DataService):
def test_method(self) -> None:
logger.info("Triggered 'test_method'.")
state_manager = StateManager(ServiceClass())
observer = DataServiceObserver(state_manager)
server = setup_sio_server(observer, False, asyncio.get_running_loop())
test_sid = 1234
test_data: RunMethodDict = {
"parent_path": "",
"name": "test_method",
"kwargs": {},
}
server.handlers["/"]["run_method"](test_sid, test_data)
assert "Triggered 'test_method'." in caplog.text

View File

@@ -5,6 +5,7 @@ import pydase.units as u
from pydase.data_service.data_service import DataService
from pydase.data_service.data_service_observer import DataServiceObserver
from pydase.data_service.state_manager import StateManager, load_state
from pydase.utils.serialization.serializer import dump
from pytest import LogCaptureFixture
@@ -70,21 +71,11 @@ def test_set_service_attribute_value_by_path(caplog: LogCaptureFixture) -> None:
DataServiceObserver(state_manager)
state_manager.set_service_attribute_value_by_path(
path="voltage", value=1.0 * u.units.mV
path="voltage", serialized_value=dump(1.0 * u.units.mV)
)
assert "'voltage' changed to '1.0 mV'" in caplog.text
caplog.clear()
state_manager.set_service_attribute_value_by_path(path="voltage", value=2)
assert "'voltage' changed to '2.0 mV'" in caplog.text
caplog.clear()
state_manager.set_service_attribute_value_by_path(
path="voltage", value={"magnitude": 123, "unit": "kV"}
)
assert "'voltage' changed to '123.0 kV'" in caplog.text
def test_autoconvert_offset_to_baseunit() -> None:
import pint

View File

View File

@@ -0,0 +1,143 @@
import enum
from typing import Any
import pydase.components
import pydase.units as u
import pytest
from pydase.utils.serialization.deserializer import loads
from pydase.utils.serialization.serializer import dump
from pydase.utils.serialization.types import SerializedObject
class MyEnum(enum.Enum):
FINISHED = "finished"
RUNNING = "running"
class MyService(pydase.DataService):
name = "MyService"
@pytest.mark.parametrize(
"obj, obj_serialization",
[
(
1,
{
"full_access_path": "",
"type": "int",
"value": 1,
"readonly": False,
"doc": None,
},
),
(
1.0,
{
"full_access_path": "",
"type": "float",
"value": 1.0,
"readonly": False,
"doc": None,
},
),
(
True,
{
"full_access_path": "",
"type": "bool",
"value": True,
"readonly": False,
"doc": None,
},
),
(
u.Quantity(10, "m"),
{
"full_access_path": "",
"type": "Quantity",
"value": {"magnitude": 10, "unit": "meter"},
"readonly": False,
"doc": None,
},
),
(
[1.0],
{
"full_access_path": "",
"value": [
{
"full_access_path": "[0]",
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
}
],
"type": "list",
"doc": None,
"readonly": False,
},
),
(
{"key": 1.0},
{
"full_access_path": "",
"value": {
"key": {
"full_access_path": '["key"]',
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
}
},
"type": "dict",
"doc": None,
"readonly": False,
},
),
],
)
def test_loads_primitive_types(obj: Any, obj_serialization: SerializedObject) -> None:
assert loads(obj_serialization) == obj
@pytest.mark.parametrize(
"obj, obj_serialization",
[
(
MyEnum.RUNNING,
{
"full_access_path": "",
"value": "RUNNING",
"type": "Enum",
"doc": "MyEnum description",
"readonly": False,
"name": "MyEnum",
"enum": {"RUNNING": "running", "FINISHED": "finished"},
},
),
(
MyService(),
{
"full_access_path": "",
"value": {
"name": {
"full_access_path": "name",
"doc": None,
"readonly": False,
"type": "str",
"value": "MyService",
}
},
"type": "DataService",
"doc": None,
"readonly": False,
"name": "MyService",
},
),
],
)
def test_loads_advanced_types(obj: Any, obj_serialization: SerializedObject) -> None:
assert dump(loads(obj_serialization)) == dump(obj)

View File

@@ -1,7 +1,7 @@
import asyncio
import enum
from enum import Enum
from typing import Any
from typing import Any, ClassVar
import pydase
import pydase.units as u
@@ -9,12 +9,14 @@ import pytest
from pydase.components.coloured_enum import ColouredEnum
from pydase.data_service.task_manager import TaskStatus
from pydase.utils.decorators import frontend
from pydase.utils.serializer import (
from pydase.utils.serialization.serializer import (
SerializationPathError,
SerializedObject,
dump,
generate_serialized_data_paths,
get_container_item_by_key,
get_data_paths_from_serialized_object,
get_nested_dict_by_path,
get_next_level_dict_by_key,
serialized_dict_is_nested_object,
set_nested_value_by_path,
)
@@ -27,15 +29,63 @@ class MyEnum(enum.Enum):
FINISHED = "finished"
class MySubclass(pydase.DataService):
attr3 = 1.0
list_attr: ClassVar[list[Any]] = [1.0, 1]
some_quantity: u.Quantity = 1.0 * u.units.A
class ServiceClass(pydase.DataService):
attr1 = 1.0
attr2 = MySubclass()
enum_attr = MyEnum.RUNNING
attr_list: ClassVar[list[Any]] = [0, 1, MySubclass()]
dict_attr: ClassVar[dict[Any, Any]] = {"foo": 1.0, "bar": {"foo": "bar"}}
def my_task(self) -> None:
pass
service_instance = ServiceClass()
@pytest.mark.parametrize(
"test_input, expected",
[
(1, {"type": "int", "value": 1, "readonly": False, "doc": None}),
(1.0, {"type": "float", "value": 1.0, "readonly": False, "doc": None}),
(True, {"type": "bool", "value": True, "readonly": False, "doc": None}),
(
1,
{
"full_access_path": "",
"type": "int",
"value": 1,
"readonly": False,
"doc": None,
},
),
(
1.0,
{
"full_access_path": "",
"type": "float",
"value": 1.0,
"readonly": False,
"doc": None,
},
),
(
True,
{
"full_access_path": "",
"type": "bool",
"value": True,
"readonly": False,
"doc": None,
},
),
(
u.Quantity(10, "m"),
{
"full_access_path": "",
"type": "Quantity",
"value": {"magnitude": 10, "unit": "meter"},
"readonly": False,
@@ -82,7 +132,9 @@ def test_enum_serialize() -> None:
assert dump(EnumAttribute())["value"] == {
"some_enum": {
"full_access_path": "some_enum",
"type": "Enum",
"name": "EnumClass",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
@@ -91,7 +143,9 @@ def test_enum_serialize() -> None:
}
assert dump(EnumPropertyWithoutSetter())["value"] == {
"some_enum": {
"full_access_path": "some_enum",
"type": "Enum",
"name": "EnumClass",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": True,
@@ -100,7 +154,9 @@ def test_enum_serialize() -> None:
}
assert dump(EnumPropertyWithSetter())["value"] == {
"some_enum": {
"full_access_path": "some_enum",
"type": "Enum",
"name": "EnumClass",
"value": "FOO",
"enum": {"FOO": "foo", "BAR": "bar"},
"readonly": False,
@@ -122,7 +178,9 @@ def test_ColouredEnum_serialize() -> None:
CANCELLED = "SlateGray"
assert dump(Status.FAILED) == {
"full_access_path": "",
"type": "ColouredEnum",
"name": "Status",
"value": "FAILED",
"enum": {
"CANCELLED": "SlateGray",
@@ -153,6 +211,7 @@ async def test_method_serialization() -> None:
assert dump(instance)["value"] == {
"some_method": {
"full_access_path": "some_method",
"type": "method",
"value": None,
"readonly": True,
@@ -162,6 +221,7 @@ async def test_method_serialization() -> None:
"frontend_render": False,
},
"some_task": {
"full_access_path": "some_task",
"type": "method",
"value": TaskStatus.RUNNING.name,
"readonly": True,
@@ -187,6 +247,7 @@ def test_methods_with_type_hints() -> None:
pass
assert dump(method_without_type_hint) == {
"full_access_path": "",
"async": False,
"doc": None,
"signature": {
@@ -205,6 +266,7 @@ def test_methods_with_type_hints() -> None:
}
assert dump(method_with_type_hint) == {
"full_access_path": "",
"type": "method",
"value": None,
"readonly": True,
@@ -219,6 +281,7 @@ def test_methods_with_type_hints() -> None:
"frontend_render": False,
}
assert dump(method_with_union_type_hint) == {
"full_access_path": "",
"type": "method",
"value": None,
"readonly": True,
@@ -245,6 +308,7 @@ def test_exposed_function_serialization() -> None:
pass
assert dump(MyService().some_method) == {
"full_access_path": "",
"type": "method",
"value": None,
"readonly": True,
@@ -255,6 +319,7 @@ def test_exposed_function_serialization() -> None:
}
assert dump(some_function) == {
"full_access_path": "",
"type": "method",
"value": None,
"readonly": True,
@@ -282,30 +347,41 @@ def test_list_serialization() -> None:
assert dump(instance)["value"] == {
"list_attr": {
"full_access_path": "list_attr",
"doc": None,
"readonly": False,
"type": "list",
"value": [
{"doc": None, "readonly": False, "type": "int", "value": 1},
{
"full_access_path": "list_attr[0]",
"doc": None,
"readonly": False,
"type": "int",
"value": 1,
},
{
"full_access_path": "list_attr[1]",
"doc": None,
"readonly": False,
"type": "DataService",
"name": "MySubclass",
"value": {
"bool_attr": {
"full_access_path": "list_attr[1].bool_attr",
"doc": None,
"readonly": False,
"type": "bool",
"value": True,
},
"int_attr": {
"full_access_path": "list_attr[1].int_attr",
"doc": None,
"readonly": False,
"type": "int",
"value": 1,
},
"name": {
"full_access_path": "list_attr[1].name",
"doc": None,
"readonly": True,
"type": "str",
@@ -324,24 +400,27 @@ def test_dict_serialization() -> None:
test_dict = {
"int_key": 1,
"float_key": 1.0,
"1.0": 1.0,
"bool_key": True,
"Quantity_key": 1.0 * u.units.s,
"DataService_key": MyClass(),
}
assert dump(test_dict) == {
"full_access_path": "",
"doc": None,
"readonly": False,
"type": "dict",
"value": {
"DataService_key": {
"full_access_path": '["DataService_key"]',
"name": "MyClass",
"doc": None,
"readonly": False,
"type": "DataService",
"value": {
"name": {
"full_access_path": '["DataService_key"].name',
"doc": None,
"readonly": False,
"type": "str",
@@ -350,19 +429,33 @@ def test_dict_serialization() -> None:
},
},
"Quantity_key": {
"full_access_path": '["Quantity_key"]',
"doc": None,
"readonly": False,
"type": "Quantity",
"value": {"magnitude": 1.0, "unit": "s"},
},
"bool_key": {"doc": None, "readonly": False, "type": "bool", "value": True},
"float_key": {
"bool_key": {
"full_access_path": '["bool_key"]',
"doc": None,
"readonly": False,
"type": "bool",
"value": True,
},
"1.0": {
"full_access_path": '["1.0"]',
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
},
"int_key": {"doc": None, "readonly": False, "type": "int", "value": 1},
"int_key": {
"full_access_path": '["int_key"]',
"doc": None,
"readonly": False,
"type": "int",
"value": 1,
},
},
}
@@ -383,8 +476,7 @@ def test_derived_data_service_serialization() -> None:
def name(self, value: str) -> None:
self._name = value
class DerivedService(BaseService):
...
class DerivedService(BaseService): ...
base_service_serialization = dump(BaseService())
derived_service_serialization = dump(DerivedService())
@@ -398,20 +490,123 @@ def test_derived_data_service_serialization() -> None:
@pytest.fixture
def setup_dict() -> dict[str, Any]:
class MySubclass(pydase.DataService):
attr3 = 1.0
list_attr = [1.0, 1]
return ServiceClass().serialize()["value"] # type: ignore
class ServiceClass(pydase.DataService):
attr1 = 1.0
attr2 = MySubclass()
enum_attr = MyEnum.RUNNING
attr_list = [0, 1, MySubclass()]
def my_task(self) -> None:
pass
return ServiceClass().serialize()["value"]
@pytest.mark.parametrize(
"serialized_object, attr_name, allow_append, expected",
[
(
dump(service_instance)["value"],
"attr1",
False,
{
"doc": None,
"full_access_path": "attr1",
"readonly": False,
"type": "float",
"value": 1.0,
},
),
(
dump(service_instance.attr_list)["value"],
"[0]",
False,
{
"doc": None,
"full_access_path": "[0]",
"readonly": False,
"type": "int",
"value": 0,
},
),
(
dump(service_instance.attr_list)["value"],
"[3]",
True,
{
# we do not know the full_access_path of this entry within the
# serialized object
"full_access_path": "",
"value": None,
"type": "None",
"doc": None,
"readonly": False,
},
),
(
dump(service_instance.attr_list)["value"],
"[3]",
False,
SerializationPathError,
),
(
dump(service_instance.dict_attr)["value"],
"['foo']",
False,
{
"full_access_path": '["foo"]',
"value": 1.0,
"type": "float",
"doc": None,
"readonly": False,
},
),
(
dump(service_instance.dict_attr)["value"],
"['unset_key']",
True,
{
# we do not know the full_access_path of this entry within the
# serialized object
"full_access_path": "",
"value": None,
"type": "None",
"doc": None,
"readonly": False,
},
),
(
dump(service_instance.dict_attr)["value"],
"['unset_key']",
False,
SerializationPathError,
),
(
dump(service_instance)["value"],
"invalid_path",
True,
{
# we do not know the full_access_path of this entry within the
# serialized object
"full_access_path": "",
"value": None,
"type": "None",
"doc": None,
"readonly": False,
},
),
(
dump(service_instance)["value"],
"invalid_path",
False,
SerializationPathError,
),
],
)
def test_get_container_item_by_key(
serialized_object: dict[str, Any], attr_name: str, allow_append: bool, expected: Any
) -> None:
if isinstance(expected, type) and issubclass(expected, Exception):
with pytest.raises(expected):
get_container_item_by_key(
serialized_object, attr_name, allow_append=allow_append
)
else:
nested_dict = get_container_item_by_key(
serialized_object, attr_name, allow_append=allow_append
)
assert nested_dict == expected
def test_update_attribute(setup_dict: dict[str, Any]) -> None:
@@ -427,6 +622,8 @@ def test_update_nested_attribute(setup_dict: dict[str, Any]) -> None:
def test_update_float_attribute_to_enum(setup_dict: dict[str, Any]) -> None:
set_nested_value_by_path(setup_dict, "attr2.attr3", MyEnum.RUNNING)
assert setup_dict["attr2"]["value"]["attr3"] == {
"full_access_path": "attr2.attr3",
"name": "MyEnum",
"doc": "MyEnum description",
"enum": {"FINISHED": "finished", "RUNNING": "running"},
"readonly": False,
@@ -438,6 +635,7 @@ def test_update_float_attribute_to_enum(setup_dict: dict[str, Any]) -> None:
def test_update_enum_attribute_to_float(setup_dict: dict[str, Any]) -> None:
set_nested_value_by_path(setup_dict, "enum_attr", 1.01)
assert setup_dict["enum_attr"] == {
"full_access_path": "enum_attr",
"doc": None,
"readonly": False,
"type": "float",
@@ -447,6 +645,7 @@ def test_update_enum_attribute_to_float(setup_dict: dict[str, Any]) -> None:
def test_update_task_state(setup_dict: dict[str, Any]) -> None:
assert setup_dict["my_task"] == {
"full_access_path": "my_task",
"async": False,
"doc": None,
"frontend_render": False,
@@ -457,6 +656,7 @@ def test_update_task_state(setup_dict: dict[str, Any]) -> None:
}
set_nested_value_by_path(setup_dict, "my_task", TaskStatus.RUNNING)
assert setup_dict["my_task"] == {
"full_access_path": "my_task",
"async": False,
"doc": None,
"frontend_render": False,
@@ -469,13 +669,15 @@ def test_update_task_state(setup_dict: dict[str, Any]) -> None:
def test_update_list_entry(setup_dict: dict[str, SerializedObject]) -> None:
set_nested_value_by_path(setup_dict, "attr_list[1]", 20)
assert setup_dict["attr_list"]["value"][1]["value"] == 20
assert setup_dict["attr_list"]["value"][1]["value"] == 20 # type: ignore # noqa
def test_update_list_append(setup_dict: dict[str, SerializedObject]) -> None:
set_nested_value_by_path(setup_dict, "attr_list[3]", MyEnum.RUNNING)
assert setup_dict["attr_list"]["value"][3] == {
assert setup_dict["attr_list"]["value"][3] == { # type: ignore
"full_access_path": "attr_list[3]",
"doc": "MyEnum description",
"name": "MyEnum",
"enum": {"FINISHED": "finished", "RUNNING": "running"},
"readonly": False,
"type": "Enum",
@@ -488,50 +690,19 @@ def test_update_invalid_list_index(
) -> None:
set_nested_value_by_path(setup_dict, "attr_list[10]", 30)
assert (
"Error occured trying to change 'attr_list[10]': list index "
"out of range" in caplog.text
)
def test_update_invalid_path(
setup_dict: dict[str, Any], caplog: pytest.LogCaptureFixture
) -> None:
set_nested_value_by_path(setup_dict, "invalid_path", 30)
assert (
"Error occured trying to access the key 'invalid_path': it is either "
"not present in the current dictionary or its value does not contain "
"a 'value' key." in caplog.text
"Error occured trying to change 'attr_list[10]': Index '10': list index out of "
"range" in caplog.text
)
def test_update_list_inside_class(setup_dict: dict[str, Any]) -> None:
set_nested_value_by_path(setup_dict, "attr2.list_attr[1]", 40)
assert setup_dict["attr2"]["value"]["list_attr"]["value"][1]["value"] == 40
assert setup_dict["attr2"]["value"]["list_attr"]["value"][1]["value"] == 40 # noqa
def test_update_class_attribute_inside_list(setup_dict: dict[str, Any]) -> None:
set_nested_value_by_path(setup_dict, "attr_list[2].attr3", 50)
assert setup_dict["attr_list"]["value"][2]["value"]["attr3"]["value"] == 50
def test_get_next_level_attribute_nested_dict(setup_dict: dict[str, Any]) -> None:
nested_dict = get_next_level_dict_by_key(setup_dict, "attr1")
assert nested_dict == setup_dict["attr1"]
def test_get_next_level_list_entry_nested_dict(setup_dict: dict[str, Any]) -> None:
nested_dict = get_next_level_dict_by_key(setup_dict, "attr_list[0]")
assert nested_dict == setup_dict["attr_list"]["value"][0]
def test_get_next_level_invalid_path_nested_dict(setup_dict: dict[str, Any]) -> None:
with pytest.raises(SerializationPathError):
get_next_level_dict_by_key(setup_dict, "invalid_path")
def test_get_next_level_invalid_list_index(setup_dict: dict[str, Any]) -> None:
with pytest.raises(SerializationPathError):
get_next_level_dict_by_key(setup_dict, "attr_list[10]")
assert setup_dict["attr_list"]["value"][2]["value"]["attr3"]["value"] == 50 # noqa
def test_get_attribute(setup_dict: dict[str, Any]) -> None:
@@ -666,3 +837,228 @@ def test_serialized_dict_is_nested_object() -> None:
assert not serialized_dict_is_nested_object(serialized_dict["unit"])
assert not serialized_dict_is_nested_object(serialized_dict["float"])
assert not serialized_dict_is_nested_object(serialized_dict["state"])
class MyService(pydase.DataService):
name = "MyService"
@pytest.mark.parametrize(
"test_input, expected",
[
(
1,
{
"new_attr": {
"full_access_path": "new_attr",
"type": "int",
"value": 1,
"readonly": False,
"doc": None,
}
},
),
(
1.0,
{
"new_attr": {
"full_access_path": "new_attr",
"type": "float",
"value": 1.0,
"readonly": False,
"doc": None,
},
},
),
(
True,
{
"new_attr": {
"full_access_path": "new_attr",
"type": "bool",
"value": True,
"readonly": False,
"doc": None,
},
},
),
(
u.Quantity(10, "m"),
{
"new_attr": {
"full_access_path": "new_attr",
"type": "Quantity",
"value": {"magnitude": 10, "unit": "meter"},
"readonly": False,
"doc": None,
},
},
),
(
MyEnum.RUNNING,
{
"new_attr": {
"full_access_path": "new_attr",
"value": "RUNNING",
"type": "Enum",
"doc": "MyEnum description",
"readonly": False,
"name": "MyEnum",
"enum": {"RUNNING": "running", "FINISHED": "finished"},
}
},
),
(
[1.0],
{
"new_attr": {
"full_access_path": "new_attr",
"value": [
{
"full_access_path": "new_attr[0]",
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
}
],
"type": "list",
"doc": None,
"readonly": False,
}
},
),
(
{"key": 1.0},
{
"new_attr": {
"full_access_path": "new_attr",
"value": {
"key": {
"full_access_path": 'new_attr["key"]',
"doc": None,
"readonly": False,
"type": "float",
"value": 1.0,
}
},
"type": "dict",
"doc": None,
"readonly": False,
}
},
),
(
MyService(),
{
"new_attr": {
"full_access_path": "new_attr",
"value": {
"name": {
"full_access_path": "new_attr.name",
"doc": None,
"readonly": False,
"type": "str",
"value": "MyService",
}
},
"type": "DataService",
"doc": None,
"readonly": False,
"name": "MyService",
}
},
),
],
)
def test_dynamically_add_attributes(test_input: Any, expected: dict[str, Any]) -> None:
serialized_object: dict[str, SerializedObject] = {}
set_nested_value_by_path(serialized_object, "new_attr", test_input)
assert serialized_object == expected
@pytest.mark.parametrize(
"obj, expected",
[
(
service_instance.attr2,
[
"attr3",
"list_attr",
"list_attr[0]",
"list_attr[1]",
"some_quantity",
],
),
(
service_instance.dict_attr,
[
'["foo"]',
'["bar"]',
'["bar"]["foo"]',
],
),
(
service_instance.attr_list,
[
"[0]",
"[1]",
"[2]",
"[2].attr3",
"[2].list_attr",
"[2].list_attr[0]",
"[2].list_attr[1]",
"[2].some_quantity",
],
),
],
)
def test_get_data_paths_from_serialized_object(obj: Any, expected: list[str]) -> None:
assert get_data_paths_from_serialized_object(dump(obj=obj)) == expected
@pytest.mark.parametrize(
"obj, expected",
[
(
service_instance,
[
"attr1",
"attr2",
"attr2.attr3",
"attr2.list_attr",
"attr2.list_attr[0]",
"attr2.list_attr[1]",
"attr2.some_quantity",
"attr_list",
"attr_list[0]",
"attr_list[1]",
"attr_list[2]",
"attr_list[2].attr3",
"attr_list[2].list_attr",
"attr_list[2].list_attr[0]",
"attr_list[2].list_attr[1]",
"attr_list[2].some_quantity",
"dict_attr",
'dict_attr["foo"]',
'dict_attr["bar"]',
'dict_attr["bar"]["foo"]',
"enum_attr",
"my_task",
],
),
(
service_instance.attr2,
[
"attr3",
"list_attr",
"list_attr[0]",
"list_attr[1]",
"some_quantity",
],
),
],
)
def test_generate_serialized_data_paths(obj: Any, expected: list[str]) -> None:
assert generate_serialized_data_paths(dump(obj=obj)["value"]) == expected

View File

@@ -1,10 +1,113 @@
from typing import Any
import pydase
import pytest
from pydase.utils.helpers import (
get_object_by_path_parts,
get_path_from_path_parts,
is_property_attribute,
parse_full_access_path,
parse_serialized_key,
)
@pytest.mark.parametrize(
"serialized_key, expected",
[
("attr_name", "attr_name"),
("[0]", 0),
("[0.0]", 0.0),
('["some_key"]', "some_key"),
('["12.34"]', "12.34"),
],
)
def test_parse_serialized_key(serialized_key: str, expected: str) -> None:
assert parse_serialized_key(serialized_key) == expected
@pytest.mark.parametrize(
"full_access_path, expected",
[
("attr_name", ["attr_name"]),
("parent.attr_name", ["parent", "attr_name"]),
("nested.parent.attr_name", ["nested", "parent", "attr_name"]),
("nested.parent.attr_name", ["nested", "parent", "attr_name"]),
("attr_name[0]", ["attr_name", "[0]"]),
("parent.attr_name[0]", ["parent", "attr_name", "[0]"]),
("attr_name[0][1]", ["attr_name", "[0]", "[1]"]),
('attr_name[0]["some_key"]', ["attr_name", "[0]", '["some_key"]']),
(
'dict_attr["some_key"].attr_name["other_key"]',
["dict_attr", '["some_key"]', "attr_name", '["other_key"]'],
),
("dict_attr[2.1]", ["dict_attr", "[2.1]"]),
],
)
def test_parse_full_access_path(full_access_path: str, expected: list[str]) -> None:
assert parse_full_access_path(full_access_path) == expected
@pytest.mark.parametrize(
"path_parts, expected",
[
(["attr_name"], "attr_name"),
(["parent", "attr_name"], "parent.attr_name"),
(["nested", "parent", "attr_name"], "nested.parent.attr_name"),
(["nested", "parent", "attr_name"], "nested.parent.attr_name"),
(["attr_name", "[0]"], "attr_name[0]"),
(["parent", "attr_name", "[0]"], "parent.attr_name[0]"),
(["attr_name", "[0]", "[1]"], "attr_name[0][1]"),
(["attr_name", "[0]", '["some_key"]'], 'attr_name[0]["some_key"]'),
(
["dict_attr", '["some_key"]', "attr_name", '["other_key"]'],
'dict_attr["some_key"].attr_name["other_key"]',
),
(["dict_attr", "[2.1]"], "dict_attr[2.1]"),
],
)
def test_get_path_from_path_parts(path_parts: list[str], expected: str) -> None:
assert get_path_from_path_parts(path_parts) == expected
class SubService(pydase.DataService):
name = "SubService"
some_int = 1
some_float = 1.0
class MyService(pydase.DataService):
def __init__(self) -> None:
super().__init__()
self.some_float = 1.0
self.subservice = SubService()
self.list_attr = [1.0, SubService()]
self.dict_attr = {"foo": SubService(), "dotted.key": "float_as_key"}
service_instance = MyService()
@pytest.mark.parametrize(
"path_parts, expected",
[
(["some_float"], service_instance.some_float),
(["subservice"], service_instance.subservice),
(["list_attr", "[0]"], service_instance.list_attr[0]),
(["list_attr", "[1]"], service_instance.list_attr[1]),
(["dict_attr", '["foo"]'], service_instance.dict_attr["foo"]),
(["dict_attr", '["foo"]', "name"], service_instance.dict_attr["foo"].name), # type: ignore
(["dict_attr", '["dotted.key"]'], service_instance.dict_attr["dotted.key"]),
],
)
def test_get_object_by_path_parts(path_parts: list[str], expected: Any) -> None:
assert get_object_by_path_parts(service_instance, path_parts) == expected
def test_get_object_by_path_parts_error(caplog: pytest.LogCaptureFixture) -> None:
assert get_object_by_path_parts(service_instance, ["non_existent_attr"]) is None
assert "Attribute 'non_existent_attr' does not exist in the object." in caplog.text
@pytest.mark.parametrize(
"attr_name, expected",
[
@@ -12,13 +115,29 @@ from pydase.utils.helpers import (
("my_property", True),
("my_method", False),
("non_existent_attr", False),
("nested_class_instance", False),
("nested_class_instance.my_property", True),
("list_attr", False),
("list_attr[0]", False),
("list_attr[0].my_property", True),
("dict_attr", False),
("dict_attr['foo']", False),
("dict_attr['foo'].my_property", True),
],
)
def test_is_property_attribute(attr_name: str, expected: bool) -> None:
class NestedClass:
@property
def my_property(self) -> str:
return "I'm a nested property"
# Test Suite
class DummyClass:
def __init__(self) -> None:
self.regular_attribute = "I'm just an attribute"
self.nested_class_instance = NestedClass()
self.list_attr = [NestedClass()]
self.dict_attr = {"foo": NestedClass()}
@property
def my_property(self) -> str: