67 Commits
tmp ... flowsas

Author SHA1 Message Date
a7fd90cd6d flowsas project as of 2025-04-14 2025-04-14 11:40:12 +02:00
adfb561308 add cetoni pump for flowsas project 2024-04-03 11:31:38 +02:00
70a31b5cae attocube: remove seperate pyanc350 python module
- implement positioner as an io class
- proper shut down behaviour

Change-Id: If04176f779809fd5b08f586556cac668cf188479
2024-03-28 10:12:18 +01:00
8ee97ade63 adjust cfg foe attocube to current example 2024-03-27 17:13:58 +01:00
1715f95dd4 frappy_psi.attocube: add lock protection to hw access
in order to avoid sporadic timeout problems

Change-Id: I36f67ae72b65e9c1f3179cae942b0a7d94584e55
2024-03-27 17:08:24 +01:00
db29776dd5 reworked attocube
- step_mode: soft closed loop, stepwise, reading encoder after a delay
- calib_steps command to determine step size

Change-Id: I27bdffb4d564ac9c55a6473704ac2de6ad92bac8
2024-03-25 16:47:13 +01:00
a2905d9fbc improve attocube driver
- driving in an extra thread, hoping not to miss end of travel
  status bits (does not work always)
- maxtry parameter for trying several times

TODO: move by step (in an other thread)
Change-Id: I89b51d1f6926f2fd26ec22d43aede377b5231583
2024-03-22 14:38:23 +01:00
16b826394f fixes for attocube
Change-Id: Id5eeb749ba010fec59d1c2f8a3258ee34a47e246
2024-03-20 16:59:04 +01:00
ea8570d422 wip: fix attocube 2024-03-20 16:12:03 +01:00
1169e0cd09 improve sea interface
Change-Id: I58fb4b10ef9466f90e4cd58b6c67bcfb11c493e3
2024-03-08 15:59:16 +01:00
7d02498b3d improve async behaviour of parmod.Driv
Change-Id: I3889614a0deaba4ef13b86c6600b6f96bc502a39
2024-03-08 15:58:17 +01:00
694b121c01 fix more stop doc strings
Change-Id: Id7ea0a6d0c959e980beee8fbea73932c701977d7
2024-03-08 15:38:34 +01:00
0f50de9a7f fix command doc string handling and change default stop doc string
- fix inheritance of command description
- when no stop method is given, then the description should indicate
  that stop is a no-op -> add missing doc strings to stop methods
- add test to make sure stop command doc strings are given
  when implemented

Change-Id: If891359350e8dcdec39a706841d61d4f8ec8926f
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33266
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-03-08 15:34:08 +01:00
b454f47a12 fix docstring in frappy.error.OutOfRangeError
Change-Id: I006c061a5d88ac7c97808efd56faece927916e78
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33183
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-03-08 15:34:02 +01:00
6e7be6b4c7 simplify callbacks
on Module, use one single callback list 'paramsCallback' instead of
'valueCallbacks', 'errorCallbacks'. Redesign the mechanism to
avoid most of the closures.

Change-Id: Ie7f68f6bf97ab3f3cd961faa20b0e77730e5b37d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33118
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
2024-03-08 15:33:52 +01:00
af28511403 fixes for proxy modules
- for the case when the remote module name does not match,
  'read', 'change' and 'do' does not work
- a proxy to an IO class has enablePoll == False, but it needs
  a triggerPoll for modules relying on it to work
- a proxy on a communicator module has a status even when the
  remote does not - this needs 2 fixes

Change-Id: Icd44da4c2984f27ce7147dec633739f9176012ec
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33168
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-03-08 15:33:45 +01:00
9d9d31693b bugfix in automatic creation if attached io
srv.modules does no longer exist

Change-Id: Ibc52fe35f27ad110e60947702d97ee40f359b7c5
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33167
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-03-08 15:33:39 +01:00
3a7fff713d move StructParam to frappy/extparams.py
+ typos and fixes in doc strings

Change-Id: Ib3e9add84ce2a6fb5c33770cae7f2da3f5655506
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/33033
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-02-26 13:43:10 +01:00
2acab33faa add FloatEnumParam
for the use case where a parameter is a selection of discrete
float parameters

declaring a parameter as FloatEnumParam will create effectively
two parameters, one with datatype FloatRange and another with
datatype Enum, influencing each other automatically.

in a later change StructParam should be moved from
frappy/structparam.py to frappy/extparams.py

Change-Id: Ica3fd8dcaf6e9439e8178390f220dec15e52cc86
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32975
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-02-26 13:43:10 +01:00
8c589cc138 simulation: extra_params might be a list
- still accept comma separated string
- remove legacy naming '.extra_params'

Change-Id: I497cf7722d0b39dd31c516383449a4cc4e7dcb7d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32968
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-02-26 13:43:10 +01:00
2b42e3fa0a flamesample: improve comments to parmod.SwitchDriv 2024-02-23 10:13:31 +01:00
5b0da3ba98 fixes on 2023-11-27
- ls372 autorange: wait one sec. more for switching
- keep only one channel, even after target is reached
- intermediate target only when T is raise, but not when lowered
2024-02-23 10:13:05 +01:00
c80b4ac5fb fixes for flamesample
- fixes in SwitchDrive
- increase range when reading is zero in autorange
- add debugging log msgs
2024-02-23 10:10:38 +01:00
8cb9154bb5 flamesample: use odd fraction for limits
workaround for problems when driving exactly to the limit
2024-02-23 10:10:29 +01:00
813d1b76ef remove 1K plate heater configuration
this heater does not exist
2024-02-19 12:47:52 +01:00
183709b7ce frappy_mlz seop: add count to ampl and phase cmds
Change-Id: Id7faca31269bb481ec4010f1e0aec2591f0299d6
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32030
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-02-02 14:25:27 +01:00
2cdf1fc58e fix import order in entangle.py
import order

Change-Id: I54450bb64cb5cbea3d29072b6095b8b9e3962aa6
2024-02-02 14:21:44 +01:00
ffaa9c83bd introduce FrozenParam
For the case when the readback of a parameter does not reflect the
change immediately.

May also be used on Writable.target or Drivable.target with a short
BUSY period.

+ bug fix in an error message in frappy.datatypes.IntRange

Change-Id: I5e1c871629f9e3940ae80f35cb6307f404202b4a
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31981
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-02-02 14:11:25 +01:00
f9a0fdf7e4 add frappy_psi/calcurve.py and calibtest.py
Change-Id: I5b9aae0ac3bcd76d846c08717201e6c32df4b675
2024-01-31 17:09:17 +01:00
7dfb2ff4e3 logdif.py: only one commit needs to be new enough
+ fixe intendation

Change-Id: I4b0c393767925532e1f105e80a215839a02214af
2024-01-29 15:48:43 +01:00
84c0017c03 synced most of wip to mlz
Change-Id: Ifc5eb0d8ccf693535ab474553759f5622b3a3c8f
2024-01-29 14:31:13 +01:00
2126956160 adopt missing changes while cherry-picking from mlz
Change-Id: Icda4d581e8f0ebd22fc22f2661965bf98a821a34
2024-01-29 14:29:23 +01:00
4cdd3b0709 remove more coding cookies
mainly from frappy_psi

Change-Id: I192811459aebe97f3076888cd31a308a51e6aa49
2024-01-29 14:14:09 +01:00
15d38d7cc1 all: remove coding cookies
Change-Id: I53a4d79c3ebc50b8aed43a5ef1fa6538f8059a47
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32251
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 14:06:06 +01:00
9904d31f0b [deb] Release v0.18.1 2024-01-29 13:51:25 +01:00
b07d2ae8a3 mlz: entangle fix limit check
Change-Id: Ib430262057026054ac71053d25dfda340b48227a
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32921
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Tested-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
2024-01-29 13:51:20 +01:00
7d7cb02f17 mlz: Zapf fix unit handling and small errors
Change-Id: Iaa5ed175582d8399cc0c69ba72c3ab8e6e51ecf6
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32920
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Tested-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
2024-01-29 13:51:15 +01:00
1017925ca0 [deb] Release v0.18.0 2024-01-29 13:51:10 +01:00
bb14d02884 bug fix in frappy.io.BytesIO.checkHWIdent
missing f for f string

Change-Id: Ie67384e5b7e514728041a72bd08c850abb31639e
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32786
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:51:04 +01:00
4c499cf048 remove py35 compatibility code
as f-strings are heavily used now, compatibility to py35
can be removed

Change-Id: I1ae4912ad4cbde8419b74845217943bd061053f3
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32754
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:50:50 +01:00
e403396941 fix playground after change 32249
as modules are now stored on secnode instead of dispatcher

Change-Id: Iccda3d97269693a893c06a4e094a9c1dbcf7df0b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32746
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:50:00 +01:00
5b42df4a5e frappy.secnode: fix strange error message
when get_module_instance is called a second time after
it failed, the 'cls' element in opts is missing:

move opts dict copy from create_modules to get_module_instance

Change-Id: Ie046f133a8fdbbb1c39643ca16dc5447a9d2d065
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32745
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:49:55 +01:00
841ef224f6 modify arguments of Dispatcher.announce_update
- 'pname' argument is not needed
- change 'modulename' argument to 'moduleobj'
  (needed for further change)

Change-Id: Ib21f8ad06d9b2be4005ff3513088a85e29785c94
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32744
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:49:39 +01:00
8142ba746d fix missing import in change message
Transported values in a change must be converted first.
As this is only relevant for the exotic "scaled" and "blob"
datatypes, this was not detected yet.

- add tests
- suppress warning PytestUnhandledThreadExceptionWarning in tests
+ change import_value methods to raise no other exceptions than
  WrongTypeError and RangeError
+ simplify Command.do: as import_value already raises the
  appropriate error, no more try/except is needed

Change-Id: I299e511468dc0fcecff4c20cf8a917da38b70786
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32743
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:49:30 +01:00
5358412b7a core: better error on export of internal type
more descriptive error when trying to export OrType, NoneOr, ValueType
and DataTypeType

Change-Id: If13815e9d2b177042b24a1bb62b1ad1d1d88b502
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32737
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:49:21 +01:00
010f0747e1 frappy_psi.sea: workaround for bug in sea
hdb path should not contain duble slash. replace double slash
by single slash

Change-Id: Ia2ce3be9a75d68fcc7efe3eb3dbd19a7907a73ff
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32705
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:49:08 +01:00
047c52b5a5 core: better command handling
* check argument of do
* automatically set optional struct members from function signature

Change-Id: I95684f1826c1318ea92fad2bd4c9681d85ea72f5
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32501
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:48:37 +01:00
f846c5cb31 datatypes: fix optional struct export
Change-Id: Ia2758dfba75f36a91bf1676e8ead555cec3ead53
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32500
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:47:18 +01:00
0e4a427bc3 mlz: handle unconfigured abslimits
- if there are no abslimits configured, get them from the hardware.
- check if the ranges are compatible

Change-Id: If72e31a56c299cb628ed8ff66d4340a87d4bd1d4
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32625
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:47:09 +01:00
2d8b609a3c core: formatting and update server docstring
Change-Id: Ic0dd4c5239f27679c89f6b3742b9c5f8b71f33f6
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32514
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2024-01-29 13:47:00 +01:00
6e3865b345 core: allow multiple interfaces
Change-Id: Ib8c0baef85a6dd69cddafe1c4973e42136d1588b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32489
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
2024-01-29 13:46:47 +01:00
0004dc7620 implement pfeiffer TPG vacuum reading
this is an example where StringIO.communicate has to be extended

Change-Id: Iff6bb426ee7960904993574531de84793152e21d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32385
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:36:18 +01:00
158477792f add StringIO.writeline, improve StringIO.multicomm
- StringIO.writeline sends a command and does not expect a reply
- StringIO.multicomm and BytesIO.multicomm is improved in order
  to insert individual delays in between lines and individual
  noreply flags

+ fix a bug in tutorial_t_control
+ improve readability of frappy.lib.classdoc.indent_description

Change-Id: I9dea113e19147684ec41aca5267a79816bbf202c
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32267
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:34:45 +01:00
fd0e762d18 doc: drop latex support, add pdf support
latexpdf fails with error message "Too deply nested".
We want to avoid reducing the nesting level of doc strings
in frappy.lib.classdoc (less nice output) or a level of
nesting in method doc strings.

- latex removed from Jenkinsfile
- added support for rst2pdf

Change-Id: Ieb3355ef506e636e7e43a726c68327e3b1154469
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32406
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2024-01-29 13:34:39 +01:00
a16ec6cc91 mlz/demo: move old examples to Attached
change very early version of module attachments in GarfieldMagnet and
MagnetigField to use Attached

Change-Id: I616ad17bc72cd93d86e1b3e3609543cfe90edcd8
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32250
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
2024-01-29 13:34:29 +01:00
777a2cb6a9 core: move module handling out of dispatcher
Split module handling code from the dispatcher.
The new class for managing Modules is called SecNode.

* change logging to no longer need a reference to modobj
* modules get a reference to the secnode obj instead of the
  dispatcher
* intermediate usage fixes for frappy_psi/sea

Change-Id: Ifee4bb47aa7a4508bb4a47c9a5873b7e2d5faf67
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32249
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
2024-01-29 13:33:47 +01:00
cb3e98f86d added logdif.py
a tool the compare commits in branches

Change-Id: I503941b76bb567ea4c3d33b986406a910154fda6
2024-01-29 12:58:45 +01:00
a8bafde64e add test cfg for lockin 7270
Change-Id: Ic6d66b625feb44e8130266901f1296adcb11f532
2024-01-29 10:58:21 +01:00
36c512d50b frappy_psi/SR.py: got changes from develop branch
+ move soft auto range from read_value to doPoll

Change-Id: I0bf8ac15f8515e55cd9131be33615908ffc99c12
2024-01-29 10:29:33 +01:00
17b7a01ce1 Driver for ThermoHaake Phoenix P1 Circulator
Change-Id: I0573eeac2e40b4715072661c819701186733bf94
2024-01-29 10:29:33 +01:00
be66faa591 frappy_psi/thermofisher: version through gerrit
Change-Id: I6999e84d1c5efd0625c6df89e97dad46e5a8cd59
2024-01-29 10:29:33 +01:00
e27b4f72b5 newset version of oksanas drivers
Change-Id: Ia6d8b727e48e96a14b75feeef5d3e6c002cb82a0
2024-01-29 10:29:33 +01:00
bc7922f5c8 iono pi max demo (drums)
+ fix spacing in ionopimax.py
2024-01-25 09:40:10 +01:00
99a58933ec ionopimax: Add LogVoltageInput 2024-01-18 08:40:33 +01:00
9e000528d2 add vacuum furnace cfg file 2024-01-11 16:33:06 +01:00
4a2ce62dd8 added drivers for small furnace 2024-01-11 16:28:06 +01:00
9e6699dd1e more robust calculation for heater resistivity
and check is is in the allowed range 10 .. 100 Ohm

Change-Id: If485480c0974d953165c37f7354dc2818f68b30b
2023-12-15 16:00:26 +01:00
183 changed files with 6342 additions and 2708 deletions

View File

@ -1,6 +1,5 @@
#!/usr/bin/env python
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,6 +1,5 @@
#!/usr/bin/env python3
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,6 +1,5 @@
#!/usr/bin/env python3
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

22
calibtest.py Normal file
View File

@ -0,0 +1,22 @@
import sys
import os
from glob import glob
from frappy_psi.calcurve import CalCurve
os.chdir('/Users/zolliker/gitpsi/calcurves')
if len(sys.argv) > 1:
calib = sys.argv[1]
c = CalCurve(calib)
else:
for file in sorted(glob('*.*')):
if file.endswith('.md') or file.endswith('.std'):
continue
try:
c = CalCurve(file)
xy = c.export()
print('%9.4g %12.7g %9.4g %9.4g %s' % (tuple(c.extx) + tuple(c.exty) + (file,)))
except Exception as e:
print(file, e)
calib = file

View File

@ -24,6 +24,7 @@ Mod('ts_low',
minrange=13,
range=22,
tolerance = 0.1,
vexc = 3,
htrrng=4,
)
@ -32,7 +33,8 @@ Mod('ts_high',
'sample Cernox',
channel = 1,
switcher = 'lsc_channel',
minrange=9,
minrange=11,
vexc = 5,
range=22,
tolerance = 0.1,
htrrng=5,
@ -45,6 +47,8 @@ Mod('ts',
value=Param(unit='K'),
low='ts_low',
high='ts_high',
#min_high=0.6035,
#max_low=1.6965,
min_high=0.6,
max_low=1.7,
tolerance=0.1,

View File

@ -74,10 +74,10 @@ Mod('currentsource',
Mod('mf',
'frappy_mlz.amagnet.GarfieldMagnet',
'magnetic field module, handling polarity switching and stuff',
subdev_currentsource = 'currentsource',
subdev_enable = 'enable',
subdev_polswitch = 'polarity',
subdev_symmetry = 'symmetry',
currentsource = 'currentsource',
enable = 'enable',
polswitch = 'polarity',
symmetry = 'symmetry',
target = Param(unit='T'),
value = Param(unit='T'),
userlimits = (-0.35, 0.35),

19
cfg/attocube_cfg.py Normal file
View File

@ -0,0 +1,19 @@
Node('attocube_test.psi.ch',
'a single attocube axis',
interface='tcp://5000',
)
Mod('r',
'frappy_psi.attocube.Axis',
'ANRv220-F3-02882',
axis = 1,
value = Param(unit='deg'),
tolerance = 0.1,
target_min = 0,
target_max = 360,
steps_fwd = 45,
steps_bwd = 85,
step_mode = True,
# gear = 1.2,
)

View File

@ -41,6 +41,6 @@ Mod('label',
'frappy_demo.modules.Label',
'some label indicating the state of the magnet `mf`.',
system = 'Cryomagnet MX15',
subdev_mf = 'mf',
subdev_ts = 'ts',
mf = 'mf',
ts = 'ts',
)

27
cfg/drums_cfg.py Normal file
View File

@ -0,0 +1,27 @@
Node('relais.psi.ch',
'relais test',
'tcp://5000',
)
Mod('rl',
'frappy_psi.ionopimax.DigitalOutput',
'left relais',
addr = 'o1',
value = 0, # start with relais off
)
Mod('rr',
'frappy_psi.ionopimax.DigitalOutput',
'right relais',
addr = 'o2',
value = 0, # start with relais off
)
Mod('drummer',
'frappy_psi.drums.Drums',
'drummer',
target = 150,
pattern='l2L2rl1R1L2',
left='rl',
right='rr',
)

60
cfg/flowsas_cfg.py Normal file
View File

@ -0,0 +1,60 @@
Node('flowsas.psi.ch',
'flowsas test motors',
'tcp://3000',
)
#Mod('mot_io',
# 'frappy_psi.phytron.PhytronIO',
# 'io for motor control',
# uri = 'serial:///dev/ttyUSB0',
# )
#Mod('hmot',
# 'frappy_psi.phytron.Motor',
# 'horizontal axis',
# axis = 'X',
# io = 'mot_io',
# encoder_mode = 'NO',
# )
#Mod('vmot',
# 'frappy_psi.phytron.Motor',
# 'vertical axis',
# axis = 'Y',
# io = 'mot_io',
# encoder_mode= 'NO',
# )
Mod('syr_io',
'frappy_psi.cetoni_pump.LabCannBus',
'Module for bus',
deviceconfig = "/home/l_samenv/frappy/cetoniSDK/CETONI_SDK_Raspi_64bit_v20220627/config/conti_flow",
)
Mod('syr1',
'frappy_psi.cetoni_pump.SyringePump',
'First syringe pump',
io='syr_io',
pump_name = "Nemesys_S_1_Pump",
valve_name = "Nemesys_S_1_Valve",
inner_diameter_set = 14.5673,
piston_stroke_set = 60,
)
Mod('syr2',
'frappy_psi.cetoni_pump.SyringePump',
'Second syringe pump',
io='syr_io',
pump_name = "Nemesys_S_2_Pump",
valve_name = "Nemesys_S_2_Valve",
inner_diameter_set = 14.5673,
piston_stroke_set = 60,
)
Mod('contiflow',
'frappy_psi.cetoni_pump.ContiFlowPump',
'Continuous flow pump',
io='syr_io',
inner_diameter_set = 14.5673,
piston_stroke_set = 60,
)

16
cfg/lockin70_cfg.py Normal file
View File

@ -0,0 +1,16 @@
Node('lockin70test.psi.ch',
'lockin70 test',
'tcp://5000',
)
Mod('io',
'frappy_psi.SR.SR_IO',
'lockin communication',
uri='10105266.psi.ch:50000',
)
Mod('XY',
'frappy_psi.SR.XY70',
'XY channels',
io='io',
)

View File

@ -0,0 +1,12 @@
Node('flowsas.psi.ch',
'peristaltic pump',
'tcp://3000',
)
Mod('peripump',
'frappy_psi.gilsonpump.PeristalticPump',
'Peristaltic pump',
addr_AO = 'ao1',
addr_dir_relay = 'o1',
addr_run_relay = 'o2',
)

16
cfg/phoenix_cfg.py Normal file
View File

@ -0,0 +1,16 @@
Node('phoenixtest.psi.ch',
'phoenix test',
'tcp://5000',
)
Mod('io',
'frappy_psi.haake.HaakeIO',
'connection for Thermo Haake',
uri='tcp://ldmprep7-ts:3005',
)
Mod('T',
'frappy_psi.haake.TemperatureLoop',
'holder temperature',
io='io',
)

13
cfg/pressureTest_cfg.py Normal file
View File

@ -0,0 +1,13 @@
Node('vf.psi.ch',
'small vacuum furnace',
'tcp://5000',
)
Mod('p',
'frappy_psi.ionopimax.VoltageInput',
'Vacuum pressure',
addr = 'av2',
rawrange = (0, 10),
valuerange = (0, 10),
value = Param(unit='V'),
)

11
cfg/rheotrigger_cfg.py Normal file
View File

@ -0,0 +1,11 @@
Node('flowsas.psi.ch',
'rheometer triggering',
'tcp://3000',
)
Mod('rheo',
'frappy_psi.rheo_trigger.RheoTrigger',
'Trigger for the rheometer',
addr='dt1',
doBeep = False,
)

View File

@ -138,13 +138,6 @@ Mod('T_one_K',
io='itc',
)
Mod('htr_one_K',
'frappy_psi.mercury.HeaterOutput',
'1 K plate warmup heater',
slot='DB3.H1',
io='itc',
)
Mod('T_mix_wup',
'frappy_psi.mercury.TemperatureLoop',
'mix. chamber warmup temperature',

103
cfg/vf_cfg.py Normal file
View File

@ -0,0 +1,103 @@
Node('vf.psi.ch',
'small vacuum furnace',
'tcp://5000',
)
Mod('htr_io',
'frappy_psi.bkpower.IO',
'powersupply communicator',
uri = 'serial:///dev/ttyUSBupper',
)
Mod('htr',
'frappy_psi.bkpower.Power',
'heater power',
io= 'htr_io',
)
Mod('out',
'frappy_psi.bkpower.Output',
'heater output',
io = 'htr_io',
maxvolt = 50,
maxcurrent = 2,
)
Mod('relais',
'frappy_psi.ionopimax.DigitalOutput',
'relais for power output',
addr = 'o2',
)
Mod('T_main',
'frappy_psi.ionopimax.CurrentInput',
'sample temperature',
addr = 'ai4',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_extra',
'frappy_psi.ionopimax.CurrentInput',
'extra temperature',
addr = 'ai3',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_htr',
'frappy_psi.ionopimax.CurrentInput',
'heater temperature',
addr = 'ai2',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_wall',
'frappy_psi.ionopimax.VoltageInput',
'furnace wall temperature',
addr = 'av2',
rawrange = (0, 1.5),
valuerange = (0, 150),
value = Param(unit='degC'),
)
Mod('T',
'frappy_psi.picontrol.PI',
'controlled Temperature',
input = 'T_htr',
output = 'out',
relais = 'relais',
p = 2,
i = 0.01,
)
Mod('interlocks',
'frappy_psi.furnace.Interlocks',
'interlock parameters',
input = 'T_htr',
wall_T = 'T_wall',
vacuum = 'p',
relais = 'relais',
control = 'T',
wall_limit = 50,
vacuum_limit = 0.1,
)
Mod('p_io',
'frappy_psi.pfeiffer.IO',
'pressure io',
uri='serial:///dev/ttyUSBlower',
)
Mod('p',
'frappy_psi.pfeiffer.Pressure',
'pressure reading',
io = 'p_io',
)

11
ci/Jenkinsfile vendored
View File

@ -141,12 +141,23 @@ def run_docs() {
'''
}
/* does not work with too many quote levels
* alternatively use pdf (based on rst2pdf)
* or singlehtml converted to pdf manually from a browser (may produce nicer output)
stage('build latexpdf') {
sh '''
. /home/jenkins/secopvenv/bin/activate
make -C doc latexpdf
'''
}
*/
stage('build pdf') {
sh '''
. /home/jenkins/secopvenv/bin/activate
make -C doc pdf
'''
}
stage('build man') {
sh '''

118
debian/changelog vendored
View File

@ -1,3 +1,121 @@
frappy-core (0.18.1) focal; urgency=medium
* mlz: Zapf fix unit handling and small errors
* mlz: entangle fix limit check
-- Alexander Zaft <jenkins@frm2.tum.de> Wed, 24 Jan 2024 14:59:21 +0100
frappy-core (0.18.0) focal; urgency=medium
[ Alexander Zaft ]
* Add shutdownModule function
[ Markus Zolliker ]
* frappy_psi.convergence: bug fixes and improvements
[ Alexander Zaft ]
* server: Add signal handling
* add test cases for server and config
[ Markus Zolliker ]
* fix frappy.lib.merge_status
* frappy_psi.sea: try to reconnect on failure
* pylint: disable use-dict-literal
[ Alexander Zaft ]
* server: add option to dynamically create devices
[ Markus Zolliker ]
* add StructParam
* add frappy_psi.thermofisher
* add frappy_psi.thermofisher to the doc
* frappy.io: make error reporting consistent
* frappy_psi.sea: avoid multiple connections
* frappy_psi.sea: further bug fixes
* frappy.client.interactive: bug fixes
[ Alexander Zaft ]
* mlz: Add Zebra Barcode Reader
* frappy_mlz: Zebra fixes after basic test
* dispatcher: change logging calls to debug
* core: do not call register_module on error
* add zapf to requirements-dev.txt
* frappy_mlz: Add Zapf PLC
* Revert "add zapf to requirements-dev.txt"
* add zapf to requirements-dev
* frappy_mlz: fix one-off error in barcode reader
[ Markus Zolliker ]
* improve error message on client when host/port is bad
* frappy/protocol/interface/tcp.py: use SECoP_DEFAULT_PORT
* frappy_psi.phytron: stop motor before restart
* interactive client: improve keyboard interrupt
* fix frappy/playground.py after change 31470
[ Alexander Zaft ]
* frappy_mlz seop: add count to ampl and phase cmds
[ Markus Zolliker ]
* frappy_psi.phytron: further improvements
* further fixes after change 31470
* fix missing .poll attribute in simulation
* psi: improve sea interface
* fix frappy_demo.lakeshore
* change FloatRange arguments minval/maxval to min/max
* improve client shutdown time
* introduce FrozenParam
* phytron.py: improve status
* frappy_psi.sea: small fixes
* bug in Attached (fix after change 31470)
[ Alexander Zaft ]
* core: split module code
* core: factor out accessibles from init
[ Markus Zolliker ]
* proxy: fix command wrapper
[ Alexander Zaft ]
* server: handle signals during startup
* all: remove coding cookies
* psi: fix Done import in sea
[ Markus Zolliker ]
* frappy.io: change default to retry_first_idn=True
[ Alexander Zaft ]
* core: move module handling out of dispatcher
* mlz/demo: move old examples to Attached
[ Markus Zolliker ]
* frappy.client: fix the case then timestamp is missing
* doc: drop latex support, add pdf support
* add StringIO.writeline, improve StringIO.multicomm
* implement pfeiffer TPG vacuum reading
[ Alexander Zaft ]
* core: allow multiple interfaces
* core: formatting and update server docstring
* mlz: handle unconfigured abslimits
* datatypes: fix optional struct export
* core: better command handling
[ Markus Zolliker ]
* frappy_psi.sea: workaround for bug in sea
[ Alexander Zaft ]
* core: better error on export of internal type
[ Markus Zolliker ]
* fix missing import in change message
* modify arguments of Dispatcher.announce_update
* frappy.secnode: fix strange error message
* fix playground after change 32249
* remove py35 compatibility code
* bug fix in frappy.io.BytesIO.checkHWIdent
-- Alexander Zaft <jenkins@frm2.tum.de> Wed, 17 Jan 2024 12:35:00 +0100
frappy-core (0.17.13) focal; urgency=medium
[ Alexander Zaft ]

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Frappy documentation build configuration file, created by
# sphinx-quickstart on Mon Sep 11 10:58:28 2017.
@ -43,7 +42,9 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.mathjax',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'rst2pdf.pdfbuilder',
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@ -220,3 +221,80 @@ from frappy.lib.classdoc import class_doc_handler
def setup(app):
app.connect('autodoc-process-docstring', class_doc_handler)
# -- Options for PDF output --------------------------------------------------
# Grouping the document tree into PDF files. List of tuples
# (source start file, target name, title, author, options).
#
# If there is more than one author, separate them with \\.
# For example: r'Guido van Rossum\\Fred L. Drake, Jr., editor'
#
# The options element is a dictionary that lets you override
# this config per-document. For example:
#
# ('index', 'MyProject', 'My Project', 'Author Name', {'pdf_compressed': True})
#
# would mean that specific document would be compressed
# regardless of the global 'pdf_compressed' setting.
pdf_documents = [
('index', project, project, author),
]
# A comma-separated list of custom stylesheets. Example:
pdf_stylesheets = ['sphinx', 'a4']
# A list of folders to search for stylesheets. Example:
pdf_style_path = ['.', '_styles']
# Create a compressed PDF
# Use True/False or 1/0
# Example: compressed=True
# pdf_compressed = False
# A colon-separated list of folders to search for fonts. Example:
# pdf_font_path = ['/usr/share/fonts', '/usr/share/texmf-dist/fonts/']
# Language to be used for hyphenation support
# pdf_language = "en_US"
# Mode for literal blocks wider than the frame. Can be
# overflow, shrink or truncate
# pdf_fit_mode = "shrink"
# Section level that forces a break page.
# For example: 1 means top-level sections start in a new page
# 0 means disabled
# pdf_break_level = 0
# When a section starts in a new page, force it to be 'even', 'odd',
# or just use 'any'
# pdf_breakside = 'any'
# Insert footnotes where they are defined instead of
# at the end.
# pdf_inline_footnotes = True
# verbosity level. 0 1 or 2
# pdf_verbosity = 0
# If false, no index is generated.
# pdf_use_index = True
# If false, no modindex is generated.
# pdf_use_modindex = True
# If false, no coverpage is generated.
# pdf_use_coverpage = True
# Name of the cover page template to use
# pdf_cover_template = 'sphinxcover.tmpl'
# Documents to append as an appendix to all manuals.
# pdf_appendices = []
# Enable experimental feature to split table cells. Use it
# if you get "DelayedTable too big" errors
# pdf_splittables = False
# Set the default DPI for images
# pdf_default_dpi = 72
# Enable rst2pdf extension modules
# pdf_extensions = []
# Page template name for "regular" pages
# pdf_page_template = 'cutePage'
# Show Table Of Contents at the beginning?
# pdf_use_toc = True
# How many levels deep should the table of contents be?
pdf_toc_depth = 9999
# Add section number to section references
pdf_use_numbered_links = False
# Background images fitting mode
pdf_fit_background_mode = 'scale'
# Repeat table header on tables that cross a page boundary?
pdf_repeat_table_rows = True
# Enable smart quotes (1, 2 or 3) or disable by setting to 0
pdf_smartquotes = 0

View File

@ -405,7 +405,7 @@ Appendix 2: Extract from the LakeShore Manual
Reply <range> *term*
**Operation Complete Query**
----------------------------------------------
Command *OPC?
Command \*OPC?
Reply 1
Description in Frappy, we append this command to request in order
to generate a reply

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2019 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#
@ -29,8 +28,9 @@
from frappy.datatypes import ArrayOf, BLOBType, BoolType, EnumType, \
FloatRange, IntRange, ScaledInteger, StringType, StructOf, TupleOf, StatusType
from frappy.lib.enum import Enum
from frappy.modulebase import Done, Module, Feature
from frappy.modules import Attached, Communicator, \
Done, Drivable, Feature, Module, Readable, Writable, HasAccessibles
Drivable, Readable, Writable
from frappy.params import Command, Parameter, Limit
from frappy.properties import Property
from frappy.proxy import Proxy, SecNode, proxy_class

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -90,7 +89,11 @@ class DataType(HasProperties):
def export_datatype(self):
"""return a python object which after jsonifying identifies this datatype"""
raise NotImplementedError
raise ProgrammingError(
f"{type(self).__name__} is not able to be exported to SECoP. "
f"It is intended for internal use only."
)
def export_value(self, value):
"""if needed, reformat value for transport"""
@ -102,7 +105,7 @@ class DataType(HasProperties):
note: for importing from gui/configfile/commandline use :meth:`from_string`
instead.
"""
return value
return self(value)
def format_value(self, value, unit=None):
"""format a value of this type into a str string
@ -256,10 +259,6 @@ class FloatRange(HasUnit, DataType):
"""returns a python object fit for serialisation"""
return float(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return float(value)
def from_string(self, text):
value = float(text)
return self(value)
@ -315,7 +314,7 @@ class IntRange(DataType):
except Exception:
raise WrongTypeError(f'can not convert {shortrepr(value)} to an int') from None
if round(fvalue) != fvalue:
raise WrongTypeError('%r should be an int')
raise WrongTypeError(f'{value} should be an int')
return value
def validate(self, value, previous=None):
@ -338,10 +337,6 @@ class IntRange(DataType):
"""returns a python object fit for serialisation"""
return int(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return int(value)
def from_string(self, text):
value = int(text)
return self(value)
@ -458,7 +453,10 @@ class ScaledInteger(HasUnit, DataType):
def import_value(self, value):
"""returns a python object from serialisation"""
return self.scale * int(value)
try:
return self.scale * int(value)
except Exception:
raise WrongTypeError(f'can not import {shortrepr(value)} to scaled') from None
def from_string(self, text):
value = float(text)
@ -510,10 +508,6 @@ class EnumType(DataType):
"""returns a python object fit for serialisation"""
return int(self(value))
def import_value(self, value):
"""returns a python object from serialisation"""
return self(value)
def __call__(self, value):
"""accepts integers and strings, converts to EnumMember (may be used like an int)"""
try:
@ -585,7 +579,10 @@ class BLOBType(DataType):
def import_value(self, value):
"""returns a python object from serialisation"""
return b64decode(value)
try:
return b64decode(value)
except Exception:
raise WrongTypeError(f'can not b64decode {shortrepr(value)}') from None
def from_string(self, text):
value = text
@ -656,10 +653,6 @@ class StringType(DataType):
"""returns a python object fit for serialisation"""
return f'{value}'
def import_value(self, value):
"""returns a python object from serialisation"""
return str(value)
def from_string(self, text):
value = str(text)
return self(value)
@ -720,10 +713,6 @@ class BoolType(DataType):
"""returns a python object fit for serialisation"""
return self(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return self(value)
def from_string(self, text):
value = text
return self(value)
@ -993,7 +982,7 @@ class StructOf(DataType):
return res
def __repr__(self):
opt = f', optional={self.optional!r}' if set(self.optional) == set(self.members) else ''
opt = f', optional={self.optional!r}' if set(self.optional) != set(self.members) else ''
return 'StructOf(%s%s)' % (', '.join(
['%s=%s' % (n, repr(st)) for n, st in list(self.members.items())]), opt)
@ -1232,6 +1221,7 @@ class OrType(DataType):
self.types = types
self.default = self.types[0].default
def __call__(self, value):
"""accepts any of the given types, takes the first valid"""
for t in self.types:

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -233,7 +232,7 @@ class ReadFailedError(SECoPError):
class OutOfRangeError(SECoPError):
"""The requested parameter can not be read just now"""
"""The value read from the hardware is out of sensor or calibration range"""
name = 'OutOfRange'

304
frappy/extparams.py Normal file
View File

@ -0,0 +1,304 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Markus Zolliker <markus.zolliker@psi.ch>
#
# *****************************************************************************
"""extended parameters
special parameter classes with some automatic functionality
"""
import re
from frappy.core import Parameter, Property
from frappy.datatypes import BoolType, DataType, DataTypeType, EnumType, \
FloatRange, StringType, StructOf, ValueType
from frappy.errors import ProgrammingError
class StructParam(Parameter):
"""convenience class to create a struct Parameter together with individual params
Usage:
class Controller(Drivable):
...
ctrlpars = StructParam('ctrlpars struct', [
('pid_p', 'p', Parameter('control parameter p', FloatRange())),
('pid_i', 'i', Parameter('control parameter i', FloatRange())),
('pid_d', 'd', Parameter('control parameter d', FloatRange())),
], readonly=False)
...
then implement either read_ctrlpars and write_ctrlpars or
read_pid_p, read_pid_i, read_pid_d, write_pid_p, write_pid_i and write_pid_d
the methods not implemented will be created automatically
"""
# use properties, as simple attributes are not considered on copy()
paramdict = Property('dict <parametername> of Parameter(...)', ValueType())
hasStructRW = Property('has a read_<struct param> or write_<struct param> method',
BoolType(), default=False)
insideRW = 0 # counter for avoiding multiple superfluous updates
def __init__(self, description=None, paramdict=None, prefix_or_map='', *, datatype=None, readonly=False, **kwds):
"""create a struct parameter together with individual parameters
in addition to normal Parameter arguments:
:param paramdict: dict <member name> of Parameter(...)
:param prefix_or_map: either a prefix for the parameter name to add to the member name
or a dict <member name> or <parameter name>
"""
if isinstance(paramdict, DataType):
raise ProgrammingError('second argument must be a dict of Param')
if datatype is None and paramdict is not None: # omit the following on Parameter.copy()
if isinstance(prefix_or_map, str):
prefix_or_map = {m: prefix_or_map + m for m in paramdict}
for membername, param in paramdict.items():
param.name = prefix_or_map[membername]
datatype = StructOf(**{m: p.datatype for m, p in paramdict.items()})
kwds['influences'] = [p.name for p in paramdict.values()]
self.updateEnable = {}
if paramdict:
kwds['paramdict'] = paramdict
super().__init__(description, datatype, readonly=readonly, **kwds)
def __set_name__(self, owner, name):
# names of access methods of structed param (e.g. ctrlpars)
struct_read_name = f'read_{name}' # e.g. 'read_ctrlpars'
struct_write_name = f'write_{name}' # e.h. 'write_ctrlpars'
self.hasStructRW = hasattr(owner, struct_read_name) or hasattr(owner, struct_write_name)
for membername, param in self.paramdict.items():
pname = param.name
changes = {
'readonly': self.readonly,
'influences': set(param.influences) | {name},
}
param.ownProperties.update(changes)
param.init(changes)
setattr(owner, pname, param)
param.__set_name__(owner, param.name)
if self.hasStructRW:
rname = f'read_{pname}'
if not hasattr(owner, rname):
def rfunc(self, membername=membername, struct_read_name=struct_read_name):
return getattr(self, struct_read_name)()[membername]
rfunc.poll = False # read_<struct param> is polled only
setattr(owner, rname, rfunc)
if not self.readonly:
wname = f'write_{pname}'
if not hasattr(owner, wname):
def wfunc(self, value, membername=membername,
name=name, rname=rname, struct_write_name=struct_write_name):
valuedict = dict(getattr(self, name))
valuedict[membername] = value
getattr(self, struct_write_name)(valuedict)
return getattr(self, rname)()
setattr(owner, wname, wfunc)
if not self.hasStructRW:
if not hasattr(owner, struct_read_name):
def struct_read_func(self, name=name, flist=tuple(
(m, f'read_{p.name}') for m, p in self.paramdict.items())):
pobj = self.parameters[name]
# disable updates generated from the callbacks of individual params
pobj.insideRW += 1 # guarded by self.accessLock
try:
return {m: getattr(self, f)() for m, f in flist}
finally:
pobj.insideRW -= 1
setattr(owner, struct_read_name, struct_read_func)
if not (self.readonly or hasattr(owner, struct_write_name)):
def struct_write_func(self, value, name=name, funclist=tuple(
(m, f'write_{p.name}') for m, p in self.paramdict.items())):
pobj = self.parameters[name]
pobj.insideRW += 1 # guarded by self.accessLock
try:
return {m: getattr(self, f)(value[m]) for m, f in funclist}
finally:
pobj.insideRW -= 1
setattr(owner, struct_write_name, struct_write_func)
super().__set_name__(owner, name)
def finish(self, modobj=None):
"""register callbacks for consistency"""
super().finish(modobj)
if modobj:
if self.hasStructRW:
def cb(value, modobj=modobj, structparam=self):
for membername, param in structparam.paramdict.items():
setattr(modobj, param.name, value[membername])
modobj.addCallback(self.name, cb)
else:
for membername, param in self.paramdict.items():
def cb(value, modobj=modobj, structparam=self, membername=membername):
if not structparam.insideRW:
prev = dict(getattr(modobj, structparam.name))
prev[membername] = value
setattr(modobj, structparam.name, prev)
modobj.addCallback(param.name, cb)
class FloatEnumParam(Parameter):
"""combine enum and float parameter
Example Usage:
vrange = FloatEnumParam('sensor range', ['500uV', '20mV', '1V'], 'V')
The following will be created automatically:
- the parameter vrange will get a datatype FloatRange(5e-4, 1, unit='V')
- an additional parameter `vrange_idx` will be created with an enum type
{'500uV': 0, '20mV': 1, '1V': 2}
- the method `write_vrange` will be created automatically
However, the methods `write_vrange_idx` and `read_vrange_idx`, if needed,
have to implemented by the programmer.
Writing to the float parameter involves 'rounding' to the closest allowed value.
Customization:
The individual labels might be customized by defining them as a tuple
(<index>, <label>, <float value>) where either the index or the float value
may be omitted.
When the index is omitted, the element will be the previous index + 1 or
0 when it is the first element.
Omitted values will be determined from the label, assuming that they use
one of the predefined unit prefixes together with the given unit.
The name of the index parameter is by default '<name>_idx' but might be
changed with the idx_name argument.
"""
# use properties, as simple attributes are not considered on copy()
idx_name = Property('name of attached index parameter', StringType(), default='')
valuedict = Property('dict <index> of <value>', ValueType(dict))
enumtype = Property('dict <label> of <index', DataTypeType())
# TODO: factor out unit handling, at the latest when needed elsewhere
PREFIXES = {'q': -30, 'r': -27, 'y': -24, 'z': -21, 'a': -18, 'f': -15,
'p': -12, 'n': -9, 'u': -6, 'µ': -6, 'm': -3,
'': 0, 'k': 3, 'M': 6, 'G': 9, 'T': 12,
'P': 15, 'E': 18, 'Z': 21, 'Y': 24, 'R': 25, 'Q': 30}
def __init__(self, description=None, labels=None, unit='',
*, datatype=None, readonly=False, **kwds):
if labels is None:
# called on Parameter.copy()
super().__init__(description, datatype, readonly=readonly, **kwds)
return
if isinstance(labels, DataType):
raise ProgrammingError('second argument must be a list of labels, not a datatype')
nextidx = 0
try:
edict = {}
vdict = {}
for elem in labels:
if isinstance(elem, str):
idx, label = [nextidx, elem]
else:
if isinstance(elem[0], str):
elem = [nextidx] + list(elem)
idx, label, *tail = elem
if tail:
vdict[idx], = tail
edict[label] = idx
nextidx = idx + 1
except (ValueError, TypeError) as e:
raise ProgrammingError('labels must be a list of labels or tuples '
'([index], label, [value])') from e
pat = re.compile(rf'([+-]?\d*\.?\d*) *({"|".join(self.PREFIXES)}){unit}$')
try:
# determine missing values from labels
for label, idx in edict.items():
if idx not in vdict:
value, prefix = pat.match(label).groups()
vdict[idx] = float(f'{value}e{self.PREFIXES[prefix]}')
except (AttributeError, ValueError) as e:
raise ProgrammingError(f"{label!r} has not the form '<float><prefix>{unit}'") from e
try:
enumtype = EnumType(**edict)
except TypeError as e:
raise ProgrammingError(str(e)) from e
datatype = FloatRange(min(vdict.values()), max(vdict.values()), unit=unit)
super().__init__(description, datatype, enumtype=enumtype, valuedict=vdict,
readonly=readonly, **kwds)
def __set_name__(self, owner, name):
super().__set_name__(owner, name)
if not self.idx_name:
self.idx_name = name + '_idx'
iname = self.idx_name
idx_param = Parameter(f'index of {name}', self.enumtype,
readonly=self.readonly, influences={name})
idx_param.init({})
setattr(owner, iname, idx_param)
idx_param.__set_name__(owner, iname)
self.setProperty('influences', {iname})
if not hasattr(owner, f'write_{name}'):
# customization (like rounding up or down) might be
# achieved by adding write_<name>. if not, the default
# is rounding to the closest value
def wfunc(mobj, value, vdict=self.valuedict, fname=name, wfunc_iname=f'write_{iname}'):
getattr(mobj, wfunc_iname)(
min(vdict, key=lambda i: abs(vdict[i] - value)))
return getattr(mobj, fname)
setattr(owner, f'write_{name}', wfunc)
def __get__(self, instance, owner):
"""getter for value"""
if instance is None:
return self
return self.valuedict[instance.parameters[self.idx_name].value]
def trigger_setter(self, modobj, _):
# trigger update of float parameter on change of enum parameter
modobj.announceUpdate(self.name, getattr(modobj, self.name))
def finish(self, modobj=None):
"""register callbacks for consistency"""
super().finish(modobj)
if modobj:
modobj.addCallback(self.idx_name, self.trigger_setter, modobj)

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2017 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# Resource object code
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# NICOS, the Networked Instrument Control System of the MLZ
# Copyright (c) 2009-2023 by the NICOS contributors (see AUTHORS)

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
@ -25,17 +24,17 @@ other future extensions of AsynConn
"""
import re
import time
import threading
import time
from frappy.lib.asynconn import AsynConn, ConnectionClosed
from frappy.datatypes import ArrayOf, BLOBType, BoolType, FloatRange, IntRange, \
StringType, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, ProgrammingError, \
SilentCommunicationFailedError as SilentError
from frappy.modules import Attached, Command, \
Communicator, Module, Parameter, Property
from frappy.datatypes import ArrayOf, BLOBType, BoolType, FloatRange, \
IntRange, StringType, StructOf, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, \
ProgrammingError, SilentCommunicationFailedError as SilentError
from frappy.lib import generalConfig
from frappy.lib.asynconn import AsynConn, ConnectionClosed
from frappy.modules import Attached, Command, Communicator, Module, \
Parameter, Property
generalConfig.set_default('legacy_hasiodev', False)
@ -62,8 +61,7 @@ class HasIO(Module):
ioname = opts.get('io') or f'{name}_io'
io = self.ioClass(ioname, srv.log.getChild(ioname), opts, srv) # pylint: disable=not-callable
io.callingModule = []
srv.modules[ioname] = io
srv.dispatcher.register_module(io, ioname)
srv.secnode.add_module(io, ioname)
self.ioDict[self.uri] = ioname
self.io = ioname
@ -76,8 +74,11 @@ class HasIO(Module):
def communicate(self, *args):
return self.io.communicate(*args)
def multicomm(self, *args):
return self.io.multicomm(*args)
def writeline(self, *args):
return self.io.writeline(*args)
def multicomm(self, *args, **kwds):
return self.io.multicomm(*args, **kwds)
class HasIodev(HasIO):
@ -287,7 +288,7 @@ class StringIO(IOBase):
f' does not match {regexp!r}')
@Command(StringType(), result=StringType())
def communicate(self, command):
def communicate(self, command, noreply=False):
"""send a command and receive a reply
using end_of_line, encoding and self._lock
@ -314,6 +315,8 @@ class StringIO(IOBase):
self.comLog('garbage: %r', garbage)
self._conn.send(cmd + self._eol_write)
self.comLog('> %s', cmd.decode(self.encoding))
if noreply:
return None
reply = self._conn.readline(self.timeout)
except ConnectionClosed:
self.closeConnection()
@ -329,13 +332,69 @@ class StringIO(IOBase):
self.log.error(self._last_error)
raise SilentError(repr(e)) from e
@Command(ArrayOf(StringType()), result=ArrayOf(StringType()))
def multicomm(self, commands):
"""communicate multiple request/replies in one row"""
@Command(StringType())
def writeline(self, command):
"""send a command without needing a reply
For keeping a request-reply scheme it is recommended to overwrite
this method to append a query on the same line, for example:
.. code::
def writeline(self, command):
self.communicate(command + ';*OPC?')
or to add an additional query which is returning always a reply, e.g.:
.. code::
def writeline(self, command):
with self._lock: # important!
self.communicate(command, noreply=True)
self.communicate('*OPC?')
The first version is preferred when the hardware allows to join several
commands by a separator.
"""
self.communicate(command, noreply=True)
@Command(ArrayOf(TupleOf(StringType(), BoolType(), FloatRange(0, unit='s'))),
result=ArrayOf(StringType()))
def multicomm(self, requests):
"""communicate multiple request/replies in one go
:param requests: a sequence of tuple of (command, request_expected, delay)
if called internally, a sequence of strings (command) is also accepted
:return: list of replies
This method may be rarely used, it is intended when the hardware needs
that several commands are not intercepted by an other client or by the poller,
for example selecting a channel before reading it. Or when wait times different
from 'wait_before' have to be specified.
These cases may also handled by adding an additional method to the IO class.
This could also be a custom SECoP command.
Or, in the case where all useful commands in this IO class need it,
:meth:`communicate` may be overridden.
This method should be used in the following cases:
1) you want to use a generic communicator covering above use cases over SECoP.
2) you do not want to subclass the IO class.
"""
replies = []
with self._lock:
for cmd in commands:
replies.append(self.communicate(cmd))
for request in requests:
if isinstance(request, str):
cmd, expect_reply, delay = request, True, 0
else:
cmd, expect_reply, delay = request
if expect_reply:
replies.append(self.communicate(cmd))
else:
self.writeline(cmd)
if delay:
time.sleep(delay)
return replies
@ -395,7 +454,7 @@ class BytesIO(IOBase):
if not replypat.match(reply):
self.closeConnection()
raise CommunicationFailedError(f'bad response: {reply!r}'
' does not match {expected!r}')
f' does not match {expected!r}')
@Command((BLOBType(), IntRange(0)), result=BLOBType())
def communicate(self, request, replylen): # pylint: disable=arguments-differ
@ -426,13 +485,20 @@ class BytesIO(IOBase):
self.log.error(self._last_error)
raise SilentError(repr(e)) from e
@Command((ArrayOf(TupleOf(BLOBType(), IntRange(0)))), result=ArrayOf(BLOBType()))
@Command(StructOf(requests=ArrayOf(TupleOf(BLOBType(), IntRange(0), FloatRange(0, unit='s')))),
result=ArrayOf(BLOBType()))
def multicomm(self, requests):
"""communicate multiple request/replies in one row"""
"""communicate multiple request/replies in one go
:param requests: sequence of tuple (<command>, <expected reply length>, <delay>)
:return: list of replies
"""
replies = []
with self._lock:
for request in requests:
replies.append(self.communicate(*request))
for cmd, replylen, delay in requests:
replies.append(self.communicate(cmd, replylen))
if delay:
time.sleep(delay)
return replies
def readBytes(self, nbytes):

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -22,12 +21,14 @@
from textwrap import indent
from frappy.modules import Command, HasProperties, Module, Parameter, Property
from frappy.modules import Command, Parameter, Property
from frappy.modulebase import HasProperties, Module
def indent_description(p):
"""indent lines except first one"""
return indent(p.description, ' ').replace(' ', '', 1)
space = ' ' * 6
return indent(p.description, space).replace(space, '', 1)
def fmt_param(name, param):

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -33,7 +32,7 @@ else:
class PEP487Metaclass(type):
# support for __set_name__ and __init_subclass__ for older python versions
# slightly modified from PEP487 doc
def __new__(cls, *args, **kwargs):
def __new__(cls, *args, **kwargs): # pylint: disable=bad-mcs-classmethod-argument
if len(args) != 3:
return super().__new__(cls, *args)
name, bases, ns = args

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -142,6 +141,7 @@ class SequencerMixin:
return self.Status.IDLE, ''
def stop(self):
"""stop sequence"""
if self.seq_is_alive():
self._seq_stopflag = True

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -55,6 +54,8 @@ class RemoteLogHandler(mlzlog.Handler):
def __init__(self):
super().__init__()
self.subscriptions = {} # dict[modname] of tuple(mobobj, dict [conn] of level)
# None will be replaced by a callback when one is first installed
self.send_log = None
def emit(self, record):
"""unused"""
@ -62,18 +63,18 @@ class RemoteLogHandler(mlzlog.Handler):
def handle(self, record):
modname = record.name.split('.')[-1]
try:
modobj, subscriptions = self.subscriptions[modname]
subscriptions = self.subscriptions[modname]
except KeyError:
return
for conn, lev in subscriptions.items():
if record.levelno >= lev:
modobj.DISPATCHER.send_log_msg(
conn, modobj.name, LEVEL_NAMES[record.levelno],
self.send_log( # pylint: disable=not-callable
conn, modname, LEVEL_NAMES[record.levelno],
record.getMessage())
def set_conn_level(self, modobj, conn, level):
def set_conn_level(self, modname, conn, level):
level = check_level(level)
modobj, subscriptions = self.subscriptions.setdefault(modobj.name, (modobj, {}))
subscriptions = self.subscriptions.setdefault(modname, {})
if level == OFF:
subscriptions.pop(conn, None)
else:
@ -127,7 +128,7 @@ class HasComlog:
if self.comlog and generalConfig.initialized and generalConfig.comlog:
self._comLog = mlzlog.Logger(f'COMLOG.{self.name}')
self._comLog.handlers[:] = []
directory = join(logger.logdir, logger.rootname, 'comlog', self.DISPATCHER.name)
directory = join(logger.logdir, logger.rootname, 'comlog', self.secNode.name)
self._comLog.addHandler(ComLogfileHandler(
directory, self.name, max_days=generalConfig.getint('comlog_days', 7)))
return

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -71,11 +70,8 @@ class HasOutputModule:
def initModule(self):
super().initModule()
try:
if self.output_module:
self.output_module.register_input(self.name, self.deactivate_control)
except Exception:
self.log.info(f'{self.name} has no output module')
if self.output_module:
self.output_module.register_input(self.name, self.deactivate_control)
def set_control_active(self, active):
"""to be overridden for switching hw control"""

835
frappy/modulebase.py Normal file
View File

@ -0,0 +1,835 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Defines the base Module class"""
import time
import threading
from collections import OrderedDict
from frappy.datatypes import ArrayOf, BoolType, EnumType, FloatRange, \
IntRange, StringType, TextType, TupleOf, \
NoneOr
from frappy.errors import BadValueError, CommunicationFailedError, ConfigError, \
ProgrammingError, SECoPError, secop_error, RangeError
from frappy.lib import formatException, mkthread, UniqueObject
from frappy.params import Accessible, Command, Parameter, Limit
from frappy.properties import HasProperties, Property
from frappy.logging import RemoteLogHandler
# TODO: resolve cirular import
# from .interfaces import SECoP_BASE_CLASSES
# WORKAROUND:
SECoP_BASE_CLASSES = ['Readable', 'Writable', 'Drivable', 'Communicator']
Done = UniqueObject('Done')
"""a special return value for a read_<param>/write_<param> method
indicating that the setter is triggered already"""
wrapperClasses = {}
class HasAccessibles(HasProperties):
"""base class of Module
joining the class's properties, parameters and commands dicts with
those of base classes.
wrap read_*/write_* methods
(so the dispatcher will get notified of changed values)
"""
isWrapped = False
checkedMethods = set()
@classmethod
def __init_subclass__(cls): # pylint: disable=too-many-branches
super().__init_subclass__()
if cls.isWrapped:
return
# merge accessibles from all sub-classes, treat overrides
# for now, allow to use also the old syntax (parameters/commands dict)
accessibles = OrderedDict() # dict of accessibles
merged_properties = {} # dict of dict of merged properties
new_names = [] # list of names of new accessibles
override_values = {} # bare values overriding a parameter and methods overriding a command
for base in reversed(cls.__mro__):
for key, value in base.__dict__.items():
if isinstance(value, Accessible):
value.updateProperties(merged_properties.setdefault(key, {}))
if base == cls and key not in accessibles:
new_names.append(key)
accessibles[key] = value
override_values.pop(key, None)
elif key in accessibles:
override_values[key] = value
# remark: merged_properties contain already the properties of accessibles of cls
for aname, aobj in list(accessibles.items()):
if aname in override_values:
value = override_values[aname]
if value is None:
accessibles.pop(aname)
continue
aobj = aobj.create_from_value(merged_properties[aname], value)
# replace the bare value by the created accessible
setattr(cls, aname, aobj)
else:
aobj.merge(merged_properties[aname])
accessibles[aname] = aobj
# rebuild order: (1) inherited items, (2) items from paramOrder, (3) new accessibles
# move (2) to the end
paramOrder = cls.__dict__.get('paramOrder', ())
for aname in paramOrder:
if aname in accessibles:
accessibles.move_to_end(aname)
# ignore unknown names
# move (3) to the end
for aname in new_names:
if aname not in paramOrder:
accessibles.move_to_end(aname)
cls.accessibles = accessibles
cls.wrappedAttributes = {'isWrapped': True}
# create wrappers for access methods
wrapped_name = '_' + cls.__name__
for pname, pobj in accessibles.items():
# wrap of reading/writing funcs
if not isinstance(pobj, Parameter):
# nothing to do for Commands
continue
rname = 'read_' + pname
rfunc = getattr(cls, rname, None)
# create wrapper
if rfunc:
def new_rfunc(self, pname=pname, rfunc=rfunc):
with self.accessLock:
try:
value = rfunc(self)
self.log.debug("read_%s returned %r", pname, value)
if value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
pobj = self.accessibles[pname]
value = pobj.datatype(value)
except Exception as e:
self.log.debug("read_%s failed with %r", pname, e)
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.read_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, value, validate=False)
return value
new_rfunc.poll = getattr(rfunc, 'poll', True)
else:
def new_rfunc(self, pname=pname):
return getattr(self, pname)
new_rfunc.poll = False
new_rfunc.__name__ = rname
new_rfunc.__qualname__ = wrapped_name + '.' + rname
new_rfunc.__module__ = cls.__module__
cls.wrappedAttributes[rname] = new_rfunc
cname = 'check_' + pname
for postfix in ('_limits', '_min', '_max'):
limname = pname + postfix
if limname in accessibles:
# find the base class, where the parameter <limname> is defined first.
# we have to check all bases, as they may not be treated yet when
# not inheriting from HasAccessibles
base = next(b for b in reversed(cls.__mro__) if limname in b.__dict__)
if cname not in base.__dict__:
# there is no check method yet at this class
# add check function to the class where the limit was defined
setattr(base, cname, lambda self, value, pname=pname: self.checkLimits(value, pname))
cfuncs = tuple(filter(None, (b.__dict__.get(cname) for b in cls.__mro__)))
wname = 'write_' + pname
wfunc = getattr(cls, wname, None)
if wfunc or not pobj.readonly:
# allow write method even when parameter is readonly, but internally writable
def new_wfunc(self, value, pname=pname, wfunc=wfunc, check_funcs=cfuncs):
with self.accessLock:
self.log.debug('validate %r to datatype of %r', value, pname)
validate = self.parameters[pname].datatype.validate
try:
new_value = validate(value)
for c in check_funcs:
if c(self, value):
break
if wfunc:
new_value = wfunc(self, new_value)
self.log.debug('write_%s(%r) returned %r', pname, value, new_value)
if new_value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
new_value = value if new_value is None else validate(new_value)
except Exception as e:
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.write_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, new_value, validate=False)
return new_value
new_wfunc.__name__ = wname
new_wfunc.__qualname__ = wrapped_name + '.' + wname
new_wfunc.__module__ = cls.__module__
cls.wrappedAttributes[wname] = new_wfunc
cls.checkedMethods.update(cls.wrappedAttributes)
# check for programming errors
for attrname in dir(cls):
prefix, _, pname = attrname.partition('_')
if not pname:
continue
if prefix == 'do':
raise ProgrammingError(f'{cls.__name__!r}: old style command {attrname!r} not supported anymore')
if prefix in ('read', 'write') and attrname not in cls.checkedMethods:
raise ProgrammingError(f'{cls.__name__}.{attrname} defined, but {pname!r} is no parameter')
try:
# update Status type
cls.Status = cls.status.datatype.members[0]._enum
except AttributeError:
pass
res = {}
# collect info about properties
for pn, pv in cls.propertyDict.items():
if pv.settable:
res[pn] = pv
# collect info about parameters and their properties
for param, pobj in cls.accessibles.items():
res[param] = {}
for pn, pv in pobj.getProperties().items():
if pv.settable:
res[param][pn] = pv
cls.configurables = res
def __new__(cls, *args, **kwds):
wrapper_class = wrapperClasses.get(cls)
if wrapper_class is None:
wrapper_class = type('_' + cls.__name__, (cls,), cls.wrappedAttributes)
wrapperClasses[cls] = wrapper_class
return super().__new__(wrapper_class)
class Feature(HasAccessibles):
"""all things belonging to a small, predefined functionality influencing the working of a module
a mixin with Feature as a direct base class is recognized as a SECoP feature
and reported in the module property 'features'
"""
class PollInfo:
def __init__(self, pollinterval, trigger_event):
self.interval = pollinterval
self.last_main = 0
self.last_slow = 0
self.pending_errors = set()
self.polled_parameters = []
self.fast_flag = False
self.trigger_event = trigger_event
def trigger(self, immediate=False):
"""trigger a recalculation of poll due times
:param immediate: when True, doPoll should be called as soon as possible
"""
if immediate:
self.last_main = 0
self.trigger_event.set()
def update_interval(self, pollinterval):
if not self.fast_flag:
self.interval = pollinterval
self.trigger()
class Module(HasAccessibles):
"""basic module
all SECoP modules derive from this.
:param name: the modules name
:param logger: a logger instance
:param cfgdict: the dict from this modules section in the config file
:param srv: the server instance
Notes:
- the programmer normally should not need to reimplement :meth:`__init__`
- within modules, parameters should only be addressed as ``self.<pname>``,
i.e. ``self.value``, ``self.target`` etc...
- these are accessing the cached version.
- they can also be written to, generating an async update
- if you want to 'update from the hardware', call ``self.read_<pname>()`` instead
- the return value of this method will be used as the new cached value and
be an async update sent automatically.
- if you want to 'update the hardware' call ``self.write_<pname>(<new value>)``.
- The return value of this method will also update the cache.
"""
# static properties, definitions in derived classes should overwrite earlier ones.
# note: properties don't change after startup and are usually filled
# with data from a cfg file...
# note: only the properties predefined here are allowed to be set in the cfg file
export = Property('flag if this module is to be exported', BoolType(), default=True, export=False)
group = Property('optional group the module belongs to', StringType(), default='', extname='group')
description = Property('description of the module', TextType(), extname='description', mandatory=True)
meaning = Property('optional meaning indicator', TupleOf(StringType(), IntRange(0, 50)),
default=('', 0), extname='meaning')
visibility = Property('optional visibility hint', EnumType('visibility', user=1, advanced=2, expert=3),
default='user', extname='visibility')
implementation = Property('internal name of the implementation class of the module', StringType(),
extname='implementation')
interface_classes = Property('offical highest interface-class of the module', ArrayOf(StringType()),
extname='interface_classes')
features = Property('list of features', ArrayOf(StringType()), extname='features')
pollinterval = Property('poll interval for parameters handled by doPoll', FloatRange(0.1, 120), default=5)
slowinterval = Property('poll interval for other parameters', FloatRange(0.1, 120), default=15)
omit_unchanged_within = Property('default for minimum time between updates of unchanged values',
NoneOr(FloatRange(0)), export=False, default=None)
enablePoll = True
pollInfo = None
triggerPoll = None # trigger event for polls. used on io modules and modules without io
def __init__(self, name, logger, cfgdict, srv):
# remember the secnode for interacting with other modules and the
# server
self.secNode = srv.secnode
self.log = logger
self.name = name
self.paramCallbacks = {}
self.earlyInitDone = False
self.initModuleDone = False
self.startModuleDone = False
self.remoteLogHandler = None
self.accessLock = threading.RLock() # for read_* / write_* methods
self.updateLock = threading.RLock() # for announceUpdate
self.polledModules = [] # modules polled by thread started in self.startModules
self.attachedModules = {}
self.errors = []
self._isinitialized = False
self.updateCallback = srv.dispatcher.announce_update
# handle module properties
# 1) make local copies of properties
super().__init__()
# conversion from exported names to internal attribute names
self.accessiblename2attr = {}
self.writeDict = {} # values of parameters to be written
# properties, parameters and commands are auto-merged upon subclassing
self.parameters = {}
self.commands = {}
# 2) check and apply properties specified in cfgdict as
# '<propertyname> = <propertyvalue>'
# pylint: disable=consider-using-dict-items
for key in self.propertyDict:
value = cfgdict.pop(key, None)
if value is not None:
try:
if isinstance(value, dict):
self.setProperty(key, value['value'])
else:
self.setProperty(key, value)
except BadValueError:
self.errors.append(f'{key}: value {value!r} does not match {self.propertyDict[key].datatype!r}!')
# 3) set automatic properties
mycls, = self.__class__.__bases__ # skip the wrapper class
myclassname = f'{mycls.__module__}.{mycls.__name__}'
self.implementation = myclassname
# list of only the 'highest' secop module class
self.interface_classes = [
b.__name__ for b in mycls.__mro__ if b.__name__ in SECoP_BASE_CLASSES][:1]
# handle Features
self.features = [b.__name__ for b in mycls.__mro__ if Feature in b.__bases__]
# handle accessibles
# 1) make local copies of parameter objects
# they need to be individual per instance since we use them also
# to cache the current value + qualifiers...
# do not re-use self.accessibles as this is the same for all instances
accessibles = self.accessibles
self.accessibles = {}
for aname, aobj in accessibles.items():
# make a copy of the Parameter/Command object
aobj = aobj.copy()
acfg = cfgdict.pop(aname, None)
self._add_accessible(aname, aobj, cfg=acfg)
# 3) complain about names not found as accessible or property names
if cfgdict:
self.errors.append(
f"{', '.join(cfgdict.keys())} does not exist (use one of"
f" {', '.join(list(self.accessibles) + list(self.propertyDict))})")
# 5) ensure consistency of all accessibles added here
for aobj in self.accessibles.values():
aobj.finish(self)
# Modify units AFTER applying the cfgdict
mainvalue = self.parameters.get('value')
if mainvalue:
mainunit = mainvalue.datatype.unit
if mainunit:
self.applyMainUnit(mainunit)
# 6) check complete configuration of * properties
if not self.errors:
try:
self.checkProperties()
except ConfigError as e:
self.errors.append(str(e))
for aname, aobj in self.accessibles.items():
try:
aobj.checkProperties()
except (ConfigError, ProgrammingError) as e:
self.errors.append(f'{aname}: {e}')
if self.errors:
raise ConfigError(self.errors)
# helper cfg-editor
def __iter__(self):
return self.accessibles.__iter__()
def __getitem__(self, item):
return self.accessibles.__getitem__(item)
def applyMainUnit(self, mainunit):
"""replace $ in units of parameters by mainunit"""
for pobj in self.parameters.values():
pobj.datatype.set_main_unit(mainunit)
def _add_accessible(self, name, accessible, cfg=None):
if self.startModuleDone:
raise ProgrammingError('Accessibles can only be added before startModule()!')
if not self.export: # do not export parameters of a module not exported
accessible.export = False
self.accessibles[name] = accessible
if accessible.export:
self.accessiblename2attr[accessible.export] = name
if isinstance(accessible, Parameter):
self.parameters[name] = accessible
if isinstance(accessible, Command):
self.commands[name] = accessible
if cfg:
try:
for propname, propvalue in cfg.items():
accessible.setProperty(propname, propvalue)
except KeyError:
self.errors.append(f"'{name}' has no property '{propname}'")
except BadValueError as e:
self.errors.append(f'{name}.{propname}: {str(e)}')
if isinstance(accessible, Parameter):
self._handle_writes(name, accessible)
def _handle_writes(self, pname, pobj):
""" register value for writing, if given
apply default when no value is given (in cfg or as Parameter argument)
or complain, when cfg is needed
"""
self.paramCallbacks[pname] = []
if isinstance(pobj, Limit):
basepname = pname.rpartition('_')[0]
baseparam = self.parameters.get(basepname)
if not baseparam:
self.errors.append(f'limit {pname!r} is given, but not {basepname!r}')
return
if baseparam.datatype is None:
return # an error will be reported on baseparam
pobj.set_datatype(baseparam.datatype)
if not pobj.hasDatatype():
self.errors.append(f'{pname} needs a datatype')
return
if pobj.value is None:
if pobj.needscfg:
self.errors.append(f'{pname!r} has no default value and was not given in config!')
if pobj.default is None:
# we do not want to call the setter for this parameter for now,
# this should happen on the first read
pobj.readerror = ConfigError(f'parameter {pname!r} not initialized')
# above error will be triggered on activate after startup,
# when not all hardware parameters are read because of startup timeout
pobj.default = pobj.datatype.default
pobj.value = pobj.default
else:
# value given explicitly, either by cfg or as Parameter argument
pobj.given = True # for PersistentMixin
if hasattr(self, 'write_' + pname):
self.writeDict[pname] = pobj.value
if pobj.default is None:
pobj.default = pobj.value
# this checks again for datatype and sets the timestamp
setattr(self, pname, pobj.value)
def announceUpdate(self, pname, value=None, err=None, timestamp=None, validate=True):
"""announce a changed value or readerror
:param pname: parameter name
:param value: new value or None in case of error
:param err: None or an exception
:param timestamp: a timestamp or None for taking current time
:param validate: True: convert to datatype, in case of error store in readerror
:return:
when err=None and validate=False, the value must already be converted to the datatype
"""
with self.updateLock:
pobj = self.parameters[pname]
timestamp = timestamp or time.time()
if not err:
try:
if validate:
value = pobj.datatype(value)
except Exception as e:
err = e
else:
changed = pobj.value != value
# store the value even in case of error
pobj.value = value
if err:
if secop_error(err) == pobj.readerror:
err.report_error = False
return # no updates for repeated errors
err = secop_error(err)
value_err = value, err
else:
if not changed and timestamp < (pobj.timestamp or 0) + pobj.omit_unchanged_within:
# no change within short time -> omit
return
value_err = (value,)
pobj.timestamp = timestamp or time.time()
pobj.readerror = err
for cbfunc, cbargs in self.paramCallbacks[pname]:
try:
cbfunc(*cbargs, *value_err)
except Exception:
pass
if pobj.export:
self.updateCallback(self, pobj)
def addCallback(self, pname, callback_function, *args):
self.paramCallbacks[pname].append((callback_function, args))
def registerCallbacks(self, modobj, autoupdate=()):
"""register callbacks to another module <modobj>
whenever a self.<param> changes or changes its error state:
<modobj>.update_param(<value> [, <exc>]) is called,
where <value> is the new value and <exc> is given only in case of error.
if the method does not exist, and <param> is in autoupdate
<modobj>.announceUpdate(<pname>, <value>, <exc>) is called
with <exc> being None in case of no error.
Remark: when <modobj>.update_<param> does not accept the <exc> argument,
nothing happens (the callback is catched by try / except).
Any exceptions raised by the callback function are silently ignored.
"""
autoupdate = set(autoupdate)
for pname in self.parameters:
cbfunc = getattr(modobj, 'update_' + pname, None)
if cbfunc:
self.addCallback(pname, cbfunc)
elif pname in autoupdate:
self.addCallback(pname, modobj.announceUpdate, pname)
def isBusy(self, status=None):
"""helper function for treating substates of BUSY correctly"""
# defined even for non drivable (used for dynamic polling)
return False
def earlyInit(self):
"""initialise module with stuff to be done before all modules are created"""
self.earlyInitDone = True
def initModule(self):
"""initialise module with stuff to be done after all modules are created"""
self.initModuleDone = True
if self.enablePoll or self.writeDict:
# enablePoll == False: we still need the poll thread for writing values from writeDict
if hasattr(self, 'io'):
self.io.polledModules.append(self)
if not self.io.triggerPoll:
# when self.io.enablePoll is False, triggerPoll is not
# created for self.io in the else clause below
self.triggerPoll = threading.Event()
else:
self.triggerPoll = threading.Event()
self.polledModules.append(self)
def startModule(self, start_events):
"""runs after init of all modules
when a thread is started, a trigger function may signal that it
has finished its initial work
start_events.get_trigger(<timeout>) creates such a trigger and
registers it in the server for waiting
<timeout> defaults to 30 seconds
"""
# we do not need self.errors any longer. should we delete it?
# del self.errors
if self.polledModules:
mkthread(self.__pollThread, self.polledModules, start_events.get_trigger())
self.startModuleDone = True
def initialReads(self):
"""initial reads to be done
override to read initial values from HW, when it is not desired
to poll them afterwards
called from the poll thread, after writeInitParams but before
all parameters are polled once
"""
def shutdownModule(self):
"""called when the sever shuts down
any cleanup-work should be performed here, like closing threads and
saving data.
"""
def doPoll(self):
"""polls important parameters like value and status
all other parameters are polled automatically
"""
def setFastPoll(self, flag, fast_interval=0.25):
"""change poll interval
:param flag: enable/disable fast poll mode
:param fast_interval: fast poll interval
"""
if self.pollInfo:
self.pollInfo.fast_flag = flag
self.pollInfo.interval = fast_interval if flag else self.pollinterval
self.pollInfo.trigger()
def callPollFunc(self, rfunc, raise_com_failed=False):
"""call read method with proper error handling"""
try:
rfunc()
if rfunc.__name__ in self.pollInfo.pending_errors:
self.log.info('%s: o.k.', rfunc.__name__)
self.pollInfo.pending_errors.discard(rfunc.__name__)
except Exception as e:
if getattr(e, 'report_error', True):
name = rfunc.__name__
self.pollInfo.pending_errors.add(name) # trigger o.k. message after error is resolved
if isinstance(e, SECoPError):
e.raising_methods.append(name)
if e.silent:
self.log.debug('%s', e.format(False))
else:
self.log.error('%s', e.format(False))
if raise_com_failed and isinstance(e, CommunicationFailedError):
raise
else:
# not a SECoPError: this is proabably a programming error
# we want to log the traceback
self.log.error('%s', formatException())
def __pollThread(self, modules, started_callback):
"""poll thread body
:param modules: list of modules to be handled by this thread
:param started_callback: to be called after all polls are done once
before polling, parameters which need hardware initialisation are written
"""
polled_modules = [m for m in modules if m.enablePoll]
if hasattr(self, 'registerReconnectCallback'):
# self is a communicator supporting reconnections
def trigger_all(trg=self.triggerPoll, polled_modules=polled_modules):
for m in polled_modules:
m.pollInfo.last_main = 0
m.pollInfo.last_slow = 0
trg.set()
self.registerReconnectCallback('trigger_polls', trigger_all)
# collect all read functions
for mobj in polled_modules:
pinfo = mobj.pollInfo = PollInfo(mobj.pollinterval, self.triggerPoll)
# trigger a poll interval change when self.pollinterval changes.
if 'pollinterval' in mobj.paramCallbacks:
mobj.addCallback('pollinterval', pinfo.update_interval)
for pname, pobj in mobj.parameters.items():
rfunc = getattr(mobj, 'read_' + pname)
if rfunc.poll:
pinfo.polled_parameters.append((mobj, rfunc, pobj))
while True:
try:
for mobj in modules:
# TODO when needed: here we might add a call to a method :meth:`beforeWriteInit`
mobj.writeInitParams()
mobj.initialReads()
# call all read functions a first time
for m in polled_modules:
for mobj, rfunc, _ in m.pollInfo.polled_parameters:
mobj.callPollFunc(rfunc, raise_com_failed=True)
# TODO when needed: here we might add calls to a method :meth:`afterInitPolls`
break
except CommunicationFailedError as e:
# when communication failed, probably all parameters and may be more modules are affected.
# as this would take a lot of time (summed up timeouts), we do not continue
# trying and let the server accept connections, further polls might success later
if started_callback:
self.log.error('communication failure on startup: %s', e)
started_callback()
started_callback = None
self.triggerPoll.wait(0.1) # wait for reconnection or max 10 sec.
break
if started_callback:
started_callback()
if not polled_modules: # no polls needed - exit thread
return
to_poll = ()
while True:
now = time.time()
wait_time = 999
for mobj in modules:
pinfo = mobj.pollInfo
wait_time = min(pinfo.last_main + pinfo.interval - now, wait_time,
pinfo.last_slow + mobj.slowinterval - now)
if wait_time > 0 and not to_poll:
# nothing to do
self.triggerPoll.wait(wait_time)
self.triggerPoll.clear()
continue
# call doPoll of all modules where due
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_main + pinfo.interval:
try:
pinfo.last_main = (now // pinfo.interval) * pinfo.interval
except ZeroDivisionError:
pinfo.last_main = now
mobj.callPollFunc(mobj.doPoll)
now = time.time()
# find ONE due slow poll and call it
loop = True
while loop: # loops max. 2 times, when to_poll is at end
for mobj, rfunc, pobj in to_poll:
if now > pobj.timestamp + mobj.slowinterval * 0.5:
mobj.callPollFunc(rfunc)
loop = False # one poll done
break
else:
to_poll = []
# collect due slow polls
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_slow + mobj.slowinterval:
to_poll.extend(pinfo.polled_parameters)
pinfo.last_slow = (now // mobj.slowinterval) * mobj.slowinterval
if to_poll:
to_poll = iter(to_poll)
else:
loop = False # no slow polls ready
def writeInitParams(self):
"""write values for parameters with configured values
- does proper error handling
called at the beginning of the poller thread and for writing persistent values
"""
for pname in list(self.writeDict):
value = self.writeDict.pop(pname, Done)
# in the mean time, a poller or handler might already have done it
if value is not Done:
wfunc = getattr(self, 'write_' + pname, None)
if wfunc is None:
setattr(self, pname, value)
else:
try:
self.log.debug('initialize parameter %s', pname)
wfunc(value)
except SECoPError as e:
if e.silent:
self.log.debug('%s: %s', pname, str(e))
else:
self.log.error('%s: %s', pname, str(e))
except Exception:
self.log.error(formatException())
def setRemoteLogging(self, conn, level, send_log):
if self.remoteLogHandler is None:
for handler in self.log.handlers:
if isinstance(handler, RemoteLogHandler):
handler.send_log = send_log
self.remoteLogHandler = handler
break
else:
raise ValueError('remote handler not found')
self.remoteLogHandler.set_conn_level(self.name, conn, level)
def checkLimits(self, value, pname='target'):
"""check for limits
:param value: the value to be checked for <pname>_min <= value <= <pname>_max
:param pname: parameter name, default is 'target'
raises RangeError in case the value is not valid
This method is called automatically and needs therefore rarely to be
called by the programmer. It might be used in a check_<param> method,
when no automatic super call is desired.
"""
try:
min_, max_ = getattr(self, pname + '_limits')
if not min_ <= value <= max_:
raise RangeError(f'{pname} outside {pname}_limits')
return
except AttributeError:
pass
min_ = getattr(self, pname + '_min', float('-inf'))
max_ = getattr(self, pname + '_max', float('inf'))
if min_ > max_:
raise RangeError(f'invalid limits: {pname}_min > {pname}_max')
if value < min_:
raise RangeError(f'{pname} below {pname}_min')
if value > max_:
raise RangeError(f'{pname} above {pname}_max')

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -18,842 +17,26 @@
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Define base classes for real Modules implemented in the server"""
import time
import threading
from collections import OrderedDict
from frappy.datatypes import ArrayOf, BoolType, EnumType, FloatRange, \
IntRange, StatusType, StringType, TextType, TupleOf, \
NoneOr
from frappy.errors import BadValueError, CommunicationFailedError, ConfigError, \
ProgrammingError, SECoPError, secop_error, RangeError
from frappy.lib import formatException, mkthread, UniqueObject
from frappy.datatypes import FloatRange, \
StatusType, StringType
from frappy.errors import ConfigError, ProgrammingError
from frappy.lib.enum import Enum
from frappy.params import Accessible, Command, Parameter, Limit
from frappy.properties import HasProperties, Property
from frappy.logging import RemoteLogHandler, HasComlog
from frappy.params import Command, Parameter
from frappy.properties import Property
from frappy.logging import HasComlog
Done = UniqueObject('Done')
"""a special return value for a read_<param>/write_<param> method
indicating that the setter is triggered already"""
wrapperClasses = {}
class HasAccessibles(HasProperties):
"""base class of Module
joining the class's properties, parameters and commands dicts with
those of base classes.
wrap read_*/write_* methods
(so the dispatcher will get notified of changed values)
"""
isWrapped = False
checkedMethods = set()
@classmethod
def __init_subclass__(cls): # pylint: disable=too-many-branches
super().__init_subclass__()
if cls.isWrapped:
return
# merge accessibles from all sub-classes, treat overrides
# for now, allow to use also the old syntax (parameters/commands dict)
accessibles = OrderedDict() # dict of accessibles
merged_properties = {} # dict of dict of merged properties
new_names = [] # list of names of new accessibles
override_values = {} # bare values overriding a parameter and methods overriding a command
for base in reversed(cls.__mro__):
for key, value in base.__dict__.items():
if isinstance(value, Accessible):
value.updateProperties(merged_properties.setdefault(key, {}))
if base == cls and key not in accessibles:
new_names.append(key)
accessibles[key] = value
override_values.pop(key, None)
elif key in accessibles:
override_values[key] = value
for aname, aobj in list(accessibles.items()):
if aname in override_values:
aobj = aobj.copy()
value = override_values[aname]
if value is None:
accessibles.pop(aname)
continue
aobj.merge(merged_properties[aname])
aobj.override(value)
# replace the bare value by the created accessible
setattr(cls, aname, aobj)
else:
aobj.merge(merged_properties[aname])
accessibles[aname] = aobj
# rebuild order: (1) inherited items, (2) items from paramOrder, (3) new accessibles
# move (2) to the end
paramOrder = cls.__dict__.get('paramOrder', ())
for aname in paramOrder:
if aname in accessibles:
accessibles.move_to_end(aname)
# ignore unknown names
# move (3) to the end
for aname in new_names:
if aname not in paramOrder:
accessibles.move_to_end(aname)
# note: for python < 3.6 the order of inherited items is not ensured between
# declarations within the same class
cls.accessibles = accessibles
cls.wrappedAttributes = {'isWrapped': True}
# create wrappers for access methods
wrapped_name = '_' + cls.__name__
for pname, pobj in accessibles.items():
# wrap of reading/writing funcs
if not isinstance(pobj, Parameter):
# nothing to do for Commands
continue
rname = 'read_' + pname
rfunc = getattr(cls, rname, None)
# create wrapper
if rfunc:
def new_rfunc(self, pname=pname, rfunc=rfunc):
with self.accessLock:
try:
value = rfunc(self)
self.log.debug("read_%s returned %r", pname, value)
if value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
pobj = self.accessibles[pname]
value = pobj.datatype(value)
except Exception as e:
self.log.debug("read_%s failed with %r", pname, e)
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.read_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, value, validate=False)
return value
new_rfunc.poll = getattr(rfunc, 'poll', True)
else:
def new_rfunc(self, pname=pname):
return getattr(self, pname)
new_rfunc.poll = False
new_rfunc.__name__ = rname
new_rfunc.__qualname__ = wrapped_name + '.' + rname
new_rfunc.__module__ = cls.__module__
cls.wrappedAttributes[rname] = new_rfunc
cname = 'check_' + pname
for postfix in ('_limits', '_min', '_max'):
limname = pname + postfix
if limname in accessibles:
# find the base class, where the parameter <limname> is defined first.
# we have to check all bases, as they may not be treated yet when
# not inheriting from HasAccessibles
base = next(b for b in reversed(cls.__mro__) if limname in b.__dict__)
if cname not in base.__dict__:
# there is no check method yet at this class
# add check function to the class where the limit was defined
setattr(base, cname, lambda self, value, pname=pname: self.checkLimits(value, pname))
cfuncs = tuple(filter(None, (b.__dict__.get(cname) for b in cls.__mro__)))
wname = 'write_' + pname
wfunc = getattr(cls, wname, None)
if wfunc or not pobj.readonly:
# allow write method even when parameter is readonly, but internally writable
def new_wfunc(self, value, pname=pname, wfunc=wfunc, check_funcs=cfuncs):
with self.accessLock:
self.log.debug('validate %r to datatype of %r', value, pname)
validate = self.parameters[pname].datatype.validate
try:
new_value = validate(value)
for c in check_funcs:
if c(self, value):
break
if wfunc:
new_value = wfunc(self, new_value)
self.log.debug('write_%s(%r) returned %r', pname, value, new_value)
if new_value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
new_value = value if new_value is None else validate(new_value)
except Exception as e:
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.write_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, new_value, validate=False)
return new_value
new_wfunc.__name__ = wname
new_wfunc.__qualname__ = wrapped_name + '.' + wname
new_wfunc.__module__ = cls.__module__
cls.wrappedAttributes[wname] = new_wfunc
cls.checkedMethods.update(cls.wrappedAttributes)
# check for programming errors
for attrname in dir(cls):
prefix, _, pname = attrname.partition('_')
if not pname:
continue
if prefix == 'do':
raise ProgrammingError(f'{cls.__name__!r}: old style command {attrname!r} not supported anymore')
if prefix in ('read', 'write') and attrname not in cls.checkedMethods:
raise ProgrammingError(f'{cls.__name__}.{attrname} defined, but {pname!r} is no parameter')
try:
# update Status type
cls.Status = cls.status.datatype.members[0]._enum
except AttributeError:
pass
res = {}
# collect info about properties
for pn, pv in cls.propertyDict.items():
if pv.settable:
res[pn] = pv
# collect info about parameters and their properties
for param, pobj in cls.accessibles.items():
res[param] = {}
for pn, pv in pobj.getProperties().items():
if pv.settable:
res[param][pn] = pv
cls.configurables = res
def __new__(cls, *args, **kwds):
wrapper_class = wrapperClasses.get(cls)
if wrapper_class is None:
wrapper_class = type('_' + cls.__name__, (cls,), cls.wrappedAttributes)
wrapperClasses[cls] = wrapper_class
return super().__new__(wrapper_class)
class Feature(HasAccessibles):
"""all things belonging to a small, predefined functionality influencing the working of a module
a mixin with Feature as a direct base class is recognized as a SECoP feature
and reported in the module property 'features'
"""
class PollInfo:
def __init__(self, pollinterval, trigger_event):
self.interval = pollinterval
self.last_main = 0
self.last_slow = 0
self.pending_errors = set()
self.polled_parameters = []
self.fast_flag = False
self.trigger_event = trigger_event
def trigger(self, immediate=False):
"""trigger a recalculation of poll due times
:param immediate: when True, doPoll should be called as soon as possible
"""
if immediate:
self.last_main = 0
self.trigger_event.set()
def update_interval(self, pollinterval):
if not self.fast_flag:
self.interval = pollinterval
self.trigger()
class Module(HasAccessibles):
"""basic module
all SECoP modules derive from this.
:param name: the modules name
:param logger: a logger instance
:param cfgdict: the dict from this modules section in the config file
:param srv: the server instance
Notes:
- the programmer normally should not need to reimplement :meth:`__init__`
- within modules, parameters should only be addressed as ``self.<pname>``,
i.e. ``self.value``, ``self.target`` etc...
- these are accessing the cached version.
- they can also be written to, generating an async update
- if you want to 'update from the hardware', call ``self.read_<pname>()`` instead
- the return value of this method will be used as the new cached value and
be an async update sent automatically.
- if you want to 'update the hardware' call ``self.write_<pname>(<new value>)``.
- The return value of this method will also update the cache.
"""
# static properties, definitions in derived classes should overwrite earlier ones.
# note: properties don't change after startup and are usually filled
# with data from a cfg file...
# note: only the properties predefined here are allowed to be set in the cfg file
export = Property('flag if this module is to be exported', BoolType(), default=True, export=False)
group = Property('optional group the module belongs to', StringType(), default='', extname='group')
description = Property('description of the module', TextType(), extname='description', mandatory=True)
meaning = Property('optional meaning indicator', TupleOf(StringType(), IntRange(0, 50)),
default=('', 0), extname='meaning')
visibility = Property('optional visibility hint', EnumType('visibility', user=1, advanced=2, expert=3),
default='user', extname='visibility')
implementation = Property('internal name of the implementation class of the module', StringType(),
extname='implementation')
interface_classes = Property('offical highest interface-class of the module', ArrayOf(StringType()),
extname='interface_classes')
features = Property('list of features', ArrayOf(StringType()), extname='features')
pollinterval = Property('poll interval for parameters handled by doPoll', FloatRange(0.1, 120), default=5)
slowinterval = Property('poll interval for other parameters', FloatRange(0.1, 120), default=15)
omit_unchanged_within = Property('default for minimum time between updates of unchanged values',
NoneOr(FloatRange(0)), export=False, default=None)
enablePoll = True
# properties, parameters and commands are auto-merged upon subclassing
parameters = {}
commands = {}
# reference to the dispatcher (used for sending async updates)
DISPATCHER = None
pollInfo = None
triggerPoll = None # trigger event for polls. used on io modules and modules without io
def __init__(self, name, logger, cfgdict, srv):
# remember the dispatcher object (for the async callbacks)
self.DISPATCHER = srv.dispatcher
self.log = logger
self.name = name
self.valueCallbacks = {}
self.errorCallbacks = {}
self.earlyInitDone = False
self.initModuleDone = False
self.startModuleDone = False
self.remoteLogHandler = None
self.accessLock = threading.RLock() # for read_* / write_* methods
self.updateLock = threading.RLock() # for announceUpdate
self.polledModules = [] # modules polled by thread started in self.startModules
self.attachedModules = {}
errors = []
self._isinitialized = False
# handle module properties
# 1) make local copies of properties
super().__init__()
# 2) check and apply properties specified in cfgdict as
# '<propertyname> = <propertyvalue>'
# pylint: disable=consider-using-dict-items
for key in self.propertyDict:
value = cfgdict.pop(key, None)
if value is not None:
try:
if isinstance(value, dict):
self.setProperty(key, value['value'])
else:
self.setProperty(key, value)
except BadValueError:
errors.append(f'{key}: value {value!r} does not match {self.propertyDict[key].datatype!r}!')
# 3) set automatic properties
mycls, = self.__class__.__bases__ # skip the wrapper class
myclassname = f'{mycls.__module__}.{mycls.__name__}'
self.implementation = myclassname
# list of all 'secop' modules
# self.interface_classes = [
# b.__name__ for b in mycls.__mro__ if b.__module__.startswith('frappy.modules')]
# list of only the 'highest' secop module class
self.interface_classes = [
b.__name__ for b in mycls.__mro__ if b in SECoP_BASE_CLASSES][:1]
# handle Features
self.features = [b.__name__ for b in mycls.__mro__ if Feature in b.__bases__]
# handle accessibles
# 1) make local copies of parameter objects
# they need to be individual per instance since we use them also
# to cache the current value + qualifiers...
accessibles = {}
# conversion from exported names to internal attribute names
accessiblename2attr = {}
for aname, aobj in self.accessibles.items():
# make a copy of the Parameter/Command object
aobj = aobj.copy()
if not self.export: # do not export parameters of a module not exported
aobj.export = False
if aobj.export:
accessiblename2attr[aobj.export] = aname
accessibles[aname] = aobj
# do not re-use self.accessibles as this is the same for all instances
self.accessibles = accessibles
self.accessiblename2attr = accessiblename2attr
# provide properties to 'filter' out the parameters/commands
self.parameters = {k: v for k, v in accessibles.items() if isinstance(v, Parameter)}
self.commands = {k: v for k, v in accessibles.items() if isinstance(v, Command)}
# 2) check and apply parameter_properties
bad = []
for aname, cfg in cfgdict.items():
aobj = self.accessibles.get(aname, None)
if aobj:
try:
for propname, propvalue in cfg.items():
aobj.setProperty(propname, propvalue)
except KeyError:
errors.append(f"'{aname}' has no property '{propname}'")
except BadValueError as e:
errors.append(f'{aname}.{propname}: {str(e)}')
else:
bad.append(aname)
# 3) complain about names not found as accessible or property names
if bad:
errors.append(
f"{', '.join(bad)} does not exist (use one of {', '.join(list(self.accessibles) + list(self.propertyDict))})")
# 4) register value for writing, if given
# apply default when no value is given (in cfg or as Parameter argument)
# or complain, when cfg is needed
self.writeDict = {} # values of parameters to be written
for pname, pobj in self.parameters.items():
self.valueCallbacks[pname] = []
self.errorCallbacks[pname] = []
if isinstance(pobj, Limit):
basepname = pname.rpartition('_')[0]
baseparam = self.parameters.get(basepname)
if not baseparam:
errors.append(f'limit {pname!r} is given, but not {basepname!r}')
continue
if baseparam.datatype is None:
continue # an error will be reported on baseparam
pobj.set_datatype(baseparam.datatype)
if not pobj.hasDatatype():
errors.append(f'{pname} needs a datatype')
continue
if pobj.value is None:
if pobj.needscfg:
errors.append(f'{pname!r} has no default value and was not given in config!')
if pobj.default is None:
# we do not want to call the setter for this parameter for now,
# this should happen on the first read
pobj.readerror = ConfigError(f'parameter {pname!r} not initialized')
# above error will be triggered on activate after startup,
# when not all hardware parameters are read because of startup timeout
pobj.default = pobj.datatype.default
pobj.value = pobj.default
else:
# value given explicitly, either by cfg or as Parameter argument
pobj.given = True # for PersistentMixin
if hasattr(self, 'write_' + pname):
self.writeDict[pname] = pobj.value
if pobj.default is None:
pobj.default = pobj.value
# this checks again for datatype and sets the timestamp
setattr(self, pname, pobj.value)
# 5) ensure consistency
for aobj in self.accessibles.values():
aobj.finish(self)
# Modify units AFTER applying the cfgdict
mainvalue = self.parameters.get('value')
if mainvalue:
mainunit = mainvalue.datatype.unit
if mainunit:
self.applyMainUnit(mainunit)
# 6) check complete configuration of * properties
if not errors:
try:
self.checkProperties()
except ConfigError as e:
errors.append(str(e))
for aname, aobj in self.accessibles.items():
try:
aobj.checkProperties()
except (ConfigError, ProgrammingError) as e:
errors.append(f'{aname}: {e}')
if errors:
raise ConfigError(errors)
# helper cfg-editor
def __iter__(self):
return self.accessibles.__iter__()
def __getitem__(self, item):
return self.accessibles.__getitem__(item)
def applyMainUnit(self, mainunit):
"""replace $ in units of parameters by mainunit"""
for pobj in self.parameters.values():
pobj.datatype.set_main_unit(mainunit)
def announceUpdate(self, pname, value=None, err=None, timestamp=None, validate=True):
"""announce a changed value or readerror
:param pname: parameter name
:param value: new value or None in case of error
:param err: None or an exception
:param timestamp: a timestamp or None for taking current time
:param validate: True: convert to datatype, in case of error store in readerror
:return:
when err=None and validate=False, the value must already be converted to the datatype
"""
with self.updateLock:
pobj = self.parameters[pname]
timestamp = timestamp or time.time()
if not err:
try:
if validate:
value = pobj.datatype(value)
except Exception as e:
err = e
else:
changed = pobj.value != value
# store the value even in case of error
pobj.value = value
if err:
if secop_error(err) == pobj.readerror:
err.report_error = False
return # no updates for repeated errors
err = secop_error(err)
elif not changed and timestamp < (pobj.timestamp or 0) + pobj.omit_unchanged_within:
# no change within short time -> omit
return
pobj.timestamp = timestamp or time.time()
if err:
callbacks = self.errorCallbacks
pobj.readerror = arg = err
else:
callbacks = self.valueCallbacks
arg = value
pobj.readerror = None
if pobj.export:
self.DISPATCHER.announce_update(self.name, pname, pobj)
cblist = callbacks[pname]
for cb in cblist:
try:
cb(arg)
except Exception:
# print(formatExtendedTraceback())
pass
def registerCallbacks(self, modobj, autoupdate=()):
"""register callbacks to another module <modobj>
- whenever a self.<param> changes:
<modobj>.update_<param> is called with the new value as argument.
If this method raises an exception, <modobj>.<param> gets into an error state.
If the method does not exist and <param> is in autoupdate,
<modobj>.<param> is updated to self.<param>
- whenever <self>.<param> gets into an error state:
<modobj>.error_update_<param> is called with the exception as argument.
If this method raises an error, <modobj>.<param> gets into an error state.
If this method does not exist, and <param> is in autoupdate,
<modobj>.<param> gets into the same error state as self.<param>
"""
for pname in self.parameters:
errfunc = getattr(modobj, 'error_update_' + pname, None)
if errfunc:
def errcb(err, p=pname, efunc=errfunc):
try:
efunc(err)
except Exception as e:
modobj.announceUpdate(p, err=e)
self.errorCallbacks[pname].append(errcb)
else:
def errcb(err, p=pname):
modobj.announceUpdate(p, err=err)
if pname in autoupdate:
self.errorCallbacks[pname].append(errcb)
updfunc = getattr(modobj, 'update_' + pname, None)
if updfunc:
def cb(value, ufunc=updfunc, efunc=errcb):
try:
ufunc(value)
except Exception as e:
efunc(e)
self.valueCallbacks[pname].append(cb)
elif pname in autoupdate:
def cb(value, p=pname):
modobj.announceUpdate(p, value)
self.valueCallbacks[pname].append(cb)
def isBusy(self, status=None):
"""helper function for treating substates of BUSY correctly"""
# defined even for non drivable (used for dynamic polling)
return False
def earlyInit(self):
"""initialise module with stuff to be done before all modules are created"""
self.earlyInitDone = True
def initModule(self):
"""initialise module with stuff to be done after all modules are created"""
self.initModuleDone = True
if self.enablePoll or self.writeDict:
# enablePoll == False: we still need the poll thread for writing values from writeDict
if hasattr(self, 'io'):
self.io.polledModules.append(self)
else:
self.triggerPoll = threading.Event()
self.polledModules.append(self)
def startModule(self, start_events):
"""runs after init of all modules
when a thread is started, a trigger function may signal that it
has finished its initial work
start_events.get_trigger(<timeout>) creates such a trigger and
registers it in the server for waiting
<timeout> defaults to 30 seconds
"""
if self.polledModules:
mkthread(self.__pollThread, self.polledModules, start_events.get_trigger())
self.startModuleDone = True
def initialReads(self):
"""initial reads to be done
override to read initial values from HW, when it is not desired
to poll them afterwards
called from the poll thread, after writeInitParams but before
all parameters are polled once
"""
def shutdownModule(self):
"""called when the sever shuts down
any cleanup-work should be performed here, like closing threads and
saving data.
"""
def doPoll(self):
"""polls important parameters like value and status
all other parameters are polled automatically
"""
def setFastPoll(self, flag, fast_interval=0.25):
"""change poll interval
:param flag: enable/disable fast poll mode
:param fast_interval: fast poll interval
"""
if self.pollInfo:
self.pollInfo.fast_flag = flag
self.pollInfo.interval = fast_interval if flag else self.pollinterval
self.pollInfo.trigger()
def callPollFunc(self, rfunc, raise_com_failed=False):
"""call read method with proper error handling"""
try:
rfunc()
if rfunc.__name__ in self.pollInfo.pending_errors:
self.log.info('%s: o.k.', rfunc.__name__)
self.pollInfo.pending_errors.discard(rfunc.__name__)
except Exception as e:
if getattr(e, 'report_error', True):
name = rfunc.__name__
self.pollInfo.pending_errors.add(name) # trigger o.k. message after error is resolved
if isinstance(e, SECoPError):
e.raising_methods.append(name)
if e.silent:
self.log.debug('%s', e.format(False))
else:
self.log.error('%s', e.format(False))
if raise_com_failed and isinstance(e, CommunicationFailedError):
raise
else:
# not a SECoPError: this is proabably a programming error
# we want to log the traceback
self.log.error('%s', formatException())
def __pollThread(self, modules, started_callback):
"""poll thread body
:param modules: list of modules to be handled by this thread
:param started_callback: to be called after all polls are done once
before polling, parameters which need hardware initialisation are written
"""
polled_modules = [m for m in modules if m.enablePoll]
if hasattr(self, 'registerReconnectCallback'):
# self is a communicator supporting reconnections
def trigger_all(trg=self.triggerPoll, polled_modules=polled_modules):
for m in polled_modules:
m.pollInfo.last_main = 0
m.pollInfo.last_slow = 0
trg.set()
self.registerReconnectCallback('trigger_polls', trigger_all)
# collect all read functions
for mobj in polled_modules:
pinfo = mobj.pollInfo = PollInfo(mobj.pollinterval, self.triggerPoll)
# trigger a poll interval change when self.pollinterval changes.
if 'pollinterval' in mobj.valueCallbacks:
mobj.valueCallbacks['pollinterval'].append(pinfo.update_interval)
for pname, pobj in mobj.parameters.items():
rfunc = getattr(mobj, 'read_' + pname)
if rfunc.poll:
pinfo.polled_parameters.append((mobj, rfunc, pobj))
while True:
try:
for mobj in modules:
# TODO when needed: here we might add a call to a method :meth:`beforeWriteInit`
mobj.writeInitParams()
mobj.initialReads()
# call all read functions a first time
for m in polled_modules:
for mobj, rfunc, _ in m.pollInfo.polled_parameters:
mobj.callPollFunc(rfunc, raise_com_failed=True)
# TODO when needed: here we might add calls to a method :meth:`afterInitPolls`
break
except CommunicationFailedError as e:
# when communication failed, probably all parameters and may be more modules are affected.
# as this would take a lot of time (summed up timeouts), we do not continue
# trying and let the server accept connections, further polls might success later
if started_callback:
self.log.error('communication failure on startup: %s', e)
started_callback()
started_callback = None
self.triggerPoll.wait(0.1) # wait for reconnection or max 10 sec.
break
if started_callback:
started_callback()
if not polled_modules: # no polls needed - exit thread
return
to_poll = ()
while True:
now = time.time()
wait_time = 999
for mobj in modules:
pinfo = mobj.pollInfo
wait_time = min(pinfo.last_main + pinfo.interval - now, wait_time,
pinfo.last_slow + mobj.slowinterval - now)
if wait_time > 0 and not to_poll:
# nothing to do
self.triggerPoll.wait(wait_time)
self.triggerPoll.clear()
continue
# call doPoll of all modules where due
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_main + pinfo.interval:
try:
pinfo.last_main = (now // pinfo.interval) * pinfo.interval
except ZeroDivisionError:
pinfo.last_main = now
mobj.callPollFunc(mobj.doPoll)
now = time.time()
# find ONE due slow poll and call it
loop = True
while loop: # loops max. 2 times, when to_poll is at end
for mobj, rfunc, pobj in to_poll:
if now > pobj.timestamp + mobj.slowinterval * 0.5:
mobj.callPollFunc(rfunc)
loop = False # one poll done
break
else:
to_poll = []
# collect due slow polls
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_slow + mobj.slowinterval:
to_poll.extend(pinfo.polled_parameters)
pinfo.last_slow = (now // mobj.slowinterval) * mobj.slowinterval
if to_poll:
to_poll = iter(to_poll)
else:
loop = False # no slow polls ready
def writeInitParams(self):
"""write values for parameters with configured values
- does proper error handling
called at the beginning of the poller thread and for writing persistent values
"""
for pname in list(self.writeDict):
value = self.writeDict.pop(pname, Done)
# in the mean time, a poller or handler might already have done it
if value is not Done:
wfunc = getattr(self, 'write_' + pname, None)
if wfunc is None:
setattr(self, pname, value)
else:
try:
self.log.debug('initialize parameter %s', pname)
wfunc(value)
except SECoPError as e:
if e.silent:
self.log.debug('%s: %s', pname, str(e))
else:
self.log.error('%s: %s', pname, str(e))
except Exception:
self.log.error(formatException())
def setRemoteLogging(self, conn, level):
if self.remoteLogHandler is None:
for handler in self.log.handlers:
if isinstance(handler, RemoteLogHandler):
self.remoteLogHandler = handler
break
else:
raise ValueError('remote handler not found')
self.remoteLogHandler.set_conn_level(self, conn, level)
def checkLimits(self, value, pname='target'):
"""check for limits
:param value: the value to be checked for <pname>_min <= value <= <pname>_max
:param pname: parameter name, default is 'target'
raises RangeError in case the value is not valid
This method is called automatically and needs therefore rarely to be
called by the programmer. It might be used in a check_<param> method,
when no automatic super call is desired.
"""
try:
min_, max_ = getattr(self, pname + '_limits')
if not min_ <= value <= max_:
raise RangeError(f'{pname} outside {pname}_limits')
return
except AttributeError:
pass
min_ = getattr(self, pname + '_min', float('-inf'))
max_ = getattr(self, pname + '_max', float('inf'))
if min_ > max_:
raise RangeError(f'invalid limits: {pname}_min > {pname}_max')
if value < min_:
raise RangeError(f'{pname} below {pname}_min')
if value > max_:
raise RangeError(f'{pname} above {pname}_max')
from .modulebase import Module
class Readable(Module):
"""basic readable module"""
# pylint: disable=invalid-name
Status = Enum('Status',
IDLE=StatusType.IDLE,
WARN=StatusType.WARN,
@ -910,7 +93,7 @@ class Drivable(Writable):
@Command(None, result=None)
def stop(self):
"""cease driving, go to IDLE state"""
"""not implemented - this is a no-op"""
class Communicator(HasComlog, Module):
@ -925,13 +108,18 @@ class Communicator(HasComlog, Module):
"""
raise NotImplementedError()
SECoP_BASE_CLASSES = {Readable, Writable, Drivable, Communicator}
class Attached(Property):
"""a special property, defining an attached module
assign a module name to this property in the cfg file,
and the server will create an attribute with this module
When mandatory is set to False, and there is no value or an empty string
given in the config file, the value of the attribute will be None.
"""
def __init__(self, basecls=Module, description='attached module', mandatory=True):
self.basecls = basecls
@ -940,13 +128,20 @@ class Attached(Property):
def __get__(self, obj, owner):
if obj is None:
return self
if self.name not in obj.attachedModules:
modobj = obj.DISPATCHER.get_module(super().__get__(obj, owner))
modobj = obj.attachedModules.get(self.name)
if not modobj:
modulename = super().__get__(obj, owner)
if not modulename:
return None # happens when mandatory=False and modulename is not given
modobj = obj.secNode.get_module(modulename)
if not modobj:
raise ConfigError(f'attached module {self.name}={modulename!r} '
f'does not exist')
if not isinstance(modobj, self.basecls):
raise ConfigError(f'attached module {self.name}={modobj.name!r} '\
f'must inherit from {self.basecls.__qualname__!r}')
raise ConfigError(f'attached module {self.name}={modobj.name!r} '
f'must inherit from {self.basecls.__qualname__!r}')
obj.attachedModules[self.name] = modobj
return obj.attachedModules.get(self.name) # return None if not given
return modobj
def copy(self):
return Attached(self.basecls, self.description, self.mandatory)

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -25,12 +24,12 @@
import inspect
from frappy.datatypes import BoolType, CommandType, DataType, \
DataTypeType, EnumType, NoneOr, OrType, FloatRange, \
StringType, StructOf, TextType, TupleOf, ValueType, ArrayOf
from frappy.errors import BadValueError, WrongTypeError, ProgrammingError
from frappy.properties import HasProperties, Property
from frappy.datatypes import ArrayOf, BoolType, CommandType, DataType, \
DataTypeType, EnumType, FloatRange, NoneOr, OrType, StringType, StructOf, \
TextType, TupleOf, ValueType
from frappy.errors import BadValueError, ProgrammingError, WrongTypeError
from frappy.lib import generalConfig
from frappy.properties import HasProperties, Property
generalConfig.set_default('tolerate_poll_property', False)
generalConfig.set_default('omit_unchanged_within', 0.1)
@ -58,13 +57,17 @@ class Accessible(HasProperties):
def as_dict(self):
return self.propertyValues
def override(self, value):
"""override with a bare value"""
def create_from_value(self, properties, value):
"""return a clone with given value and inherited properties"""
raise NotImplementedError
def clone(self, properties, **kwds):
"""return a clone of ourselfs with inherited properties"""
raise NotImplementedError
def copy(self):
"""return a (deep) copy of ourselfs"""
raise NotImplementedError
return self.clone(self.propertyValues)
def updateProperties(self, merged_properties):
"""update merged_properties with our own properties"""
@ -235,13 +238,15 @@ class Parameter(Accessible):
# avoid export=True overrides export=<name>
self.ownProperties['export'] = self.export
def copy(self):
"""return a (deep) copy of ourselfs"""
res = type(self)()
def clone(self, properties, **kwds):
"""return a clone of ourselfs with inherited properties"""
res = type(self)(**kwds)
res.name = self.name
res.init(self.propertyValues)
res.init(properties)
res.init(res.ownProperties)
if 'datatype' in self.propertyValues:
res.datatype = res.datatype.copy()
res.finish()
return res
def updateProperties(self, merged_properties):
@ -254,9 +259,9 @@ class Parameter(Accessible):
merged_properties.pop(key)
merged_properties.update(self.ownProperties)
def override(self, value):
"""override default"""
self.value = self.datatype(value)
def create_from_value(self, properties, value):
"""return a clone with given value and inherited properties"""
return self.clone(properties, value=self.datatype(value))
def merge(self, merged_properties):
"""merge with inherited properties
@ -391,7 +396,7 @@ class Command(Accessible):
else:
# goodie: allow @Command instead of @Command()
self.func = argument # this is the wrapped method!
if argument.__doc__:
if argument.__doc__ is not None:
self.description = inspect.cleandoc(argument.__doc__)
self.name = self.func.__name__ # this is probably not needed
self._inherit = inherit # save for __set_name__
@ -428,38 +433,49 @@ class Command(Accessible):
def __call__(self, func):
"""called when used as decorator"""
if 'description' not in self.propertyValues and func.__doc__:
if isinstance(self.argument, StructOf):
# automatically set optional struct members
sig = inspect.signature(func)
params = set(sig.parameters.keys())
params.discard('self')
members = set(self.argument.members)
if params != members:
raise ProgrammingError(f'Command {func.__name__}: Function'
f' argument names do not match struct'
f' members!: {params} != {members}')
self.argument.optional = [p for p,v in sig.parameters.items()
if v.default is not inspect.Parameter.empty]
if 'description' not in self.ownProperties and func.__doc__ is not None:
self.description = inspect.cleandoc(func.__doc__)
self.ownProperties['description'] = self.description
self.func = func
return self
def copy(self):
"""return a (deep) copy of ourselfs"""
res = type(self)()
def clone(self, properties, **kwds):
"""return a clone of ourselfs with inherited properties"""
res = type(self)(**kwds)
res.name = self.name
res.func = self.func
res.init(self.propertyValues)
res.init(properties)
res.init(res.ownProperties)
if res.argument:
res.argument = res.argument.copy()
if res.result:
res.result = res.result.copy()
self.finish()
res.finish()
return res
def updateProperties(self, merged_properties):
"""update merged_properties with our own properties"""
merged_properties.update(self.ownProperties)
def override(self, value):
"""override method
def create_from_value(self, properties, value):
"""return a clone with given value and inherited properties
this is needed when the @Command is missing on a method overriding a command"""
if not callable(value):
raise ProgrammingError(f'{self.name} = {value!r} is overriding a Command')
self.func = value
if value.__doc__:
self.description = inspect.cleandoc(value.__doc__)
return self.clone(properties)(value)
def merge(self, merged_properties):
"""merge with inherited properties
@ -488,7 +504,7 @@ class Command(Accessible):
"""perform function call
:param module_obj: the module on which the command is to be executed
:param argument: the argument from the do command
:param argument: the argument from the do command (transported value!)
:returns: the return value converted to the result type
- when the argument type is TupleOf, the function is called with multiple arguments
@ -498,6 +514,15 @@ class Command(Accessible):
# pylint: disable=unnecessary-dunder-call
func = self.__get__(module_obj)
if self.argument:
if argument is None:
raise WrongTypeError(
f'{module_obj.__class__.__name__}.{self.name} needs an'
f' argument of type {self.argument}!'
)
# convert transported value to internal value
argument = self.argument.import_value(argument)
# verify range
self.argument.validate(argument)
if isinstance(self.argument, TupleOf):
res = func(*argument)
elif isinstance(self.argument, StructOf):

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -78,17 +77,14 @@ class PersistentMixin(Module):
super().__init__(name, logger, cfgdict, srv)
persistentdir = os.path.join(generalConfig.logdir, 'persistent')
os.makedirs(persistentdir, exist_ok=True)
self.persistentFile = os.path.join(persistentdir, f'{self.DISPATCHER.equipment_id}.{self.name}.json')
self.persistentFile = os.path.join(persistentdir, f'{self.secNode.equipment_id}.{self.name}.json')
self.initData = {} # "factory" settings
loaded = self.loadPersistentData()
for pname in self.parameters:
pobj = self.parameters[pname]
for pname, pobj in self.parameters.items():
flag = getattr(pobj, 'persistent', False)
if flag:
if flag == 'auto':
def cb(value, m=self):
m.saveParameters()
self.valueCallbacks[pname].append(cb)
self.addCallback(pname, self.saveParameters)
self.initData[pname] = pobj.value
if not pobj.given:
if pname in loaded:
@ -131,16 +127,18 @@ class PersistentMixin(Module):
self.writeInitParams()
return loaded
def saveParameters(self):
def saveParameters(self, _=None):
"""save persistent parameters
- to be called regularly explicitly by the module
- the caller has to make sure that this is not called after
a power down of the connected hardware before loadParameters
dummy argument to avoid closure for callback
"""
if self.writeDict:
# do not save before all values are written to the hw, as potentially
# factory default values were read in the mean time
# factory default values were read in the meantime
return
self.__save_params()

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -70,8 +69,8 @@ class MainLogger:
self.log = None
self.console_handler = None
mlzlog.setLoggerClass(mlzlog.MLZLogger)
assert self.log is None
self.log = mlzlog.log = mlzlog.MLZLogger('')
mlzlog.log = mlzlog.MLZLogger('')
self.log = mlzlog.log.getChild('')
self.log.setLevel(mlzlog.DEBUG)
self.log.addHandler(mlzlog.ColoredConsoleHandler())
self.log.handlers[0].setLevel(LOG_LEVELS['comlog'])
@ -82,20 +81,12 @@ class Dispatcher(dispatcher.Dispatcher):
super().__init__(name, log, options, srv)
self.log = srv.log # overwrite child logger
def announce_update(self, modulename, pname, pobj):
def announce_update(self, moduleobj, pobj):
if pobj.readerror:
value = repr(pobj.readerror)
else:
value = pobj.value
logobj = self._modules.get(modulename, self)
# self.log.info('%s:%s %r', modulename, pname, value)
logobj.log.info('%s %r', pname, value)
def register_module(self, moduleobj, modulename, export=True):
self.log.info('registering %s', modulename)
super().register_module(moduleobj, modulename, export)
setattr(main, modulename, moduleobj)
self.get_module(modulename)
moduleobj.log.info('%s %r', pobj.name, value)
logger = MainLogger()
@ -119,6 +110,10 @@ class Playground(Server):
merged_cfg.pop('node', None)
self.module_cfg = merged_cfg
self._processCfg()
for modulename, moduleobj in self.secnode.modules.items():
cls = type(moduleobj).__bases__[0]
moduleobj.log.info('created as %s.%s', cls.__module__, cls.__name__)
setattr(main, modulename, moduleobj)
play = Playground()

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -27,12 +26,11 @@ import inspect
from frappy.errors import BadValueError, ConfigError, ProgrammingError
from frappy.lib import UniqueObject
from frappy.lib.py35compat import Object
UNSET = UniqueObject('undefined value') #: an unset value, not even None
class HasDescriptors(Object):
class HasDescriptors:
@classmethod
def __init_subclass__(cls):
# when migrating old style declarations, sometimes the trailing comma is not removed

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -18,6 +17,7 @@
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Dispatcher for SECoP Messages
@ -29,28 +29,18 @@ Interface to the service offering part:
on the connectionobj or on activated connections
- 'add_connection(connectionobj)' registers new connection
- 'remove_connection(connectionobj)' removes now longer functional connection
Interface to the modules:
- add_module(modulename, moduleobj, export=True) registers a new module under the
given name, may also register it for exporting (making accessible)
- get_module(modulename) returns the requested module or None
- remove_module(modulename_or_obj): removes the module (during shutdown)
"""
import threading
import traceback
from collections import OrderedDict
from time import time as currenttime
from frappy.errors import NoSuchCommandError, NoSuchModuleError, \
NoSuchParameterError, ProtocolError, ReadOnlyError, ConfigError
NoSuchParameterError, ProtocolError, ReadOnlyError
from frappy.params import Parameter
from frappy.protocol.messages import COMMANDREPLY, DESCRIPTIONREPLY, \
DISABLEEVENTSREPLY, ENABLEEVENTSREPLY, ERRORPREFIX, EVENTREPLY, \
HEARTBEATREPLY, IDENTREPLY, IDENTREQUEST, READREPLY, WRITEREPLY, \
LOGGING_REPLY, LOG_EVENT
from frappy.lib import get_class
HEARTBEATREPLY, IDENTREPLY, IDENTREQUEST, LOG_EVENT, LOGGING_REPLY, \
READREPLY, WRITEREPLY
def make_update(modulename, pobj):
@ -71,10 +61,7 @@ class Dispatcher:
self.nodeprops[k] = options.pop(k)
self.log = logger
# map ALL modulename -> moduleobj
self._modules = {}
# list of EXPORTED modules
self._export = []
self.secnode = srv.secnode
# list all connections
self._connections = []
# active (i.e. broadcast-receiving) connections
@ -88,11 +75,6 @@ class Dispatcher:
self.shutdown = srv.shutdown
# handle to server
self.srv = srv
# set of modules that failed creation
self.failed_modules = set()
# list of errors that occured during initialization
self.errors = []
self.traceback_counter = 0
def broadcast_event(self, msg, reallyall=False):
"""broadcasts a msg to all active connections
@ -111,10 +93,10 @@ class Dispatcher:
for conn in listeners:
conn.send_reply(msg)
def announce_update(self, modulename, pname, pobj):
def announce_update(self, moduleobj, pobj):
"""called by modules param setters to notify subscribers of new values
"""
self.broadcast_event(make_update(modulename, pobj))
self.broadcast_event(make_update(moduleobj.name, pobj))
def subscribe(self, conn, eventname):
self._subscriptions.setdefault(eventname, set()).add(conn)
@ -148,163 +130,10 @@ class Dispatcher:
self._connections.remove(conn)
self.reset_connection(conn)
def register_module(self, moduleobj, modulename, export=True):
self.log.debug('registering module %r as %s (export=%r)',
moduleobj, modulename, export)
self._modules[modulename] = moduleobj
if export:
self._export.append(modulename)
def get_module(self, modulename):
""" Returns a fully initialized module. Or None, if something went
wrong during instatiating/initializing the module."""
modobj = self.get_module_instance(modulename)
if modobj is None:
return None
if modobj._isinitialized:
return modobj
# also call earlyInit on the modules
self.log.debug('initializing module %r', modulename)
try:
modobj.earlyInit()
if not modobj.earlyInitDone:
self.errors.append(f'{modobj.earlyInit.__qualname__} was not called, probably missing super call')
modobj.initModule()
if not modobj.initModuleDone:
self.errors.append(f'{modobj.initModule.__qualname__} was not called, probably missing super call')
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error initializing {modulename}: {e!r}')
modobj._isinitialized = True
self.log.debug('initialized module %r', modulename)
return modobj
def get_module_instance(self, modulename):
""" Returns the module in its current initialization state or creates a
new uninitialized modle to return.
When creating a new module, srv.module_config is accessed to get the
modules configuration.
"""
if modulename in self._modules:
return self._modules[modulename]
if modulename in list(self._modules.values()):
# it's actually already the module object
return modulename
# create module from srv.module_cfg, store and return
self.log.debug('attempting to create module %r', modulename)
opts = self.srv.module_cfg.get(modulename, None)
if opts is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist on this SEC-Node!')
pymodule = None
try: # pylint: disable=no-else-return
classname = opts.pop('cls')
if isinstance(classname, str):
pymodule = classname.rpartition('.')[0]
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = get_class(classname)
else:
pymodule = classname.__module__
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = classname
except Exception as e:
if str(e) == 'no such class':
self.errors.append(f'{classname} not found')
else:
self.failed_modules.add(pymodule)
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error importing {classname}')
return None
else:
try:
modobj = cls(modulename, self.log.getChild(modulename), opts, self.srv)
except ConfigError as e:
self.errors.append(f'error creating module {modulename}:')
for errtxt in e.args[0] if isinstance(e.args[0], list) else [e.args[0]]:
self.errors.append(' ' + errtxt)
modobj = None
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error creating {modulename}')
modobj = None
if modobj:
self.register_module(modobj, modulename, modobj.export)
self.srv.modules[modulename] = modobj # IS HERE THE CORRECT PLACE?
return modobj
def remove_module(self, modulename_or_obj):
moduleobj = self.get_module(modulename_or_obj)
modulename = moduleobj.name
if modulename in self._export:
self._export.remove(modulename)
self._modules.pop(modulename)
self._subscriptions.pop(modulename, None)
for k in [kk for kk in self._subscriptions if kk.startswith(f'{modulename}:')]:
self._subscriptions.pop(k, None)
def list_module_names(self):
# return a copy of our list
return self._export[:]
def export_accessibles(self, modulename):
self.log.debug('export_accessibles(%r)', modulename)
if modulename in self._export:
# omit export=False params!
res = OrderedDict()
for aobj in self.get_module(modulename).accessibles.values():
if aobj.export:
res[aobj.export] = aobj.for_export()
self.log.debug('list accessibles for module %s -> %r',
modulename, res)
return res
self.log.debug('-> module is not to be exported!')
return OrderedDict()
def get_descriptive_data(self, specifier):
"""returns a python object which upon serialisation results in the descriptive data"""
specifier = specifier or ''
modules = {}
result = {'modules': modules}
for modulename in self._export:
module = self.get_module(modulename)
if not module.export:
continue
# some of these need rework !
mod_desc = {'accessibles': self.export_accessibles(modulename)}
mod_desc.update(module.exportProperties())
mod_desc.pop('export', False)
modules[modulename] = mod_desc
modname, _, pname = specifier.partition(':')
if modname in modules: # extension to SECoP standard: description of a single module
result = modules[modname]
if pname in result['accessibles']: # extension to SECoP standard: description of a single accessible
# command is also accepted
result = result['accessibles'][pname]
elif pname:
raise NoSuchParameterError(f'Module {modname!r} has no parameter {pname!r}')
elif not modname or modname == '.':
result['equipment_id'] = self.equipment_id
result['firmware'] = 'FRAPPY - The Python Framework for SECoP'
result['version'] = '2021.02'
result.update(self.nodeprops)
else:
raise NoSuchModuleError(f'Module {modname!r} does not exist')
return result
def _execute_command(self, modulename, exportedname, argument=None):
moduleobj = self.get_module(modulename)
""" Execute a command. Importing the value is done in 'do' for nicer
error messages."""
moduleobj = self.secnode.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -312,9 +141,6 @@ class Dispatcher:
cobj = moduleobj.commands.get(cname)
if cobj is None:
raise NoSuchCommandError(f'Module {modulename!r} has no command {cname or exportedname!r}')
if cobj.argument:
argument = cobj.argument.import_value(argument)
# now call func
# note: exceptions are handled in handle_request, not here!
result = cobj.do(moduleobj, argument)
@ -323,7 +149,7 @@ class Dispatcher:
return result, {'t': currenttime()}
def _setParameterValue(self, modulename, exportedname, value):
moduleobj = self.get_module(modulename)
moduleobj = self.secnode.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -336,7 +162,9 @@ class Dispatcher:
if pobj.readonly:
raise ReadOnlyError(f"Parameter {modulename}:{pname} can not be changed remotely")
# validate!
# convert transported value to internal value
value = pobj.datatype.import_value(value)
# verify range
value = pobj.datatype.validate(value, previous=pobj.value)
# note: exceptions are handled in handle_request, not here!
getattr(moduleobj, 'write_' + pname)(value)
@ -344,7 +172,7 @@ class Dispatcher:
return pobj.export_value(), {'t': pobj.timestamp} if pobj.timestamp else {}
def _getParameterValue(self, modulename, exportedname):
moduleobj = self.get_module(modulename)
moduleobj = self.secnode.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -401,7 +229,7 @@ class Dispatcher:
return (IDENTREPLY, None, None)
def handle_describe(self, conn, specifier, data):
return (DESCRIPTIONREPLY, specifier or '.', self.get_descriptive_data(specifier))
return (DESCRIPTIONREPLY, specifier or '.', self.secnode.get_descriptive_data(specifier))
def handle_read(self, conn, specifier, data):
if data:
@ -440,9 +268,9 @@ class Dispatcher:
modulename, exportedname = specifier, None
if ':' in specifier:
modulename, exportedname = specifier.split(':', 1)
if modulename not in self._export:
if modulename not in self.secnode.export:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
moduleobj = self.get_module(modulename)
moduleobj = self.secnode.get_module(modulename)
if exportedname is not None:
pname = moduleobj.accessiblename2attr.get(exportedname, True)
if pname and pname not in moduleobj.accessibles:
@ -456,12 +284,12 @@ class Dispatcher:
else:
# activate all modules
self._active_connections.add(conn)
modules = [(m, None) for m in self._export]
modules = [(m, None) for m in self.secnode.export]
# send updates for all subscribed values.
# note: The initial poll already happend before the server is active
for modulename, pname in modules:
moduleobj = self._modules.get(modulename, None)
moduleobj = self.secnode.modules.get(modulename, None)
if pname:
conn.send_reply(make_update(modulename, moduleobj.parameters[pname]))
continue
@ -485,16 +313,13 @@ class Dispatcher:
conn.send_reply((LOG_EVENT, f'{modname}:{level}', msg))
def set_all_log_levels(self, conn, level):
for modobj in self._modules.values():
modobj.setRemoteLogging(conn, level)
for modobj in self.secnode.modules.values():
modobj.setRemoteLogging(conn, level, self.send_log_msg)
def handle_logging(self, conn, specifier, level):
if specifier == '#':
self.log.handlers[1].setLevel(int(level))
return LOGGING_REPLY, specifier, level
if specifier and specifier != '.':
modobj = self._modules[specifier]
modobj.setRemoteLogging(conn, level)
modobj = self.secnode.modules[specifier]
modobj.setRemoteLogging(conn, level, self.send_log_msg)
else:
self.set_all_log_levels(conn, level)
return LOGGING_REPLY, specifier, level

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -143,7 +143,6 @@ class TCPRequestHandler(socketserver.BaseRequestHandler):
if not data:
self.log.error('should not reply empty data!')
return
self.log.debug('send %r', data)
outdata = encode_msg_frame(*data)
with self.send_lock:
if self.running:
@ -232,13 +231,6 @@ class TCPServer(DualStackTCPServer):
self.log.warning('tried again %d times after "Address already in use"', ntry)
self.log.info("TCPServer initiated")
# py35 compatibility
if not hasattr(socketserver.ThreadingTCPServer, '__exit__'):
def __enter__(self):
return self
def __exit__(self, *args):
self.server_close()
def format_address(addr):
if len(addr) == 2:

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -72,7 +71,7 @@ class ProxyModule(HasIO, Module):
pname, pobj = params.popitem()
props = remoteparams.get(pname, None)
if props is None:
if pobj.export:
if pobj.export and pname != 'status':
self.log.warning('remote parameter %s:%s does not exist', self.module, pname)
continue
dt = props['datatype']
@ -109,17 +108,19 @@ class ProxyModule(HasIO, Module):
# for now, the error message must be enough
def nodeStateChange(self, online, state):
disconnected = Readable.Status.ERROR, 'disconnected'
if online:
if not self._consistency_check_done:
self._check_descriptive_data()
self._consistency_check_done = True
if self.status == disconnected:
self.status = Readable.Status.IDLE, 'connected'
else:
newstatus = Readable.Status.ERROR, 'disconnected'
readerror = CommunicationFailedError('disconnected')
if self.status != newstatus:
if self.status != disconnected:
for pname in set(self.parameters) - set(('module', 'status')):
self.announceUpdate(pname, None, readerror)
self.announceUpdate('status', newstatus)
self.status = disconnected
def checkProperties(self):
pass # skip
@ -194,7 +195,7 @@ def proxy_class(remote_class, name=None):
attrs[aname] = pobj
def rfunc(self, pname=aname):
value, _, readerror = self._secnode.getParameter(self.name, pname, True)
value, _, readerror = self._secnode.getParameter(self.module, pname, True)
if readerror:
raise readerror
return value
@ -204,7 +205,7 @@ def proxy_class(remote_class, name=None):
if not pobj.readonly:
def wfunc(self, value, pname=aname):
value, _, readerror = self._secnode.setParameter(self.name, pname, value)
value, _, readerror = self._secnode.setParameter(self.module, pname, value)
if readerror:
raise readerror
return value
@ -215,7 +216,7 @@ def proxy_class(remote_class, name=None):
cobj = aobj.copy()
def cfunc(self, arg=None, cname=aname):
return self._secnode.execCommand(self.name, cname, arg)[0]
return self._secnode.execCommand(self.module, cname, arg)[0]
attrs[aname] = cobj(cfunc)

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

283
frappy/secnode.py Normal file
View File

@ -0,0 +1,283 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
import traceback
from collections import OrderedDict
from frappy.dynamic import Pinata
from frappy.errors import ConfigError, NoSuchModuleError, NoSuchParameterError
from frappy.lib import get_class
class SecNode:
"""Managing the modules.
Interface to the modules:
- add_module(module, modulename)
- get_module(modulename) returns the requested module or None if there is
no suitable configuration on the server
"""
def __init__(self, name, logger, options, srv):
self.equipment_id = options.pop('equipment_id', name)
self.nodeprops = {}
for k in list(options):
self.nodeprops[k] = options.pop(k)
# map ALL modulename -> moduleobj
self.modules = {}
# list of EXPORTED modules
self.export = []
self.log = logger
self.srv = srv
# set of modules that failed creation
self.failed_modules = set()
# list of errors that occured during initialization
self.errors = []
self.traceback_counter = 0
self.name = name
def get_module(self, modulename):
""" Returns a fully initialized module. Or None, if something went
wrong during instatiating/initializing the module."""
modobj = self.get_module_instance(modulename)
if modobj is None:
return None
if modobj._isinitialized:
return modobj
# also call earlyInit on the modules
self.log.debug('initializing module %r', modulename)
try:
modobj.earlyInit()
if not modobj.earlyInitDone:
self.errors.append(f'{modobj.earlyInit.__qualname__} was not '
f'called, probably missing super call')
modobj.initModule()
if not modobj.initModuleDone:
self.errors.append(f'{modobj.initModule.__qualname__} was not '
f'called, probably missing super call')
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error initializing {modulename}: {e!r}')
modobj._isinitialized = True
self.log.debug('initialized module %r', modulename)
return modobj
def get_module_instance(self, modulename):
""" Returns the module in its current initialization state or creates a
new uninitialized module to return.
When creating a new module, srv.module_config is accessed to get the
modules configuration.
"""
if modulename in self.modules:
return self.modules[modulename]
if modulename in list(self.modules.values()):
# it's actually already the module object
return modulename
# create module from srv.module_cfg, store and return
self.log.debug('attempting to create module %r', modulename)
opts = self.srv.module_cfg.get(modulename, None)
if opts is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist on '
f'this SEC-Node!')
opts = dict(opts)
pymodule = None
try: # pylint: disable=no-else-return
classname = opts.pop('cls')
if isinstance(classname, str):
pymodule = classname.rpartition('.')[0]
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = get_class(classname)
else:
pymodule = classname.__module__
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = classname
except Exception as e:
if str(e) == 'no such class':
self.errors.append(f'{classname} not found')
else:
self.failed_modules.add(pymodule)
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error importing {classname}')
return None
else:
try:
modobj = cls(modulename, self.log.parent.getChild(modulename),
opts, self.srv)
except ConfigError as e:
self.errors.append(f'error creating module {modulename}:')
for errtxt in e.args[0] if isinstance(e.args[0], list) else [e.args[0]]:
self.errors.append(' ' + errtxt)
modobj = None
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error creating {modulename}')
modobj = None
if modobj:
self.add_module(modobj, modulename)
return modobj
def create_modules(self):
self.modules = OrderedDict()
# create and initialize modules
todos = list(self.srv.module_cfg.items())
while todos:
modname, options = todos.pop(0)
if modname in self.modules:
# already created via Attached
continue
# For Pinata modules: we need to access this in Self.get_module
self.srv.module_cfg[modname] = options
modobj = self.get_module_instance(modname) # lazy
if modobj is None:
self.log.debug('Module %s returned None', modname)
continue
self.modules[modname] = modobj
if isinstance(modobj, Pinata):
# scan for dynamic devices
pinata = self.get_module(modname)
pinata_modules = list(pinata.scanModules())
for name, _cfg in pinata_modules:
if name in self.srv.module_cfg:
self.log.error('Module %s, from pinata %s, already '
'exists in config file!', name, modname)
self.log.info('Pinata %s found %d modules',
modname, len(pinata_modules))
todos.extend(pinata_modules)
def export_accessibles(self, modulename):
self.log.debug('export_accessibles(%r)', modulename)
if modulename in self.export:
# omit export=False params!
res = OrderedDict()
for aobj in self.get_module(modulename).accessibles.values():
if aobj.export:
res[aobj.export] = aobj.for_export()
self.log.debug('list accessibles for module %s -> %r',
modulename, res)
return res
self.log.debug('-> module is not to be exported!')
return OrderedDict()
def get_descriptive_data(self, specifier):
"""returns a python object which upon serialisation results in the
descriptive data"""
specifier = specifier or ''
modules = {}
result = {'modules': modules}
for modulename in self.export:
module = self.get_module(modulename)
if not module.export:
continue
# some of these need rework !
mod_desc = {'accessibles': self.export_accessibles(modulename)}
mod_desc.update(module.exportProperties())
mod_desc.pop('export', False)
modules[modulename] = mod_desc
modname, _, pname = specifier.partition(':')
if modname in modules: # extension to SECoP standard: description of a single module
result = modules[modname]
if pname in result['accessibles']: # extension to SECoP standard: description of a single accessible
# command is also accepted
result = result['accessibles'][pname]
elif pname:
raise NoSuchParameterError(f'Module {modname!r} '
f'has no parameter {pname!r}')
elif not modname or modname == '.':
result['equipment_id'] = self.equipment_id
result['firmware'] = 'FRAPPY - The Python Framework for SECoP'
result['version'] = '2021.02'
result.update(self.nodeprops)
else:
raise NoSuchModuleError(f'Module {modname!r} does not exist')
return result
def add_module(self, module, modulename):
"""Adds a named module object to this SecNode."""
self.modules[modulename] = module
if module.export:
self.export.append(modulename)
# def remove_module(self, modulename_or_obj):
# moduleobj = self.get_module(modulename_or_obj)
# modulename = moduleobj.name
# if modulename in self.export:
# self.export.remove(modulename)
# self.modules.pop(modulename)
# self._subscriptions.pop(modulename, None)
# for k in [kk for kk in self._subscriptions if kk.startswith(f'{modulename}:')]:
# self._subscriptions.pop(k, None)
def shutdown_modules(self):
"""Call 'shutdownModule' for all modules."""
for name in self._getSortedModules():
self.modules[name].shutdownModule()
def _getSortedModules(self):
"""Sort modules topologically by inverse dependency.
Example: if there is an IO device A and module B depends on it, then
the result will be [B, A].
Right now, if the dependency graph is not a DAG, we give up and return
the unvisited nodes to be dismantled at the end.
Taken from Introduction to Algorithms [CLRS].
"""
def go(name):
if name in done: # visiting a node
return True
if name in visited:
visited.add(name)
return False # cycle in dependencies -> fail
visited.add(name)
if name in unmarked:
unmarked.remove(name)
for module in self.modules[name].attachedModules.values():
res = go(module.name)
if not res:
return False
visited.remove(name)
done.add(name)
l.append(name)
return True
unmarked = set(self.modules.keys()) # unvisited nodes
visited = set() # visited in DFS, but not completed
done = set()
l = [] # list of sorted modules
while unmarked:
if not go(unmarked.pop()):
self.log.error('cyclical dependency between modules!')
return l[::-1] + list(visited) + list(unmarked)
return l[::-1]

View File

@ -25,14 +25,14 @@
import os
import signal
import sys
from collections import OrderedDict
import threading
from frappy.config import load_config
from frappy.errors import ConfigError
from frappy.dynamic import Pinata
from frappy.lib import formatException, generalConfig, get_class, mkthread
from frappy.lib.multievent import MultiEvent
from frappy.params import PREDEFINED_ACCESSIBLES
from frappy.secnode import SecNode
try:
from daemon import DaemonContext
@ -71,19 +71,18 @@ class Server:
multiple cfg files, the interface is taken from the first cfg file
- testonly: test mode. tries to build all modules, but the server is not started
Format of cfg file (for now, both forms are accepted):
old form: new form:
[node <equipment id>] [NODE]
description=<descr> id=<equipment id>
description=<descr>
[interface tcp] [INTERFACE]
bindport=10769 uri=tcp://10769
bindto=0.0.0.0
[module temp] [temp]
ramp=12 ramp=12
Config file:
Format: Example:
Node('<equipment_id>', Node('ex.frappy.demo',
<description>, 'short description\n\nlong descr.',
<main interface>, 'tcp://10769',
secondary=[ secondary=['ws://10770'], # optional
<interfaces>
],
) )
Mod('<module name>', Mod('temp',
<param config> value = Param(unit='K'),
) )
...
"""
self._testonly = testonly
@ -108,9 +107,13 @@ class Server:
signal.signal(signal.SIGINT, self.signal_handler)
signal.signal(signal.SIGTERM, self.signal_handler)
def signal_handler(self, _num, _frame):
if hasattr(self, 'interface') and self.interface:
def signal_handler(self, num, frame):
if hasattr(self, 'interfaces') and self.interfaces:
self.shutdown()
else:
# TODO: we should probably clean up the already initialized modules
# when getting an interrupt while the server is starting
signal.default_int_handler(num, frame)
def start(self):
if not DaemonContext:
@ -152,33 +155,44 @@ class Server:
print(formatException(verbose=True))
raise
opts = {'uri': self.node_cfg['interface']}
scheme, _, _ = opts['uri'].rpartition('://')
scheme = scheme or 'tcp'
cls = get_class(self.INTERFACES[scheme])
with cls(scheme, self.log.getChild(scheme), opts, self) as self.interface:
if opts:
raise ConfigError(self.unknown_options(cls, opts))
self.log.info('startup done, handling transport messages')
if systemd:
systemd.daemon.notify("READY=1\nSTATUS=accepting requests")
t = mkthread(self.interface.serve_forever)
# we wait here on the thread finishing, which means we got a
# signal to shut down or an exception was raised
# TODO: get the exception (and re-raise?)
self.interfaces = []
iface_threads = []
interfaces_started = MultiEvent(default_timeout=1)#default_timeout=15)
lock = threading.Lock()
# TODO: check if only one interface of each type is open?
for interface in [self.node_cfg['interface']] + self.node_cfg.get(
'secondary', []
):
opts = {'uri': interface}
t = mkthread(
self._interfaceThread,
opts,
lock,
self.interfaces.append,
interfaces_started.get_trigger(),
)
iface_threads.append(t)
interfaces_started.wait()
self.log.info('startup done, handling transport messages')
if systemd:
systemd.daemon.notify("READY=1\nSTATUS=accepting requests")
self.log.info('Started %d interfaces' % len(self.interfaces))
# we wait here on the thread finishing, which means we got a
# signal to shut down or an exception was raised
# TODO: get the exception (and re-raise?)
for t in iface_threads:
t.join()
self.interface = None # fine due to the semantics of 'with'
# server_close() called by 'with'
self.log.info(f'stopped listenning, cleaning up'
f' {len(self.modules)} modules')
f' {len(self.secnode.modules)} modules')
# if systemd:
# if self._restart:
# systemd.daemon.notify('RELOADING=1')
# else:
# systemd.daemon.notify('STOPPING=1')
for name in self._getSortedModules():
self.modules[name].shutdownModule()
self.secnode.shutdown_modules()
if self._restart:
self.restart_hook()
self.log.info('restarting')
@ -187,11 +201,28 @@ class Server:
def restart(self):
if not self._restart:
self._restart = True
self.interface.shutdown()
for iface in self.interfaces:
iface.shutdown()
def shutdown(self):
self._restart = False
self.interface.shutdown()
for iface in self.interfaces:
iface.shutdown()
def _interfaceThread(self, opts, lock, if_cb, start_cb):
scheme, _, _ = opts['uri'].rpartition('://')
iface = opts['uri']
scheme = scheme or 'tcp'
cls = get_class(self.INTERFACES[scheme])
with cls(scheme, self.log.getChild(scheme), opts, self) as interface:
if opts:
raise ConfigError(self.unknown_options(cls, opts))
with lock:
if_cb(interface)
start_cb()
interface.serve_forever()
# server_close() called by 'with'
self.log.info(f'stopped {iface}')
def _processCfg(self):
"""Processes the module configuration.
@ -205,50 +236,27 @@ class Server:
errors = []
opts = dict(self.node_cfg)
cls = get_class(opts.pop('cls'))
self.dispatcher = cls(opts.pop('name', self._cfgfiles),
self.log.getChild('dispatcher'), opts, self)
name = opts.pop('name', self._cfgfiles)
# TODO: opts not in both
self.secnode = SecNode(name, self.log.getChild('secnode'), opts, self)
self.dispatcher = cls(name, self.log.getChild('dispatcher'), opts, self)
if opts:
self.dispatcher.errors.append(self.unknown_options(cls, opts))
self.modules = OrderedDict()
# create and initialize modules
todos = list(self.module_cfg.items())
while todos:
modname, options = todos.pop(0)
if modname in self.modules:
# already created by Dispatcher (via Attached)
continue
# For Pinata modules: we need to access this in Dispatcher.get_module
self.module_cfg[modname] = dict(options)
modobj = self.dispatcher.get_module_instance(modname) # lazy
if modobj is None:
self.log.debug('Module %s returned None', modname)
continue
self.modules[modname] = modobj
if isinstance(modobj, Pinata):
# scan for dynamic devices
pinata = self.dispatcher.get_module(modname)
pinata_modules = list(pinata.scanModules())
for name, _cfg in pinata_modules:
if name in self.module_cfg:
self.log.error('Module %s, from pinata %s, already'
' exists in config file!', name, modname)
self.log.info('Pinata %s found %d modules', modname, len(pinata_modules))
todos.extend(pinata_modules)
self.secnode.errors.append(self.unknown_options(cls, opts))
self.secnode.create_modules()
# initialize all modules by getting them with Dispatcher.get_module,
# which is done in the get_descriptive data
# TODO: caching, to not make this extra work
self.dispatcher.get_descriptive_data('')
self.secnode.get_descriptive_data('')
# =========== All modules are initialized ===========
# all errors from initialization process
errors = self.dispatcher.errors
errors = self.secnode.errors
if not self._testonly:
start_events = MultiEvent(default_timeout=30)
for modname, modobj in self.modules.items():
for modname, modobj in self.secnode.modules.items():
# startModule must return either a timeout value or None (default 30 sec)
start_events.name = f'module {modname}'
modobj.startModule(start_events)
@ -275,7 +283,8 @@ class Server:
self.log.info('all modules started')
history_path = os.environ.get('FRAPPY_HISTORY')
if history_path:
from frappy_psi.historywriter import FrappyHistoryWriter # pylint: disable=import-outside-toplevel
from frappy_psi.historywriter import \
FrappyHistoryWriter # pylint: disable=import-outside-toplevel
writer = FrappyHistoryWriter(history_path, PREDEFINED_ACCESSIBLES.keys(), self.dispatcher)
# treat writer as a connection
self.dispatcher.add_connection(writer)
@ -288,41 +297,3 @@ class Server:
# history_path = os.environ.get('ALTERNATIVE_HISTORY')
# if history_path:
# from frappy_<xx>.historywriter import ... etc.
def _getSortedModules(self):
"""Sort modules topologically by inverse dependency.
Example: if there is an IO device A and module B depends on it, then
the result will be [B, A].
Right now, if the dependency graph is not a DAG, we give up and return
the unvisited nodes to be dismantled at the end.
Taken from Introduction to Algorithms [CLRS].
"""
def go(name):
if name in done: # visiting a node
return True
if name in visited:
visited.add(name)
return False # cycle in dependencies -> fail
visited.add(name)
if name in unmarked:
unmarked.remove(name)
for module in self.modules[name].attachedModules.values():
res = go(module.name)
if not res:
return False
visited.remove(name)
done.add(name)
l.append(name)
return True
unmarked = set(self.modules.keys()) # unvisited nodes
visited = set() # visited in DFS, but not completed
done = set()
l = [] # list of sorted modules
while unmarked:
if not go(unmarked.pop()):
self.log.error('cyclical dependency between modules!')
return l[::-1] + list(visited) + list(unmarked)
return l[::-1]

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -240,6 +239,7 @@ class HasStates:
@Command
def stop(self):
"""stop state machine"""
self.stop_machine()
def final_status(self, code=IDLE, text=''):

View File

@ -1,164 +0,0 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Markus Zolliker <markus.zolliker@psi.ch>
#
# *****************************************************************************
"""convenience class to create a struct Parameter together with indivdual params
Usage:
class Controller(Drivable):
...
ctrlpars = StructParam('ctrlpars struct', [
('pid_p', 'p', Parameter('control parameter p', FloatRange())),
('pid_i', 'i', Parameter('control parameter i', FloatRange())),
('pid_d', 'd', Parameter('control parameter d', FloatRange())),
], readonly=False)
...
then implement either read_ctrlpars and write_ctrlpars or
read_pid_p, read_pid_i, read_pid_d, write_pid_p, write_pid_i and write_pid_d
the methods not implemented will be created automatically
"""
from frappy.core import Parameter, Property
from frappy.datatypes import BoolType, DataType, StructOf, ValueType
from frappy.errors import ProgrammingError
class StructParam(Parameter):
"""create a struct parameter together with individual parameters
in addition to normal Parameter arguments:
:param paramdict: dict <member name> of Parameter(...)
:param prefix_or_map: either a prefix for the parameter name to add to the member name
or a dict <member name> or <paramerter name>
"""
# use properties, as simple attributes are not considered on copy()
paramdict = Property('dict <parametername> of Parameter(...)', ValueType())
hasStructRW = Property('has a read_<struct param> or write_<struct param> method',
BoolType(), default=False)
insideRW = 0 # counter for avoiding multiple superfluous updates
def __init__(self, description=None, paramdict=None, prefix_or_map='', *, datatype=None, readonly=False, **kwds):
if isinstance(paramdict, DataType):
raise ProgrammingError('second argument must be a dict of Param')
if datatype is None and paramdict is not None: # omit the following on Parameter.copy()
if isinstance(prefix_or_map, str):
prefix_or_map = {m: prefix_or_map + m for m in paramdict}
for membername, param in paramdict.items():
param.name = prefix_or_map[membername]
datatype = StructOf(**{m: p.datatype for m, p in paramdict.items()})
kwds['influences'] = [p.name for p in paramdict.values()]
self.updateEnable = {}
super().__init__(description, datatype, paramdict=paramdict, readonly=readonly, **kwds)
def __set_name__(self, owner, name):
# names of access methods of structed param (e.g. ctrlpars)
struct_read_name = f'read_{name}' # e.g. 'read_ctrlpars'
struct_write_name = f'write_{name}' # e.h. 'write_ctrlpars'
self.hasStructRW = hasattr(owner, struct_read_name) or hasattr(owner, struct_write_name)
for membername, param in self.paramdict.items():
pname = param.name
changes = {
'readonly': self.readonly,
'influences': set(param.influences) | {name},
}
param.ownProperties.update(changes)
param.init(changes)
setattr(owner, pname, param)
param.__set_name__(owner, param.name)
if self.hasStructRW:
rname = f'read_{pname}'
if not hasattr(owner, rname):
def rfunc(self, membername=membername, struct_read_name=struct_read_name):
return getattr(self, struct_read_name)()[membername]
rfunc.poll = False # read_<struct param> is polled only
setattr(owner, rname, rfunc)
if not self.readonly:
wname = f'write_{pname}'
if not hasattr(owner, wname):
def wfunc(self, value, membername=membername,
name=name, rname=rname, struct_write_name=struct_write_name):
valuedict = dict(getattr(self, name))
valuedict[membername] = value
getattr(self, struct_write_name)(valuedict)
return getattr(self, rname)()
setattr(owner, wname, wfunc)
if not self.hasStructRW:
if not hasattr(owner, struct_read_name):
def struct_read_func(self, name=name, flist=tuple(
(m, f'read_{p.name}') for m, p in self.paramdict.items())):
pobj = self.parameters[name]
# disable updates generated from the callbacks of individual params
pobj.insideRW += 1 # guarded by self.accessLock
try:
return {m: getattr(self, f)() for m, f in flist}
finally:
pobj.insideRW -= 1
setattr(owner, struct_read_name, struct_read_func)
if not (self.readonly or hasattr(owner, struct_write_name)):
def struct_write_func(self, value, name=name, funclist=tuple(
(m, f'write_{p.name}') for m, p in self.paramdict.items())):
pobj = self.parameters[name]
pobj.insideRW += 1 # guarded by self.accessLock
try:
return {m: getattr(self, f)(value[m]) for m, f in funclist}
finally:
pobj.insideRW -= 1
setattr(owner, struct_write_name, struct_write_func)
super().__set_name__(owner, name)
def finish(self, modobj=None):
"""register callbacks for consistency"""
super().finish(modobj)
if modobj:
if self.hasStructRW:
def cb(value, modobj=modobj, structparam=self):
for membername, param in structparam.paramdict.items():
setattr(modobj, param.name, value[membername])
modobj.valueCallbacks[self.name].append(cb)
else:
for membername, param in self.paramdict.items():
def cb(value, modobj=modobj, structparam=self, membername=membername):
if not structparam.insideRW:
prev = dict(getattr(modobj, structparam.name))
prev[membername] = value
setattr(modobj, structparam.name, prev)
modobj.valueCallbacks[param.name].append(cb)

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
@ -16,6 +15,7 @@
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""testing devices"""
@ -28,9 +28,8 @@ import time
from frappy.datatypes import ArrayOf, BoolType, EnumType, \
FloatRange, IntRange, StringType, StructOf, TupleOf
from frappy.lib.enum import Enum
from frappy.modules import Drivable
from frappy.modules import Drivable, Readable, Attached
from frappy.modules import Parameter as SECoP_Parameter
from frappy.modules import Readable
from frappy.properties import Property
@ -119,10 +118,9 @@ class MagneticField(Drivable):
default=1, datatype=EnumType(persistent=1, hold=0),
readonly=False,
)
heatswitch = Parameter('name of heat switch device',
datatype=StringType(), export=False,
)
heatswitch = Attached(Switch, description='name of heat switch device')
# pylint: disable=invalid-name
Status = Enum(Drivable.Status, PERSIST=PERSIST, PREPARE=301, RAMPING=302, FINISH=303)
status = Parameter(datatype=TupleOf(EnumType(Status), StringType()))
@ -130,7 +128,6 @@ class MagneticField(Drivable):
def initModule(self):
super().initModule()
self._state = Enum('state', idle=1, switch_on=2, switch_off=3, ramp=4).idle
self._heatswitch = self.DISPATCHER.get_module(self.heatswitch)
_thread = threading.Thread(target=self._thread)
_thread.daemon = True
_thread.start()
@ -165,10 +162,10 @@ class MagneticField(Drivable):
if self.target != self.value:
self.log.debug('got new target -> switching heater on')
self._state = self._state.enum.switch_on
self._heatswitch.write_target('on')
self.heatswitch.write_target('on')
if self._state == self._state.enum.switch_on:
# wait until switch is on
if self._heatswitch.read_value() == 'on':
if self.heatswitch.read_value() == 'on':
self.log.debug('heatswitch is on -> ramp to %.3f',
self.target)
self._state = self._state.enum.ramp
@ -178,7 +175,7 @@ class MagneticField(Drivable):
if self.mode:
self.log.debug('at field -> switching heater off')
self._state = self._state.enum.switch_off
self._heatswitch.write_target('off')
self.heatswitch.write_target('off')
else:
self.log.debug('at field -> hold')
self._state = self._state.enum.idle
@ -189,7 +186,7 @@ class MagneticField(Drivable):
self.value += step
if self._state == self._state.enum.switch_off:
# wait until switch is off
if self._heatswitch.read_value() == 'off':
if self.heatswitch.read_value() == 'off':
self.log.debug('heatswitch is off at %.3f', self.value)
self._state = self._state.enum.idle
self.read_status() # update async
@ -197,6 +194,7 @@ class MagneticField(Drivable):
self.log.error(self, 'main thread exited unexpectedly!')
def stop(self):
"""stop at current value"""
self.write_target(self.read_value())
@ -269,12 +267,8 @@ class Label(Readable):
system = Parameter("Name of the magnet system",
datatype=StringType(), export=False,
)
subdev_mf = Parameter("name of subdevice for magnet status",
datatype=StringType(), export=False,
)
subdev_ts = Parameter("name of subdevice for sample temp",
datatype=StringType(), export=False,
)
mf = Attached(MagneticField, description="subdevice for magnet status")
ts = Attached(SampleTemp, description="subdevice for sample temp")
value = Parameter("final value of label string", default='',
datatype=StringType(),
)
@ -282,18 +276,16 @@ class Label(Readable):
def read_value(self):
strings = [self.system]
dev_ts = self.DISPATCHER.get_module(self.subdev_ts)
if dev_ts:
strings.append(f"at {dev_ts.read_value():.3f} {dev_ts.parameters['value'].datatype.unit}")
if self.ts:
strings.append(f"at {self.ts.read_value():.3f} {self.ts.parameters['value'].datatype.unit}")
else:
strings.append('No connection to sample temp!')
dev_mf = self.DISPATCHER.get_module(self.subdev_mf)
if dev_mf:
mf_stat = dev_mf.read_status()
mf_mode = dev_mf.mode
mf_val = dev_mf.value
mf_unit = dev_mf.parameters['value'].datatype.unit
if self.mf:
mf_stat = self.mf.read_status()
mf_mode = self.mf.mode
mf_val = self.mf.value
mf_unit = self.mf.parameters['value'].datatype.unit
if mf_stat[0] == self.Status.IDLE:
state = 'Persistent' if mf_mode else 'Non-persistent'
else:

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -17,6 +16,7 @@
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
@ -28,10 +28,10 @@
import math
from frappy.datatypes import ArrayOf, FloatRange, StringType, StructOf, TupleOf
from frappy.datatypes import ArrayOf, FloatRange, StructOf, TupleOf
from frappy.errors import ConfigError, DisabledError
from frappy.lib.sequence import SequencerMixin, Step
from frappy.modules import Drivable, Parameter
from frappy.modules import Drivable, Parameter, Attached
class GarfieldMagnet(SequencerMixin, Drivable):
@ -47,19 +47,12 @@ class GarfieldMagnet(SequencerMixin, Drivable):
the symmetry setting selects which.
"""
# attached submodules
currentsource = Attached(description='(bipolar) Powersupply')
enable = Attached(description='Switch to set for on/off')
polswitch = Attached(description='Switch to set for polarity')
symmetry = Attached(description='Switch to read for symmetry')
# parameters
subdev_currentsource = Parameter('(bipolar) Powersupply',
datatype=StringType(),
readonly=True, export=False)
subdev_enable = Parameter('Switch to set for on/off',
datatype=StringType(), readonly=True,
export=False)
subdev_polswitch = Parameter('Switch to set for polarity',
datatype=StringType(), readonly=True,
export=False)
subdev_symmetry = Parameter('Switch to read for symmetry',
datatype=StringType(), readonly=True,
export=False)
userlimits = Parameter('User defined limits of device value',
datatype=TupleOf(FloatRange(unit='$'),
FloatRange(unit='$')),
@ -111,7 +104,7 @@ class GarfieldMagnet(SequencerMixin, Drivable):
Note: This may be overridden in derived classes.
"""
# binary search/bisection
maxcurr = self._currentsource.abslimits[1]
maxcurr = self.currentsource.abslimits[1]
mincurr = -maxcurr
maxfield = self._current2field(maxcurr)
minfield = -maxfield
@ -143,26 +136,21 @@ class GarfieldMagnet(SequencerMixin, Drivable):
def initModule(self):
super().initModule()
self._enable = self.DISPATCHER.get_module(self.subdev_enable)
self._symmetry = self.DISPATCHER.get_module(self.subdev_symmetry)
self._polswitch = self.DISPATCHER.get_module(self.subdev_polswitch)
self._currentsource = self.DISPATCHER.get_module(
self.subdev_currentsource)
self.init_sequencer(fault_on_error=False, fault_on_stop=False)
self._symmetry.read_value()
self.symmetry.read_value()
def read_calibration(self):
try:
try:
return self.calibrationtable[self._symmetry.value]
return self.calibrationtable[self.symmetry.value]
except KeyError:
return self.calibrationtable[self._symmetry.value.name]
return self.calibrationtable[self.symmetry.value.name]
except KeyError:
minslope = min(entry[0]
for entry in self.calibrationtable.values())
self.log.error(
'unconfigured calibration for symmetry %r',
self._symmetry.value)
self.symmetry.value)
return [minslope, 0, 0, 0, 0]
def _checkLimits(self, limits):
@ -182,22 +170,22 @@ class GarfieldMagnet(SequencerMixin, Drivable):
return limits
def read_abslimits(self):
maxfield = self._current2field(self._currentsource.abslimits[1])
maxfield = self._current2field(self.currentsource.abslimits[1])
# limit to configured value (if any)
maxfield = min(maxfield, max(self.accessibles['abslimits'].default))
return -maxfield, maxfield
def read_ramp(self):
# This is an approximation!
return self.calibration[0] * abs(self._currentsource.ramp)
return self.calibration[0] * abs(self.currentsource.ramp)
def write_ramp(self, newramp):
# This is an approximation!
self._currentsource.ramp = float(newramp) / self.calibration[0]
self.currentsource.ramp = float(newramp) / self.calibration[0]
def _get_field_polarity(self):
sign = int(self._polswitch.read_value())
if self._enable.read_value():
sign = int(self.polswitch.read_value())
if self.enable.read_value():
return sign
return 0
@ -210,35 +198,35 @@ class GarfieldMagnet(SequencerMixin, Drivable):
return
if current_pol == 0:
# safe to switch
self._polswitch.write_target(
self.polswitch.write_target(
'+1' if polarity > 0 else str(polarity))
return
if self._currentsource.value < 0.1:
self._polswitch.write_target('0')
if self.currentsource.value < 0.1:
self.polswitch.write_target('0')
return
# unsafe to switch, go to safe state first
self._currentsource.write_target(0)
self.currentsource.write_target(0)
def read_value(self):
return self._current2field(
self._currentsource.read_value() *
self.currentsource.read_value() *
self._get_field_polarity())
def readHwStatus(self):
# called from SequencerMixin.read_status if no sequence is running
if self._enable.value == 'Off':
if self.enable.value == 'Off':
return self.Status.WARN, 'Disabled'
if self._enable.read_status()[0] != self.Status.IDLE:
return self._enable.status
if self._polswitch.value in ['0', 0]:
return self.Status.IDLE, 'Shorted, ' + self._currentsource.status[1]
if self._symmetry.value in ['short', 0]:
return self._currentsource.status[
0], 'Shorted, ' + self._currentsource.status[1]
return self._currentsource.read_status()
if self.enable.read_status()[0] != self.Status.IDLE:
return self.enable.status
if self.polswitch.value in ['0', 0]:
return self.Status.IDLE, 'Shorted, ' + self.currentsource.status[1]
if self.symmetry.value in ['short', 0]:
return self.currentsource.status[
0], 'Shorted, ' + self.currentsource.status[1]
return self.currentsource.read_status()
def write_target(self, target):
if target != 0 and self._symmetry.read_value() in ['short', 0]:
if target != 0 and self.symmetry.read_value() in ['short', 0]:
raise DisabledError(
'Symmetry is shorted, please select another symmetry first!')
@ -251,7 +239,7 @@ class GarfieldMagnet(SequencerMixin, Drivable):
seq.append(Step('preparing', 0, self._prepare_ramp))
seq.append(Step('recover', 0, self._recover))
if current_polarity != wanted_polarity:
if self._currentsource.read_value() > 0.1:
if self.currentsource.read_value() > 0.1:
# switching only allowed if current is low enough -> ramp down
# first
seq.append(
@ -281,54 +269,54 @@ class GarfieldMagnet(SequencerMixin, Drivable):
# steps for the sequencing
def _prepare_ramp(self, store, *args):
store.old_window = self._currentsource.window
self._currentsource.window = 1
store.old_window = self.currentsource.window
self.currentsource.window = 1
def _finish_ramp(self, store, *args):
self._currentsource.window = max(store.old_window, 10)
self.currentsource.window = max(store.old_window, 10)
def _recover(self, store):
# check for interlock
if self._currentsource.read_status()[0] != self.Status.ERROR:
if self.currentsource.read_status()[0] != self.Status.ERROR:
return
# recover from interlock
ramp = self._currentsource.ramp
self._polswitch.write_target('0') # short is safe...
self._polswitch._hw_wait()
self._enable.write_target('On') # else setting ramp won't work
self._enable._hw_wait()
self._currentsource.ramp = 60000
self._currentsource.target = 0
self._currentsource.ramp = ramp
ramp = self.currentsource.ramp
self.polswitch.write_target('0') # short is safe...
self.polswitch._hw_wait()
self.enable.write_target('On') # else setting ramp won't work
self.enable._hw_wait()
self.currentsource.ramp = 60000
self.currentsource.target = 0
self.currentsource.ramp = ramp
# safe state.... if anything of the above fails, the tamperatures may
# be too hot!
def _ramp_current(self, store, target):
if abs(self._currentsource.value - target) <= 0.05:
if abs(self.currentsource.value - target) <= 0.05:
# done with this step if no longer BUSY
return self._currentsource.read_status()[0] == 'BUSY'
if self._currentsource.status[0] != 'BUSY':
if self._enable.status[0] == 'ERROR':
self._enable.reset()
self._enable.read_status()
self._enable.write_target('On')
self._enable._hw_wait()
self._currentsource.write_target(target)
return self.currentsource.read_status()[0] == 'BUSY'
if self.currentsource.status[0] != 'BUSY':
if self.enable.status[0] == 'ERROR':
self.enable.reset()
self.enable.read_status()
self.enable.write_target('On')
self.enable._hw_wait()
self.currentsource.write_target(target)
return True # repeat
def _ramp_current_cleanup(self, store, step_was_busy, target):
# don't cleanup if step finished
if step_was_busy:
self._currentsource.write_target(self._currentsource.read_value())
self._currentsource.window = max(store.old_window, 10)
self.currentsource.write_target(self.currentsource.read_value())
self.currentsource.window = max(store.old_window, 10)
def _set_polarity(self, store, target):
if self._polswitch.read_status()[0] == self.Status.BUSY:
if self.polswitch.read_status()[0] == self.Status.BUSY:
return True
if int(self._polswitch.value) == int(target):
if int(self.polswitch.value) == int(target):
return False # done with this step
if self._polswitch.read_value() != 0:
self._polswitch.write_target(0)
if self.polswitch.read_value() != 0:
self.polswitch.write_target(0)
else:
self._polswitch.write_target(target)
self.polswitch.write_target(target)
return True # repeat

View File

@ -36,12 +36,12 @@ from time import sleep, time as currenttime
import PyTango
from frappy.datatypes import ArrayOf, EnumType, FloatRange, IntRange, \
LimitsType, StringType, TupleOf, ValueType
LimitsType, StatusType, StringType, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, \
HardwareError, ProgrammingError
HardwareError, ProgrammingError, WrongTypeError
from frappy.lib import lazy_property
from frappy.modules import Command, Drivable, Module, Parameter, Readable, \
StatusType, Writable, Property
from frappy.modules import Command, Drivable, Module, Parameter, Property, \
Readable, Writable
#####
@ -466,6 +466,31 @@ class AnalogOutput(PyTangoDevice, Drivable):
# replacement of '$' by main unit must be done later
self.__main_unit = mainunit
def _init_abslimits(self):
"""Get abslimits from tango if not configured. Otherwise, check if both
ranges are compatible."""
try:
tangoabslim = (
float(self._getProperty('absmin')),
float(self._getProperty('absmax'))
)
if self.parameters['abslimits'].readerror:
# no abslimits configured in frappy. read from entangle
self.parameters['abslimits'].readerror = None
self.abslimits = tangoabslim
except Exception as e:
self.log.error(e)
# check if compatible
try:
dt = FloatRange(*tangoabslim)
dt.validate(self.parameters['abslimits'].datatype.min)
dt.validate(self.parameters['abslimits'].datatype.max)
except WrongTypeError as e:
raise WrongTypeError(f'Absolute limits configured in frappy \''
f'{self.abslimits}\' extend beyond the limits '
f'defined in entangle \'{tangoabslim}\'!') from e
def initModule(self):
super().initModule()
# init history
@ -486,6 +511,7 @@ class AnalogOutput(PyTangoDevice, Drivable):
self.log.error(e)
if self.__main_unit:
super().applyMainUnit(self.__main_unit)
self._init_abslimits()
def doPoll(self):
super().doPoll()
@ -615,6 +641,7 @@ class AnalogOutput(PyTangoDevice, Drivable):
sleep(0.3)
def stop(self):
"""cease driving, go to IDLE state"""
self._dev.Stop()

View File

@ -99,6 +99,8 @@ class ZapfPinata(Pinata):
'min': max(devinfo.info['absmin'], -UNLIMITED),
'max': min(devinfo.info['absmax'], UNLIMITED),
}
if devinfo.info['unit'] and devinfo.info['basetype'] == 'float':
config['value']['unit'] = devinfo.info['unit']
if devinfo.info['access'] == 'rw':
config['target'] = {
'min': config['value']['min'],
@ -131,8 +133,6 @@ STATUS_MAP = {
class PLCBase:
status = Parameter(datatype=StatusType(Drivable, 'INITIALIZING',
'DISABLED', 'STARTING'))
status_code = Parameter('raw internal status code',
IntRange(0, 2**32-1))
plcio = Property('plc io device', ValueType())
plc_name = Property('plc io device', StringType(), export=True)
_pinata = Attached(ZapfPinata) # TODO: make this automatic?
@ -159,9 +159,15 @@ class PLCBase:
dataty = cls._map_datatype(info)
if dataty is None:
continue
param = Parameter(info['description'],
dataty,
readonly=readonly)
if info['basetype'] == 'float' and info['unit']: # TODO: better handling
param = Parameter(info['description'],
dataty,
unit=info['unit'],
readonly=readonly)
else:
param = Parameter(info['description'],
dataty,
readonly=readonly)
def read_param(self, parameter=parameter):
code, val = self.plcio.get_param_raw(parameter)
@ -223,7 +229,7 @@ class PLCBase:
if not add_members:
return cls
new_name = '_' + cls.__name__ + '_' \
+ internalize_name("blub")
+ internalize_name("extended")
return type(new_name, (cls,), add_members)
@classmethod
@ -254,10 +260,6 @@ class PLCBase:
msg.append(self.plcio.decode_errid(err_id))
return status, ', '.join(msg)
def read_status_code(self):
state, reason, aux, _ = self.plcio.read_status()
return state << 28 | reason << 24 | aux
@Command()
def stop(self):
"""Stop the operation of this module.

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -195,19 +194,19 @@ class Nmr(Readable):
x = val['xval'][:len(val['yval'])]
return (x, val['yval'])
@Command(result=TupleOf(ArrayOf(string, maxlen=100),
@Command(IntRange(1), result=TupleOf(ArrayOf(string, maxlen=100),
ArrayOf(floating, maxlen=100)))
def get_amplitude(self):
"""Last 20 amplitude datapoints."""
rv = self.cell.cell.nmr_paramlog_get('amplitude', 20)
def get_amplitude(self, count):
"""Last <count> amplitude datapoints."""
rv = self.cell.cell.nmr_paramlog_get('amplitude', count)
x = [ str(timestamp) for timestamp in rv['xval']]
return (x,rv['yval'])
@Command(result=TupleOf(ArrayOf(string, maxlen=100),
@Command(IntRange(1), result=TupleOf(ArrayOf(string, maxlen=100),
ArrayOf(floating, maxlen=100)))
def get_phase(self):
"""Last 20 phase datapoints."""
val = self.cell.cell.nmr_paramlog_get('phase', 20)
def get_phase(self, count):
"""Last <count> phase datapoints."""
val = self.cell.cell.nmr_paramlog_get('phase', count)
return ([str(timestamp) for timestamp in val['xval']], val['yval'])

View File

@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# MLZ library of Tango servers
# Copyright (c) 2015-2023 by the authors, see LICENSE

View File

@ -1,5 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

Some files were not shown because too many files have changed in this diff Show More