133 Commits
develop ... tmp

Author SHA1 Message Date
a930dc8f6a temporary state at PPMS as of 2024-01-26 2024-01-29 08:29:25 +01:00
416cdd5a88 Autogain function for SR830 lock-in driver
Change-Id: If07ec9182e5153e1237b9818ce555162f54e0ae5
2023-12-11 08:31:43 +01:00
1bd188e326 For the lockin830 get_par/set_par are implemented.
Change-Id: I5b6707a07d936d24528173a2edae49a148081ff9
2023-12-11 08:31:36 +01:00
f7b29ee959 SR830: moved dicts out of class
Change-Id: If056b1bf4e81c3b609ded087dff2b40c7119903f
2023-12-11 08:31:29 +01:00
f6a0ccb38b Changed write_range, write_tc methods
Change-Id: I335f97bd54deaccf0552b27deb3a7dfe73074e4c
2023-12-11 08:31:17 +01:00
b93a0cd87b New driver for lock-in amplifier SR830
Change-Id: I45c5a06460f4b84cade0eae53188b058510c4473
2023-12-11 08:31:11 +01:00
be6ba73c89 workaround for bug in sea
fix double slash in hdb path

Change-Id: I68ab79c5240abb9fcccbbe5f817f740df2bb5ea6
2023-12-05 14:10:03 +01:00
c075738584 ips_mercury: add NOT_FOUND action 2023-12-04 15:47:18 +01:00
0fa2e8332d do not complain when no output module is configured 2023-12-04 15:46:50 +01:00
afb49199a1 fixes in mb11/dil5 cfg files
- add flowpars
- increase om range to -360
- add sea config
2023-12-04 15:45:00 +01:00
416fe6ddc0 frappy.client: fix the case then timestamp is missing
the previous version failed when timestamp was missing

Change-Id: I77e1fb81b19fb4ee2749d731bafacbac46132f8e
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32404
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-11-30 09:17:11 +01:00
e3cb5d2e60 frappy.io: change default to retry_first_idn=True
Looked at this code again, and wondered why the default is not True.
It is far more probable that the programmer just forgets to set
this property to True than it would harm to do so.

Change-Id: I439aedbdfc9c2b12737e3ce1694e90550ddf0e78
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32270
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-11-30 09:16:57 +01:00
998367a727 Merge branch 'wip' of gitlab.psi.ch:samenv/frappy into wip 2023-11-23 15:57:17 +01:00
ab918a33ae cc: replace bool by enum in cc.hav and cc.nav 2023-11-23 15:56:23 +01:00
397ec2efbd add pdld laser driver
2 modules: a switch (on/off) and the power (set: target, readback:value)
2023-11-14 09:17:11 +01:00
67032ff59b ma6: set backlash 2023-10-27 15:11:19 +02:00
03c356590b frappy_psi.phytron: implement limit switches 2023-10-02 16:58:09 +02:00
06bec41ed3 ma6: make ts drivable 2023-10-02 16:56:23 +02:00
4cd6929d4b fix ma15 sea config 2023-10-02 16:55:42 +02:00
a89f7a3c44 configs for sample heat stick 2023-10-02 16:54:52 +02:00
a4330081b7 proxy: fix command wrapper
bugfix: return only value of execCommand result, not qualifiers
Change-Id: Iff14779050daa9886e9f7d0396317c5a41695cd1
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32235
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-10-02 13:49:08 +02:00
3b997d7d86 Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-09-22 13:29:23 +02:00
612295d360 add ill2 2023-09-22 13:29:04 +02:00
9e39a43193 fix fs config 2023-09-22 13:27:54 +02:00
6adfafaa27 more consistent ori1 stick json file 2023-09-22 13:27:10 +02:00
f6c4090b96 fix simulation
+ some fixed in sim_uniax

Change-Id: Ia8703ed988aa904bb2694339f0d3175b28fcb33e
2023-09-19 16:05:52 +02:00
ecef2b8974 more cfg file fixes
Change-Id: I0ba86cd17bb07f480cac6f20994ee854c6e811ae
2023-09-19 15:04:02 +02:00
96a7e2109b cleanup cfg files 2023-09-19 14:43:48 +02:00
2f3c68a5c5 improvements for flame
- frappy_psi.channelswitcher: use isBusy instead of checking value and target
- frappy_psi.ls372: remove underflow mechanism
- frappy_psi.parmod.SwitchDriv: switch the controlled module also when not buys
2023-09-19 14:17:08 +02:00
e9a195d61e flamedil as of 2023-07-04 2023-09-19 14:17:08 +02:00
6ac3938b78 flamedil as of 2023-07-03 2023-09-19 14:17:08 +02:00
b4cfdcfc1a flame sample combined T 2023-09-19 14:16:21 +02:00
d32fb647a6 frappy_psi.ls372: add TemperatureSensor and TemperatureLoop 2023-09-19 14:14:12 +02:00
abf7859fd6 frappy_psi.cryoltd: fixes after frappy upgrade 2023-09-19 14:14:12 +02:00
55ea2b8cc4 frappy_psi.triton: try to fix channel selection before condense action 2023-09-19 14:14:12 +02:00
27600e3ddf fix bad cfg files
Change-Id: Iacba12a2679777dd4ea2892751d82a63221b1361
2023-09-19 14:07:20 +02:00
6b4244f071 Merge branch 'wip' of gitlab.psi.ch:samenv/frappy into wip 2023-09-19 11:01:06 +02:00
1d81fc6fcd frappy_psi.sea: small fixes
- changes in return value of frappy_config command in sea
- do not store sea manager

Change-Id: I5bc1d9a281ad2285b90d3649b4c702a3501d451d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32166
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-19 10:58:07 +02:00
dfce0bdfbc phytron.py: improve status
better analysis of hardware status code

Change-Id: I667b443649db43ff3e572e0a50685aabc9ba2ca2
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32165
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-19 10:57:59 +02:00
c39aef10aa Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-09-14 11:10:12 +02:00
45dd87060b improve client shutdown time
in SecopClient.disconnect joinng the reconnect thread may take
up to 10 s, because of the time.sleep(10) call in the reconnect
thread.

change the _shutdown attribute from bool to an Event, and
use Event.wait instead of time.sleep

Change-Id: Icea6a14ad73df0b3d26ef45806f4c05e6bf18492
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32137
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:10:39 +02:00
8019b359c4 change FloatRange arguments minval/maxval to min/max
in the previous version FloatRange(max=100) was neither working
properly nor complaining, because the maxval=None default was
overriding the value for max.

possible fixes:
  - raise an error when min/max used as argument (confusing for
    the programmer, as it is a property)
  - allow both versions minval/maxval and min/max (more code)
  - use min/max and a pylint directive here (the only thing to
    take care is not to use the min/max builtin in __init__)

this change uses the last option for the fix

Change-Id: Iff0e0c4d0d7b165003bdeffa67a93a1cd7f29eea
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31982
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:09:25 +02:00
4c5109e5a3 fix frappy_demo.lakeshore
reading back the target does not work properly, because
  a) the readback value might be delayed
  b) there is no command to read back the target, SETP?1
     is returning the working setpoint, which might be distinct
     in case of a ramp

Change-Id: I0da2dbfc1a8ddbecbae6d0456ff64e008bc56336
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31983
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:09:09 +02:00
bf4b3e5683 psi: improve sea interface
- get return value from teh frappy-config script in order
  to detect failures
- call config_check not more than once within 1 sec

Change-Id: Ibe42e846521206463f2761d452aea7e558a36854
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32139
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:09:00 +02:00
af34fef1e1 fix missing .poll attribute in simulation
using super() in SimBase.__new__ fixes the problem

Change-Id: I18d0ba6ac476c2edb0d973090bcb09508a983d6a
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32136
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:08:50 +02:00
5e1c22ba28 further fixes after change 31470
- get_module is to be called when io is autocreated
- register_module is missing in playground

Change-Id: I28884575b71320667107c494473b0fc5d4363a50
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32123
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-14 09:08:36 +02:00
0bc4a63aa7 add parmod.Par
the reasonly class frappy_psi.parmod.Par represents a parameter
or a component of a tuple parameter

Change-Id: I47208c9d7a6fc377cd56b82cc6a9e8cdb433fe8e
2023-09-14 09:05:00 +02:00
cb2c10655c improve shutdown time
on shutdown, time.sleep(10) is blocking the reconnect thread.
change the _shutdown attribute from bool to an Event, and
use Event.wait instead of time.sleep

Change-Id: Icea6a14ad73df0b3d26ef45806f4c05e6bf18492
2023-09-13 17:22:58 +02:00
6c49abea74 fix frappy/playground.py after change 31470
assumptions about dispatcher in playground.py are no longer
valid.

- let Dispatcher class in playground inherit from real dispatcher
+ improve log messages

Change-Id: I2a9a9d532dabadc590543660c445c021dd2f2891
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31967
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-09-11 14:12:14 +02:00
dee8f8929e Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-09-08 10:46:13 +02:00
2e143963df stickmotor addon: add backlash -1 2023-09-08 10:45:18 +02:00
4bc82c2896 Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-09-06 08:39:22 +02:00
833a68db51 ma10: improve sea cfg 2023-09-06 08:38:49 +02:00
b9f046a665 hvolt_short stick: make hcp writable 2023-09-06 08:38:06 +02:00
9d9b5b2694 frappy_psi.phytron: further improvements
unfortunaely, sometimes communication errors happen.
workaround: try several times reading the status

Change-Id: I2788c6c9b4145246cdd51c31b246abffee60f93b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32032
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-28 14:17:26 +02:00
255adbf8d9 frappy_psi.phytron: further improvements
unfortunaely, sometimes communication errors happen.
workaround: try several times reading the status

Change-Id: I2788c6c9b4145246cdd51c31b246abffee60f93b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/32032
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-25 08:21:16 +02:00
bc0133f55a add zapf to requirements-dev
Change-Id: I6dddd8d4c590253f1039b89edae561fa90b40811
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31725
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
09e59b93d8 Revert "add zapf to requirements-dev.txt"
This reverts commit e67a46cd015c0a1a32d5a4f114b963dd17a7c266.

Reason for revert: required version available from pypi

Change-Id: Ib4f8b0cf62da58e84545511c7521ea93b7ff1342
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31724
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
2474dc5e72 interactive client: improve keyboard interrupt
- when driving a module with <module>(<target>),
  keyboard interrupt should send stop()

- make sure keyboard interrupt does not only stop
  the current driving, but also skips other code
  on the same command line

Change-Id: Ib4d2c4111dc0f23bf07385065766fb9b4a611454
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31926
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-18 16:32:16 +02:00
9dab41441f frappy_mlz: Add Zapf PLC
adds a zapf-based PLC connection scanner.

Change-Id: Icc0ded7e7a8cc5a83d7527d9b26b37c49e9b8674
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31471
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
4af46a0ea2 add zapf to requirements-dev.txt
Change-Id: Ia4de696051cee1e00676e777b7dd2c0a90a0c504
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31719
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
b844b83352 core: do not call register_module on error
Dispatcher.get_module_instance returns None on failure.
If that is the case, the dispatcher should not try to register the
None value as a module.

Change-Id: Ie33b8debc2a829d480d56cafc1eb0ab610181d67
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31713
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
2023-08-18 16:32:16 +02:00
3b63e32395 frappy_mlz: fix one-off error in barcode reader
cut of one byte too much in barcode decode

Change-Id: I5f1f8475f197b13af836d685dc6da5a9ee824dc2
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31728
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
5168e0133d dispatcher: change logging calls to debug
Some logging calls should not have landed as log.info in the dynamic
modules patch. This fixes that.

Change-Id: I666fc7c9b5c65ddbed1c26ea456becce7870e744
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31707
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
9ea6082ed8 frappy_mlz: Zebra fixes after basic test
Some fixes after the device was tested with socat ptys and NICOS.

Change-Id: I3e9dba2be2547d493c435d1da9844c932a2df4e6
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31662
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
f205cf76aa mlz: Add Zebra Barcode Reader
Adds a Barcode reader device (for now, only for ANTARES). Not yet
tested with real hardware.

Change-Id: I25f097466be89d152f47b9d05ece8f562e4b34d6
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31412
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-08-18 16:32:16 +02:00
db9ce02028 Revert "revert commits done before MZ holidays"
This reverts commit d2885bdd72.
2023-08-18 16:32:16 +02:00
dmc
c4a39306e4 Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-08-17 14:10:29 +02:00
dmc
024de0bd32 insert pressure reading into ccrpe_cfg 2023-08-17 14:07:44 +02:00
d2d63c47e1 frappy_psi.phytron: stop motor before restart
restarting the phytron motor without prior stop leads
to funny behaviour.

- send stop before restart
- stop motor when moving but status not busy
- restart when motor drives the wrong way

+ better status text when stopping

Change-Id: I82cd59297b3c79a354a4eeb5ba03fc65bedf755f
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31929
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-14 14:32:01 +02:00
565e8e6fd3 Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-08-10 14:35:03 +02:00
89bc7f6dfe add ori7 2023-08-10 14:34:51 +02:00
c69fe1571a add hvolt_short 2023-08-09 17:39:19 +02:00
c40033a816 update old cfg files
- change secop_psi to frappy_psi
- remove interface and name in Node

Change-Id: I69242de250c9ecf52e001fce6396347dbf3fedcb
2023-08-09 17:36:02 +02:00
da37175cbb frappy/protocol/interface/tcp.py: use SECoP_DEFAULT_PORT
import SECoP_DEFAULT_PORT instead of defining DEF_PORT

Change-Id: I02ee420d200f90b61f8c79e1cb5ee3e0913955e9
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31913
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-07 16:15:30 +02:00
2020928289 improve error message on client when host/port is bad
for host name without port, None was used for the port
leading to a confusing error message

- do not call parse_host_port with None as defaultport argument
- improve error message when connection fails

+ fix an error in last line of parse_ipv6_host_and_port
+ fix some issues breaking PEP 8 rules

Change-Id: I437360be96449c164f0080e3c60f1685825d4780
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31911
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-08-07 16:15:30 +02:00
9df6794678 fix haake_cfg 2023-08-07 16:10:55 +02:00
41f3b7526e fix ma7.config.json 2023-08-07 16:10:30 +02:00
f80624b48d MA7: add unit=T to mf 2023-07-11 15:36:14 +02:00
9e2e6074c8 Merge branch 'wip' of gitlab.psi.ch:samenv/frappy into wip 2023-07-11 13:27:57 +02:00
5a13888498 sMA6 encoder mode to CHECK
after Oksana experienced that it works
2023-07-11 11:27:20 +02:00
5a8a6b88ff frappy.client.interactive: bug fixes
- correct behaviour with the following untypical message sequence:
  - send change target
  - receive status idle
  - receive status busy
  - receive changed target

- add 'exception' to Logger

Change-Id: I614b2a2c2e09ef1b43544838ccb2fa43357dd50d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31632
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-07-11 10:59:42 +02:00
b84b7964e3 frappy_psi.sea: further bug fixes
- in SEA, it is not guaranteed that the is_running state is set
  before the run command returns. as a consequence, we have to
  wait in SeaDrivable.write_target for is_running being set
- syncio has always to be reconnected after asynio

Change-Id: Ia46cff11de86868ce0627faaf6f776282bd7a8f4
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31631
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-07-11 10:59:42 +02:00
6c5dddc449 ma7: sea confg: make ta/tb visible 2023-07-10 10:55:43 +02:00
78fa49ef74 Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-07-07 15:46:05 +02:00
d92b154292 frappy_psi.sea: avoid multiple connections
the _connect was sometimes started in parallel from
startModules and the first call to doPoll.
remove the first one, and protect the second one
with a lock

Change-Id: I079439e150efd5d005130cef475f6326f933ecbd
2023-07-07 15:40:20 +02:00
073fe1a08b frappy.io: make error reporting consistent
- fix mechanism to avoid multiple error messages in log files

Change-Id: I688071f9b06da1a81eb12d63adb549042171c7c8
2023-07-07 15:40:20 +02:00
f80c793cd9 consmetic changes to ma6_sample_heat_cfg.py 2023-07-06 14:49:06 +02:00
519e9e2ed7 MA6: set om.encoder_mode to 'NO' 2023-07-06 14:48:09 +02:00
14036160f7 add special configurations m6/ma7 sampleheat 2023-07-06 13:06:06 +02:00
131dc60807 MA7/MA11: make ts drivable 2023-07-06 13:04:36 +02:00
49722a858f update haake + eurotherm cfg 2023-07-06 13:04:11 +02:00
c61b674382 disable encoder for MA11 stick rotation 2023-07-06 12:57:07 +02:00
091543be56 add FW (old power rack, via SEA) 2023-07-06 12:53:19 +02:00
d2885bdd72 revert commits done before MZ holidays
they are all not neccessary for SINQ SE operation

Change-Id: Ic9adcccf685752ab90bb6b86005ac8e04b302855
2023-07-06 08:03:15 +02:00
975593dd6b update to gerrit version
Change-Id: Ifdaa28dd961a529cd9197c4c3639744f108b0a6a
2023-07-05 17:33:05 +02:00
4fe28363d3 server: add option to dynamically create devices
add module which scans a connection and registers new devices depending
on the answer.
* change module initialization to demand-based
* move code from server to dispatcher
- remove intermediate step in Attached __get__

TODO:
  factor out dispatcher (regards to playground)
  discuss factoring out of module creation code from server AND
  dispatcher

Change-Id: I7af959b99a84c291c526aac067a4e2bf3cd741d4
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31470
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-07-05 17:16:41 +02:00
28b19dbf57 frappy_psi.thermofisher: add version through gerrit
Change-Id: I4b89d6ec803ad64c41720bc62493d2e4027df50e
2023-07-05 17:14:07 +02:00
05189d094a add StructParam
adds a generic solution for creating parameters with struct datatype
with their members linked to individual parameters.

main use case: ctrlpars

read_*/write_* methods are either created for the main (structed)
parameter based on the corresponding methods of the individual
parameters or the methods for the individual parameters are created
based on the methods of the main parameter

+ disable pylint use-dict-literal

Change-Id: I7f1d9fb3d3b2226b548c2999bbfebe2ba5ac285e
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31405
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-07-05 17:12:12 +02:00
47da14eef9 pylint: disable use-dict-literal
sometimes it is nicer to use dict(...) instead of {}
an objections against removing this check from pylint?

Change-Id: Ib08d3016b7ec3512111021a82685253cdcd42916
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31505
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-07-05 17:11:40 +02:00
7904f243cb server: add option to dynamically create devices
add module which scans a connection and registers new devices depending
on the answer.
* change module initialization to demand-based
* move code from server to dispatcher
- remove intermediate step in Attached __get__

TODO:
  factor out dispatcher (regards to playground)
  discuss factoring out of module creation code from server AND
  dispatcher

Change-Id: I7af959b99a84c291c526aac067a4e2bf3cd741d4
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31470
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-07-05 17:10:56 +02:00
a2fed8df03 pylint: disable use-dict-literal
sometimes it is nicer to use dict(...) instead of {}
an objections against removing this check from pylint?

Change-Id: Ib08d3016b7ec3512111021a82685253cdcd42916
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31505
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-07-05 17:08:48 +02:00
19f965bced Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-07-03 17:51:43 +02:00
3e4ea2515e add ma15 cfg 2023-07-03 17:51:28 +02:00
714c820115 fixes in ori3 and dil5 config 2023-07-03 17:49:06 +02:00
a8e1d0e1e8 frappy_psi.sea: try to reconnect on failure
both .asynio and .syncio connection should be tried to reopen.
(fix from mlz gerrit)

Change-Id: I0da5bd9927865a1c55afb93a7a5b76c44fc8750e
2023-06-29 11:28:04 +02:00
d7a1604bd5 frappy_psi.sea: auto connect
on both .ssynio and /syncio try to reconnect after failure
2023-06-26 14:45:53 +02:00
b92095974b camea filter addon
Change-Id: I1d80aa3bfc4e441ad8a69930b81d6cc25cee9511
2023-06-20 11:05:15 +02:00
8dc9c57e9d entangle: fix tango guards for pytango 9.3
Change-Id: I666969f9c798971d5cd8a0c2f6564067ac3cde72
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31327
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
2023-06-20 09:00:42 +02:00
7c95f1f8ee config: fix merge_modules
Change-Id: I31d05afe300443e08fb08f9e6645401f52cfae39
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31323
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-06-20 08:05:55 +02:00
3786d2f209 frappy_psi.triton: fix HeaterOutput.limit
+ fix handling of control_active

Change-Id: Ic11933f6c1c4d9df07aa9d06ae4dca40b755e4ed
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31377
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
138b84e84c add a hook for reads to be done initially
inital reads from HW should be done in the thread started by
startModule, not in startModule itself.

- add a hook method 'initialReads' for this
+ add doc for init methods
+ fix some errors in doc

Change-Id: I914e3b7ee05050eea1ee8aff3461030adf08a461
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31374
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
997e8e26e9 frappy_psi.mercury: proper handling of control_active
Change-Id: I31e846fa6fdf6d642184e3736a66ffd53033bccf
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31376
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
644d005dad frappy.mixins.HasOutputModule
add 'set_control_active' method for overriding by subclasses

Change-Id: Ib344319862a4a0bf29efef16a63db09d1f314a82
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31375
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
36dfe968e8 frappy_psi.phytron: rename reset_error to clear_errors
use the command 'clear_errors' to return from an error state

+ make sure target is valid after clear_errors

Change-Id: I3c180500a05836d52bbb9a8ecbdb397adea03d0d
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31337
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
0932228596 frappy_psi.mercury/triton: add control_off command
frappy_psi.triton.TemperatureLoop has not output module to
deactivate control -> add control_off also to loops in
frappy_psi.mercury

Change-Id: I4dc4333134da34a8d3ae0f3c037a1e5b108c95a1
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31341
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-20 08:03:37 +02:00
dff0c819de io: followup fix for retry-first-ident
followup fix: no error was raised ever for the first identification
message.

Change-Id: I80f0f431add6dfd7b37d750b9fc661174aa8f217
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31318
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Georg Brandl <g.brandl@fz-juelich.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-06-20 08:02:13 +02:00
fd917724d8 io: add option to retry first ident request
Change-Id: I524c15387eaf2461e3dfe690250a55f058467b0b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31291
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Bjoern Pedersen <bjoern.pedersen@frm2.tum.de>
Reviewed-by: Alexander Zaft <a.zaft@fz-juelich.de>
2023-06-20 08:02:02 +02:00
bf43858031 GUI bugfix: use isChecked instead of checkState in BoolInput
Change-Id: I4896df13c117c6eeaaaaba80ca3da4b1982c3d9b
Reviewed-on: https://forge.frm2.tum.de/review/c/secop/frappy/+/31346
Tested-by: Jenkins Automated Tests <pedersen+jenkins@frm2.tum.de>
Reviewed-by: Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
Reviewed-by: Markus Zolliker <markus.zolliker@psi.ch>
2023-06-16 11:09:27 +02:00
f354b19cf0 frappy_psi.sea: fix extra_module_set
Change-Id: If5669fdd60c8505a47414f17cfcd8534cdc2abee
2023-06-06 16:50:48 +02:00
f304ac019e fix cfg files (extra_modules/single_module)
Change-Id: I1821e1e0dc960d48a3e343c53195808798b7f969
2023-06-06 16:49:54 +02:00
9a9a22588f Merge branch 'wip' of gitlab.psi.ch-samenv:samenv/frappy into wip 2023-06-06 16:46:31 +02:00
3e26dd49d0 jtccr: fix cfg for extra_modules / single_module 2023-06-06 16:45:31 +02:00
5a456a82b0 fix enum when SEA type is text
Change-Id: I873045a2dac8771b844431ccda70ce1b8ff1aee5
2023-06-05 13:56:00 +02:00
f6868da3b9 fix systemd bug
Change-Id: I8a3f1eddba9525589757d4612a5060267ea0c5db
2023-06-05 13:56:00 +02:00
ee31f8fb45 fix merge_status in HasConvergence 2023-06-05 13:03:38 +02:00
a6a3f80e30 remove sign=-1 from cfg files 2023-06-05 13:02:40 +02:00
ad36ab1067 frappy_psi.thermofisher improvements
- merge Loop with Sensor
- make convergence work

Change-Id: Iba0cafc524ada6d490b7a5c30f4127e77fd163f3
2023-06-05 09:50:35 +02:00
f2d795cfba frappy_psi.convergence: improvments
- merge_status
- empty string instead of 'approaching'
- dif <= tol

Change-Id: I6f10875f7ef5d2109c13d7448ede114b8e30d86e
2023-06-05 09:47:08 +02:00
c04337c3a4 frappy.client: missing exception method in dummy logger
Change-Id: Ie3a574c3060f2ac6833ff44e8074a19db6ea2f0b
2023-06-05 09:45:04 +02:00
57d5298c92 remove secop-server.spec
Change-Id: I8097a2918bc4e786bd270aeb436efebe9a3bd88f
2023-05-31 14:32:04 +02:00
9a6421a54f up to date with develop/mlz
Change-Id: I5ea71bc99a2f0dffc3dbe37e1119eb188ef8a3f0
2023-05-31 14:27:36 +02:00
c5d429346d update 2023-05-30 from gitmlz
Change-Id: I0b1eb2941692fde5c9d98f107fc38315625dcfdb
2023-05-31 14:16:12 +02:00
174 changed files with 2090 additions and 4125 deletions

View File

@ -1,5 +1,6 @@
#!/usr/bin/env python
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,5 +1,6 @@
#!/usr/bin/env python3
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,5 +1,6 @@
#!/usr/bin/env python3
# pylint: disable=invalid-name
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -6,7 +6,7 @@ Node('QnwTC1test.psi.ch',
Mod('io',
'frappy_psi.qnw.QnwIO',
'connection for Quantum northwest',
uri='tcp://ldm-fi-ts:3001',
uri='tcp://ldmcc01-ts:3004',
)
Mod('T',

View File

@ -6,7 +6,7 @@ Node('TFA10.psi.ch',
Mod('io',
'frappy_psi.thermofisher.ThermFishIO',
'connection for ThermoFisher A10',
uri='tcp://ldm-fi-ts:3002',
uri='tcp://ldmse-d910-ts:3001',
)
Mod('T',

View File

@ -74,10 +74,10 @@ Mod('currentsource',
Mod('mf',
'frappy_mlz.amagnet.GarfieldMagnet',
'magnetic field module, handling polarity switching and stuff',
currentsource = 'currentsource',
enable = 'enable',
polswitch = 'polarity',
symmetry = 'symmetry',
subdev_currentsource = 'currentsource',
subdev_enable = 'enable',
subdev_polswitch = 'polarity',
subdev_symmetry = 'symmetry',
target = Param(unit='T'),
value = Param(unit='T'),
userlimits = (-0.35, 0.35),

View File

@ -41,6 +41,6 @@ Mod('label',
'frappy_demo.modules.Label',
'some label indicating the state of the magnet `mf`.',
system = 'Cryomagnet MX15',
mf = 'mf',
ts = 'ts',
subdev_mf = 'mf',
subdev_ts = 'ts',
)

View File

@ -1,27 +0,0 @@
Node('relais.psi.ch',
'relais test',
'tcp://5000',
)
Mod('rl',
'frappy_psi.ionopimax.DigitalOutput',
'left relais',
addr = 'o1',
value = 0, # start with relais off
)
Mod('rr',
'frappy_psi.ionopimax.DigitalOutput',
'right relais',
addr = 'o2',
value = 0, # start with relais off
)
Mod('drummer',
'frappy_psi.drums.Drums',
'drummer',
target = 150,
pattern='l2L2rl1R1L2',
left='rl',
right='rr',
)

View File

@ -1,16 +0,0 @@
Node('lockin70test.psi.ch',
'lockin70 test',
'tcp://5000',
)
Mod('io',
'frappy_psi.SR.SR_IO',
'lockin communication',
uri='10105266.psi.ch:50000',
)
Mod('XY',
'frappy_psi.SR.XY70',
'XY channels',
io='io',
)

View File

@ -1,16 +0,0 @@
Node('lockin830test.psi.ch',
'lockin830 test',
'tcp://5000',
)
Mod('io',
'frappy_psi.SR830.SR830_IO',
'lockin communication',
uri='tcp://linse-976d-ts:3002',
)
Mod('XY',
'frappy_psi.SR830.XY',
'XY channels',
io='io',
)

View File

@ -1,34 +0,0 @@
Node('multimetertest.psi.ch',
'multimeter test',
'tcp://5000',
)
Mod('io',
'frappy_psi.HP.HP_IO',
'multimeter communication',
uri='/dev/cu.usbserial-21410',
)
Mod('Voltage',
'frappy_psi.HP.Voltage',
'voltage',
io='io',
)
Mod('Current',
'frappy_psi.HP.Current',
'current',
io='io',
)
Mod('Resistance',
'frappy_psi.HP.Resistance',
'resistivity',
io='io',
)
Mod('Frequency',
'frappy_psi.HP.Frequency',
'resistivity',
io='io',
)

View File

@ -1,16 +0,0 @@
Node('phoenixtest.psi.ch',
'phoenix test',
'tcp://5000',
)
Mod('io',
'frappy_psi.haake.HaakeIO',
'connection for Thermo Haake',
uri='tcp://ldmprep7-ts:3005',
)
Mod('T',
'frappy_psi.haake.TemperatureLoop',
'holder temperature',
io='io',
)

View File

@ -1,67 +0,0 @@
Node('bridge.psi.ch',
'ac resistance bridge',
'tcp://5000',
)
Mod('io',
'frappy_psi.bridge.BridgeIO',
'communication to sim900',
uri='serial:///dev/cu.usbserial-14340',
)
Mod('res1',
'frappy_psi.bridge.Resistance',
'module communication',
io='io',
port=1,
)
Mod('res2',
'frappy_psi.bridge.Resistance',
'module communication',
io='io',
port=3,
)
Mod('res3',
'frappy_psi.bridge.Resistance',
'module communication',
io='io',
port=5,
)
Mod('phase1',
'frappy_psi.bridge.Phase',
'module communication',
resistance='res1',
)
Mod('phase2',
'frappy_psi.bridge.Phase',
'module communication',
resistance='res2',
)
Mod('phase3',
'frappy_psi.bridge.Phase',
'module communication',
resistance='res3',
)
Mod('dev1',
'frappy_psi.bridge.Deviation',
'module communication',
resistance='res1',
)
Mod('dev2',
'frappy_psi.bridge.Deviation',
'module communication',
resistance='res1',
)
Mod('dev3',
'frappy_psi.bridge.Deviation',
'module communication',
resistance='res3',
)

View File

@ -1,103 +0,0 @@
Node('vf.psi.ch',
'small vacuum furnace',
'tcp://5000',
)
Mod('htr_io',
'frappy_psi.bkpower.IO',
'powersupply communicator',
uri = 'serial:///dev/ttyUSBupper',
)
Mod('htr',
'frappy_psi.bkpower.Power',
'heater power',
io= 'htr_io',
)
Mod('out',
'frappy_psi.bkpower.Output',
'heater output',
io = 'htr_io',
maxvolt = 50,
maxcurrent = 2,
)
Mod('relais',
'frappy_psi.ionopimax.DigitalOutput',
'relais for power output',
addr = 'o2',
)
Mod('T_main',
'frappy_psi.ionopimax.CurrentInput',
'sample temperature',
addr = 'ai4',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_extra',
'frappy_psi.ionopimax.CurrentInput',
'extra temperature',
addr = 'ai3',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_htr',
'frappy_psi.ionopimax.CurrentInput',
'heater temperature',
addr = 'ai2',
valuerange = (0, 1372),
value = Param(unit='degC'),
)
Mod('T_wall',
'frappy_psi.ionopimax.VoltageInput',
'furnace wall temperature',
addr = 'av2',
rawrange = (0, 1.5),
valuerange = (0, 150),
value = Param(unit='degC'),
)
Mod('T',
'frappy_psi.picontrol.PI',
'controlled Temperature',
input = 'T_htr',
output = 'out',
relais = 'relais',
p = 2,
i = 0.01,
)
Mod('interlocks',
'frappy_psi.furnace.Interlocks',
'interlock parameters',
input = 'T_htr',
wall_T = 'T_wall',
vacuum = 'p',
relais = 'relais',
control = 'T',
wall_limit = 50,
vacuum_limit = 0.1,
)
Mod('p_io',
'frappy_psi.pfeiffer.IO',
'pressure io',
uri='serial:///dev/ttyUSBlower',
)
Mod('p',
'frappy_psi.pfeiffer.Pressure',
'pressure reading',
io = 'p_io',
)

11
ci/Jenkinsfile vendored
View File

@ -141,23 +141,12 @@ def run_docs() {
'''
}
/* does not work with too many quote levels
* alternatively use pdf (based on rst2pdf)
* or singlehtml converted to pdf manually from a browser (may produce nicer output)
stage('build latexpdf') {
sh '''
. /home/jenkins/secopvenv/bin/activate
make -C doc latexpdf
'''
}
*/
stage('build pdf') {
sh '''
. /home/jenkins/secopvenv/bin/activate
make -C doc pdf
'''
}
stage('build man') {
sh '''

118
debian/changelog vendored
View File

@ -1,121 +1,3 @@
frappy-core (0.18.1) focal; urgency=medium
* mlz: Zapf fix unit handling and small errors
* mlz: entangle fix limit check
-- Alexander Zaft <jenkins@frm2.tum.de> Wed, 24 Jan 2024 14:59:21 +0100
frappy-core (0.18.0) focal; urgency=medium
[ Alexander Zaft ]
* Add shutdownModule function
[ Markus Zolliker ]
* frappy_psi.convergence: bug fixes and improvements
[ Alexander Zaft ]
* server: Add signal handling
* add test cases for server and config
[ Markus Zolliker ]
* fix frappy.lib.merge_status
* frappy_psi.sea: try to reconnect on failure
* pylint: disable use-dict-literal
[ Alexander Zaft ]
* server: add option to dynamically create devices
[ Markus Zolliker ]
* add StructParam
* add frappy_psi.thermofisher
* add frappy_psi.thermofisher to the doc
* frappy.io: make error reporting consistent
* frappy_psi.sea: avoid multiple connections
* frappy_psi.sea: further bug fixes
* frappy.client.interactive: bug fixes
[ Alexander Zaft ]
* mlz: Add Zebra Barcode Reader
* frappy_mlz: Zebra fixes after basic test
* dispatcher: change logging calls to debug
* core: do not call register_module on error
* add zapf to requirements-dev.txt
* frappy_mlz: Add Zapf PLC
* Revert "add zapf to requirements-dev.txt"
* add zapf to requirements-dev
* frappy_mlz: fix one-off error in barcode reader
[ Markus Zolliker ]
* improve error message on client when host/port is bad
* frappy/protocol/interface/tcp.py: use SECoP_DEFAULT_PORT
* frappy_psi.phytron: stop motor before restart
* interactive client: improve keyboard interrupt
* fix frappy/playground.py after change 31470
[ Alexander Zaft ]
* frappy_mlz seop: add count to ampl and phase cmds
[ Markus Zolliker ]
* frappy_psi.phytron: further improvements
* further fixes after change 31470
* fix missing .poll attribute in simulation
* psi: improve sea interface
* fix frappy_demo.lakeshore
* change FloatRange arguments minval/maxval to min/max
* improve client shutdown time
* introduce FrozenParam
* phytron.py: improve status
* frappy_psi.sea: small fixes
* bug in Attached (fix after change 31470)
[ Alexander Zaft ]
* core: split module code
* core: factor out accessibles from init
[ Markus Zolliker ]
* proxy: fix command wrapper
[ Alexander Zaft ]
* server: handle signals during startup
* all: remove coding cookies
* psi: fix Done import in sea
[ Markus Zolliker ]
* frappy.io: change default to retry_first_idn=True
[ Alexander Zaft ]
* core: move module handling out of dispatcher
* mlz/demo: move old examples to Attached
[ Markus Zolliker ]
* frappy.client: fix the case then timestamp is missing
* doc: drop latex support, add pdf support
* add StringIO.writeline, improve StringIO.multicomm
* implement pfeiffer TPG vacuum reading
[ Alexander Zaft ]
* core: allow multiple interfaces
* core: formatting and update server docstring
* mlz: handle unconfigured abslimits
* datatypes: fix optional struct export
* core: better command handling
[ Markus Zolliker ]
* frappy_psi.sea: workaround for bug in sea
[ Alexander Zaft ]
* core: better error on export of internal type
[ Markus Zolliker ]
* fix missing import in change message
* modify arguments of Dispatcher.announce_update
* frappy.secnode: fix strange error message
* fix playground after change 32249
* remove py35 compatibility code
* bug fix in frappy.io.BytesIO.checkHWIdent
-- Alexander Zaft <jenkins@frm2.tum.de> Wed, 17 Jan 2024 12:35:00 +0100
frappy-core (0.17.13) focal; urgency=medium
[ Alexander Zaft ]

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Frappy documentation build configuration file, created by
# sphinx-quickstart on Mon Sep 11 10:58:28 2017.
@ -42,9 +43,7 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.mathjax',
'sphinx.ext.viewcode',
'rst2pdf.pdfbuilder',
]
'sphinx.ext.viewcode']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@ -221,80 +220,3 @@ from frappy.lib.classdoc import class_doc_handler
def setup(app):
app.connect('autodoc-process-docstring', class_doc_handler)
# -- Options for PDF output --------------------------------------------------
# Grouping the document tree into PDF files. List of tuples
# (source start file, target name, title, author, options).
#
# If there is more than one author, separate them with \\.
# For example: r'Guido van Rossum\\Fred L. Drake, Jr., editor'
#
# The options element is a dictionary that lets you override
# this config per-document. For example:
#
# ('index', 'MyProject', 'My Project', 'Author Name', {'pdf_compressed': True})
#
# would mean that specific document would be compressed
# regardless of the global 'pdf_compressed' setting.
pdf_documents = [
('index', project, project, author),
]
# A comma-separated list of custom stylesheets. Example:
pdf_stylesheets = ['sphinx', 'a4']
# A list of folders to search for stylesheets. Example:
pdf_style_path = ['.', '_styles']
# Create a compressed PDF
# Use True/False or 1/0
# Example: compressed=True
# pdf_compressed = False
# A colon-separated list of folders to search for fonts. Example:
# pdf_font_path = ['/usr/share/fonts', '/usr/share/texmf-dist/fonts/']
# Language to be used for hyphenation support
# pdf_language = "en_US"
# Mode for literal blocks wider than the frame. Can be
# overflow, shrink or truncate
# pdf_fit_mode = "shrink"
# Section level that forces a break page.
# For example: 1 means top-level sections start in a new page
# 0 means disabled
# pdf_break_level = 0
# When a section starts in a new page, force it to be 'even', 'odd',
# or just use 'any'
# pdf_breakside = 'any'
# Insert footnotes where they are defined instead of
# at the end.
# pdf_inline_footnotes = True
# verbosity level. 0 1 or 2
# pdf_verbosity = 0
# If false, no index is generated.
# pdf_use_index = True
# If false, no modindex is generated.
# pdf_use_modindex = True
# If false, no coverpage is generated.
# pdf_use_coverpage = True
# Name of the cover page template to use
# pdf_cover_template = 'sphinxcover.tmpl'
# Documents to append as an appendix to all manuals.
# pdf_appendices = []
# Enable experimental feature to split table cells. Use it
# if you get "DelayedTable too big" errors
# pdf_splittables = False
# Set the default DPI for images
# pdf_default_dpi = 72
# Enable rst2pdf extension modules
# pdf_extensions = []
# Page template name for "regular" pages
# pdf_page_template = 'cutePage'
# Show Table Of Contents at the beginning?
# pdf_use_toc = True
# How many levels deep should the table of contents be?
pdf_toc_depth = 9999
# Add section number to section references
pdf_use_numbered_links = False
# Background images fitting mode
pdf_fit_background_mode = 'scale'
# Repeat table header on tables that cross a page boundary?
pdf_repeat_table_rows = True
# Enable smart quotes (1, 2 or 3) or disable by setting to 0
pdf_smartquotes = 0

View File

@ -405,7 +405,7 @@ Appendix 2: Extract from the LakeShore Manual
Reply <range> *term*
**Operation Complete Query**
----------------------------------------------
Command \*OPC?
Command *OPC?
Reply 1
Description in Frappy, we append this command to request in order
to generate a reply

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2019 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#
@ -28,9 +29,8 @@
from frappy.datatypes import ArrayOf, BLOBType, BoolType, EnumType, \
FloatRange, IntRange, ScaledInteger, StringType, StructOf, TupleOf, StatusType
from frappy.lib.enum import Enum
from frappy.modulebase import Done, Module, Feature
from frappy.modules import Attached, Communicator, \
Drivable, Readable, Writable
Done, Drivable, Feature, Module, Readable, Writable, HasAccessibles
from frappy.params import Command, Parameter, Limit
from frappy.properties import Property
from frappy.proxy import Proxy, SecNode, proxy_class

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -89,11 +90,7 @@ class DataType(HasProperties):
def export_datatype(self):
"""return a python object which after jsonifying identifies this datatype"""
raise ProgrammingError(
f"{type(self).__name__} is not able to be exported to SECoP. "
f"It is intended for internal use only."
)
raise NotImplementedError
def export_value(self, value):
"""if needed, reformat value for transport"""
@ -105,7 +102,7 @@ class DataType(HasProperties):
note: for importing from gui/configfile/commandline use :meth:`from_string`
instead.
"""
return self(value)
return value
def format_value(self, value, unit=None):
"""format a value of this type into a str string
@ -259,6 +256,10 @@ class FloatRange(HasUnit, DataType):
"""returns a python object fit for serialisation"""
return float(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return float(value)
def from_string(self, text):
value = float(text)
return self(value)
@ -314,7 +315,7 @@ class IntRange(DataType):
except Exception:
raise WrongTypeError(f'can not convert {shortrepr(value)} to an int') from None
if round(fvalue) != fvalue:
raise WrongTypeError(f'{value} should be an int')
raise WrongTypeError('%r should be an int')
return value
def validate(self, value, previous=None):
@ -337,6 +338,10 @@ class IntRange(DataType):
"""returns a python object fit for serialisation"""
return int(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return int(value)
def from_string(self, text):
value = int(text)
return self(value)
@ -453,10 +458,7 @@ class ScaledInteger(HasUnit, DataType):
def import_value(self, value):
"""returns a python object from serialisation"""
try:
return self.scale * int(value)
except Exception:
raise WrongTypeError(f'can not import {shortrepr(value)} to scaled') from None
return self.scale * int(value)
def from_string(self, text):
value = float(text)
@ -508,6 +510,10 @@ class EnumType(DataType):
"""returns a python object fit for serialisation"""
return int(self(value))
def import_value(self, value):
"""returns a python object from serialisation"""
return self(value)
def __call__(self, value):
"""accepts integers and strings, converts to EnumMember (may be used like an int)"""
try:
@ -579,10 +585,7 @@ class BLOBType(DataType):
def import_value(self, value):
"""returns a python object from serialisation"""
try:
return b64decode(value)
except Exception:
raise WrongTypeError(f'can not b64decode {shortrepr(value)}') from None
return b64decode(value)
def from_string(self, text):
value = text
@ -653,6 +656,10 @@ class StringType(DataType):
"""returns a python object fit for serialisation"""
return f'{value}'
def import_value(self, value):
"""returns a python object from serialisation"""
return str(value)
def from_string(self, text):
value = str(text)
return self(value)
@ -713,6 +720,10 @@ class BoolType(DataType):
"""returns a python object fit for serialisation"""
return self(value)
def import_value(self, value):
"""returns a python object from serialisation"""
return self(value)
def from_string(self, text):
value = text
return self(value)
@ -982,7 +993,7 @@ class StructOf(DataType):
return res
def __repr__(self):
opt = f', optional={self.optional!r}' if set(self.optional) != set(self.members) else ''
opt = f', optional={self.optional!r}' if set(self.optional) == set(self.members) else ''
return 'StructOf(%s%s)' % (', '.join(
['%s=%s' % (n, repr(st)) for n, st in list(self.members.items())]), opt)
@ -1221,7 +1232,6 @@ class OrType(DataType):
self.types = types
self.default = self.types[0].default
def __call__(self, value):
"""accepts any of the given types, takes the first valid"""
for t in self.types:

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2017 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2023 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Resource object code
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# NICOS, the Networked Instrument Control System of the MLZ
# Copyright (c) 2009-2023 by the NICOS contributors (see AUTHORS)

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
@ -24,17 +25,17 @@ other future extensions of AsynConn
"""
import re
import threading
import time
import threading
from frappy.datatypes import ArrayOf, BLOBType, BoolType, FloatRange, \
IntRange, StringType, StructOf, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, \
ProgrammingError, SilentCommunicationFailedError as SilentError
from frappy.lib import generalConfig
from frappy.lib.asynconn import AsynConn, ConnectionClosed
from frappy.modules import Attached, Command, Communicator, Module, \
Parameter, Property
from frappy.datatypes import ArrayOf, BLOBType, BoolType, FloatRange, IntRange, \
StringType, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, ProgrammingError, \
SilentCommunicationFailedError as SilentError
from frappy.modules import Attached, Command, \
Communicator, Module, Parameter, Property
from frappy.lib import generalConfig
generalConfig.set_default('legacy_hasiodev', False)
@ -62,7 +63,7 @@ class HasIO(Module):
io = self.ioClass(ioname, srv.log.getChild(ioname), opts, srv) # pylint: disable=not-callable
io.callingModule = []
srv.modules[ioname] = io
srv.secnode.add_module(io, ioname)
srv.dispatcher.register_module(io, ioname)
self.ioDict[self.uri] = ioname
self.io = ioname
@ -75,11 +76,8 @@ class HasIO(Module):
def communicate(self, *args):
return self.io.communicate(*args)
def writeline(self, *args):
return self.io.writeline(*args)
def multicomm(self, *args, **kwds):
return self.io.multicomm(*args, **kwds)
def multicomm(self, *args):
return self.io.multicomm(*args)
class HasIodev(HasIO):
@ -289,7 +287,7 @@ class StringIO(IOBase):
f' does not match {regexp!r}')
@Command(StringType(), result=StringType())
def communicate(self, command, noreply=False):
def communicate(self, command):
"""send a command and receive a reply
using end_of_line, encoding and self._lock
@ -316,8 +314,6 @@ class StringIO(IOBase):
self.comLog('garbage: %r', garbage)
self._conn.send(cmd + self._eol_write)
self.comLog('> %s', cmd.decode(self.encoding))
if noreply:
return None
reply = self._conn.readline(self.timeout)
except ConnectionClosed:
self.closeConnection()
@ -333,69 +329,13 @@ class StringIO(IOBase):
self.log.error(self._last_error)
raise SilentError(repr(e)) from e
@Command(StringType())
def writeline(self, command):
"""send a command without needing a reply
For keeping a request-reply scheme it is recommended to overwrite
this method to append a query on the same line, for example:
.. code::
def writeline(self, command):
self.communicate(command + ';*OPC?')
or to add an additional query which is returning always a reply, e.g.:
.. code::
def writeline(self, command):
with self._lock: # important!
self.communicate(command, noreply=True)
self.communicate('*OPC?')
The first version is preferred when the hardware allows to join several
commands by a separator.
"""
self.communicate(command, noreply=True)
@Command(ArrayOf(TupleOf(StringType(), BoolType(), FloatRange(0, unit='s'))),
result=ArrayOf(StringType()))
def multicomm(self, requests):
"""communicate multiple request/replies in one go
:param requests: a sequence of tuple of (command, request_expected, delay)
if called internally, a sequence of strings (command) is also accepted
:return: list of replies
This method may be rarely used, it is intended when the hardware needs
that several commands are not intercepted by an other client or by the poller,
for example selecting a channel before reading it. Or when wait times different
from 'wait_before' have to be specified.
These cases may also handled by adding an additional method to the IO class.
This could also be a custom SECoP command.
Or, in the case where all useful commands in this IO class need it,
:meth:`communicate` may be overridden.
This method should be used in the following cases:
1) you want to use a generic communicator covering above use cases over SECoP.
2) you do not want to subclass the IO class.
"""
@Command(ArrayOf(StringType()), result=ArrayOf(StringType()))
def multicomm(self, commands):
"""communicate multiple request/replies in one row"""
replies = []
with self._lock:
for request in requests:
if isinstance(request, str):
cmd, expect_reply, delay = request, True, 0
else:
cmd, expect_reply, delay = request
if expect_reply:
replies.append(self.communicate(cmd))
else:
self.writeline(cmd)
if delay:
time.sleep(delay)
for cmd in commands:
replies.append(self.communicate(cmd))
return replies
@ -455,7 +395,7 @@ class BytesIO(IOBase):
if not replypat.match(reply):
self.closeConnection()
raise CommunicationFailedError(f'bad response: {reply!r}'
f' does not match {expected!r}')
' does not match {expected!r}')
@Command((BLOBType(), IntRange(0)), result=BLOBType())
def communicate(self, request, replylen): # pylint: disable=arguments-differ
@ -486,20 +426,13 @@ class BytesIO(IOBase):
self.log.error(self._last_error)
raise SilentError(repr(e)) from e
@Command(StructOf(requests=ArrayOf(TupleOf(BLOBType(), IntRange(0), FloatRange(0, unit='s')))),
result=ArrayOf(BLOBType()))
@Command((ArrayOf(TupleOf(BLOBType(), IntRange(0)))), result=ArrayOf(BLOBType()))
def multicomm(self, requests):
"""communicate multiple request/replies in one go
:param requests: sequence of tuple (<command>, <expected reply length>, <delay>)
:return: list of replies
"""
"""communicate multiple request/replies in one row"""
replies = []
with self._lock:
for cmd, replylen, delay in requests:
replies.append(self.communicate(cmd, replylen))
if delay:
time.sleep(delay)
for request in requests:
replies.append(self.communicate(*request))
return replies
def readBytes(self, nbytes):

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -21,14 +22,12 @@
from textwrap import indent
from frappy.modules import Command, Parameter, Property
from frappy.modulebase import HasProperties, Module
from frappy.modules import Command, HasProperties, Module, Parameter, Property
def indent_description(p):
"""indent lines except first one"""
space = ' ' * 6
return indent(p.description, space).replace(space, '', 1)
return indent(p.description, ' ').replace(' ', '', 1)
def fmt_param(name, param):

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# Copyright (c) 2015-2016 by the authors, see LICENSE
#

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -32,7 +33,7 @@ else:
class PEP487Metaclass(type):
# support for __set_name__ and __init_subclass__ for older python versions
# slightly modified from PEP487 doc
def __new__(cls, *args, **kwargs): # pylint: disable=bad-mcs-classmethod-argument
def __new__(cls, *args, **kwargs):
if len(args) != 3:
return super().__new__(cls, *args)
name, bases, ns = args

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -54,8 +55,6 @@ class RemoteLogHandler(mlzlog.Handler):
def __init__(self):
super().__init__()
self.subscriptions = {} # dict[modname] of tuple(mobobj, dict [conn] of level)
# None will be replaced by a callback when one is first installed
self.send_log = None
def emit(self, record):
"""unused"""
@ -63,18 +62,18 @@ class RemoteLogHandler(mlzlog.Handler):
def handle(self, record):
modname = record.name.split('.')[-1]
try:
subscriptions = self.subscriptions[modname]
modobj, subscriptions = self.subscriptions[modname]
except KeyError:
return
for conn, lev in subscriptions.items():
if record.levelno >= lev:
self.send_log( # pylint: disable=not-callable
conn, modname, LEVEL_NAMES[record.levelno],
modobj.DISPATCHER.send_log_msg(
conn, modobj.name, LEVEL_NAMES[record.levelno],
record.getMessage())
def set_conn_level(self, modname, conn, level):
def set_conn_level(self, modobj, conn, level):
level = check_level(level)
subscriptions = self.subscriptions.setdefault(modname, {})
modobj, subscriptions = self.subscriptions.setdefault(modobj.name, (modobj, {}))
if level == OFF:
subscriptions.pop(conn, None)
else:
@ -128,7 +127,7 @@ class HasComlog:
if self.comlog and generalConfig.initialized and generalConfig.comlog:
self._comLog = mlzlog.Logger(f'COMLOG.{self.name}')
self._comLog.handlers[:] = []
directory = join(logger.logdir, logger.rootname, 'comlog', self.secNode.name)
directory = join(logger.logdir, logger.rootname, 'comlog', self.DISPATCHER.name)
self._comLog.addHandler(ComLogfileHandler(
directory, self.name, max_days=generalConfig.getint('comlog_days', 7)))
return

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -70,8 +71,11 @@ class HasOutputModule:
def initModule(self):
super().initModule()
if self.output_module:
self.output_module.register_input(self.name, self.deactivate_control)
try:
if self.output_module:
self.output_module.register_input(self.name, self.deactivate_control)
except Exception:
self.log.info(f'{self.name} has no output module')
def set_control_active(self, active):
"""to be overridden for switching hw control"""

View File

@ -1,856 +0,0 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Defines the base Module class"""
import time
import threading
from collections import OrderedDict
from frappy.datatypes import ArrayOf, BoolType, EnumType, FloatRange, \
IntRange, StringType, TextType, TupleOf, \
NoneOr
from frappy.errors import BadValueError, CommunicationFailedError, ConfigError, \
ProgrammingError, SECoPError, secop_error, RangeError
from frappy.lib import formatException, mkthread, UniqueObject
from frappy.params import Accessible, Command, Parameter, Limit
from frappy.properties import HasProperties, Property
from frappy.logging import RemoteLogHandler
# TODO: resolve cirular import
# from .interfaces import SECoP_BASE_CLASSES
# WORKAROUND:
SECoP_BASE_CLASSES = ['Readable', 'Writable', 'Drivable', 'Communicator']
Done = UniqueObject('Done')
"""a special return value for a read_<param>/write_<param> method
indicating that the setter is triggered already"""
wrapperClasses = {}
class HasAccessibles(HasProperties):
"""base class of Module
joining the class's properties, parameters and commands dicts with
those of base classes.
wrap read_*/write_* methods
(so the dispatcher will get notified of changed values)
"""
isWrapped = False
checkedMethods = set()
@classmethod
def __init_subclass__(cls): # pylint: disable=too-many-branches
super().__init_subclass__()
if cls.isWrapped:
return
# merge accessibles from all sub-classes, treat overrides
# for now, allow to use also the old syntax (parameters/commands dict)
accessibles = OrderedDict() # dict of accessibles
merged_properties = {} # dict of dict of merged properties
new_names = [] # list of names of new accessibles
override_values = {} # bare values overriding a parameter and methods overriding a command
for base in reversed(cls.__mro__):
for key, value in base.__dict__.items():
if isinstance(value, Accessible):
value.updateProperties(merged_properties.setdefault(key, {}))
if base == cls and key not in accessibles:
new_names.append(key)
accessibles[key] = value
override_values.pop(key, None)
elif key in accessibles:
override_values[key] = value
for aname, aobj in list(accessibles.items()):
if aname in override_values:
aobj = aobj.copy()
value = override_values[aname]
if value is None:
accessibles.pop(aname)
continue
aobj.merge(merged_properties[aname])
aobj.override(value)
# replace the bare value by the created accessible
setattr(cls, aname, aobj)
else:
aobj.merge(merged_properties[aname])
accessibles[aname] = aobj
# rebuild order: (1) inherited items, (2) items from paramOrder, (3) new accessibles
# move (2) to the end
paramOrder = cls.__dict__.get('paramOrder', ())
for aname in paramOrder:
if aname in accessibles:
accessibles.move_to_end(aname)
# ignore unknown names
# move (3) to the end
for aname in new_names:
if aname not in paramOrder:
accessibles.move_to_end(aname)
cls.accessibles = accessibles
cls.wrappedAttributes = {'isWrapped': True}
# create wrappers for access methods
wrapped_name = '_' + cls.__name__
for pname, pobj in accessibles.items():
# wrap of reading/writing funcs
if not isinstance(pobj, Parameter):
# nothing to do for Commands
continue
rname = 'read_' + pname
rfunc = getattr(cls, rname, None)
# create wrapper
if rfunc:
def new_rfunc(self, pname=pname, rfunc=rfunc):
with self.accessLock:
try:
value = rfunc(self)
self.log.debug("read_%s returned %r", pname, value)
if value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
pobj = self.accessibles[pname]
value = pobj.datatype(value)
except Exception as e:
self.log.debug("read_%s failed with %r", pname, e)
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.read_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, value, validate=False)
return value
new_rfunc.poll = getattr(rfunc, 'poll', True)
else:
def new_rfunc(self, pname=pname):
return getattr(self, pname)
new_rfunc.poll = False
new_rfunc.__name__ = rname
new_rfunc.__qualname__ = wrapped_name + '.' + rname
new_rfunc.__module__ = cls.__module__
cls.wrappedAttributes[rname] = new_rfunc
cname = 'check_' + pname
for postfix in ('_limits', '_min', '_max'):
limname = pname + postfix
if limname in accessibles:
# find the base class, where the parameter <limname> is defined first.
# we have to check all bases, as they may not be treated yet when
# not inheriting from HasAccessibles
base = next(b for b in reversed(cls.__mro__) if limname in b.__dict__)
if cname not in base.__dict__:
# there is no check method yet at this class
# add check function to the class where the limit was defined
setattr(base, cname, lambda self, value, pname=pname: self.checkLimits(value, pname))
cfuncs = tuple(filter(None, (b.__dict__.get(cname) for b in cls.__mro__)))
wname = 'write_' + pname
wfunc = getattr(cls, wname, None)
if wfunc or not pobj.readonly:
# allow write method even when parameter is readonly, but internally writable
def new_wfunc(self, value, pname=pname, wfunc=wfunc, check_funcs=cfuncs):
with self.accessLock:
self.log.debug('validate %r to datatype of %r', value, pname)
validate = self.parameters[pname].datatype.validate
try:
new_value = validate(value)
for c in check_funcs:
if c(self, value):
break
if wfunc:
new_value = wfunc(self, new_value)
self.log.debug('write_%s(%r) returned %r', pname, value, new_value)
if new_value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
new_value = value if new_value is None else validate(new_value)
except Exception as e:
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.write_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, new_value, validate=False)
return new_value
new_wfunc.__name__ = wname
new_wfunc.__qualname__ = wrapped_name + '.' + wname
new_wfunc.__module__ = cls.__module__
cls.wrappedAttributes[wname] = new_wfunc
cls.checkedMethods.update(cls.wrappedAttributes)
# check for programming errors
for attrname in dir(cls):
prefix, _, pname = attrname.partition('_')
if not pname:
continue
if prefix == 'do':
raise ProgrammingError(f'{cls.__name__!r}: old style command {attrname!r} not supported anymore')
if prefix in ('read', 'write') and attrname not in cls.checkedMethods:
raise ProgrammingError(f'{cls.__name__}.{attrname} defined, but {pname!r} is no parameter')
try:
# update Status type
cls.Status = cls.status.datatype.members[0]._enum
except AttributeError:
pass
res = {}
# collect info about properties
for pn, pv in cls.propertyDict.items():
if pv.settable:
res[pn] = pv
# collect info about parameters and their properties
for param, pobj in cls.accessibles.items():
res[param] = {}
for pn, pv in pobj.getProperties().items():
if pv.settable:
res[param][pn] = pv
cls.configurables = res
def __new__(cls, *args, **kwds):
wrapper_class = wrapperClasses.get(cls)
if wrapper_class is None:
wrapper_class = type('_' + cls.__name__, (cls,), cls.wrappedAttributes)
wrapperClasses[cls] = wrapper_class
return super().__new__(wrapper_class)
class Feature(HasAccessibles):
"""all things belonging to a small, predefined functionality influencing the working of a module
a mixin with Feature as a direct base class is recognized as a SECoP feature
and reported in the module property 'features'
"""
class PollInfo:
def __init__(self, pollinterval, trigger_event):
self.interval = pollinterval
self.last_main = 0
self.last_slow = 0
self.pending_errors = set()
self.polled_parameters = []
self.fast_flag = False
self.trigger_event = trigger_event
def trigger(self, immediate=False):
"""trigger a recalculation of poll due times
:param immediate: when True, doPoll should be called as soon as possible
"""
if immediate:
self.last_main = 0
self.trigger_event.set()
def update_interval(self, pollinterval):
if not self.fast_flag:
self.interval = pollinterval
self.trigger()
class Module(HasAccessibles):
"""basic module
all SECoP modules derive from this.
:param name: the modules name
:param logger: a logger instance
:param cfgdict: the dict from this modules section in the config file
:param srv: the server instance
Notes:
- the programmer normally should not need to reimplement :meth:`__init__`
- within modules, parameters should only be addressed as ``self.<pname>``,
i.e. ``self.value``, ``self.target`` etc...
- these are accessing the cached version.
- they can also be written to, generating an async update
- if you want to 'update from the hardware', call ``self.read_<pname>()`` instead
- the return value of this method will be used as the new cached value and
be an async update sent automatically.
- if you want to 'update the hardware' call ``self.write_<pname>(<new value>)``.
- The return value of this method will also update the cache.
"""
# static properties, definitions in derived classes should overwrite earlier ones.
# note: properties don't change after startup and are usually filled
# with data from a cfg file...
# note: only the properties predefined here are allowed to be set in the cfg file
export = Property('flag if this module is to be exported', BoolType(), default=True, export=False)
group = Property('optional group the module belongs to', StringType(), default='', extname='group')
description = Property('description of the module', TextType(), extname='description', mandatory=True)
meaning = Property('optional meaning indicator', TupleOf(StringType(), IntRange(0, 50)),
default=('', 0), extname='meaning')
visibility = Property('optional visibility hint', EnumType('visibility', user=1, advanced=2, expert=3),
default='user', extname='visibility')
implementation = Property('internal name of the implementation class of the module', StringType(),
extname='implementation')
interface_classes = Property('offical highest interface-class of the module', ArrayOf(StringType()),
extname='interface_classes')
features = Property('list of features', ArrayOf(StringType()), extname='features')
pollinterval = Property('poll interval for parameters handled by doPoll', FloatRange(0.1, 120), default=5)
slowinterval = Property('poll interval for other parameters', FloatRange(0.1, 120), default=15)
omit_unchanged_within = Property('default for minimum time between updates of unchanged values',
NoneOr(FloatRange(0)), export=False, default=None)
enablePoll = True
pollInfo = None
triggerPoll = None # trigger event for polls. used on io modules and modules without io
def __init__(self, name, logger, cfgdict, srv):
# remember the secnode for interacting with other modules and the
# server
self.secNode = srv.secnode
self.log = logger
self.name = name
self.valueCallbacks = {}
self.errorCallbacks = {}
self.earlyInitDone = False
self.initModuleDone = False
self.startModuleDone = False
self.remoteLogHandler = None
self.accessLock = threading.RLock() # for read_* / write_* methods
self.updateLock = threading.RLock() # for announceUpdate
self.polledModules = [] # modules polled by thread started in self.startModules
self.attachedModules = {}
self.errors = []
self._isinitialized = False
self.updateCallback = srv.dispatcher.announce_update
# handle module properties
# 1) make local copies of properties
super().__init__()
# conversion from exported names to internal attribute names
self.accessiblename2attr = {}
self.writeDict = {} # values of parameters to be written
# properties, parameters and commands are auto-merged upon subclassing
self.parameters = {}
self.commands = {}
# 2) check and apply properties specified in cfgdict as
# '<propertyname> = <propertyvalue>'
# pylint: disable=consider-using-dict-items
for key in self.propertyDict:
value = cfgdict.pop(key, None)
if value is not None:
try:
if isinstance(value, dict):
self.setProperty(key, value['value'])
else:
self.setProperty(key, value)
except BadValueError:
self.errors.append(f'{key}: value {value!r} does not match {self.propertyDict[key].datatype!r}!')
# 3) set automatic properties
mycls, = self.__class__.__bases__ # skip the wrapper class
myclassname = f'{mycls.__module__}.{mycls.__name__}'
self.implementation = myclassname
# list of only the 'highest' secop module class
self.interface_classes = [
b.__name__ for b in mycls.__mro__ if b.__name__ in SECoP_BASE_CLASSES][:1]
# handle Features
self.features = [b.__name__ for b in mycls.__mro__ if Feature in b.__bases__]
# handle accessibles
# 1) make local copies of parameter objects
# they need to be individual per instance since we use them also
# to cache the current value + qualifiers...
# do not re-use self.accessibles as this is the same for all instances
accessibles = self.accessibles
self.accessibles = {}
for aname, aobj in accessibles.items():
# make a copy of the Parameter/Command object
aobj = aobj.copy()
acfg = cfgdict.pop(aname, None)
self._add_accessible(aname, aobj, cfg=acfg)
# 3) complain about names not found as accessible or property names
if cfgdict:
self.errors.append(
f"{', '.join(cfgdict.keys())} does not exist (use one of"
f" {', '.join(list(self.accessibles) + list(self.propertyDict))})")
# 5) ensure consistency of all accessibles added here
for aobj in self.accessibles.values():
aobj.finish(self)
# Modify units AFTER applying the cfgdict
mainvalue = self.parameters.get('value')
if mainvalue:
mainunit = mainvalue.datatype.unit
if mainunit:
self.applyMainUnit(mainunit)
# 6) check complete configuration of * properties
if not self.errors:
try:
self.checkProperties()
except ConfigError as e:
self.errors.append(str(e))
for aname, aobj in self.accessibles.items():
try:
aobj.checkProperties()
except (ConfigError, ProgrammingError) as e:
self.errors.append(f'{aname}: {e}')
if self.errors:
raise ConfigError(self.errors)
# helper cfg-editor
def __iter__(self):
return self.accessibles.__iter__()
def __getitem__(self, item):
return self.accessibles.__getitem__(item)
def applyMainUnit(self, mainunit):
"""replace $ in units of parameters by mainunit"""
for pobj in self.parameters.values():
pobj.datatype.set_main_unit(mainunit)
def _add_accessible(self, name, accessible, cfg=None):
if self.startModuleDone:
raise ProgrammingError('Accessibles can only be added before startModule()!')
if not self.export: # do not export parameters of a module not exported
accessible.export = False
self.accessibles[name] = accessible
if accessible.export:
self.accessiblename2attr[accessible.export] = name
if isinstance(accessible, Parameter):
self.parameters[name] = accessible
if isinstance(accessible, Command):
self.commands[name] = accessible
if cfg:
try:
for propname, propvalue in cfg.items():
accessible.setProperty(propname, propvalue)
except KeyError:
self.errors.append(f"'{name}' has no property '{propname}'")
except BadValueError as e:
self.errors.append(f'{name}.{propname}: {str(e)}')
if isinstance(accessible, Parameter):
self._handle_writes(name, accessible)
def _handle_writes(self, pname, pobj):
""" register value for writing, if given
apply default when no value is given (in cfg or as Parameter argument)
or complain, when cfg is needed
"""
self.valueCallbacks[pname] = []
self.errorCallbacks[pname] = []
if isinstance(pobj, Limit):
basepname = pname.rpartition('_')[0]
baseparam = self.parameters.get(basepname)
if not baseparam:
self.errors.append(f'limit {pname!r} is given, but not {basepname!r}')
return
if baseparam.datatype is None:
return # an error will be reported on baseparam
pobj.set_datatype(baseparam.datatype)
if not pobj.hasDatatype():
self.errors.append(f'{pname} needs a datatype')
return
if pobj.value is None:
if pobj.needscfg:
self.errors.append(f'{pname!r} has no default value and was not given in config!')
if pobj.default is None:
# we do not want to call the setter for this parameter for now,
# this should happen on the first read
pobj.readerror = ConfigError(f'parameter {pname!r} not initialized')
# above error will be triggered on activate after startup,
# when not all hardware parameters are read because of startup timeout
pobj.default = pobj.datatype.default
pobj.value = pobj.default
else:
# value given explicitly, either by cfg or as Parameter argument
pobj.given = True # for PersistentMixin
if hasattr(self, 'write_' + pname):
self.writeDict[pname] = pobj.value
if pobj.default is None:
pobj.default = pobj.value
# this checks again for datatype and sets the timestamp
setattr(self, pname, pobj.value)
def announceUpdate(self, pname, value=None, err=None, timestamp=None, validate=True):
"""announce a changed value or readerror
:param pname: parameter name
:param value: new value or None in case of error
:param err: None or an exception
:param timestamp: a timestamp or None for taking current time
:param validate: True: convert to datatype, in case of error store in readerror
:return:
when err=None and validate=False, the value must already be converted to the datatype
"""
with self.updateLock:
pobj = self.parameters[pname]
timestamp = timestamp or time.time()
if not err:
try:
if validate:
value = pobj.datatype(value)
except Exception as e:
err = e
else:
changed = pobj.value != value
# store the value even in case of error
pobj.value = value
if err:
if secop_error(err) == pobj.readerror:
err.report_error = False
return # no updates for repeated errors
err = secop_error(err)
elif not changed and timestamp < (pobj.timestamp or 0) + pobj.omit_unchanged_within:
# no change within short time -> omit
return
pobj.timestamp = timestamp or time.time()
if err:
callbacks = self.errorCallbacks
pobj.readerror = arg = err
else:
callbacks = self.valueCallbacks
arg = value
pobj.readerror = None
if pobj.export:
self.updateCallback(self, pobj)
cblist = callbacks[pname]
for cb in cblist:
try:
cb(arg)
except Exception:
# print(formatExtendedTraceback())
pass
def registerCallbacks(self, modobj, autoupdate=()):
"""register callbacks to another module <modobj>
- whenever a self.<param> changes:
<modobj>.update_<param> is called with the new value as argument.
If this method raises an exception, <modobj>.<param> gets into an error state.
If the method does not exist and <param> is in autoupdate,
<modobj>.<param> is updated to self.<param>
- whenever <self>.<param> gets into an error state:
<modobj>.error_update_<param> is called with the exception as argument.
If this method raises an error, <modobj>.<param> gets into an error state.
If this method does not exist, and <param> is in autoupdate,
<modobj>.<param> gets into the same error state as self.<param>
"""
for pname in self.parameters:
errfunc = getattr(modobj, 'error_update_' + pname, None)
if errfunc:
def errcb(err, p=pname, efunc=errfunc):
try:
efunc(err)
except Exception as e:
modobj.announceUpdate(p, err=e)
self.errorCallbacks[pname].append(errcb)
else:
def errcb(err, p=pname):
modobj.announceUpdate(p, err=err)
if pname in autoupdate:
self.errorCallbacks[pname].append(errcb)
updfunc = getattr(modobj, 'update_' + pname, None)
if updfunc:
def cb(value, ufunc=updfunc, efunc=errcb):
try:
ufunc(value)
except Exception as e:
efunc(e)
self.valueCallbacks[pname].append(cb)
elif pname in autoupdate:
def cb(value, p=pname):
modobj.announceUpdate(p, value)
self.valueCallbacks[pname].append(cb)
def isBusy(self, status=None):
"""helper function for treating substates of BUSY correctly"""
# defined even for non drivable (used for dynamic polling)
return False
def earlyInit(self):
"""initialise module with stuff to be done before all modules are created"""
self.earlyInitDone = True
def initModule(self):
"""initialise module with stuff to be done after all modules are created"""
self.initModuleDone = True
if self.enablePoll or self.writeDict:
# enablePoll == False: we still need the poll thread for writing values from writeDict
if hasattr(self, 'io'):
self.io.polledModules.append(self)
else:
self.triggerPoll = threading.Event()
self.polledModules.append(self)
def startModule(self, start_events):
"""runs after init of all modules
when a thread is started, a trigger function may signal that it
has finished its initial work
start_events.get_trigger(<timeout>) creates such a trigger and
registers it in the server for waiting
<timeout> defaults to 30 seconds
"""
# we do not need self.errors any longer. should we delete it?
# del self.errors
if self.polledModules:
mkthread(self.__pollThread, self.polledModules, start_events.get_trigger())
self.startModuleDone = True
def initialReads(self):
"""initial reads to be done
override to read initial values from HW, when it is not desired
to poll them afterwards
called from the poll thread, after writeInitParams but before
all parameters are polled once
"""
def shutdownModule(self):
"""called when the sever shuts down
any cleanup-work should be performed here, like closing threads and
saving data.
"""
def doPoll(self):
"""polls important parameters like value and status
all other parameters are polled automatically
"""
def setFastPoll(self, flag, fast_interval=0.25):
"""change poll interval
:param flag: enable/disable fast poll mode
:param fast_interval: fast poll interval
"""
if self.pollInfo:
self.pollInfo.fast_flag = flag
self.pollInfo.interval = fast_interval if flag else self.pollinterval
self.pollInfo.trigger()
def callPollFunc(self, rfunc, raise_com_failed=False):
"""call read method with proper error handling"""
try:
rfunc()
if rfunc.__name__ in self.pollInfo.pending_errors:
self.log.info('%s: o.k.', rfunc.__name__)
self.pollInfo.pending_errors.discard(rfunc.__name__)
except Exception as e:
if getattr(e, 'report_error', True):
name = rfunc.__name__
self.pollInfo.pending_errors.add(name) # trigger o.k. message after error is resolved
if isinstance(e, SECoPError):
e.raising_methods.append(name)
if e.silent:
self.log.debug('%s', e.format(False))
else:
self.log.error('%s', e.format(False))
if raise_com_failed and isinstance(e, CommunicationFailedError):
raise
else:
# not a SECoPError: this is proabably a programming error
# we want to log the traceback
self.log.error('%s', formatException())
def __pollThread(self, modules, started_callback):
"""poll thread body
:param modules: list of modules to be handled by this thread
:param started_callback: to be called after all polls are done once
before polling, parameters which need hardware initialisation are written
"""
polled_modules = [m for m in modules if m.enablePoll]
if hasattr(self, 'registerReconnectCallback'):
# self is a communicator supporting reconnections
def trigger_all(trg=self.triggerPoll, polled_modules=polled_modules):
for m in polled_modules:
m.pollInfo.last_main = 0
m.pollInfo.last_slow = 0
trg.set()
self.registerReconnectCallback('trigger_polls', trigger_all)
# collect all read functions
for mobj in polled_modules:
pinfo = mobj.pollInfo = PollInfo(mobj.pollinterval, self.triggerPoll)
# trigger a poll interval change when self.pollinterval changes.
if 'pollinterval' in mobj.valueCallbacks:
mobj.valueCallbacks['pollinterval'].append(pinfo.update_interval)
for pname, pobj in mobj.parameters.items():
rfunc = getattr(mobj, 'read_' + pname)
if rfunc.poll:
pinfo.polled_parameters.append((mobj, rfunc, pobj))
while True:
try:
for mobj in modules:
# TODO when needed: here we might add a call to a method :meth:`beforeWriteInit`
mobj.writeInitParams()
mobj.initialReads()
# call all read functions a first time
for m in polled_modules:
for mobj, rfunc, _ in m.pollInfo.polled_parameters:
mobj.callPollFunc(rfunc, raise_com_failed=True)
# TODO when needed: here we might add calls to a method :meth:`afterInitPolls`
break
except CommunicationFailedError as e:
# when communication failed, probably all parameters and may be more modules are affected.
# as this would take a lot of time (summed up timeouts), we do not continue
# trying and let the server accept connections, further polls might success later
if started_callback:
self.log.error('communication failure on startup: %s', e)
started_callback()
started_callback = None
self.triggerPoll.wait(0.1) # wait for reconnection or max 10 sec.
break
if started_callback:
started_callback()
if not polled_modules: # no polls needed - exit thread
return
to_poll = ()
while True:
now = time.time()
wait_time = 999
for mobj in modules:
pinfo = mobj.pollInfo
wait_time = min(pinfo.last_main + pinfo.interval - now, wait_time,
pinfo.last_slow + mobj.slowinterval - now)
if wait_time > 0 and not to_poll:
# nothing to do
self.triggerPoll.wait(wait_time)
self.triggerPoll.clear()
continue
# call doPoll of all modules where due
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_main + pinfo.interval:
try:
pinfo.last_main = (now // pinfo.interval) * pinfo.interval
except ZeroDivisionError:
pinfo.last_main = now
mobj.callPollFunc(mobj.doPoll)
now = time.time()
# find ONE due slow poll and call it
loop = True
while loop: # loops max. 2 times, when to_poll is at end
for mobj, rfunc, pobj in to_poll:
if now > pobj.timestamp + mobj.slowinterval * 0.5:
mobj.callPollFunc(rfunc)
loop = False # one poll done
break
else:
to_poll = []
# collect due slow polls
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_slow + mobj.slowinterval:
to_poll.extend(pinfo.polled_parameters)
pinfo.last_slow = (now // mobj.slowinterval) * mobj.slowinterval
if to_poll:
to_poll = iter(to_poll)
else:
loop = False # no slow polls ready
def writeInitParams(self):
"""write values for parameters with configured values
- does proper error handling
called at the beginning of the poller thread and for writing persistent values
"""
for pname in list(self.writeDict):
value = self.writeDict.pop(pname, Done)
# in the mean time, a poller or handler might already have done it
if value is not Done:
wfunc = getattr(self, 'write_' + pname, None)
if wfunc is None:
setattr(self, pname, value)
else:
try:
self.log.debug('initialize parameter %s', pname)
wfunc(value)
except SECoPError as e:
if e.silent:
self.log.debug('%s: %s', pname, str(e))
else:
self.log.error('%s: %s', pname, str(e))
except Exception:
self.log.error(formatException())
def setRemoteLogging(self, conn, level, send_log):
if self.remoteLogHandler is None:
for handler in self.log.handlers:
if isinstance(handler, RemoteLogHandler):
handler.send_log = send_log
self.remoteLogHandler = handler
break
else:
raise ValueError('remote handler not found')
self.remoteLogHandler.set_conn_level(self.name, conn, level)
def checkLimits(self, value, pname='target'):
"""check for limits
:param value: the value to be checked for <pname>_min <= value <= <pname>_max
:param pname: parameter name, default is 'target'
raises RangeError in case the value is not valid
This method is called automatically and needs therefore rarely to be
called by the programmer. It might be used in a check_<param> method,
when no automatic super call is desired.
"""
try:
min_, max_ = getattr(self, pname + '_limits')
if not min_ <= value <= max_:
raise RangeError(f'{pname} outside {pname}_limits')
return
except AttributeError:
pass
min_ = getattr(self, pname + '_min', float('-inf'))
max_ = getattr(self, pname + '_max', float('inf'))
if min_ > max_:
raise RangeError(f'invalid limits: {pname}_min > {pname}_max')
if value < min_:
raise RangeError(f'{pname} below {pname}_min')
if value > max_:
raise RangeError(f'{pname} above {pname}_max')

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -17,21 +18,838 @@
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Define base classes for real Modules implemented in the server"""
from frappy.datatypes import FloatRange, \
StatusType, StringType
from frappy.errors import ConfigError, ProgrammingError
from frappy.lib.enum import Enum
from frappy.params import Command, Parameter
from frappy.properties import Property
from frappy.logging import HasComlog
import time
import threading
from collections import OrderedDict
from .modulebase import Module
from frappy.datatypes import ArrayOf, BoolType, EnumType, FloatRange, \
IntRange, StatusType, StringType, TextType, TupleOf, \
NoneOr
from frappy.errors import BadValueError, CommunicationFailedError, ConfigError, \
ProgrammingError, SECoPError, secop_error, RangeError
from frappy.lib import formatException, mkthread, UniqueObject
from frappy.lib.enum import Enum
from frappy.params import Accessible, Command, Parameter, Limit
from frappy.properties import HasProperties, Property
from frappy.logging import RemoteLogHandler, HasComlog
Done = UniqueObject('Done')
"""a special return value for a read_<param>/write_<param> method
indicating that the setter is triggered already"""
wrapperClasses = {}
class HasAccessibles(HasProperties):
"""base class of Module
joining the class's properties, parameters and commands dicts with
those of base classes.
wrap read_*/write_* methods
(so the dispatcher will get notified of changed values)
"""
isWrapped = False
checkedMethods = set()
@classmethod
def __init_subclass__(cls): # pylint: disable=too-many-branches
super().__init_subclass__()
if cls.isWrapped:
return
# merge accessibles from all sub-classes, treat overrides
# for now, allow to use also the old syntax (parameters/commands dict)
accessibles = OrderedDict() # dict of accessibles
merged_properties = {} # dict of dict of merged properties
new_names = [] # list of names of new accessibles
override_values = {} # bare values overriding a parameter and methods overriding a command
for base in reversed(cls.__mro__):
for key, value in base.__dict__.items():
if isinstance(value, Accessible):
value.updateProperties(merged_properties.setdefault(key, {}))
if base == cls and key not in accessibles:
new_names.append(key)
accessibles[key] = value
override_values.pop(key, None)
elif key in accessibles:
override_values[key] = value
for aname, aobj in list(accessibles.items()):
if aname in override_values:
aobj = aobj.copy()
value = override_values[aname]
if value is None:
accessibles.pop(aname)
continue
aobj.merge(merged_properties[aname])
aobj.override(value)
# replace the bare value by the created accessible
setattr(cls, aname, aobj)
else:
aobj.merge(merged_properties[aname])
accessibles[aname] = aobj
# rebuild order: (1) inherited items, (2) items from paramOrder, (3) new accessibles
# move (2) to the end
paramOrder = cls.__dict__.get('paramOrder', ())
for aname in paramOrder:
if aname in accessibles:
accessibles.move_to_end(aname)
# ignore unknown names
# move (3) to the end
for aname in new_names:
if aname not in paramOrder:
accessibles.move_to_end(aname)
# note: for python < 3.6 the order of inherited items is not ensured between
# declarations within the same class
cls.accessibles = accessibles
cls.wrappedAttributes = {'isWrapped': True}
# create wrappers for access methods
wrapped_name = '_' + cls.__name__
for pname, pobj in accessibles.items():
# wrap of reading/writing funcs
if not isinstance(pobj, Parameter):
# nothing to do for Commands
continue
rname = 'read_' + pname
rfunc = getattr(cls, rname, None)
# create wrapper
if rfunc:
def new_rfunc(self, pname=pname, rfunc=rfunc):
with self.accessLock:
try:
value = rfunc(self)
self.log.debug("read_%s returned %r", pname, value)
if value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
pobj = self.accessibles[pname]
value = pobj.datatype(value)
except Exception as e:
self.log.debug("read_%s failed with %r", pname, e)
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.read_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, value, validate=False)
return value
new_rfunc.poll = getattr(rfunc, 'poll', True)
else:
def new_rfunc(self, pname=pname):
return getattr(self, pname)
new_rfunc.poll = False
new_rfunc.__name__ = rname
new_rfunc.__qualname__ = wrapped_name + '.' + rname
new_rfunc.__module__ = cls.__module__
cls.wrappedAttributes[rname] = new_rfunc
cname = 'check_' + pname
for postfix in ('_limits', '_min', '_max'):
limname = pname + postfix
if limname in accessibles:
# find the base class, where the parameter <limname> is defined first.
# we have to check all bases, as they may not be treated yet when
# not inheriting from HasAccessibles
base = next(b for b in reversed(cls.__mro__) if limname in b.__dict__)
if cname not in base.__dict__:
# there is no check method yet at this class
# add check function to the class where the limit was defined
setattr(base, cname, lambda self, value, pname=pname: self.checkLimits(value, pname))
cfuncs = tuple(filter(None, (b.__dict__.get(cname) for b in cls.__mro__)))
wname = 'write_' + pname
wfunc = getattr(cls, wname, None)
if wfunc or not pobj.readonly:
# allow write method even when parameter is readonly, but internally writable
def new_wfunc(self, value, pname=pname, wfunc=wfunc, check_funcs=cfuncs):
with self.accessLock:
self.log.debug('validate %r to datatype of %r', value, pname)
validate = self.parameters[pname].datatype.validate
try:
new_value = validate(value)
for c in check_funcs:
if c(self, value):
break
if wfunc:
new_value = wfunc(self, new_value)
self.log.debug('write_%s(%r) returned %r', pname, value, new_value)
if new_value is Done: # TODO: to be removed when all code using Done is updated
return getattr(self, pname)
new_value = value if new_value is None else validate(new_value)
except Exception as e:
if isinstance(e, SECoPError):
e.raising_methods.append(f'{self.name}.write_{pname}')
self.announceUpdate(pname, err=e)
raise
self.announceUpdate(pname, new_value, validate=False)
return new_value
new_wfunc.__name__ = wname
new_wfunc.__qualname__ = wrapped_name + '.' + wname
new_wfunc.__module__ = cls.__module__
cls.wrappedAttributes[wname] = new_wfunc
cls.checkedMethods.update(cls.wrappedAttributes)
# check for programming errors
for attrname in dir(cls):
prefix, _, pname = attrname.partition('_')
if not pname:
continue
if prefix == 'do':
raise ProgrammingError(f'{cls.__name__!r}: old style command {attrname!r} not supported anymore')
if prefix in ('read', 'write') and attrname not in cls.checkedMethods:
raise ProgrammingError(f'{cls.__name__}.{attrname} defined, but {pname!r} is no parameter')
try:
# update Status type
cls.Status = cls.status.datatype.members[0]._enum
except AttributeError:
pass
res = {}
# collect info about properties
for pn, pv in cls.propertyDict.items():
if pv.settable:
res[pn] = pv
# collect info about parameters and their properties
for param, pobj in cls.accessibles.items():
res[param] = {}
for pn, pv in pobj.getProperties().items():
if pv.settable:
res[param][pn] = pv
cls.configurables = res
def __new__(cls, *args, **kwds):
wrapper_class = wrapperClasses.get(cls)
if wrapper_class is None:
wrapper_class = type('_' + cls.__name__, (cls,), cls.wrappedAttributes)
wrapperClasses[cls] = wrapper_class
return super().__new__(wrapper_class)
class Feature(HasAccessibles):
"""all things belonging to a small, predefined functionality influencing the working of a module
a mixin with Feature as a direct base class is recognized as a SECoP feature
and reported in the module property 'features'
"""
class PollInfo:
def __init__(self, pollinterval, trigger_event):
self.interval = pollinterval
self.last_main = 0
self.last_slow = 0
self.pending_errors = set()
self.polled_parameters = []
self.fast_flag = False
self.trigger_event = trigger_event
def trigger(self, immediate=False):
"""trigger a recalculation of poll due times
:param immediate: when True, doPoll should be called as soon as possible
"""
if immediate:
self.last_main = 0
self.trigger_event.set()
def update_interval(self, pollinterval):
if not self.fast_flag:
self.interval = pollinterval
self.trigger()
class Module(HasAccessibles):
"""basic module
all SECoP modules derive from this.
:param name: the modules name
:param logger: a logger instance
:param cfgdict: the dict from this modules section in the config file
:param srv: the server instance
Notes:
- the programmer normally should not need to reimplement :meth:`__init__`
- within modules, parameters should only be addressed as ``self.<pname>``,
i.e. ``self.value``, ``self.target`` etc...
- these are accessing the cached version.
- they can also be written to, generating an async update
- if you want to 'update from the hardware', call ``self.read_<pname>()`` instead
- the return value of this method will be used as the new cached value and
be an async update sent automatically.
- if you want to 'update the hardware' call ``self.write_<pname>(<new value>)``.
- The return value of this method will also update the cache.
"""
# static properties, definitions in derived classes should overwrite earlier ones.
# note: properties don't change after startup and are usually filled
# with data from a cfg file...
# note: only the properties predefined here are allowed to be set in the cfg file
export = Property('flag if this module is to be exported', BoolType(), default=True, export=False)
group = Property('optional group the module belongs to', StringType(), default='', extname='group')
description = Property('description of the module', TextType(), extname='description', mandatory=True)
meaning = Property('optional meaning indicator', TupleOf(StringType(), IntRange(0, 50)),
default=('', 0), extname='meaning')
visibility = Property('optional visibility hint', EnumType('visibility', user=1, advanced=2, expert=3),
default='user', extname='visibility')
implementation = Property('internal name of the implementation class of the module', StringType(),
extname='implementation')
interface_classes = Property('offical highest interface-class of the module', ArrayOf(StringType()),
extname='interface_classes')
features = Property('list of features', ArrayOf(StringType()), extname='features')
pollinterval = Property('poll interval for parameters handled by doPoll', FloatRange(0.1, 120), default=5)
slowinterval = Property('poll interval for other parameters', FloatRange(0.1, 120), default=15)
omit_unchanged_within = Property('default for minimum time between updates of unchanged values',
NoneOr(FloatRange(0)), export=False, default=None)
enablePoll = True
# properties, parameters and commands are auto-merged upon subclassing
parameters = {}
commands = {}
# reference to the dispatcher (used for sending async updates)
DISPATCHER = None
pollInfo = None
triggerPoll = None # trigger event for polls. used on io modules and modules without io
def __init__(self, name, logger, cfgdict, srv):
# remember the dispatcher object (for the async callbacks)
self.DISPATCHER = srv.dispatcher
self.log = logger
self.name = name
self.valueCallbacks = {}
self.errorCallbacks = {}
self.earlyInitDone = False
self.initModuleDone = False
self.startModuleDone = False
self.remoteLogHandler = None
self.accessLock = threading.RLock() # for read_* / write_* methods
self.updateLock = threading.RLock() # for announceUpdate
self.polledModules = [] # modules polled by thread started in self.startModules
self.attachedModules = {}
errors = []
self._isinitialized = False
# handle module properties
# 1) make local copies of properties
super().__init__()
# 2) check and apply properties specified in cfgdict as
# '<propertyname> = <propertyvalue>'
# pylint: disable=consider-using-dict-items
for key in self.propertyDict:
value = cfgdict.pop(key, None)
if value is not None:
try:
if isinstance(value, dict):
self.setProperty(key, value['value'])
else:
self.setProperty(key, value)
except BadValueError:
errors.append(f'{key}: value {value!r} does not match {self.propertyDict[key].datatype!r}!')
# 3) set automatic properties
mycls, = self.__class__.__bases__ # skip the wrapper class
myclassname = f'{mycls.__module__}.{mycls.__name__}'
self.implementation = myclassname
# list of all 'secop' modules
# self.interface_classes = [
# b.__name__ for b in mycls.__mro__ if b.__module__.startswith('frappy.modules')]
# list of only the 'highest' secop module class
self.interface_classes = [
b.__name__ for b in mycls.__mro__ if b in SECoP_BASE_CLASSES][:1]
# handle Features
self.features = [b.__name__ for b in mycls.__mro__ if Feature in b.__bases__]
# handle accessibles
# 1) make local copies of parameter objects
# they need to be individual per instance since we use them also
# to cache the current value + qualifiers...
accessibles = {}
# conversion from exported names to internal attribute names
accessiblename2attr = {}
for aname, aobj in self.accessibles.items():
# make a copy of the Parameter/Command object
aobj = aobj.copy()
if not self.export: # do not export parameters of a module not exported
aobj.export = False
if aobj.export:
accessiblename2attr[aobj.export] = aname
accessibles[aname] = aobj
# do not re-use self.accessibles as this is the same for all instances
self.accessibles = accessibles
self.accessiblename2attr = accessiblename2attr
# provide properties to 'filter' out the parameters/commands
self.parameters = {k: v for k, v in accessibles.items() if isinstance(v, Parameter)}
self.commands = {k: v for k, v in accessibles.items() if isinstance(v, Command)}
# 2) check and apply parameter_properties
bad = []
for aname, cfg in cfgdict.items():
aobj = self.accessibles.get(aname, None)
if aobj:
try:
for propname, propvalue in cfg.items():
aobj.setProperty(propname, propvalue)
except KeyError:
errors.append(f"'{aname}' has no property '{propname}'")
except BadValueError as e:
errors.append(f'{aname}.{propname}: {str(e)}')
else:
bad.append(aname)
# 3) complain about names not found as accessible or property names
if bad:
errors.append(
f"{', '.join(bad)} does not exist (use one of {', '.join(list(self.accessibles) + list(self.propertyDict))})")
# 4) register value for writing, if given
# apply default when no value is given (in cfg or as Parameter argument)
# or complain, when cfg is needed
self.writeDict = {} # values of parameters to be written
for pname, pobj in self.parameters.items():
self.valueCallbacks[pname] = []
self.errorCallbacks[pname] = []
if isinstance(pobj, Limit):
basepname = pname.rpartition('_')[0]
baseparam = self.parameters.get(basepname)
if not baseparam:
errors.append(f'limit {pname!r} is given, but not {basepname!r}')
continue
if baseparam.datatype is None:
continue # an error will be reported on baseparam
pobj.set_datatype(baseparam.datatype)
if not pobj.hasDatatype():
errors.append(f'{pname} needs a datatype')
continue
if pobj.value is None:
if pobj.needscfg:
errors.append(f'{pname!r} has no default value and was not given in config!')
if pobj.default is None:
# we do not want to call the setter for this parameter for now,
# this should happen on the first read
pobj.readerror = ConfigError(f'parameter {pname!r} not initialized')
# above error will be triggered on activate after startup,
# when not all hardware parameters are read because of startup timeout
pobj.default = pobj.datatype.default
pobj.value = pobj.default
else:
# value given explicitly, either by cfg or as Parameter argument
pobj.given = True # for PersistentMixin
if hasattr(self, 'write_' + pname):
self.writeDict[pname] = pobj.value
if pobj.default is None:
pobj.default = pobj.value
# this checks again for datatype and sets the timestamp
setattr(self, pname, pobj.value)
# 5) ensure consistency
for aobj in self.accessibles.values():
aobj.finish(self)
# Modify units AFTER applying the cfgdict
mainvalue = self.parameters.get('value')
if mainvalue:
mainunit = mainvalue.datatype.unit
if mainunit:
self.applyMainUnit(mainunit)
# 6) check complete configuration of * properties
if not errors:
try:
self.checkProperties()
except ConfigError as e:
errors.append(str(e))
for aname, aobj in self.accessibles.items():
try:
aobj.checkProperties()
except (ConfigError, ProgrammingError) as e:
errors.append(f'{aname}: {e}')
if errors:
raise ConfigError(errors)
# helper cfg-editor
def __iter__(self):
return self.accessibles.__iter__()
def __getitem__(self, item):
return self.accessibles.__getitem__(item)
def applyMainUnit(self, mainunit):
"""replace $ in units of parameters by mainunit"""
for pobj in self.parameters.values():
pobj.datatype.set_main_unit(mainunit)
def announceUpdate(self, pname, value=None, err=None, timestamp=None, validate=True):
"""announce a changed value or readerror
:param pname: parameter name
:param value: new value or None in case of error
:param err: None or an exception
:param timestamp: a timestamp or None for taking current time
:param validate: True: convert to datatype, in case of error store in readerror
:return:
when err=None and validate=False, the value must already be converted to the datatype
"""
with self.updateLock:
pobj = self.parameters[pname]
timestamp = timestamp or time.time()
if not err:
try:
if validate:
value = pobj.datatype(value)
except Exception as e:
err = e
else:
changed = pobj.value != value
# store the value even in case of error
pobj.value = value
if err:
if secop_error(err) == pobj.readerror:
err.report_error = False
return # no updates for repeated errors
err = secop_error(err)
elif not changed and timestamp < (pobj.timestamp or 0) + pobj.omit_unchanged_within:
# no change within short time -> omit
return
pobj.timestamp = timestamp or time.time()
if err:
callbacks = self.errorCallbacks
pobj.readerror = arg = err
else:
callbacks = self.valueCallbacks
arg = value
pobj.readerror = None
if pobj.export:
self.DISPATCHER.announce_update(self.name, pname, pobj)
cblist = callbacks[pname]
for cb in cblist:
try:
cb(arg)
except Exception:
# print(formatExtendedTraceback())
pass
def registerCallbacks(self, modobj, autoupdate=()):
"""register callbacks to another module <modobj>
- whenever a self.<param> changes:
<modobj>.update_<param> is called with the new value as argument.
If this method raises an exception, <modobj>.<param> gets into an error state.
If the method does not exist and <param> is in autoupdate,
<modobj>.<param> is updated to self.<param>
- whenever <self>.<param> gets into an error state:
<modobj>.error_update_<param> is called with the exception as argument.
If this method raises an error, <modobj>.<param> gets into an error state.
If this method does not exist, and <param> is in autoupdate,
<modobj>.<param> gets into the same error state as self.<param>
"""
for pname in self.parameters:
errfunc = getattr(modobj, 'error_update_' + pname, None)
if errfunc:
def errcb(err, p=pname, efunc=errfunc):
try:
efunc(err)
except Exception as e:
modobj.announceUpdate(p, err=e)
self.errorCallbacks[pname].append(errcb)
else:
def errcb(err, p=pname):
modobj.announceUpdate(p, err=err)
if pname in autoupdate:
self.errorCallbacks[pname].append(errcb)
updfunc = getattr(modobj, 'update_' + pname, None)
if updfunc:
def cb(value, ufunc=updfunc, efunc=errcb):
try:
ufunc(value)
except Exception as e:
efunc(e)
self.valueCallbacks[pname].append(cb)
elif pname in autoupdate:
def cb(value, p=pname):
modobj.announceUpdate(p, value)
self.valueCallbacks[pname].append(cb)
def isBusy(self, status=None):
"""helper function for treating substates of BUSY correctly"""
# defined even for non drivable (used for dynamic polling)
return False
def earlyInit(self):
"""initialise module with stuff to be done before all modules are created"""
self.earlyInitDone = True
def initModule(self):
"""initialise module with stuff to be done after all modules are created"""
self.initModuleDone = True
if self.enablePoll or self.writeDict:
# enablePoll == False: we still need the poll thread for writing values from writeDict
if hasattr(self, 'io'):
self.io.polledModules.append(self)
else:
self.triggerPoll = threading.Event()
self.polledModules.append(self)
def startModule(self, start_events):
"""runs after init of all modules
when a thread is started, a trigger function may signal that it
has finished its initial work
start_events.get_trigger(<timeout>) creates such a trigger and
registers it in the server for waiting
<timeout> defaults to 30 seconds
"""
if self.polledModules:
mkthread(self.__pollThread, self.polledModules, start_events.get_trigger())
self.startModuleDone = True
def initialReads(self):
"""initial reads to be done
override to read initial values from HW, when it is not desired
to poll them afterwards
called from the poll thread, after writeInitParams but before
all parameters are polled once
"""
def shutdownModule(self):
"""called when the sever shuts down
any cleanup-work should be performed here, like closing threads and
saving data.
"""
def doPoll(self):
"""polls important parameters like value and status
all other parameters are polled automatically
"""
def setFastPoll(self, flag, fast_interval=0.25):
"""change poll interval
:param flag: enable/disable fast poll mode
:param fast_interval: fast poll interval
"""
if self.pollInfo:
self.pollInfo.fast_flag = flag
self.pollInfo.interval = fast_interval if flag else self.pollinterval
self.pollInfo.trigger()
def callPollFunc(self, rfunc, raise_com_failed=False):
"""call read method with proper error handling"""
try:
rfunc()
if rfunc.__name__ in self.pollInfo.pending_errors:
self.log.info('%s: o.k.', rfunc.__name__)
self.pollInfo.pending_errors.discard(rfunc.__name__)
except Exception as e:
if getattr(e, 'report_error', True):
name = rfunc.__name__
self.pollInfo.pending_errors.add(name) # trigger o.k. message after error is resolved
if isinstance(e, SECoPError):
e.raising_methods.append(name)
if e.silent:
self.log.debug('%s', e.format(False))
else:
self.log.error('%s', e.format(False))
if raise_com_failed and isinstance(e, CommunicationFailedError):
raise
else:
# not a SECoPError: this is proabably a programming error
# we want to log the traceback
self.log.error('%s', formatException())
def __pollThread(self, modules, started_callback):
"""poll thread body
:param modules: list of modules to be handled by this thread
:param started_callback: to be called after all polls are done once
before polling, parameters which need hardware initialisation are written
"""
polled_modules = [m for m in modules if m.enablePoll]
if hasattr(self, 'registerReconnectCallback'):
# self is a communicator supporting reconnections
def trigger_all(trg=self.triggerPoll, polled_modules=polled_modules):
for m in polled_modules:
m.pollInfo.last_main = 0
m.pollInfo.last_slow = 0
trg.set()
self.registerReconnectCallback('trigger_polls', trigger_all)
# collect all read functions
for mobj in polled_modules:
pinfo = mobj.pollInfo = PollInfo(mobj.pollinterval, self.triggerPoll)
# trigger a poll interval change when self.pollinterval changes.
if 'pollinterval' in mobj.valueCallbacks:
mobj.valueCallbacks['pollinterval'].append(pinfo.update_interval)
for pname, pobj in mobj.parameters.items():
rfunc = getattr(mobj, 'read_' + pname)
if rfunc.poll:
pinfo.polled_parameters.append((mobj, rfunc, pobj))
while True:
try:
for mobj in modules:
# TODO when needed: here we might add a call to a method :meth:`beforeWriteInit`
mobj.writeInitParams()
mobj.initialReads()
# call all read functions a first time
for m in polled_modules:
for mobj, rfunc, _ in m.pollInfo.polled_parameters:
mobj.callPollFunc(rfunc, raise_com_failed=True)
# TODO when needed: here we might add calls to a method :meth:`afterInitPolls`
break
except CommunicationFailedError as e:
# when communication failed, probably all parameters and may be more modules are affected.
# as this would take a lot of time (summed up timeouts), we do not continue
# trying and let the server accept connections, further polls might success later
if started_callback:
self.log.error('communication failure on startup: %s', e)
started_callback()
started_callback = None
self.triggerPoll.wait(0.1) # wait for reconnection or max 10 sec.
break
if started_callback:
started_callback()
if not polled_modules: # no polls needed - exit thread
return
to_poll = ()
while True:
now = time.time()
wait_time = 999
for mobj in modules:
pinfo = mobj.pollInfo
wait_time = min(pinfo.last_main + pinfo.interval - now, wait_time,
pinfo.last_slow + mobj.slowinterval - now)
if wait_time > 0 and not to_poll:
# nothing to do
self.triggerPoll.wait(wait_time)
self.triggerPoll.clear()
continue
# call doPoll of all modules where due
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_main + pinfo.interval:
try:
pinfo.last_main = (now // pinfo.interval) * pinfo.interval
except ZeroDivisionError:
pinfo.last_main = now
mobj.callPollFunc(mobj.doPoll)
now = time.time()
# find ONE due slow poll and call it
loop = True
while loop: # loops max. 2 times, when to_poll is at end
for mobj, rfunc, pobj in to_poll:
if now > pobj.timestamp + mobj.slowinterval * 0.5:
mobj.callPollFunc(rfunc)
loop = False # one poll done
break
else:
to_poll = []
# collect due slow polls
for mobj in modules:
pinfo = mobj.pollInfo
if now > pinfo.last_slow + mobj.slowinterval:
to_poll.extend(pinfo.polled_parameters)
pinfo.last_slow = (now // mobj.slowinterval) * mobj.slowinterval
if to_poll:
to_poll = iter(to_poll)
else:
loop = False # no slow polls ready
def writeInitParams(self):
"""write values for parameters with configured values
- does proper error handling
called at the beginning of the poller thread and for writing persistent values
"""
for pname in list(self.writeDict):
value = self.writeDict.pop(pname, Done)
# in the mean time, a poller or handler might already have done it
if value is not Done:
wfunc = getattr(self, 'write_' + pname, None)
if wfunc is None:
setattr(self, pname, value)
else:
try:
self.log.debug('initialize parameter %s', pname)
wfunc(value)
except SECoPError as e:
if e.silent:
self.log.debug('%s: %s', pname, str(e))
else:
self.log.error('%s: %s', pname, str(e))
except Exception:
self.log.error(formatException())
def setRemoteLogging(self, conn, level):
if self.remoteLogHandler is None:
for handler in self.log.handlers:
if isinstance(handler, RemoteLogHandler):
self.remoteLogHandler = handler
break
else:
raise ValueError('remote handler not found')
self.remoteLogHandler.set_conn_level(self, conn, level)
def checkLimits(self, value, pname='target'):
"""check for limits
:param value: the value to be checked for <pname>_min <= value <= <pname>_max
:param pname: parameter name, default is 'target'
raises RangeError in case the value is not valid
This method is called automatically and needs therefore rarely to be
called by the programmer. It might be used in a check_<param> method,
when no automatic super call is desired.
"""
try:
min_, max_ = getattr(self, pname + '_limits')
if not min_ <= value <= max_:
raise RangeError(f'{pname} outside {pname}_limits')
return
except AttributeError:
pass
min_ = getattr(self, pname + '_min', float('-inf'))
max_ = getattr(self, pname + '_max', float('inf'))
if min_ > max_:
raise RangeError(f'invalid limits: {pname}_min > {pname}_max')
if value < min_:
raise RangeError(f'{pname} below {pname}_min')
if value > max_:
raise RangeError(f'{pname} above {pname}_max')
class Readable(Module):
@ -107,18 +925,13 @@ class Communicator(HasComlog, Module):
"""
raise NotImplementedError()
SECoP_BASE_CLASSES = {Readable, Writable, Drivable, Communicator}
class Attached(Property):
"""a special property, defining an attached module
assign a module name to this property in the cfg file,
and the server will create an attribute with this module
When mandatory is set to False, and there is no value or an empty string
given in the config file, the value of the attribute will be None.
"""
def __init__(self, basecls=Module, description='attached module', mandatory=True):
self.basecls = basecls
@ -127,20 +940,13 @@ class Attached(Property):
def __get__(self, obj, owner):
if obj is None:
return self
modobj = obj.attachedModules.get(self.name)
if not modobj:
modulename = super().__get__(obj, owner)
if not modulename:
return None # happens when mandatory=False and modulename is not given
modobj = obj.secNode.get_module(modulename)
if not modobj:
raise ConfigError(f'attached module {self.name}={modulename!r} '
f'does not exist')
if self.name not in obj.attachedModules:
modobj = obj.DISPATCHER.get_module(super().__get__(obj, owner))
if not isinstance(modobj, self.basecls):
raise ConfigError(f'attached module {self.name}={modobj.name!r} '
f'must inherit from {self.basecls.__qualname__!r}')
raise ConfigError(f'attached module {self.name}={modobj.name!r} '\
f'must inherit from {self.basecls.__qualname__!r}')
obj.attachedModules[self.name] = modobj
return modobj
return obj.attachedModules.get(self.name) # return None if not given
def copy(self):
return Attached(self.basecls, self.description, self.mandatory)

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -24,12 +25,12 @@
import inspect
from frappy.datatypes import ArrayOf, BoolType, CommandType, DataType, \
DataTypeType, EnumType, FloatRange, NoneOr, OrType, StringType, StructOf, \
TextType, TupleOf, ValueType
from frappy.errors import BadValueError, ProgrammingError, WrongTypeError
from frappy.lib import generalConfig
from frappy.datatypes import BoolType, CommandType, DataType, \
DataTypeType, EnumType, NoneOr, OrType, FloatRange, \
StringType, StructOf, TextType, TupleOf, ValueType, ArrayOf
from frappy.errors import BadValueError, WrongTypeError, ProgrammingError
from frappy.properties import HasProperties, Property
from frappy.lib import generalConfig
generalConfig.set_default('tolerate_poll_property', False)
generalConfig.set_default('omit_unchanged_within', 0.1)
@ -427,18 +428,6 @@ class Command(Accessible):
def __call__(self, func):
"""called when used as decorator"""
if isinstance(self.argument, StructOf):
# automatically set optional struct members
sig = inspect.signature(func)
params = set(sig.parameters.keys())
params.discard('self')
members = set(self.argument.members)
if params != members:
raise ProgrammingError(f'Command {func.__name__}: Function'
f' argument names do not match struct'
f' members!: {params} != {members}')
self.argument.optional = [p for p,v in sig.parameters.items()
if v.default is not inspect.Parameter.empty]
if 'description' not in self.propertyValues and func.__doc__:
self.description = inspect.cleandoc(func.__doc__)
self.ownProperties['description'] = self.description
@ -499,7 +488,7 @@ class Command(Accessible):
"""perform function call
:param module_obj: the module on which the command is to be executed
:param argument: the argument from the do command (transported value!)
:param argument: the argument from the do command
:returns: the return value converted to the result type
- when the argument type is TupleOf, the function is called with multiple arguments
@ -509,15 +498,6 @@ class Command(Accessible):
# pylint: disable=unnecessary-dunder-call
func = self.__get__(module_obj)
if self.argument:
if argument is None:
raise WrongTypeError(
f'{module_obj.__class__.__name__}.{self.name} needs an'
f' argument of type {self.argument}!'
)
# convert transported value to internal value
argument = self.argument.import_value(argument)
# verify range
self.argument.validate(argument)
if isinstance(self.argument, TupleOf):
res = func(*argument)
elif isinstance(self.argument, StructOf):

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -77,10 +78,11 @@ class PersistentMixin(Module):
super().__init__(name, logger, cfgdict, srv)
persistentdir = os.path.join(generalConfig.logdir, 'persistent')
os.makedirs(persistentdir, exist_ok=True)
self.persistentFile = os.path.join(persistentdir, f'{self.secNode.equipment_id}.{self.name}.json')
self.persistentFile = os.path.join(persistentdir, f'{self.DISPATCHER.equipment_id}.{self.name}.json')
self.initData = {} # "factory" settings
loaded = self.loadPersistentData()
for pname, pobj in self.parameters.items():
for pname in self.parameters:
pobj = self.parameters[pname]
flag = getattr(pobj, 'persistent', False)
if flag:
if flag == 'auto':

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -69,8 +70,8 @@ class MainLogger:
self.log = None
self.console_handler = None
mlzlog.setLoggerClass(mlzlog.MLZLogger)
mlzlog.log = mlzlog.MLZLogger('')
self.log = mlzlog.log.getChild('')
assert self.log is None
self.log = mlzlog.log = mlzlog.MLZLogger('')
self.log.setLevel(mlzlog.DEBUG)
self.log.addHandler(mlzlog.ColoredConsoleHandler())
self.log.handlers[0].setLevel(LOG_LEVELS['comlog'])
@ -81,12 +82,20 @@ class Dispatcher(dispatcher.Dispatcher):
super().__init__(name, log, options, srv)
self.log = srv.log # overwrite child logger
def announce_update(self, moduleobj, pobj):
def announce_update(self, modulename, pname, pobj):
if pobj.readerror:
value = repr(pobj.readerror)
else:
value = pobj.value
moduleobj.log.info('%s %r', pobj.name, value)
logobj = self._modules.get(modulename, self)
# self.log.info('%s:%s %r', modulename, pname, value)
logobj.log.info('%s %r', pname, value)
def register_module(self, moduleobj, modulename, export=True):
self.log.info('registering %s', modulename)
super().register_module(moduleobj, modulename, export)
setattr(main, modulename, moduleobj)
self.get_module(modulename)
logger = MainLogger()
@ -110,10 +119,6 @@ class Playground(Server):
merged_cfg.pop('node', None)
self.module_cfg = merged_cfg
self._processCfg()
for modulename, moduleobj in self.secnode.modules.items():
cls = type(moduleobj).__bases__[0]
moduleobj.log.info('created as %s.%s', cls.__module__, cls.__name__)
setattr(main, modulename, moduleobj)
play = Playground()

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -26,11 +27,12 @@ import inspect
from frappy.errors import BadValueError, ConfigError, ProgrammingError
from frappy.lib import UniqueObject
from frappy.lib.py35compat import Object
UNSET = UniqueObject('undefined value') #: an unset value, not even None
class HasDescriptors:
class HasDescriptors(Object):
@classmethod
def __init_subclass__(cls):
# when migrating old style declarations, sometimes the trailing comma is not removed

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -17,7 +18,6 @@
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Markus Zolliker <markus.zolliker@psi.ch>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""Dispatcher for SECoP Messages
@ -29,18 +29,28 @@ Interface to the service offering part:
on the connectionobj or on activated connections
- 'add_connection(connectionobj)' registers new connection
- 'remove_connection(connectionobj)' removes now longer functional connection
Interface to the modules:
- add_module(modulename, moduleobj, export=True) registers a new module under the
given name, may also register it for exporting (making accessible)
- get_module(modulename) returns the requested module or None
- remove_module(modulename_or_obj): removes the module (during shutdown)
"""
import threading
import traceback
from collections import OrderedDict
from time import time as currenttime
from frappy.errors import NoSuchCommandError, NoSuchModuleError, \
NoSuchParameterError, ProtocolError, ReadOnlyError
NoSuchParameterError, ProtocolError, ReadOnlyError, ConfigError
from frappy.params import Parameter
from frappy.protocol.messages import COMMANDREPLY, DESCRIPTIONREPLY, \
DISABLEEVENTSREPLY, ENABLEEVENTSREPLY, ERRORPREFIX, EVENTREPLY, \
HEARTBEATREPLY, IDENTREPLY, IDENTREQUEST, LOG_EVENT, LOGGING_REPLY, \
READREPLY, WRITEREPLY
HEARTBEATREPLY, IDENTREPLY, IDENTREQUEST, READREPLY, WRITEREPLY, \
LOGGING_REPLY, LOG_EVENT
from frappy.lib import get_class
def make_update(modulename, pobj):
@ -61,7 +71,10 @@ class Dispatcher:
self.nodeprops[k] = options.pop(k)
self.log = logger
self.secnode = srv.secnode
# map ALL modulename -> moduleobj
self._modules = {}
# list of EXPORTED modules
self._export = []
# list all connections
self._connections = []
# active (i.e. broadcast-receiving) connections
@ -75,6 +88,11 @@ class Dispatcher:
self.shutdown = srv.shutdown
# handle to server
self.srv = srv
# set of modules that failed creation
self.failed_modules = set()
# list of errors that occured during initialization
self.errors = []
self.traceback_counter = 0
def broadcast_event(self, msg, reallyall=False):
"""broadcasts a msg to all active connections
@ -93,10 +111,10 @@ class Dispatcher:
for conn in listeners:
conn.send_reply(msg)
def announce_update(self, moduleobj, pobj):
def announce_update(self, modulename, pname, pobj):
"""called by modules param setters to notify subscribers of new values
"""
self.broadcast_event(make_update(moduleobj.name, pobj))
self.broadcast_event(make_update(modulename, pobj))
def subscribe(self, conn, eventname):
self._subscriptions.setdefault(eventname, set()).add(conn)
@ -130,10 +148,163 @@ class Dispatcher:
self._connections.remove(conn)
self.reset_connection(conn)
def register_module(self, moduleobj, modulename, export=True):
self.log.debug('registering module %r as %s (export=%r)',
moduleobj, modulename, export)
self._modules[modulename] = moduleobj
if export:
self._export.append(modulename)
def get_module(self, modulename):
""" Returns a fully initialized module. Or None, if something went
wrong during instatiating/initializing the module."""
modobj = self.get_module_instance(modulename)
if modobj is None:
return None
if modobj._isinitialized:
return modobj
# also call earlyInit on the modules
self.log.debug('initializing module %r', modulename)
try:
modobj.earlyInit()
if not modobj.earlyInitDone:
self.errors.append(f'{modobj.earlyInit.__qualname__} was not called, probably missing super call')
modobj.initModule()
if not modobj.initModuleDone:
self.errors.append(f'{modobj.initModule.__qualname__} was not called, probably missing super call')
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error initializing {modulename}: {e!r}')
modobj._isinitialized = True
self.log.debug('initialized module %r', modulename)
return modobj
def get_module_instance(self, modulename):
""" Returns the module in its current initialization state or creates a
new uninitialized modle to return.
When creating a new module, srv.module_config is accessed to get the
modules configuration.
"""
if modulename in self._modules:
return self._modules[modulename]
if modulename in list(self._modules.values()):
# it's actually already the module object
return modulename
# create module from srv.module_cfg, store and return
self.log.debug('attempting to create module %r', modulename)
opts = self.srv.module_cfg.get(modulename, None)
if opts is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist on this SEC-Node!')
pymodule = None
try: # pylint: disable=no-else-return
classname = opts.pop('cls')
if isinstance(classname, str):
pymodule = classname.rpartition('.')[0]
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = get_class(classname)
else:
pymodule = classname.__module__
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = classname
except Exception as e:
if str(e) == 'no such class':
self.errors.append(f'{classname} not found')
else:
self.failed_modules.add(pymodule)
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error importing {classname}')
return None
else:
try:
modobj = cls(modulename, self.log.getChild(modulename), opts, self.srv)
except ConfigError as e:
self.errors.append(f'error creating module {modulename}:')
for errtxt in e.args[0] if isinstance(e.args[0], list) else [e.args[0]]:
self.errors.append(' ' + errtxt)
modobj = None
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error creating {modulename}')
modobj = None
if modobj:
self.register_module(modobj, modulename, modobj.export)
self.srv.modules[modulename] = modobj # IS HERE THE CORRECT PLACE?
return modobj
def remove_module(self, modulename_or_obj):
moduleobj = self.get_module(modulename_or_obj)
modulename = moduleobj.name
if modulename in self._export:
self._export.remove(modulename)
self._modules.pop(modulename)
self._subscriptions.pop(modulename, None)
for k in [kk for kk in self._subscriptions if kk.startswith(f'{modulename}:')]:
self._subscriptions.pop(k, None)
def list_module_names(self):
# return a copy of our list
return self._export[:]
def export_accessibles(self, modulename):
self.log.debug('export_accessibles(%r)', modulename)
if modulename in self._export:
# omit export=False params!
res = OrderedDict()
for aobj in self.get_module(modulename).accessibles.values():
if aobj.export:
res[aobj.export] = aobj.for_export()
self.log.debug('list accessibles for module %s -> %r',
modulename, res)
return res
self.log.debug('-> module is not to be exported!')
return OrderedDict()
def get_descriptive_data(self, specifier):
"""returns a python object which upon serialisation results in the descriptive data"""
specifier = specifier or ''
modules = {}
result = {'modules': modules}
for modulename in self._export:
module = self.get_module(modulename)
if not module.export:
continue
# some of these need rework !
mod_desc = {'accessibles': self.export_accessibles(modulename)}
mod_desc.update(module.exportProperties())
mod_desc.pop('export', False)
modules[modulename] = mod_desc
modname, _, pname = specifier.partition(':')
if modname in modules: # extension to SECoP standard: description of a single module
result = modules[modname]
if pname in result['accessibles']: # extension to SECoP standard: description of a single accessible
# command is also accepted
result = result['accessibles'][pname]
elif pname:
raise NoSuchParameterError(f'Module {modname!r} has no parameter {pname!r}')
elif not modname or modname == '.':
result['equipment_id'] = self.equipment_id
result['firmware'] = 'FRAPPY - The Python Framework for SECoP'
result['version'] = '2021.02'
result.update(self.nodeprops)
else:
raise NoSuchModuleError(f'Module {modname!r} does not exist')
return result
def _execute_command(self, modulename, exportedname, argument=None):
""" Execute a command. Importing the value is done in 'do' for nicer
error messages."""
moduleobj = self.secnode.get_module(modulename)
moduleobj = self.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -141,6 +312,9 @@ class Dispatcher:
cobj = moduleobj.commands.get(cname)
if cobj is None:
raise NoSuchCommandError(f'Module {modulename!r} has no command {cname or exportedname!r}')
if cobj.argument:
argument = cobj.argument.import_value(argument)
# now call func
# note: exceptions are handled in handle_request, not here!
result = cobj.do(moduleobj, argument)
@ -149,7 +323,7 @@ class Dispatcher:
return result, {'t': currenttime()}
def _setParameterValue(self, modulename, exportedname, value):
moduleobj = self.secnode.get_module(modulename)
moduleobj = self.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -162,9 +336,7 @@ class Dispatcher:
if pobj.readonly:
raise ReadOnlyError(f"Parameter {modulename}:{pname} can not be changed remotely")
# convert transported value to internal value
value = pobj.datatype.import_value(value)
# verify range
# validate!
value = pobj.datatype.validate(value, previous=pobj.value)
# note: exceptions are handled in handle_request, not here!
getattr(moduleobj, 'write_' + pname)(value)
@ -172,7 +344,7 @@ class Dispatcher:
return pobj.export_value(), {'t': pobj.timestamp} if pobj.timestamp else {}
def _getParameterValue(self, modulename, exportedname):
moduleobj = self.secnode.get_module(modulename)
moduleobj = self.get_module(modulename)
if moduleobj is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
@ -229,7 +401,7 @@ class Dispatcher:
return (IDENTREPLY, None, None)
def handle_describe(self, conn, specifier, data):
return (DESCRIPTIONREPLY, specifier or '.', self.secnode.get_descriptive_data(specifier))
return (DESCRIPTIONREPLY, specifier or '.', self.get_descriptive_data(specifier))
def handle_read(self, conn, specifier, data):
if data:
@ -268,9 +440,9 @@ class Dispatcher:
modulename, exportedname = specifier, None
if ':' in specifier:
modulename, exportedname = specifier.split(':', 1)
if modulename not in self.secnode.export:
if modulename not in self._export:
raise NoSuchModuleError(f'Module {modulename!r} does not exist')
moduleobj = self.secnode.get_module(modulename)
moduleobj = self.get_module(modulename)
if exportedname is not None:
pname = moduleobj.accessiblename2attr.get(exportedname, True)
if pname and pname not in moduleobj.accessibles:
@ -284,12 +456,12 @@ class Dispatcher:
else:
# activate all modules
self._active_connections.add(conn)
modules = [(m, None) for m in self.secnode.export]
modules = [(m, None) for m in self._export]
# send updates for all subscribed values.
# note: The initial poll already happend before the server is active
for modulename, pname in modules:
moduleobj = self.secnode.modules.get(modulename, None)
moduleobj = self._modules.get(modulename, None)
if pname:
conn.send_reply(make_update(modulename, moduleobj.parameters[pname]))
continue
@ -313,13 +485,16 @@ class Dispatcher:
conn.send_reply((LOG_EVENT, f'{modname}:{level}', msg))
def set_all_log_levels(self, conn, level):
for modobj in self.secnode.modules.values():
modobj.setRemoteLogging(conn, level, self.send_log_msg)
for modobj in self._modules.values():
modobj.setRemoteLogging(conn, level)
def handle_logging(self, conn, specifier, level):
if specifier == '#':
self.log.handlers[1].setLevel(int(level))
return LOGGING_REPLY, specifier, level
if specifier and specifier != '.':
modobj = self.secnode.modules[specifier]
modobj.setRemoteLogging(conn, level, self.send_log_msg)
modobj = self._modules[specifier]
modobj.setRemoteLogging(conn, level)
else:
self.set_all_log_levels(conn, level)
return LOGGING_REPLY, specifier, level

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -143,6 +143,7 @@ class TCPRequestHandler(socketserver.BaseRequestHandler):
if not data:
self.log.error('should not reply empty data!')
return
self.log.debug('send %r', data)
outdata = encode_msg_frame(*data)
with self.send_lock:
if self.running:
@ -231,6 +232,13 @@ class TCPServer(DualStackTCPServer):
self.log.warning('tried again %d times after "Address already in use"', ntry)
self.log.info("TCPServer initiated")
# py35 compatibility
if not hasattr(socketserver.ThreadingTCPServer, '__exit__'):
def __enter__(self):
return self
def __exit__(self, *args):
self.server_close()
def format_address(addr):
if len(addr) == 2:

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,283 +0,0 @@
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors:
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
import traceback
from collections import OrderedDict
from frappy.dynamic import Pinata
from frappy.errors import ConfigError, NoSuchModuleError, NoSuchParameterError
from frappy.lib import get_class
class SecNode:
"""Managing the modules.
Interface to the modules:
- add_module(module, modulename)
- get_module(modulename) returns the requested module or None if there is
no suitable configuration on the server
"""
def __init__(self, name, logger, options, srv):
self.equipment_id = options.pop('equipment_id', name)
self.nodeprops = {}
for k in list(options):
self.nodeprops[k] = options.pop(k)
# map ALL modulename -> moduleobj
self.modules = {}
# list of EXPORTED modules
self.export = []
self.log = logger
self.srv = srv
# set of modules that failed creation
self.failed_modules = set()
# list of errors that occured during initialization
self.errors = []
self.traceback_counter = 0
self.name = name
def get_module(self, modulename):
""" Returns a fully initialized module. Or None, if something went
wrong during instatiating/initializing the module."""
modobj = self.get_module_instance(modulename)
if modobj is None:
return None
if modobj._isinitialized:
return modobj
# also call earlyInit on the modules
self.log.debug('initializing module %r', modulename)
try:
modobj.earlyInit()
if not modobj.earlyInitDone:
self.errors.append(f'{modobj.earlyInit.__qualname__} was not '
f'called, probably missing super call')
modobj.initModule()
if not modobj.initModuleDone:
self.errors.append(f'{modobj.initModule.__qualname__} was not '
f'called, probably missing super call')
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error initializing {modulename}: {e!r}')
modobj._isinitialized = True
self.log.debug('initialized module %r', modulename)
return modobj
def get_module_instance(self, modulename):
""" Returns the module in its current initialization state or creates a
new uninitialized module to return.
When creating a new module, srv.module_config is accessed to get the
modules configuration.
"""
if modulename in self.modules:
return self.modules[modulename]
if modulename in list(self.modules.values()):
# it's actually already the module object
return modulename
# create module from srv.module_cfg, store and return
self.log.debug('attempting to create module %r', modulename)
opts = self.srv.module_cfg.get(modulename, None)
if opts is None:
raise NoSuchModuleError(f'Module {modulename!r} does not exist on '
f'this SEC-Node!')
opts = dict(opts)
pymodule = None
try: # pylint: disable=no-else-return
classname = opts.pop('cls')
if isinstance(classname, str):
pymodule = classname.rpartition('.')[0]
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = get_class(classname)
else:
pymodule = classname.__module__
if pymodule in self.failed_modules:
# creation has failed already once, do not try again
return None
cls = classname
except Exception as e:
if str(e) == 'no such class':
self.errors.append(f'{classname} not found')
else:
self.failed_modules.add(pymodule)
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error importing {classname}')
return None
else:
try:
modobj = cls(modulename, self.log.parent.getChild(modulename),
opts, self.srv)
except ConfigError as e:
self.errors.append(f'error creating module {modulename}:')
for errtxt in e.args[0] if isinstance(e.args[0], list) else [e.args[0]]:
self.errors.append(' ' + errtxt)
modobj = None
except Exception as e:
if self.traceback_counter == 0:
self.log.exception(traceback.format_exc())
self.traceback_counter += 1
self.errors.append(f'error creating {modulename}')
modobj = None
if modobj:
self.add_module(modobj, modulename)
return modobj
def create_modules(self):
self.modules = OrderedDict()
# create and initialize modules
todos = list(self.srv.module_cfg.items())
while todos:
modname, options = todos.pop(0)
if modname in self.modules:
# already created via Attached
continue
# For Pinata modules: we need to access this in Self.get_module
self.srv.module_cfg[modname] = options
modobj = self.get_module_instance(modname) # lazy
if modobj is None:
self.log.debug('Module %s returned None', modname)
continue
self.modules[modname] = modobj
if isinstance(modobj, Pinata):
# scan for dynamic devices
pinata = self.get_module(modname)
pinata_modules = list(pinata.scanModules())
for name, _cfg in pinata_modules:
if name in self.srv.module_cfg:
self.log.error('Module %s, from pinata %s, already '
'exists in config file!', name, modname)
self.log.info('Pinata %s found %d modules',
modname, len(pinata_modules))
todos.extend(pinata_modules)
def export_accessibles(self, modulename):
self.log.debug('export_accessibles(%r)', modulename)
if modulename in self.export:
# omit export=False params!
res = OrderedDict()
for aobj in self.get_module(modulename).accessibles.values():
if aobj.export:
res[aobj.export] = aobj.for_export()
self.log.debug('list accessibles for module %s -> %r',
modulename, res)
return res
self.log.debug('-> module is not to be exported!')
return OrderedDict()
def get_descriptive_data(self, specifier):
"""returns a python object which upon serialisation results in the
descriptive data"""
specifier = specifier or ''
modules = {}
result = {'modules': modules}
for modulename in self.export:
module = self.get_module(modulename)
if not module.export:
continue
# some of these need rework !
mod_desc = {'accessibles': self.export_accessibles(modulename)}
mod_desc.update(module.exportProperties())
mod_desc.pop('export', False)
modules[modulename] = mod_desc
modname, _, pname = specifier.partition(':')
if modname in modules: # extension to SECoP standard: description of a single module
result = modules[modname]
if pname in result['accessibles']: # extension to SECoP standard: description of a single accessible
# command is also accepted
result = result['accessibles'][pname]
elif pname:
raise NoSuchParameterError(f'Module {modname!r} '
f'has no parameter {pname!r}')
elif not modname or modname == '.':
result['equipment_id'] = self.equipment_id
result['firmware'] = 'FRAPPY - The Python Framework for SECoP'
result['version'] = '2021.02'
result.update(self.nodeprops)
else:
raise NoSuchModuleError(f'Module {modname!r} does not exist')
return result
def add_module(self, module, modulename):
"""Adds a named module object to this SecNode."""
self.modules[modulename] = module
if module.export:
self.export.append(modulename)
# def remove_module(self, modulename_or_obj):
# moduleobj = self.get_module(modulename_or_obj)
# modulename = moduleobj.name
# if modulename in self.export:
# self.export.remove(modulename)
# self.modules.pop(modulename)
# self._subscriptions.pop(modulename, None)
# for k in [kk for kk in self._subscriptions if kk.startswith(f'{modulename}:')]:
# self._subscriptions.pop(k, None)
def shutdown_modules(self):
"""Call 'shutdownModule' for all modules."""
for name in self._getSortedModules():
self.modules[name].shutdownModule()
def _getSortedModules(self):
"""Sort modules topologically by inverse dependency.
Example: if there is an IO device A and module B depends on it, then
the result will be [B, A].
Right now, if the dependency graph is not a DAG, we give up and return
the unvisited nodes to be dismantled at the end.
Taken from Introduction to Algorithms [CLRS].
"""
def go(name):
if name in done: # visiting a node
return True
if name in visited:
visited.add(name)
return False # cycle in dependencies -> fail
visited.add(name)
if name in unmarked:
unmarked.remove(name)
for module in self.modules[name].attachedModules.values():
res = go(module.name)
if not res:
return False
visited.remove(name)
done.add(name)
l.append(name)
return True
unmarked = set(self.modules.keys()) # unvisited nodes
visited = set() # visited in DFS, but not completed
done = set()
l = [] # list of sorted modules
while unmarked:
if not go(unmarked.pop()):
self.log.error('cyclical dependency between modules!')
return l[::-1] + list(visited) + list(unmarked)
return l[::-1]

View File

@ -25,14 +25,14 @@
import os
import signal
import sys
import threading
from collections import OrderedDict
from frappy.config import load_config
from frappy.errors import ConfigError
from frappy.dynamic import Pinata
from frappy.lib import formatException, generalConfig, get_class, mkthread
from frappy.lib.multievent import MultiEvent
from frappy.params import PREDEFINED_ACCESSIBLES
from frappy.secnode import SecNode
try:
from daemon import DaemonContext
@ -71,18 +71,19 @@ class Server:
multiple cfg files, the interface is taken from the first cfg file
- testonly: test mode. tries to build all modules, but the server is not started
Config file:
Format: Example:
Node('<equipment_id>', Node('ex.frappy.demo',
<description>, 'short description\n\nlong descr.',
<main interface>, 'tcp://10769',
secondary=[ secondary=['ws://10770'], # optional
<interfaces>
],
) )
Mod('<module name>', Mod('temp',
<param config> value = Param(unit='K'),
) )
Format of cfg file (for now, both forms are accepted):
old form: new form:
[node <equipment id>] [NODE]
description=<descr> id=<equipment id>
description=<descr>
[interface tcp] [INTERFACE]
bindport=10769 uri=tcp://10769
bindto=0.0.0.0
[module temp] [temp]
ramp=12 ramp=12
...
"""
self._testonly = testonly
@ -107,13 +108,9 @@ class Server:
signal.signal(signal.SIGINT, self.signal_handler)
signal.signal(signal.SIGTERM, self.signal_handler)
def signal_handler(self, num, frame):
if hasattr(self, 'interfaces') and self.interfaces:
def signal_handler(self, _num, _frame):
if hasattr(self, 'interface') and self.interface:
self.shutdown()
else:
# TODO: we should probably clean up the already initialized modules
# when getting an interrupt while the server is starting
signal.default_int_handler(num, frame)
def start(self):
if not DaemonContext:
@ -155,44 +152,33 @@ class Server:
print(formatException(verbose=True))
raise
self.interfaces = []
iface_threads = []
interfaces_started = MultiEvent(default_timeout=1)#default_timeout=15)
lock = threading.Lock()
# TODO: check if only one interface of each type is open?
for interface in [self.node_cfg['interface']] + self.node_cfg.get(
'secondary', []
):
opts = {'uri': interface}
t = mkthread(
self._interfaceThread,
opts,
lock,
self.interfaces.append,
interfaces_started.get_trigger(),
)
iface_threads.append(t)
interfaces_started.wait()
self.log.info('startup done, handling transport messages')
if systemd:
systemd.daemon.notify("READY=1\nSTATUS=accepting requests")
self.log.info('Started %d interfaces' % len(self.interfaces))
# we wait here on the thread finishing, which means we got a
# signal to shut down or an exception was raised
# TODO: get the exception (and re-raise?)
for t in iface_threads:
opts = {'uri': self.node_cfg['interface']}
scheme, _, _ = opts['uri'].rpartition('://')
scheme = scheme or 'tcp'
cls = get_class(self.INTERFACES[scheme])
with cls(scheme, self.log.getChild(scheme), opts, self) as self.interface:
if opts:
raise ConfigError(self.unknown_options(cls, opts))
self.log.info('startup done, handling transport messages')
if systemd:
systemd.daemon.notify("READY=1\nSTATUS=accepting requests")
t = mkthread(self.interface.serve_forever)
# we wait here on the thread finishing, which means we got a
# signal to shut down or an exception was raised
# TODO: get the exception (and re-raise?)
t.join()
self.interface = None # fine due to the semantics of 'with'
# server_close() called by 'with'
self.log.info(f'stopped listenning, cleaning up'
f' {len(self.secnode.modules)} modules')
f' {len(self.modules)} modules')
# if systemd:
# if self._restart:
# systemd.daemon.notify('RELOADING=1')
# else:
# systemd.daemon.notify('STOPPING=1')
self.secnode.shutdown_modules()
for name in self._getSortedModules():
self.modules[name].shutdownModule()
if self._restart:
self.restart_hook()
self.log.info('restarting')
@ -201,28 +187,11 @@ class Server:
def restart(self):
if not self._restart:
self._restart = True
for iface in self.interfaces:
iface.shutdown()
self.interface.shutdown()
def shutdown(self):
self._restart = False
for iface in self.interfaces:
iface.shutdown()
def _interfaceThread(self, opts, lock, if_cb, start_cb):
scheme, _, _ = opts['uri'].rpartition('://')
iface = opts['uri']
scheme = scheme or 'tcp'
cls = get_class(self.INTERFACES[scheme])
with cls(scheme, self.log.getChild(scheme), opts, self) as interface:
if opts:
raise ConfigError(self.unknown_options(cls, opts))
with lock:
if_cb(interface)
start_cb()
interface.serve_forever()
# server_close() called by 'with'
self.log.info(f'stopped {iface}')
self.interface.shutdown()
def _processCfg(self):
"""Processes the module configuration.
@ -236,27 +205,50 @@ class Server:
errors = []
opts = dict(self.node_cfg)
cls = get_class(opts.pop('cls'))
name = opts.pop('name', self._cfgfiles)
# TODO: opts not in both
self.secnode = SecNode(name, self.log.getChild('secnode'), opts, self)
self.dispatcher = cls(name, self.log.getChild('dispatcher'), opts, self)
self.dispatcher = cls(opts.pop('name', self._cfgfiles),
self.log.getChild('dispatcher'), opts, self)
if opts:
self.secnode.errors.append(self.unknown_options(cls, opts))
self.dispatcher.errors.append(self.unknown_options(cls, opts))
self.modules = OrderedDict()
# create and initialize modules
todos = list(self.module_cfg.items())
while todos:
modname, options = todos.pop(0)
if modname in self.modules:
# already created by Dispatcher (via Attached)
continue
# For Pinata modules: we need to access this in Dispatcher.get_module
self.module_cfg[modname] = dict(options)
modobj = self.dispatcher.get_module_instance(modname) # lazy
if modobj is None:
self.log.debug('Module %s returned None', modname)
continue
self.modules[modname] = modobj
if isinstance(modobj, Pinata):
# scan for dynamic devices
pinata = self.dispatcher.get_module(modname)
pinata_modules = list(pinata.scanModules())
for name, _cfg in pinata_modules:
if name in self.module_cfg:
self.log.error('Module %s, from pinata %s, already'
' exists in config file!', name, modname)
self.log.info('Pinata %s found %d modules', modname, len(pinata_modules))
todos.extend(pinata_modules)
self.secnode.create_modules()
# initialize all modules by getting them with Dispatcher.get_module,
# which is done in the get_descriptive data
# TODO: caching, to not make this extra work
self.secnode.get_descriptive_data('')
self.dispatcher.get_descriptive_data('')
# =========== All modules are initialized ===========
# all errors from initialization process
errors = self.secnode.errors
errors = self.dispatcher.errors
if not self._testonly:
start_events = MultiEvent(default_timeout=30)
for modname, modobj in self.secnode.modules.items():
for modname, modobj in self.modules.items():
# startModule must return either a timeout value or None (default 30 sec)
start_events.name = f'module {modname}'
modobj.startModule(start_events)
@ -283,8 +275,7 @@ class Server:
self.log.info('all modules started')
history_path = os.environ.get('FRAPPY_HISTORY')
if history_path:
from frappy_psi.historywriter import \
FrappyHistoryWriter # pylint: disable=import-outside-toplevel
from frappy_psi.historywriter import FrappyHistoryWriter # pylint: disable=import-outside-toplevel
writer = FrappyHistoryWriter(history_path, PREDEFINED_ACCESSIBLES.keys(), self.dispatcher)
# treat writer as a connection
self.dispatcher.add_connection(writer)
@ -297,3 +288,41 @@ class Server:
# history_path = os.environ.get('ALTERNATIVE_HISTORY')
# if history_path:
# from frappy_<xx>.historywriter import ... etc.
def _getSortedModules(self):
"""Sort modules topologically by inverse dependency.
Example: if there is an IO device A and module B depends on it, then
the result will be [B, A].
Right now, if the dependency graph is not a DAG, we give up and return
the unvisited nodes to be dismantled at the end.
Taken from Introduction to Algorithms [CLRS].
"""
def go(name):
if name in done: # visiting a node
return True
if name in visited:
visited.add(name)
return False # cycle in dependencies -> fail
visited.add(name)
if name in unmarked:
unmarked.remove(name)
for module in self.modules[name].attachedModules.values():
res = go(module.name)
if not res:
return False
visited.remove(name)
done.add(name)
l.append(name)
return True
unmarked = set(self.modules.keys()) # unvisited nodes
visited = set() # visited in DFS, but not completed
done = set()
l = [] # list of sorted modules
while unmarked:
if not go(unmarked.pop()):
self.log.error('cyclical dependency between modules!')
return l[::-1] + list(visited) + list(unmarked)
return l[::-1]

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
@ -15,7 +16,6 @@
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
"""testing devices"""
@ -28,8 +28,9 @@ import time
from frappy.datatypes import ArrayOf, BoolType, EnumType, \
FloatRange, IntRange, StringType, StructOf, TupleOf
from frappy.lib.enum import Enum
from frappy.modules import Drivable, Readable, Attached
from frappy.modules import Drivable
from frappy.modules import Parameter as SECoP_Parameter
from frappy.modules import Readable
from frappy.properties import Property
@ -118,7 +119,9 @@ class MagneticField(Drivable):
default=1, datatype=EnumType(persistent=1, hold=0),
readonly=False,
)
heatswitch = Attached(Switch, description='name of heat switch device')
heatswitch = Parameter('name of heat switch device',
datatype=StringType(), export=False,
)
Status = Enum(Drivable.Status, PERSIST=PERSIST, PREPARE=301, RAMPING=302, FINISH=303)
@ -127,6 +130,7 @@ class MagneticField(Drivable):
def initModule(self):
super().initModule()
self._state = Enum('state', idle=1, switch_on=2, switch_off=3, ramp=4).idle
self._heatswitch = self.DISPATCHER.get_module(self.heatswitch)
_thread = threading.Thread(target=self._thread)
_thread.daemon = True
_thread.start()
@ -161,10 +165,10 @@ class MagneticField(Drivable):
if self.target != self.value:
self.log.debug('got new target -> switching heater on')
self._state = self._state.enum.switch_on
self.heatswitch.write_target('on')
self._heatswitch.write_target('on')
if self._state == self._state.enum.switch_on:
# wait until switch is on
if self.heatswitch.read_value() == 'on':
if self._heatswitch.read_value() == 'on':
self.log.debug('heatswitch is on -> ramp to %.3f',
self.target)
self._state = self._state.enum.ramp
@ -174,7 +178,7 @@ class MagneticField(Drivable):
if self.mode:
self.log.debug('at field -> switching heater off')
self._state = self._state.enum.switch_off
self.heatswitch.write_target('off')
self._heatswitch.write_target('off')
else:
self.log.debug('at field -> hold')
self._state = self._state.enum.idle
@ -185,7 +189,7 @@ class MagneticField(Drivable):
self.value += step
if self._state == self._state.enum.switch_off:
# wait until switch is off
if self.heatswitch.read_value() == 'off':
if self._heatswitch.read_value() == 'off':
self.log.debug('heatswitch is off at %.3f', self.value)
self._state = self._state.enum.idle
self.read_status() # update async
@ -265,8 +269,12 @@ class Label(Readable):
system = Parameter("Name of the magnet system",
datatype=StringType(), export=False,
)
mf = Attached(MagneticField, description="subdevice for magnet status")
ts = Attached(SampleTemp, description="subdevice for sample temp")
subdev_mf = Parameter("name of subdevice for magnet status",
datatype=StringType(), export=False,
)
subdev_ts = Parameter("name of subdevice for sample temp",
datatype=StringType(), export=False,
)
value = Parameter("final value of label string", default='',
datatype=StringType(),
)
@ -274,16 +282,18 @@ class Label(Readable):
def read_value(self):
strings = [self.system]
if self.ts:
strings.append(f"at {self.ts.read_value():.3f} {self.ts.parameters['value'].datatype.unit}")
dev_ts = self.DISPATCHER.get_module(self.subdev_ts)
if dev_ts:
strings.append(f"at {dev_ts.read_value():.3f} {dev_ts.parameters['value'].datatype.unit}")
else:
strings.append('No connection to sample temp!')
if self.mf:
mf_stat = self.mf.read_status()
mf_mode = self.mf.mode
mf_val = self.mf.value
mf_unit = self.mf.parameters['value'].datatype.unit
dev_mf = self.DISPATCHER.get_module(self.subdev_mf)
if dev_mf:
mf_stat = dev_mf.read_status()
mf_mode = dev_mf.mode
mf_val = dev_mf.value
mf_unit = dev_mf.parameters['value'].datatype.unit
if mf_stat[0] == self.Status.IDLE:
state = 'Persistent' if mf_mode else 'Non-persistent'
else:

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under
@ -16,7 +17,6 @@
#
# Module authors:
# Enrico Faulhaber <enrico.faulhaber@frm2.tum.de>
# Alexander Zaft <a.zaft@fz-juelich.de>
#
# *****************************************************************************
@ -28,10 +28,10 @@
import math
from frappy.datatypes import ArrayOf, FloatRange, StructOf, TupleOf
from frappy.datatypes import ArrayOf, FloatRange, StringType, StructOf, TupleOf
from frappy.errors import ConfigError, DisabledError
from frappy.lib.sequence import SequencerMixin, Step
from frappy.modules import Drivable, Parameter, Attached
from frappy.modules import Drivable, Parameter
class GarfieldMagnet(SequencerMixin, Drivable):
@ -47,12 +47,19 @@ class GarfieldMagnet(SequencerMixin, Drivable):
the symmetry setting selects which.
"""
# attached submodules
currentsource = Attached(description='(bipolar) Powersupply')
enable = Attached(description='Switch to set for on/off')
polswitch = Attached(description='Switch to set for polarity')
symmetry = Attached(description='Switch to read for symmetry')
# parameters
subdev_currentsource = Parameter('(bipolar) Powersupply',
datatype=StringType(),
readonly=True, export=False)
subdev_enable = Parameter('Switch to set for on/off',
datatype=StringType(), readonly=True,
export=False)
subdev_polswitch = Parameter('Switch to set for polarity',
datatype=StringType(), readonly=True,
export=False)
subdev_symmetry = Parameter('Switch to read for symmetry',
datatype=StringType(), readonly=True,
export=False)
userlimits = Parameter('User defined limits of device value',
datatype=TupleOf(FloatRange(unit='$'),
FloatRange(unit='$')),
@ -104,7 +111,7 @@ class GarfieldMagnet(SequencerMixin, Drivable):
Note: This may be overridden in derived classes.
"""
# binary search/bisection
maxcurr = self.currentsource.abslimits[1]
maxcurr = self._currentsource.abslimits[1]
mincurr = -maxcurr
maxfield = self._current2field(maxcurr)
minfield = -maxfield
@ -136,21 +143,26 @@ class GarfieldMagnet(SequencerMixin, Drivable):
def initModule(self):
super().initModule()
self._enable = self.DISPATCHER.get_module(self.subdev_enable)
self._symmetry = self.DISPATCHER.get_module(self.subdev_symmetry)
self._polswitch = self.DISPATCHER.get_module(self.subdev_polswitch)
self._currentsource = self.DISPATCHER.get_module(
self.subdev_currentsource)
self.init_sequencer(fault_on_error=False, fault_on_stop=False)
self.symmetry.read_value()
self._symmetry.read_value()
def read_calibration(self):
try:
try:
return self.calibrationtable[self.symmetry.value]
return self.calibrationtable[self._symmetry.value]
except KeyError:
return self.calibrationtable[self.symmetry.value.name]
return self.calibrationtable[self._symmetry.value.name]
except KeyError:
minslope = min(entry[0]
for entry in self.calibrationtable.values())
self.log.error(
'unconfigured calibration for symmetry %r',
self.symmetry.value)
self._symmetry.value)
return [minslope, 0, 0, 0, 0]
def _checkLimits(self, limits):
@ -170,22 +182,22 @@ class GarfieldMagnet(SequencerMixin, Drivable):
return limits
def read_abslimits(self):
maxfield = self._current2field(self.currentsource.abslimits[1])
maxfield = self._current2field(self._currentsource.abslimits[1])
# limit to configured value (if any)
maxfield = min(maxfield, max(self.accessibles['abslimits'].default))
return -maxfield, maxfield
def read_ramp(self):
# This is an approximation!
return self.calibration[0] * abs(self.currentsource.ramp)
return self.calibration[0] * abs(self._currentsource.ramp)
def write_ramp(self, newramp):
# This is an approximation!
self.currentsource.ramp = float(newramp) / self.calibration[0]
self._currentsource.ramp = float(newramp) / self.calibration[0]
def _get_field_polarity(self):
sign = int(self.polswitch.read_value())
if self.enable.read_value():
sign = int(self._polswitch.read_value())
if self._enable.read_value():
return sign
return 0
@ -198,35 +210,35 @@ class GarfieldMagnet(SequencerMixin, Drivable):
return
if current_pol == 0:
# safe to switch
self.polswitch.write_target(
self._polswitch.write_target(
'+1' if polarity > 0 else str(polarity))
return
if self.currentsource.value < 0.1:
self.polswitch.write_target('0')
if self._currentsource.value < 0.1:
self._polswitch.write_target('0')
return
# unsafe to switch, go to safe state first
self.currentsource.write_target(0)
self._currentsource.write_target(0)
def read_value(self):
return self._current2field(
self.currentsource.read_value() *
self._currentsource.read_value() *
self._get_field_polarity())
def readHwStatus(self):
# called from SequencerMixin.read_status if no sequence is running
if self.enable.value == 'Off':
if self._enable.value == 'Off':
return self.Status.WARN, 'Disabled'
if self.enable.read_status()[0] != self.Status.IDLE:
return self.enable.status
if self.polswitch.value in ['0', 0]:
return self.Status.IDLE, 'Shorted, ' + self.currentsource.status[1]
if self.symmetry.value in ['short', 0]:
return self.currentsource.status[
0], 'Shorted, ' + self.currentsource.status[1]
return self.currentsource.read_status()
if self._enable.read_status()[0] != self.Status.IDLE:
return self._enable.status
if self._polswitch.value in ['0', 0]:
return self.Status.IDLE, 'Shorted, ' + self._currentsource.status[1]
if self._symmetry.value in ['short', 0]:
return self._currentsource.status[
0], 'Shorted, ' + self._currentsource.status[1]
return self._currentsource.read_status()
def write_target(self, target):
if target != 0 and self.symmetry.read_value() in ['short', 0]:
if target != 0 and self._symmetry.read_value() in ['short', 0]:
raise DisabledError(
'Symmetry is shorted, please select another symmetry first!')
@ -239,7 +251,7 @@ class GarfieldMagnet(SequencerMixin, Drivable):
seq.append(Step('preparing', 0, self._prepare_ramp))
seq.append(Step('recover', 0, self._recover))
if current_polarity != wanted_polarity:
if self.currentsource.read_value() > 0.1:
if self._currentsource.read_value() > 0.1:
# switching only allowed if current is low enough -> ramp down
# first
seq.append(
@ -269,54 +281,54 @@ class GarfieldMagnet(SequencerMixin, Drivable):
# steps for the sequencing
def _prepare_ramp(self, store, *args):
store.old_window = self.currentsource.window
self.currentsource.window = 1
store.old_window = self._currentsource.window
self._currentsource.window = 1
def _finish_ramp(self, store, *args):
self.currentsource.window = max(store.old_window, 10)
self._currentsource.window = max(store.old_window, 10)
def _recover(self, store):
# check for interlock
if self.currentsource.read_status()[0] != self.Status.ERROR:
if self._currentsource.read_status()[0] != self.Status.ERROR:
return
# recover from interlock
ramp = self.currentsource.ramp
self.polswitch.write_target('0') # short is safe...
self.polswitch._hw_wait()
self.enable.write_target('On') # else setting ramp won't work
self.enable._hw_wait()
self.currentsource.ramp = 60000
self.currentsource.target = 0
self.currentsource.ramp = ramp
ramp = self._currentsource.ramp
self._polswitch.write_target('0') # short is safe...
self._polswitch._hw_wait()
self._enable.write_target('On') # else setting ramp won't work
self._enable._hw_wait()
self._currentsource.ramp = 60000
self._currentsource.target = 0
self._currentsource.ramp = ramp
# safe state.... if anything of the above fails, the tamperatures may
# be too hot!
def _ramp_current(self, store, target):
if abs(self.currentsource.value - target) <= 0.05:
if abs(self._currentsource.value - target) <= 0.05:
# done with this step if no longer BUSY
return self.currentsource.read_status()[0] == 'BUSY'
if self.currentsource.status[0] != 'BUSY':
if self.enable.status[0] == 'ERROR':
self.enable.reset()
self.enable.read_status()
self.enable.write_target('On')
self.enable._hw_wait()
self.currentsource.write_target(target)
return self._currentsource.read_status()[0] == 'BUSY'
if self._currentsource.status[0] != 'BUSY':
if self._enable.status[0] == 'ERROR':
self._enable.reset()
self._enable.read_status()
self._enable.write_target('On')
self._enable._hw_wait()
self._currentsource.write_target(target)
return True # repeat
def _ramp_current_cleanup(self, store, step_was_busy, target):
# don't cleanup if step finished
if step_was_busy:
self.currentsource.write_target(self.currentsource.read_value())
self.currentsource.window = max(store.old_window, 10)
self._currentsource.write_target(self._currentsource.read_value())
self._currentsource.window = max(store.old_window, 10)
def _set_polarity(self, store, target):
if self.polswitch.read_status()[0] == self.Status.BUSY:
if self._polswitch.read_status()[0] == self.Status.BUSY:
return True
if int(self.polswitch.value) == int(target):
if int(self._polswitch.value) == int(target):
return False # done with this step
if self.polswitch.read_value() != 0:
self.polswitch.write_target(0)
if self._polswitch.read_value() != 0:
self._polswitch.write_target(0)
else:
self.polswitch.write_target(target)
self._polswitch.write_target(target)
return True # repeat

View File

@ -38,7 +38,7 @@ import PyTango
from frappy.datatypes import ArrayOf, EnumType, FloatRange, IntRange, \
LimitsType, StringType, TupleOf, ValueType
from frappy.errors import CommunicationFailedError, ConfigError, \
HardwareError, ProgrammingError, WrongTypeError
HardwareError, ProgrammingError
from frappy.lib import lazy_property
from frappy.modules import Command, Drivable, Module, Parameter, Readable, \
StatusType, Writable, Property
@ -466,31 +466,6 @@ class AnalogOutput(PyTangoDevice, Drivable):
# replacement of '$' by main unit must be done later
self.__main_unit = mainunit
def _init_abslimits(self):
"""Get abslimits from tango if not configured. Otherwise, check if both
ranges are compatible."""
try:
tangoabslim = (
float(self._getProperty('absmin')),
float(self._getProperty('absmax'))
)
if self.parameters['abslimits'].readerror:
# no abslimits configured in frappy. read from entangle
self.parameters['abslimits'].readerror = None
self.abslimits = tangoabslim
except Exception as e:
self.log.error(e)
# check if compatible
try:
dt = FloatRange(*tangoabslim)
dt.validate(self.parameters['abslimits'].datatype.min)
dt.validate(self.parameters['abslimits'].datatype.max)
except WrongTypeError as e:
raise WrongTypeError(f'Absolute limits configured in frappy \''
f'{self.abslimits}\' extend beyond the limits '
f'defined in entangle \'{tangoabslim}\'!') from e
def initModule(self):
super().initModule()
# init history
@ -511,7 +486,6 @@ class AnalogOutput(PyTangoDevice, Drivable):
self.log.error(e)
if self.__main_unit:
super().applyMainUnit(self.__main_unit)
self._init_abslimits()
def doPoll(self):
super().doPoll()

View File

@ -99,8 +99,6 @@ class ZapfPinata(Pinata):
'min': max(devinfo.info['absmin'], -UNLIMITED),
'max': min(devinfo.info['absmax'], UNLIMITED),
}
if devinfo.info['unit'] and devinfo.info['basetype'] == 'float':
config['value']['unit'] = devinfo.info['unit']
if devinfo.info['access'] == 'rw':
config['target'] = {
'min': config['value']['min'],
@ -133,6 +131,8 @@ STATUS_MAP = {
class PLCBase:
status = Parameter(datatype=StatusType(Drivable, 'INITIALIZING',
'DISABLED', 'STARTING'))
status_code = Parameter('raw internal status code',
IntRange(0, 2**32-1))
plcio = Property('plc io device', ValueType())
plc_name = Property('plc io device', StringType(), export=True)
_pinata = Attached(ZapfPinata) # TODO: make this automatic?
@ -159,15 +159,9 @@ class PLCBase:
dataty = cls._map_datatype(info)
if dataty is None:
continue
if info['basetype'] == 'float' and info['unit']: # TODO: better handling
param = Parameter(info['description'],
dataty,
unit=info['unit'],
readonly=readonly)
else:
param = Parameter(info['description'],
dataty,
readonly=readonly)
param = Parameter(info['description'],
dataty,
readonly=readonly)
def read_param(self, parameter=parameter):
code, val = self.plcio.get_param_raw(parameter)
@ -229,7 +223,7 @@ class PLCBase:
if not add_members:
return cls
new_name = '_' + cls.__name__ + '_' \
+ internalize_name("extended")
+ internalize_name("blub")
return type(new_name, (cls,), add_members)
@classmethod
@ -260,6 +254,10 @@ class PLCBase:
msg.append(self.plcio.decode_errid(err_id))
return status, ', '.join(msg)
def read_status_code(self):
state, reason, aux, _ = self.plcio.read_status()
return state << 28 | reason << 24 | aux
@Command()
def stop(self):
"""Stop the operation of this module.

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
#
# This program is free software; you can redistribute it and/or modify it under

View File

@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# *****************************************************************************
# MLZ library of Tango servers
# Copyright (c) 2015-2023 by the authors, see LICENSE

View File

@ -1,3 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,229 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
# details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
# Module authors: Oksana Shliakhtun <oksana.shliakhtun@psi.ch>
# *****************************************************************************
"""Hewlett-Packard HP34401A Multimeter (not finished)"""
import re
from frappy.core import HasIO, Readable, Parameter, FloatRange, EnumType, StatusType, IDLE, ERROR, WARN
def string_to_value(value):
"""
Converting the value to float, removing the units, converting the prefix into the number.
:param value: value
:return: float value without units
"""
value_with_unit = re.compile(r'(\d+)([pnumkMG]?)')
value, pfx = value_with_unit.match(value).groups()
pfx_dict = {'p': 1e-12, 'n': 1e-9, 'u': 1e-6, 'm': 1e-3, 'k': 1e3, 'M': 1e6, 'G': 1e9}
if pfx in pfx_dict:
value = round(float(value) * pfx_dict[pfx], 12)
return float(value)
class HP_IO(HasIO):
end_of_line = b'\n'
identification = [('*IDN?', r'HEWLETT-PACKARD,34401A,0,.*')]
class HP34401A(HP_IO):
status = Parameter(datatype=StatusType(Readable, 'BUSY'))
autorange = Parameter('autorange_on', EnumType('autorange', off=0, on=1), readonly=False, default=0)
def comm(self, cmd): # read until \n
string = f'{cmd}\n'
n_string = string.encode()
response = self.communicate(n_string)
if response:
return response
response = self.communicate(n_string)
return response if response else None
def read_range(self, function):
return self.comm(f'{function}:range?')
def write_range(self, function, range):
return self.comm(f'{function}:range {range}')
def write_autorange(self, function):
cmd = f'{function}:range:auto {"on" if self.autorange == 0 else "off"}'
self.comm(cmd)
return self.comm(f'{function}:range:auto?')
def read_resolution(self, function):
return self.comm(f'{function}:resolution?')
def write_resolution(self, function, resolution):
self.comm(f'{function}:resolution {resolution}')
return self.comm(f'{function}:resolution?')
def read_status(self):
stb = int(self.comm('*STB?'))
esr = int(self.comm('*ESR?'))
if esr & (1 << 3):
return ERROR, 'self-test/calibration/reading failed'
if esr & (1 << 4):
return ERROR, 'execution error'
if esr & (1 << 5):
return ERROR, 'syntax error'
if esr & (1 << 2):
return ERROR, 'query error'
if stb & (1 << 3):
return WARN, 'questionable data'
if stb & (1 << 5):
return WARN, 'standard event register is not empty'
if stb & (1 << 6):
return WARN, 'requested service'
if any(stb & (1 << i) for i in range(3) or stb & (1 << 7)):
return IDLE, ''
if esr & (1 << 6):
return IDLE, ''
if esr & (1 << 7):
return IDLE, ''
if stb & (1 << 4):
return IDLE, 'message available'
if esr & (1 << 0):
return IDLE, 'operation complete'
if esr & (1 << 1):
return IDLE, 'not used'
class Voltage(HP34401A, Readable):
value = Parameter('voltage', datatype=FloatRange(0.1, 1000), unit='V')
range = Parameter('voltage sensitivity value', FloatRange(), unit='V', default=1, readonly=False)
resolution = Parameter('resolution')
mode = Parameter('measurement mode: ac/dc', readonly=False)
ioClass = HP_IO
MODE_NAMES = {0: 'dc', 1: 'ac'}
VOLT_RANGE = ['100mV', '1V', '10V', '100V', '1000V']
v_range = Parameter('voltage range', EnumType('voltage index range',
{name: idx for idx, name in enumerate(VOLT_RANGE)}), readonly=False)
acdc = None
def write_mode(self, mode):
"""
Set the mode - AC or DC
:param mode: AC/DC
:return:
"""
if mode == 1:
self.comm(f'configure:voltage:AC {self.range}, {self.resolution}')
else:
self.comm(f'configure:voltage:DC {self.range}, {self.resolution}')
self.acdc = self.MODE_NAMES[mode]
return self.comm(f'function?')
def read_value(self):
"""
Makes a AC/DC voltage measurement.
:return: AC/DC value
"""
return self.comm(f'measure:voltage:{self.acdc}?')
def write_autorange_acdc(self, function):
full_function = f'{function}:{self.acdc}'
return self.write_autorange(full_function)
def read_range_voltage(self):
return self.read_range(f'voltage:{self.acdc}')
def write_range_voltage(self, range):
return self.write_range(f'voltage:{self.acdc}', range)
def write_autorange_voltage(self):
return self.write_autorange_acdc('voltage')
def read_resolution_voltage(self):
return self.read_resolution(f'voltage:{self.acdc}')
def write_resolution_voltage(self, resolution):
return self.write_resolution(f'voltage:{self.acdc}', resolution)
class Current(HP34401A, Readable, Voltage):
value = Parameter('current', FloatRange, unit='A')
range = Parameter('current range', FloatRange)
CURR_RANGE_AC = ['10mA', '100mA', '1A', '3A']
CURR_RANGE_DC = ['1A', '3A']
def read_range_current(self):
return self.read_range(f'current:{self.acdc}')
def write_autorange_current(self):
return self.write_autorange_acdc('current')
def write_range_current(self, range):
return self.write_range(f'current:{self.acdc}', range)
def read_resolution_current(self):
return self.read_resolution(f'current:{self.acdc}')
def write_resolution_current(self, resolution):
return self.write_resolution(f'current:{self.acdc}', resolution)
class Resistance(HP34401A, Readable):
value = Parameter('resistance')
mode = Parameter('measurement mode: 2-/4-wire ohms', EnumType(two_wire=2, four_wire=4), readonly=False)
resolution = Parameter('resistance measurement resolution')
range = Parameter('resistance measurement range')
RESIST_RANGE = ['100Om', '1kOm', '10kOm', '100kOm', '1MOm', '10MOm', '100MOm']
FUNCTION_MAP = {2: 'resistance', 4: 'fresistance'}
def write_range_resistance(self, range):
return self.write_range(f'{self.FUNCTION_MAP[self.mode]}', range)
def read_range_resistance(self):
return self.read_range(f'{self.FUNCTION_MAP[self.mode]}')
def write_mode(self, mode):
if mode == 2:
self.comm(f'configure:resistance {self.range},{self.resolution}')
elif mode == 4:
self.comm(f'configure:fresistance {self.range}, {self.resolution}')
return self.comm('configure?')
def write_autorange_resistance(self):
return self.write_autorange(self.FUNCTION_MAP[self.mode])
def read_resolution_resistance(self):
return self.read_resolution(f'{self.FUNCTION_MAP[self.mode]}')
def write_resolution_resistance(self, resolution):
return self.write_resolution(f'{self.FUNCTION_MAP[self.mode]}', resolution)
class Frequency(HP34401A, Readable):
value = Parameter('frequency', FloatRange(3, 300e3), unit='Hz')
def write_autorange_frequency(self):
return self.write_autorange('frequency')
def read_resolution_frequency(self):
return self.read_resolution('frequency')
def write_resolution_frequency(self, resolution):
return self.write_resolution('frequency', resolution)

View File

@ -25,7 +25,34 @@ from frappy.core import Readable, Parameter, FloatRange, TupleOf, \
from frappy.errors import RangeError
class SR_IO(StringIO):
class IO65(StringIO):
identification = [('ID', r'72.*')] # Identification; causes the lock-in amplifier to respond with the model number
# special protocol: we have to wait for each echo charavter
def communicate(self, cmd):
self.comLog('> %s', cmd)
cmd = cmd + '\r'
for ch in cmd.encode('latin-1'):
self._conn.send(bytes([ch]))
echo = self._conn.recv()
if not echo:
raise RuntimeError('tmo')
if echo[0] == 13: # CR
data = echo[1:]
break
else:
raise RuntimeError('missing CR')
while True:
chunk = self._conn.recv()
data += chunk
if not chunk or chunk[-1] == 13:
break
reply = data.decode('latin-1')
self.comLog('< %s', reply)
return reply+';0;0' # make it compatible with IO70
class IO70(StringIO):
end_of_line = b'\x00'
identification = [('ID', r'72.*')] # Identification; causes the lock-in amplifier to respond with the model number
@ -43,27 +70,16 @@ class XY(HasIO, Readable):
autorange = Parameter('autorange_on', EnumType('autorange', off=0, soft=1, hard=2),
readonly=False, default=0)
sen_range = {index + 1: name for index, name in enumerate(
sen_range = {name: value + 1 for value, name in enumerate(
['2nV', '5nV', '10nV', '20nV', '50nV', '100nV', '200nV', '500nV', '1uV',
'2uV', '5uV', '10uV', '20uV', '50uV', '100uV', '200uV', '500uV', '1mV',
'2mV', '5mV', '10mV', '20mV', '50mV', '100mV', '200mV', '500mV', '1V']
)}
irange = Parameter('sensitivity index', EnumType('sensitivity index range', sen_range))
nm = Parameter('noise mode on', BoolType(), readonly=False)
irange = Parameter('sensitivity index', EnumType('sensitivity index range', sen_range), readonly=False)
phase = Parameter('reference phase control', FloatRange(-360, 360), unit='deg', readonly=False)
vmode = Parameter('control mode', EnumType(both_grounded=0, A=1, B=2, A_B_diff=3), readonly=False)
ioClass = SR_IO
def doPoll(self):
super().doPoll()
if self.autorange == 1: # soft auto range
if max(abs(x), abs(y)) >= 0.9 * self.range and self.irange < 27:
self.write_irange(self.irange + 1)
elif max(abs(x), abs(y)) <= 0.3 * self.range and self.irange > 1:
self.write_irange(self.irange - 1)
def comm(self, cmd):
reply, status, overload = self.communicate(cmd).split(';') # try/except
reply = reply.rstrip('\n')
@ -82,7 +98,7 @@ class XY(HasIO, Readable):
"""
value_with_unit = re.compile(r'(\d+)([pnumkMG]?)')
value, pfx = value_with_unit.match(value).groups()
pfx_dict = {'p': 1e-12, 'n': 1e-9, 'u': 1e-6, 'm': 1e-3, 'k': 1e3, 'M': 1e6, 'G': 1e9}
pfx_dict = {'p': 1e-12, 'n': 1e-9, 'u': 1e-6, 'm': 1e-3, 'k': 1e3, 'M': 1e6, 'G':1e9}
if pfx in pfx_dict:
value = round(float(value) * pfx_dict[pfx], 12)
return float(value)
@ -91,6 +107,11 @@ class XY(HasIO, Readable):
reply = self.comm('XY.').split(',')
x = float(reply[0])
y = float(reply[1])
if self.autorange == 1: # soft
if max(abs(x), abs(y)) >= 0.9 * self.range and self.irange < 27:
self.write_irange(self.irange + 1)
elif max(abs(x), abs(y)) <= 0.3 * self.range and self.irange > 1:
self.write_irange(self.irange - 1)
return x, y
def read_freq(self):
@ -141,27 +162,15 @@ class XY(HasIO, Readable):
def write_range(self, target):
cl_idx = None
for idx, name in self.sen_range.items():
if idx < len(self.sen_range):
value_l = self.string_to_value(self.sen_range.get(idx))
value_r = self.string_to_value(self.sen_range.get(idx + 1))
if value_l < target <= value_r:
cl_idx = idx + 1
break
cl_idx = idx
for name, idx in self.sen_range.items():
value = self.string_to_value(name)
if value < target < self.sen_range.get(idx + 1, float('inf')):
cl_idx = idx + 1
elif value == target:
cl_idx = idx
self.write_irange(cl_idx)
return self.read_range()
def read_nm(self):
reply = self.comm('NOISEMODE')
return reply
def write_nm(self, value):
self.comm('NOISEMODE %d' % int(value))
self.read_nm()
return value
# phase and autophase
def read_phase(self):
reply = self.comm('REFP.')
@ -194,10 +203,11 @@ class XY70(XY):
'100s', '200s', '500s', '1ks', '2ks', '5ks', '10ks', '20ks', '50ks', '100ks']
)}
nm = Parameter('noise mode on', BoolType(), readonly=False)
tc = Parameter('time const. value', FloatRange(0.00001, 100000), unit='s', readonly=False)
itc = Parameter('time const. index', EnumType('time const. index range', time_const), readonly=False)
ioClass = SR_IO
ioClass = IO70
def comm(self, cmd):
reply, status, overload = self.communicate(cmd).split(';')
@ -207,6 +217,15 @@ class XY70(XY):
self.status = (self.Status.IDLE, '')
return reply
def read_nm(self):
reply = self.comm('NOISEMODE')
return reply
def write_nm(self, value):
self.comm('NOISEMODE %d' % int(value))
self.read_nm()
return value
def read_tc(self):
reply = self.comm('TC.')
return float(reply)
@ -245,6 +264,8 @@ class XY65(XY):
'200s', '500s', '1ks', '2ks', '5ks', '10ks', '20ks', '50ks', '100ks']
)}
ioClass = IO65
def read_tc(self):
reply = self.comm('TC.')
return float(reply)

View File

@ -1,3 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
@ -15,7 +17,6 @@
#
# Module authors: Oksana Shliakhtun <oksana.shliakhtun@psi.ch>
# *****************************************************************************
"""Stanford Research Systems SR830 DS Lock-in Amplifier"""
import re
import time
@ -26,11 +27,6 @@ from frappy.errors import IsBusyError
def string_to_value(value):
"""
Converting the value to float, removing the units, converting the prefix into the number.
:param value: value
:return: float value without units
"""
value_with_unit = re.compile(r'(\d+)([pnumkMG]?)')
value, pfx = value_with_unit.match(value).groups()
pfx_dict = {'p': 1e-12, 'n': 1e-9, 'u': 1e-6, 'm': 1e-3, 'k': 1e3, 'M': 1e6, 'G': 1e9}
@ -46,13 +42,6 @@ class SR830_IO(StringIO):
class StanfRes(HasIO, Readable):
def set_par(self, cmd, *args):
"""
Set parameter.
Query commands are the same as setting commands, but they have a question mark.
:param cmd: command
:param args: value(s)
:return: reply
"""
head = ','.join([cmd] + [a if isinstance(a, str) else f'{a:g}' for a in args])
tail = cmd.replace(' ', '? ')
new_tail = re.sub(r'[0-9.]+', '', tail)
@ -162,10 +151,6 @@ class XY(StanfRes):
return IDLE, ''
def read_value(self):
"""
Read XY. The manual autorange implemented.
:return:
"""
if self.read_status()[0] == BUSY:
raise IsBusyError('changing gain')
reply = self.get_par('SNAP? 1, 2')
@ -183,13 +168,11 @@ class XY(StanfRes):
return int(self.get_par('SENS?'))
def read_range(self):
"""Sensitivity range value"""
idx = self.read_irange()
name = self.SEN_RANGE[idx]
return string_to_value(name)
def write_irange(self, irange):
"""Index of sensitivity from the range"""
value = int(irange)
self.set_par(f'SENS {value}')
self._autogain_started = time.time()
@ -197,12 +180,6 @@ class XY(StanfRes):
return value
def write_range(self, target):
"""
Setting the sensitivity range.
cl_idx/cl_value is the closest index/value from the range to the target
:param target:
:return: closest value of the sensitivity range
"""
target = float(target)
cl_idx = None
cl_value = float('inf')
@ -219,7 +196,6 @@ class XY(StanfRes):
return cl_value
def read_itc(self):
"""Time constant index from the range"""
return int(self.get_par(f'OFLT?'))
def write_itc(self, itc):
@ -227,18 +203,11 @@ class XY(StanfRes):
return self.set_par(f'OFLT {value}')
def read_tc(self):
"""Read time constant value from the range"""
idx = self.read_itc()
name = self.TIME_CONST[idx]
return string_to_value(name)
def write_tc(self, target):
"""
Setting the time constant from the range.
cl_idx/cl_value is the closest index/value from the range to the target
:param target: time constant
:return: closest time constant value
"""
target = float(target)
cl_idx = None
cl_value = float('inf')

View File

@ -1,3 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# *****************************************************************************
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software

View File

@ -1,293 +1,294 @@
"""
Created on Tue Nov 26 15:42:43 2019
@author: tartarotti_d-adm
"""
import numpy as np
import ctypes as ct
import time
from numpy import sqrt, arctan2, sin, cos
#from pylab import *
from scipy import signal
#ADQAPI = ct.cdll.LoadLibrary("ADQAPI.dll")
ADQAPI = ct.cdll.LoadLibrary("libadq.so.0")
#For different trigger modes
SW_TRIG = 1
EXT_TRIG_1 = 2 #This external trigger does not work if the level of the trigger is very close to 0.5V. Now we have it close to 3V, and it works
EXT_TRIG_2 = 7
EXT_TRIG_3 = 8
LVL_TRIG = 3
INT_TRIG = 4
LVL_FALLING = 0
LVL_RISING = 1
#samples_per_record=16384
ADQ_TRANSFER_MODE_NORMAL = 0x00
ADQ_CHANNELS_MASK = 0x3
#f_LO = 40
def butter_lowpass(cutoff, sr, order=5):
nyq = 0.5 * sr
normal_cutoff = cutoff / nyq
b, a = signal.butter(order, normal_cutoff, btype = 'low', analog = False)
return b, a
class Adq(object):
max_number_of_channels = 2
samp_freq = 2
#ndecimate = 50 # decimation ratio (2GHz / 40 MHz)
ndecimate = 50
def __init__(self, number_of_records, samples_per_record, bw_cutoff):
self.number_of_records = number_of_records
self.samples_per_record = samples_per_record
self.bw_cutoff = bw_cutoff
ADQAPI.ADQAPI_GetRevision()
# Manually set return type from some ADQAPI functions
ADQAPI.CreateADQControlUnit.restype = ct.c_void_p
ADQAPI.ADQ_GetRevision.restype = ct.c_void_p
ADQAPI.ADQ_GetPtrStream.restype = ct.POINTER(ct.c_int16)
ADQAPI.ADQControlUnit_FindDevices.argtypes = [ct.c_void_p]
# Create ADQControlUnit
self.adq_cu = ct.c_void_p(ADQAPI.CreateADQControlUnit())
ADQAPI.ADQControlUnit_EnableErrorTrace(self.adq_cu, 3, '.')
self.adq_num = 1
# Find ADQ devices
ADQAPI.ADQControlUnit_FindDevices(self.adq_cu)
n_of_ADQ = ADQAPI.ADQControlUnit_NofADQ(self.adq_cu)
if n_of_ADQ != 1:
raise ValueError('number of ADQs must be 1, not %d' % n_of_ADQ)
rev = ADQAPI.ADQ_GetRevision(self.adq_cu, self.adq_num)
revision = ct.cast(rev,ct.POINTER(ct.c_int))
print('\nConnected to ADQ #1')
# Print revision information
print('FPGA Revision: {}'.format(revision[0]))
if (revision[1]):
print('Local copy')
else :
print('SVN Managed')
if (revision[2]):
print('Mixed Revision')
else :
print('SVN Updated')
print('')
ADQ_CLOCK_INT_INTREF = 0 #internal clock source
ADQ_CLOCK_EXT_REF = 1 #internal clock source, external reference
ADQ_CLOCK_EXT_CLOCK = 2 #External clock source
ADQAPI.ADQ_SetClockSource(self.adq_cu, self.adq_num, ADQ_CLOCK_EXT_REF);
##########################
# Test pattern
#ADQAPI.ADQ_SetTestPatternMode(self.adq_cu, self.adq_num, 4)
##########################
# Sample skip
#ADQAPI.ADQ_SetSampleSkip(self.adq_cu, self.adq_num, 1)
##########################
# Set trig mode
self.trigger = EXT_TRIG_1
#trigger = LVL_TRIG
success = ADQAPI.ADQ_SetTriggerMode(self.adq_cu, self.adq_num, self.trigger)
if (success == 0):
print('ADQ_SetTriggerMode failed.')
if (self.trigger == LVL_TRIG):
success = ADQAPI.ADQ_SetLvlTrigLevel(self.adq_cu, self.adq_num, -100)
if (success == 0):
print('ADQ_SetLvlTrigLevel failed.')
success = ADQAPI.ADQ_SetTrigLevelResetValue(self.adq_cu, self.adq_num, 1000)
if (success == 0):
print('ADQ_SetTrigLevelResetValue failed.')
success = ADQAPI.ADQ_SetLvlTrigChannel(self.adq_cu, self.adq_num, 1)
if (success == 0):
print('ADQ_SetLvlTrigChannel failed.')
success = ADQAPI.ADQ_SetLvlTrigEdge(self.adq_cu, self.adq_num, LVL_RISING)
if (success == 0):
print('ADQ_SetLvlTrigEdge failed.')
elif (self.trigger == EXT_TRIG_1) :
success = ADQAPI.ADQ_SetExternTrigEdge(self.adq_cu, self.adq_num,2)
if (success == 0):
print('ADQ_SetLvlTrigEdge failed.')
# success = ADQAPI.ADQ_SetTriggerThresholdVoltage(self.adq_cu, self.adq_num, trigger, ct.c_double(0.2))
# if (success == 0):
# print('SetTriggerThresholdVoltage failed.')
print("CHANNEL:"+str(ct.c_int(ADQAPI.ADQ_GetLvlTrigChannel(self.adq_cu, self.adq_num))))
self.setup_target_buffers()
def setup_target_buffers(self):
# Setup target buffers for data
self.target_buffers=(ct.POINTER(ct.c_int16 * self.samples_per_record * self.number_of_records)
* self.max_number_of_channels)()
for bufp in self.target_buffers:
bufp.contents = (ct.c_int16 * self.samples_per_record * self.number_of_records)()
def deletecu(self):
# Only disarm trigger after data is collected
ADQAPI.ADQ_DisarmTrigger(self.adq_cu, self.adq_num)
ADQAPI.ADQ_MultiRecordClose(self.adq_cu, self.adq_num);
# Delete ADQControlunit
ADQAPI.DeleteADQControlUnit(self.adq_cu)
def start(self):
"""start datat acquisition"""
# samples_per_records = samples_per_record/number_of_records
# Change number of pulses to be acquired acording to how many records are taken
# Start acquisition
ADQAPI.ADQ_MultiRecordSetup(self.adq_cu, self.adq_num,
self.number_of_records,
self.samples_per_record)
ADQAPI.ADQ_DisarmTrigger(self.adq_cu, self.adq_num)
ADQAPI.ADQ_ArmTrigger(self.adq_cu, self.adq_num)
def getdata(self):
"""wait for aquisition to be finished and get data"""
#start = time.time()
while(ADQAPI.ADQ_GetAcquiredAll(self.adq_cu,self.adq_num) == 0):
time.sleep(0.001)
#if (self.trigger == SW_TRIG):
# ADQAPI.ADQ_SWTrig(self.adq_cu, self.adq_num)
#mid = time.time()
status = ADQAPI.ADQ_GetData(self.adq_cu, self.adq_num, self.target_buffers,
self.samples_per_record * self.number_of_records, 2,
0, self.number_of_records, ADQ_CHANNELS_MASK,
0, self.samples_per_record, ADQ_TRANSFER_MODE_NORMAL);
#print(time.time()-mid,mid-start)
if not status:
raise ValueError('no succesS from ADQ_GetDATA')
# Now this is an array with all records, but the time is artificial
data = []
for ch in range(2):
onedim = np.frombuffer(self.target_buffers[ch].contents, dtype=np.int16)
data.append(onedim.reshape(self.number_of_records, self.samples_per_record) / float(2**14)) # 14 bits ADC
return data
def acquire(self):
self.start()
return self.getdata()
'''
def average(self, data):
#Average over records
return [data[ch].sum(axis=0) / self.number_of_records for ch in range(2)]
def iq(self, channel, f_LO):
newx = np.linspace(0, self.samples_per_record /2, self.samples_per_record)
s0 = channel /((2**16)/2)*0.5*np.exp(1j*2*np.pi*f_LO/(1e3)*newx)
I0 = s0.real
Q0 = s0.imag
return I0, Q0
def fitting(self, data, f_LO, ti, tf):
# As long as data[0] is the pulse
si = 2*ti #Those are for fitting the pulse
sf = 2*tf
phase = np.zeros(self.number_of_records)
amplitude = np.zeros(self.number_of_records)
offset = np.zeros(self.number_of_records)
for i in range(self.number_of_records):
phase[i], amplitude[i] = sineW(data[0][i][si:sf],f_LO*1e-9,ti,tf)
offset[i] = np.average(data[0][i][si:sf])
return phase, amplitude, offset
def waveIQ(self, channel,ti,f_LO):
#channel is not the sample data
t = np.linspace(0, self.samples_per_record /2, self.samples_per_record + 1)[:-1]
si = 2*ti # Again that is where the wave pulse starts
cwi = np.zeros((self.number_of_records,self.samples_per_record))
cwq = np.zeros((self.number_of_records,self.samples_per_record))
iq = np.zeros((self.number_of_records,self.samples_per_record))
q = np.zeros((self.number_of_records,self.samples_per_record))
for i in range(self.number_of_records):
cwi[i] = np.zeros(self.samples_per_record)
cwq[i] = np.zeros(self.samples_per_record)
cwi[i] = amplitude[i]*sin(t[si:]*f_LO*1e-9*2*np.pi+phase[i]*np.pi/180)+bias[i]
cwq[i] = amplitude[i]*sin(t[si:]*f_LO*1e-9*(2*np.pi+(phase[i]+90)*np.pi/180))+bias[i]
iq[i] = channel[i]*cwi[i]
q[i] = channel[i]*cwq[i]
return iq,q
'''
def sinW(self,sig,freq,ti,tf):
# sig: signal array
# freq
# ti, tf: initial and end time
si = int(ti * self.samp_freq)
nperiods = freq * (tf - ti)
n = int(round(max(2, int(nperiods)) / nperiods * (tf-ti) * self.samp_freq))
self.nperiods = n
t = np.arange(si, len(sig)) / self.samp_freq
t = t[:n]
self.pulselen = n / self.samp_freq
sig = sig[si:si+n]
a = 2*np.sum(sig*np.cos(2*np.pi*freq*t))/len(sig)
b = 2*np.sum(sig*np.sin(2*np.pi*freq*t))/len(sig)
return a, b
def mix(self, sigin, sigout, freq, ti, tf):
# sigin, sigout: signal array, incomping, output
# freq
# ti, tf: initial and end time if sigin
a, b = self.sinW(sigin, freq, ti, tf)
phase = arctan2(a,b) * 180 / np.pi
amp = sqrt(a**2 + b**2)
a, b = a/amp, b/amp
#si = int(ti * self.samp_freq)
t = np.arange(len(sigout)) / self.samp_freq
wave1 = sigout * (a * cos(2*np.pi*freq*t) + b * sin(2*np.pi*freq*t))
wave2 = sigout * (a * sin(2*np.pi*freq*t) - b * cos(2*np.pi*freq*t))
return wave1, wave2
def averageiq(self, data, freq, ti, tf):
'''Average over records'''
iorq = np.array([self.mix(data[0][i], data[1][i], freq, ti, tf) for i in range(self.number_of_records)])
# iorq = np.array([self.mix(data[0][:], data[1][:], freq, ti, tf)])
return iorq.sum(axis=0) / self.number_of_records
def filtro(self, iorq, cutoff):
b, a = butter_lowpass(cutoff, self.samp_freq*1e9)
#ifi = np.array(signal.filtfilt(b,a,iorq[0]))
#qf = np.array(signal.filtfilt(b,a,iorq[1]))
iqf = [signal.filtfilt(b,a,iorq[i]) for i in np.arange(len(iorq))]
return iqf
def box(self, iorq, ti, tf):
si = int(self.samp_freq * ti)
sf = int(self.samp_freq * tf)
bxa = [sum(iorq[i][si:sf])/(sf-si) for i in np.arange(len(iorq))]
return bxa
def gates_and_curves(self, data, freq, pulse, roi):
"""return iq values of rois and prepare plottable curves for iq"""
times = []
times.append(('aviq', time.time()))
iq = self.averageiq(data,freq*1e-9,*pulse)
times.append(('filtro', time.time()))
iqf = self.filtro(iq,self.bw_cutoff)
m = len(iqf[0]) // self.ndecimate
times.append(('iqdec', time.time()))
iqd = np.average(np.resize(iqf, (2, m, self.ndecimate)), axis=2)
t_axis = np.arange(m) * self.ndecimate / self.samp_freq
pulsig = np.abs(data[0][0])
times.append(('pulsig', time.time()))
pulsig = np.average(np.resize(pulsig, (m, self.ndecimate)), axis=1)
self.curves = (t_axis, iqd[0], iqd[1], pulsig)
#print(times)
return [self.box(iqf,*r) for r in roi]
# -*- coding: utf-8 -*-
"""
Created on Tue Nov 26 15:42:43 2019
@author: tartarotti_d-adm
"""
import numpy as np
import ctypes as ct
import time
from numpy import sqrt, arctan2, sin, cos
#from pylab import *
from scipy import signal
#ADQAPI = ct.cdll.LoadLibrary("ADQAPI.dll")
ADQAPI = ct.cdll.LoadLibrary("libadq.so.0")
#For different trigger modes
SW_TRIG = 1
EXT_TRIG_1 = 2 #This external trigger does not work if the level of the trigger is very close to 0.5V. Now we have it close to 3V, and it works
EXT_TRIG_2 = 7
EXT_TRIG_3 = 8
LVL_TRIG = 3
INT_TRIG = 4
LVL_FALLING = 0
LVL_RISING = 1
#samples_per_record=16384
ADQ_TRANSFER_MODE_NORMAL = 0x00
ADQ_CHANNELS_MASK = 0x3
#f_LO = 40
def butter_lowpass(cutoff, sr, order=5):
nyq = 0.5 * sr
normal_cutoff = cutoff / nyq
b, a = signal.butter(order, normal_cutoff, btype = 'low', analog = False)
return b, a
class Adq(object):
max_number_of_channels = 2
samp_freq = 2
#ndecimate = 50 # decimation ratio (2GHz / 40 MHz)
ndecimate = 50
def __init__(self, number_of_records, samples_per_record, bw_cutoff):
self.number_of_records = number_of_records
self.samples_per_record = samples_per_record
self.bw_cutoff = bw_cutoff
ADQAPI.ADQAPI_GetRevision()
# Manually set return type from some ADQAPI functions
ADQAPI.CreateADQControlUnit.restype = ct.c_void_p
ADQAPI.ADQ_GetRevision.restype = ct.c_void_p
ADQAPI.ADQ_GetPtrStream.restype = ct.POINTER(ct.c_int16)
ADQAPI.ADQControlUnit_FindDevices.argtypes = [ct.c_void_p]
# Create ADQControlUnit
self.adq_cu = ct.c_void_p(ADQAPI.CreateADQControlUnit())
ADQAPI.ADQControlUnit_EnableErrorTrace(self.adq_cu, 3, '.')
self.adq_num = 1
# Find ADQ devices
ADQAPI.ADQControlUnit_FindDevices(self.adq_cu)
n_of_ADQ = ADQAPI.ADQControlUnit_NofADQ(self.adq_cu)
if n_of_ADQ != 1:
raise ValueError('number of ADQs must be 1, not %d' % n_of_ADQ)
rev = ADQAPI.ADQ_GetRevision(self.adq_cu, self.adq_num)
revision = ct.cast(rev,ct.POINTER(ct.c_int))
print('\nConnected to ADQ #1')
# Print revision information
print('FPGA Revision: {}'.format(revision[0]))
if (revision[1]):
print('Local copy')
else :
print('SVN Managed')
if (revision[2]):
print('Mixed Revision')
else :
print('SVN Updated')
print('')
ADQ_CLOCK_INT_INTREF = 0 #internal clock source
ADQ_CLOCK_EXT_REF = 1 #internal clock source, external reference
ADQ_CLOCK_EXT_CLOCK = 2 #External clock source
ADQAPI.ADQ_SetClockSource(self.adq_cu, self.adq_num, ADQ_CLOCK_EXT_REF);
##########################
# Test pattern
#ADQAPI.ADQ_SetTestPatternMode(self.adq_cu, self.adq_num, 4)
##########################
# Sample skip
#ADQAPI.ADQ_SetSampleSkip(self.adq_cu, self.adq_num, 1)
##########################
# Set trig mode
self.trigger = EXT_TRIG_1
#trigger = LVL_TRIG
success = ADQAPI.ADQ_SetTriggerMode(self.adq_cu, self.adq_num, self.trigger)
if (success == 0):
print('ADQ_SetTriggerMode failed.')
if (self.trigger == LVL_TRIG):
success = ADQAPI.ADQ_SetLvlTrigLevel(self.adq_cu, self.adq_num, -100)
if (success == 0):
print('ADQ_SetLvlTrigLevel failed.')
success = ADQAPI.ADQ_SetTrigLevelResetValue(self.adq_cu, self.adq_num, 1000)
if (success == 0):
print('ADQ_SetTrigLevelResetValue failed.')
success = ADQAPI.ADQ_SetLvlTrigChannel(self.adq_cu, self.adq_num, 1)
if (success == 0):
print('ADQ_SetLvlTrigChannel failed.')
success = ADQAPI.ADQ_SetLvlTrigEdge(self.adq_cu, self.adq_num, LVL_RISING)
if (success == 0):
print('ADQ_SetLvlTrigEdge failed.')
elif (self.trigger == EXT_TRIG_1) :
success = ADQAPI.ADQ_SetExternTrigEdge(self.adq_cu, self.adq_num,2)
if (success == 0):
print('ADQ_SetLvlTrigEdge failed.')
# success = ADQAPI.ADQ_SetTriggerThresholdVoltage(self.adq_cu, self.adq_num, trigger, ct.c_double(0.2))
# if (success == 0):
# print('SetTriggerThresholdVoltage failed.')
print("CHANNEL:"+str(ct.c_int(ADQAPI.ADQ_GetLvlTrigChannel(self.adq_cu, self.adq_num))))
self.setup_target_buffers()
def setup_target_buffers(self):
# Setup target buffers for data
self.target_buffers=(ct.POINTER(ct.c_int16 * self.samples_per_record * self.number_of_records)
* self.max_number_of_channels)()
for bufp in self.target_buffers:
bufp.contents = (ct.c_int16 * self.samples_per_record * self.number_of_records)()
def deletecu(self):
# Only disarm trigger after data is collected
ADQAPI.ADQ_DisarmTrigger(self.adq_cu, self.adq_num)
ADQAPI.ADQ_MultiRecordClose(self.adq_cu, self.adq_num);
# Delete ADQControlunit
ADQAPI.DeleteADQControlUnit(self.adq_cu)
def start(self):
"""start datat acquisition"""
# samples_per_records = samples_per_record/number_of_records
# Change number of pulses to be acquired acording to how many records are taken
# Start acquisition
ADQAPI.ADQ_MultiRecordSetup(self.adq_cu, self.adq_num,
self.number_of_records,
self.samples_per_record)
ADQAPI.ADQ_DisarmTrigger(self.adq_cu, self.adq_num)
ADQAPI.ADQ_ArmTrigger(self.adq_cu, self.adq_num)
def getdata(self):
"""wait for aquisition to be finished and get data"""
#start = time.time()
while(ADQAPI.ADQ_GetAcquiredAll(self.adq_cu,self.adq_num) == 0):
time.sleep(0.001)
#if (self.trigger == SW_TRIG):
# ADQAPI.ADQ_SWTrig(self.adq_cu, self.adq_num)
#mid = time.time()
status = ADQAPI.ADQ_GetData(self.adq_cu, self.adq_num, self.target_buffers,
self.samples_per_record * self.number_of_records, 2,
0, self.number_of_records, ADQ_CHANNELS_MASK,
0, self.samples_per_record, ADQ_TRANSFER_MODE_NORMAL);
#print(time.time()-mid,mid-start)
if not status:
raise ValueError('no succesS from ADQ_GetDATA')
# Now this is an array with all records, but the time is artificial
data = []
for ch in range(2):
onedim = np.frombuffer(self.target_buffers[ch].contents, dtype=np.int16)
data.append(onedim.reshape(self.number_of_records, self.samples_per_record) / float(2**14)) # 14 bits ADC
return data
def acquire(self):
self.start()
return self.getdata()
'''
def average(self, data):
#Average over records
return [data[ch].sum(axis=0) / self.number_of_records for ch in range(2)]
def iq(self, channel, f_LO):
newx = np.linspace(0, self.samples_per_record /2, self.samples_per_record)
s0 = channel /((2**16)/2)*0.5*np.exp(1j*2*np.pi*f_LO/(1e3)*newx)
I0 = s0.real
Q0 = s0.imag
return I0, Q0
def fitting(self, data, f_LO, ti, tf):
# As long as data[0] is the pulse
si = 2*ti #Those are for fitting the pulse
sf = 2*tf
phase = np.zeros(self.number_of_records)
amplitude = np.zeros(self.number_of_records)
offset = np.zeros(self.number_of_records)
for i in range(self.number_of_records):
phase[i], amplitude[i] = sineW(data[0][i][si:sf],f_LO*1e-9,ti,tf)
offset[i] = np.average(data[0][i][si:sf])
return phase, amplitude, offset
def waveIQ(self, channel,ti,f_LO):
#channel is not the sample data
t = np.linspace(0, self.samples_per_record /2, self.samples_per_record + 1)[:-1]
si = 2*ti # Again that is where the wave pulse starts
cwi = np.zeros((self.number_of_records,self.samples_per_record))
cwq = np.zeros((self.number_of_records,self.samples_per_record))
iq = np.zeros((self.number_of_records,self.samples_per_record))
q = np.zeros((self.number_of_records,self.samples_per_record))
for i in range(self.number_of_records):
cwi[i] = np.zeros(self.samples_per_record)
cwq[i] = np.zeros(self.samples_per_record)
cwi[i] = amplitude[i]*sin(t[si:]*f_LO*1e-9*2*np.pi+phase[i]*np.pi/180)+bias[i]
cwq[i] = amplitude[i]*sin(t[si:]*f_LO*1e-9*(2*np.pi+(phase[i]+90)*np.pi/180))+bias[i]
iq[i] = channel[i]*cwi[i]
q[i] = channel[i]*cwq[i]
return iq,q
'''
def sinW(self,sig,freq,ti,tf):
# sig: signal array
# freq
# ti, tf: initial and end time
si = int(ti * self.samp_freq)
nperiods = freq * (tf - ti)
n = int(round(max(2, int(nperiods)) / nperiods * (tf-ti) * self.samp_freq))
self.nperiods = n
t = np.arange(si, len(sig)) / self.samp_freq
t = t[:n]
self.pulselen = n / self.samp_freq
sig = sig[si:si+n]
a = 2*np.sum(sig*np.cos(2*np.pi*freq*t))/len(sig)
b = 2*np.sum(sig*np.sin(2*np.pi*freq*t))/len(sig)
return a, b
def mix(self, sigin, sigout, freq, ti, tf):
# sigin, sigout: signal array, incomping, output
# freq
# ti, tf: initial and end time if sigin
a, b = self.sinW(sigin, freq, ti, tf)
phase = arctan2(a,b) * 180 / np.pi
amp = sqrt(a**2 + b**2)
a, b = a/amp, b/amp
#si = int(ti * self.samp_freq)
t = np.arange(len(sigout)) / self.samp_freq
wave1 = sigout * (a * cos(2*np.pi*freq*t) + b * sin(2*np.pi*freq*t))
wave2 = sigout * (a * sin(2*np.pi*freq*t) - b * cos(2*np.pi*freq*t))
return wave1, wave2
def averageiq(self, data, freq, ti, tf):
'''Average over records'''
iorq = np.array([self.mix(data[0][i], data[1][i], freq, ti, tf) for i in range(self.number_of_records)])
# iorq = np.array([self.mix(data[0][:], data[1][:], freq, ti, tf)])
return iorq.sum(axis=0) / self.number_of_records
def filtro(self, iorq, cutoff):
b, a = butter_lowpass(cutoff, self.samp_freq*1e9)
#ifi = np.array(signal.filtfilt(b,a,iorq[0]))
#qf = np.array(signal.filtfilt(b,a,iorq[1]))
iqf = [signal.filtfilt(b,a,iorq[i]) for i in np.arange(len(iorq))]
return iqf
def box(self, iorq, ti, tf):
si = int(self.samp_freq * ti)
sf = int(self.samp_freq * tf)
bxa = [sum(iorq[i][si:sf])/(sf-si) for i in np.arange(len(iorq))]
return bxa
def gates_and_curves(self, data, freq, pulse, roi):
"""return iq values of rois and prepare plottable curves for iq"""
times = []
times.append(('aviq', time.time()))
iq = self.averageiq(data,freq*1e-9,*pulse)
times.append(('filtro', time.time()))
iqf = self.filtro(iq,self.bw_cutoff)
m = len(iqf[0]) // self.ndecimate
times.append(('iqdec', time.time()))
iqd = np.average(np.resize(iqf, (2, m, self.ndecimate)), axis=2)
t_axis = np.arange(m) * self.ndecimate / self.samp_freq
pulsig = np.abs(data[0][0])
times.append(('pulsig', time.time()))
pulsig = np.average(np.resize(pulsig, (m, self.ndecimate)), axis=1)
self.curves = (t_axis, iqd[0], iqd[1], pulsig)
#print(times)
return [self.box(iqf,*r) for r in roi]

Some files were not shown because too many files have changed in this diff Show More