Files
ADAndor/documentation/areaDetectorDoc.html

1595 lines
48 KiB
HTML
Executable File

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xml:lang="en" xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>areaDetector: EPICS Area Detector Support</title>
<meta content="text/html; charset=ISO-8859-1" http-equiv="Content-Type" />
</head>
<body>
<div style="text-align: center">
<h1>
areaDetector: EPICS Area Detector Support</h1>
<h2>
July 30, 2009</h2>
<h2>
Mark Rivers</h2>
<h2>
University of Chicago</h2>
</div>
<h2>
Contents</h2>
<ul>
<li><a href="#Overview">Overview</a></li>
<li><a href="#Architecture">Architecture</a></li>
<li><a href="#Implementation_details">Implementation details</a>
<ul>
<li><a href="#asynPortDriver">asynPortDriver</a></li>
<li><a href="#NDArray">NDArray</a></li>
<li><a href="#NDArrayPool">NDArrayPool</a></li>
<li><a href="#asynNDArrayDriver">asynNDArrayDriver</a></li>
<li><a href="#ADDriver">ADDriver</a></li>
<li><a href="#MEDM_screens">MEDM screens</a></li>
</ul>
</li>
<li><a href="pluginDoc.html">Plugins</a>
<ul>
<li><a href="pluginDoc.html#NDPluginDriver">NDPluginDriver</a></li>
<li><a href="NDPluginStdArrays.html">NDPluginStdArrays</a></li>
<li><a href="NDPluginFile.html">NDPluginFile</a></li>
<li><a href="NDPluginROI.html">NDPluginROI</a></li>
<li><a href="NDPluginColorConvert.html">NDPluginColorConvert</a></li>
</ul>
</li>
<li><a href="#Detector_drivers">Detector drivers</a>
<ul>
<li><a href="simDetectorDoc.html">Simulation detector driver</a></li>
<li><a href="prosilicaDoc.html">Prosilica driver</a></li>
<li><a href="pilatusDoc.html">Pilatus driver</a></li>
<li><a href="adscDoc.html">ADSC driver</a></li>
<li><a href="RoperDoc.html">Roper driver</a></li>
<li><a href="MarCCDDoc.html">MarCCD driver</a></li>
<li><a href="Mar345Doc.html">mar345 driver</a></li>
<li><a href="FirewireWinDoc.html">Firewire Windows driver</a></li>
<li><a href="http://controls.diamond.ac.uk/downloads/support/firewireDCAM/index.html">
Firewire Linux driver</a></li>
<li><a href="PerkinElmerDoc.html">Perkin-Elmer flat panel driver</a></li>
</ul>
</li>
<li><a href="areaDetectorViewers.html">Viewers</a>
<ul>
<li><a href="areaDetectorViewers.html#ImageJViewer">ImageJ Viewer</a></li>
<li><a href="areaDetectorViewers.html#IDLViewer">IDL Viewer</a></li>
</ul>
</li>
</ul>
<p>
&nbsp;</p>
<h2 id="Overview">
Overview</h2>
<p>
The areaDetector module provides a general-purpose interface for area (2-D) detectors
in <a href="http://www.aps.anl.gov/epics">EPICS</a>. It is intended to be used with
a wide variety of detectors and cameras, ranging from high frame rate CCD and CMOS
cameras, pixel-array detectors such as the Pilatus, and large format detectors like
the MAR-345 online imaging plate.</p>
<p>
The goals of this module are:
</p>
<ul>
<li>Minimize the amount of code that needs to be written to implement a new detector.</li>
<li>Provide a standard interface defining the functions and parameters that a detector
driver must support.</li>
<li>Provide a set of base EPICS records that will be present for every detector using
this module. This allows the use of generic EPICS clients for displaying images
and controlling cameras and detectors.</li>
<li>Allow easy extensibility to take advantage of detector-specific features beyond
the standard parameters.</li>
<li>Have high-performance. Applications can be written to get the detector image data
through EPICS, but an interface is also available to receive the detector data at
a lower-level for very high performance.</li>
<li>Provide a mechanism for device-independent real-time data analysis such as regions-of-interest
and statistics.</li>
<li>Provide detector drivers for commonly used detectors in synchrotron applications.
These include Prosilica GigE video cameras, IEEE 1394 (Firewire) cameras, MAR-CCD
x-ray detectors, MAR-345 online imaging plate detectors, the Pilatus pixel-array
detector, Roper Scientific CCD cameras, and the Perkin-Elmer amorphous silicon detector.</li>
</ul>
<p>
&nbsp;</p>
<h2 id="Architecture">
Architecture</h2>
<p>
The architecture of the areaDetector module is shown below.</p>
<p style="text-align: center">
<img alt="areaDetectorArchitecture.png" src="areaDetectorArchitecture.png" /></p>
<p>
From the bottom to the top this architecture consists of the following:</p>
<ul>
<li>Layer 1. This is the layer that allows user written code to communicate with the
hardware. It is usually provided by the detector vendor. It may consist of a library
or DLL, of a socket protocol to a driver, a Microsoft COM interface, etc.</li>
<li>Layer 2. This is the driver that is written for the areaDetector application to
control a particular detector. It is written in C++ and inherits from the ADDriver
class. It uses the standard asyn interfaces for control and status information.
Each time it receives a new data array it can pass it as an NDArray object to all
Layer 3 clients that have registered for callbacks. This is the only code that needs
to be written to implement a new detector. Existing drivers range from 800 to 1800
lines of code.</li>
<li>Layer 3. Code running at this level is called a "plug-in". This code registers
with a driver for a callback whenever there is a new data array. The existing plugins
implement file saving (NDPluginFile), region-of-interest (ROI) calculations (NDPluginROI),
color mode conversion (NDPluginColorConvert), and conversion of detector data to
standard EPICS array types for use by Channel Access clients (NDPluginStdArrays).
Plugins are normally written in C++ and inherit from NDPluginDriver. Existing plugins
range from 300 to 800 lines of code.</li>
<li>Layer 4. This is standard asyn device support that comes with the EPICS asyn module.</li>
<li>Layer 5. These are standard EPICS records, and EPICS database (template) files
that define records to communicate with drivers at Layer 2 and plugins at Layer
3.</li>
<li>Layer 6. These are EPICS channel access clients, such as MEDM that communicate
with the records at Layer 5. areaDetector includes two client applications that
can display images using EPICS waveform and other records communicating with the
NDPluginStdArrays plugin at Layer 3. One of these clients is an ImageJ plugin, and
the other is a freely runnable IDL application.</li>
</ul>
<p>
The code in Layers 1-3 is essentially independent of EPICS. There are only 2 EPICS
dependencies in this code.
</p>
<ol>
<li><a href="http://www.aps.anl.gov/epics/base/R3-14/10-docs/AppDevGuide.pdf">libCom</a>.
libCom from EPICS base provides operating-system independent functions for threads,
mutexes, etc.</li>
<li><a href="http://www.aps.anl.gov/epics/modules/soft/asyn">asyn</a>. asyn is a module
that provides interthread messaging services, including queueing and callbacks.</li>
</ol>
<p>
In particular it is possible to eliminate layers 4-6 in the architecture shown in
Figure 1, providing there is a programs such as the high-performance GUI shown in
Layer 3. This means that it is not necessary to run an EPICS IOC or to use EPICS
Channel Access when using the drivers and plugins at Layers 2 and 3.
</p>
<p>
The plugin architecture is very powerful, because new plugins can be written for
application-specific purposes. For example, a plugin could be written to analyze
images and find the center of the beam, and such a plugin would then work with any
detector driver. Plugins are also powerful because they can be reconfigured at run-time.
For example the NDPluginStdArrays can switch from getting its array data from a
detector driver to an NDPluginROI plugin. That way it will switch from displaying
the entire detector to whatever sub-region the ROI driver has selected. Any Channel
Access clients connected to the NDPluginStdArrays driver will automatically switch
to displaying this subregion. Similarly, the NDPluginFile plugin can be switched
at run-time from saving the entire image to saving a selected ROI, just by changing
its input source. Plugins can be used to form an image processing pipeline, for
example with a detector providing data to a color convert plugin, which feeds an
ROI plugin, which feeds a file saving plugin. Each plugin can run in its own thread,
and hence in its own core on a modern multi-core CPU.
</p>
<p>
The use of plugins is optional, and it is only plugins that require the driver to
make callbacks with image data. If there are no plugins being used then EPICS can
be used simply to control the detector, without accessing the data itself. This
is most useful when the vendor provides an API has the ability to save the data
to a file and an application to display the images.
</p>
<p>
What follows is a detailed description of the software, working from the bottom
up. Most of the code is object oriented, and written in C++.
</p>
<h2 id="Implementation_details">
Implementation details</h2>
<p>
The areaDetector module depends heavily on <a href="http://www.aps.anl.gov/epics/modules/soft/asyn">
asyn</a>. It is the software that is used for interthread communication, using
the standard asyn interfaces (e.g. asynInt32, asynOctet, etc.), and callbacks. In
order to minimize the amount of redundant code in drivers, areaDetector has been
implemented using C++ classes. The base classes, from which drivers and plugins
are derived, take care of many of the details of asyn and other common code.
</p>
<h3 id="asynPortDriver">
asynPortDriver</h3>
<p>
Detector drivers and plugins are asyn port drivers, meaning that they implement
one or more of the standard asyn interfaces. They register themselves as interrupt
sources, so that they do callbacks to registered asyn clients when values change.
They inherit from the <a href="http://www.aps.anl.gov/epics/modules/soft/asyn/asynPortDriver.html">
asynPortDriver base C++ class</a> that is provided in the asyn module. That base
class handles all of the details of registering the port driver, registering the
supported interfaces, and registering the required interrupt sources. The <a href="http://www.aps.anl.gov/epics/modules/soft/asyn/asynDoxygenHTML/class_asyn_port_driver.html">
asynPortDriver class documentation</a> describes this class in detail.
</p>
<h3 id="NDArray">
NDArray</h3>
<p>
The NDArray (N-Dimensional array) is the class that is used for passing detector
data from drivers to plugins. An NDArray is a general purpose class for handling
array data. An NDArray object is self-describing, meaning it contains enough information
to describe the data itself. It can optionally contain "attributes" (class NDAttribute)
which contain meta-data describing how the data was collected, etc.
</p>
<p>
An NDArray can have up to ND_ARRAY_MAX_DIMS dimensions, currently 10. A fixed maximum
number of dimensions is used to significantly simplify the code compared to unlimited
number of dimensions. Each dimension of the array is described by an <a href="areaDetectorDoxygenHTML/struct_n_d_dimension.html">
NDDimension structure</a>. The <a href="areaDetectorDoxygenHTML/class_n_d_array.html">
NDArray class documentation </a>describes this class in detail.
</p>
<h3 id="H3_2">
NDAttribute</h3>
<p>
The NDAttribute is a class for linking metadata to an NDArray. An NDattribute has
a name, description, data type, value, source type and source information. There
are methods to set and get the information for an attribute, and NDArray provides
methods to add and delete attributes from an NDArray object. The <a href="areaDetectorDoxygenHTML/class_n_d_attribute.html">
NDAttribute class documentation</a> describes this class in detail.
</p>
<h3 id="NDArrayPool">
NDArrayPool</h3>
<p>
The NDArrayPool class manages a free list (pool) of NDArray objects. Drivers allocate
NDArray objects from the pool, and pass these objects to plugins. Plugins increase
the reference count on the object when they place the object on their queue, and
decrease the reference count when they are done processing the array. When the reference
count reaches 0 again the NDArray object is placed back on the free list. This mechanism
minimizes the copying of array data in plugins. The <a href="areaDetectorDoxygenHTML/class_n_d_array_pool.html">
NDArrayPool class documentation </a>describes this class in detail.
</p>
<h3 id="asynNDArrayDriver">
asynNDArrayDriver</h3>
<p>
asynNDArrayDriver inherits from asynPortDriver. It implements the asynGenericPointer
functions, for NDArray objects. This is the class from which both plugins and area
detector drivers are indirectly derived. The <a href="areaDetectorDoxygenHTML/classasyn_n_d_array_driver.html">
asynNDArrayDriver class documentation </a>describes this class in detail.
</p>
<p>
The file <a href="areaDetectorDoxygenHTML/asyn_n_d_array_driver_8h.html">asynNDArrayDriver.h</a>
defines a number of enumerations, including NDStdDriverParams_t, which are the parameters
that all NDArray drivers and plugins should implement if possible. These parameters
are defined by enum values with an associated asyn interface, and access (read-only
or read-write). The EPICS database ADBase.template provides access to these standard
driver parameters. The following table lists the standard driver parameters. The
columns are defined as follows:
</p>
<ul>
<li><b>Enum name:</b> The name of the enum value for this parameter in asynNDArrayDriver.h.
There are several EPICS records in ADBase.template that do not have corresponding
enum fields, and these are indicated as Not Applicable (N/A).</li>
<li><b>asyn interface:</b> The asyn interface used to pass this parameter to the driver.</li>
<li><b>Access:</b> Read-write (r/w) or read-only (r/o).</li>
<li><b>drvUser string:</b> The string used to look up the parameter in the driver
through the drvUser interface. This string is used in the EPICS database file for
generic asyn device support to associate a record with a particular parameter.</li>
<li><b>EPICS record name:</b> The name of the record in ADBase.template. Each record
name begins with the two macro parameters $(P) and $(R). In the case of read/write
parameters there are normally two records, one for writing the value, and a second,
ending in _RBV, that contains the actual value (Read Back Value) of the parameter.</li>
<li><b>EPICS record type:</b> The record type of the record. Waveform records are
used to hold long strings, with length (NELM) = 256 bytes and EPICS data type (FTVL)
= UCHAR. This removes the 40 character restriction string lengths that arise if
an EPICS "string" PV is used. MEDM allows one to edit and display such records correctly.
EPICS clients will typically need to convert such long strings from a string to
an integer or byte array before sending the path name to EPICS. This is easy to
do in clients like SPEC, Matlab, and IDL.</li>
</ul>
<p>
Note that for parameters whose values are defined by enum values (e.g NDDataType,
NDColorMode, etc.), drivers can use a different set of enum values for these parameters.
They can override the enum menu in ADBase.template with driver-specific choices
by loading a driver-specific template file that redefines that record field after
loading ADBase.template.
</p>
<table border="1" cellpadding="2" cellspacing="2" style="text-align: left">
<tbody>
<tr>
<td align="center" colspan="7">
<b>Parameter Definitions in asynNDArrayDriver.h and EPICS Record Definitions in ADBase.template
(file-related records are in NDFile.template)</b></td>
</tr>
<tr>
<th>
Enum name</th>
<th>
asyn interface</th>
<th>
Access</th>
<th>
Description</th>
<th>
drvUser string</th>
<th>
EPICS record name</th>
<th>
EPICS record type</th>
</tr>
<tr>
<td align="center" colspan="7">
<b>Information about the asyn port</b></td>
</tr>
<tr>
<td>
NDPortNameSelf</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
asyn port name</td>
<td>
PORT_NAME_SELF</td>
<td>
$(P)$(R)PortName_RBV</td>
<td>
stringin</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Data type</b></td>
</tr>
<tr>
<td>
NDDataType</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Data type (NDDataType_t).</td>
<td>
DATA_TYPE</td>
<td>
$(P)$(R)DataType<br />
$(P)$(R)DataType_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Color mode</b></td>
</tr>
<tr>
<td>
NDColorMode</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Color mode (NDColorMode_t).</td>
<td>
COLOR_MODE</td>
<td>
$(P)$(R)ColorMode<br />
$(P)$(R)ColorMode_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Actual dimensions of array data</b></td>
</tr>
<tr>
<td>
NDArraySizeX</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Size of the array data in the X direction</td>
<td>
ARRAY_SIZE_X</td>
<td>
$(P)$(R)ArraySizeX_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
NDArraySizeY</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Size of the array data in the Y direction</td>
<td>
ARRAY_SIZE_Y</td>
<td>
$(P)$(R)ArraySizeY_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
NDArraySizeZ</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Size of the aray data in the Z direction</td>
<td>
ARRAY_SIZE_Z</td>
<td>
$(P)$(R)ArraySizeZ_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>File saving parameters (records are defined in NDFile.template)</b></td>
</tr>
<tr>
<td>
NDFilePath</td>
<td>
asynOctet</td>
<td>
r/w</td>
<td>
File path</td>
<td>
FILE_PATH</td>
<td>
$(P)$(R)FilePath<br />
$(P)$(R)FilePath_RBV</td>
<td>
waveform<br />
waveform</td>
</tr>
<tr>
<td>
NDFileName</td>
<td>
asynOctet</td>
<td>
r/w</td>
<td>
File name</td>
<td>
FILE_NAME</td>
<td>
$(P)$(R)FileName<br />
$(P)$(R)FileName_RBV</td>
<td>
waveform<br />
waveform</td>
</tr>
<tr>
<td>
NDFileNumber</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
File number</td>
<td>
FILE_NUMBER</td>
<td>
$(P)$(R)FileNumber<br />
$(P)$(R)FileNumber_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
NDFileTemplate</td>
<td>
asynOctet</td>
<td>
r/w</td>
<td>
Format string for constructing NDFullFileName from NDFilePath, NDFileName, and NDFileNumber.
The final file name (which is placed in NDFullFileName) is created with the following
code:
<pre>epicsSnprintf(
FullFilename,
sizeof(FullFilename),
FileFormat, FilePath,
Filename, FileNumber);
</pre>
FilePath, Filename, FileNumber are converted in that order with FileFormat. An example
file format is <code>"%s%s%4.4d.tif"</code>. The first %s converts the FilePath,
followed immediately by another %s for Filename. FileNumber is formatted with %4.4d,
which results in a fixed field with of 4 digits, with leading zeros as required.
Finally, the .tif extension is added to the file name. This mechanism for creating
file names is very flexible. Other characters, such as _ can be put in Filename
or FileFormat as desired. If one does not want to have FileNumber in the file name
at all, then just omit the %d format specifier from FileFormat. If the client wishes
to construct the complete file name itself, then it can just put that file name
into NDFileFormat with no format specifiers at all, in which case NDFilePath, NDFileName,
and NDFileNumber will be ignored.</td>
<td>
FILE_TEMPLATE</td>
<td>
$(P)$(R)FileTemplate<br />
$(P)$(R)FileTemplate_RBV</td>
<td>
waveform<br />
waveform</td>
</tr>
<tr>
<td>
NDFullFileName</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
Full file name constructed using the algorithm described in NDFileTemplate</td>
<td>
FULL_FILE_NAME</td>
<td>
$(P)$(R)FullFileName_RBV</td>
<td>
waveform<br />
waveform</td>
</tr>
<tr>
<td>
NDAutoIncrement</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Auto-increment flag. Controls whether FileNumber is automatically incremented by
1 each time a file is saved (0=No, 1=Yes)</td>
<td>
AUTO_INCREMENT</td>
<td>
$(P)$(R)AutoIncrement<br />
$(P)$(R)AutoIncrement_RBV</td>
<td>
bo<br />
bi</td>
</tr>
<tr>
<td>
NDAutoSave</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Auto-save flag (0=No, 1=Yes) controlling whether a file is automatically saved each
time acquisition completes.</td>
<td>
AUTO_SAVE</td>
<td>
$(P)$(R)AutoSave<br />
$(P)$(R)AutoSave_RBV</td>
<td>
bo<br />
bi</td>
</tr>
<tr>
<td>
NDFileFormat</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
File format. The format to write/read data in (e.g. TIFF, netCDF, etc.)</td>
<td>
FILE_FORMAT</td>
<td>
$(P)$(R)FileFormat<br />
$(P)$(R)FileFormat_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td>
NDWriteFile</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Manually save the most recent image to a file when value=1</td>
<td>
WRITE_FILE</td>
<td>
$(P)$(R)WriteFile<br />
$(P)$(R)WriteFile_RBV</td>
<td>
busy<br />
bi</td>
</tr>
<tr>
<td>
NDReadFile</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Manually read a file when value=1</td>
<td>
READ_FILE</td>
<td>
$(P)$(R)ReadFile<br />
$(P)$(R)ReadFile_RBV</td>
<td>
busy<br />
bi</td>
</tr>
<tr>
<td>
NDFileWriteMode</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
File saving mode (Single, Capture, Stream)(NDFileMode_t)</td>
<td>
WRITE_MODE</td>
<td>
$(P)$(R)FileWriteMode<br />
$(P)$(R)FileWriteMode_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td>
NDFileCapture</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Start (1) or stop (0) file capture or streaming</td>
<td>
CAPTURE</td>
<td>
$(P)$(R)FileCapture<br />
$(P)$(R)FileCapture_RBV</td>
<td>
busy<br />
bi</td>
</tr>
<tr>
<td>
NDNumCapture</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Number of frames to acquire in capture or streaming mode</td>
<td>
NUM_CAPTURE</td>
<td>
$(P)$(R)FileNumCapture<br />
$(P)$(R)FileNumCapture_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
NDNumCaptured</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Number of arrays currently acquired capture or streaming mode</td>
<td>
NUM_CAPTURED</td>
<td>
$(P)$(R)FileNumCaptured_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
NDArrayCounter</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Counter that increments by 1 each time an array is acquired. Can be reset by writing
a value to it.</td>
<td>
ARRAY_COUNTER</td>
<td>
$(P)$(R)ArrayCounter<br />
$(P)$(R)ArrayCounter_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Array data</b></td>
</tr>
<tr>
<td>
NDArrayCallbacks</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Controls whether the driver does callbacks with the array data to registered plugins.
0=No, 1=Yes. Setting this to 0 can reduce overhead in the case that the driver is
being used only to control the device, and not to make the data available to plugins
or to EPICS clients.</td>
<td>
ARRAY_CALLBACKS</td>
<td>
$(P)$(R)ArrayCallbacks<br />
$(P)$(R)ArrayCallbacks_RBV</td>
<td>
bo<br />
bi</td>
</tr>
<tr>
<td>
NDArrayData</td>
<td>
asynGenericPointer</td>
<td>
r/w</td>
<td>
The image data as an NDArray object</td>
<td>
NDARRAY_DATA</td>
<td>
N/A. EPICS access to image data is through NDStdArrays plugin.</td>
<td>
N/A</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Debugging control</b></td>
</tr>
<tr>
<td>
N/A</td>
<td>
N/A</td>
<td>
N/A</td>
<td>
asyn record to control debugging (asynTrace)</td>
<td>
N/A</td>
<td>
$(P)$(R)AsynIO</td>
<td>
asyn</td>
</tr>
</tbody>
</table>
<h3 id="ADDriver">
ADDriver</h3>
<p>
ADDriver inherits from asynNDArrayDriver. This is the class from which area detector
drivers are directly derived. The <a href="areaDetectorDoxygenHTML/class_a_d_driver.html">
ADDriver class documentation </a>describes this class in detail.
</p>
<p>
The file <a href="areaDetectorDoxygenHTML/_a_d_driver_8h.html">ADDriver.h</a> defines
a number of enumerations, including ADStdDriverParams_t, which are the parameters
that all areaDetector drivers should implement if possible.
</p>
<table border="1" cellpadding="2" cellspacing="2" style="text-align: left">
<tbody>
<tr>
<td align="center" colspan="7">
<b>Parameter Definitions in ADDriver.h and EPICS Record Definitions in ADBase.template</b></td>
</tr>
<tr>
<th>
Enum name</th>
<th>
asyn interface</th>
<th>
Access</th>
<th>
Description</th>
<th>
drvUser string</th>
<th>
EPICS record name</th>
<th>
EPICS record type</th>
</tr>
<tr>
<td align="center" colspan="7" style="height: 25px">
<b>Information about the detector</b></td>
</tr>
<tr>
<td>
ADManufacturer</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
Detector manufacturer name</td>
<td>
MANUFACTURER</td>
<td>
$(P)$(R)Manufacturer_RBV</td>
<td>
stringin</td>
</tr>
<tr>
<td>
ADModel</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
Detector model name</td>
<td>
MODEL</td>
<td>
$(P)$(R)Model_RBV</td>
<td>
stringin</td>
</tr>
<tr>
<td>
ADMaxSizeX</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Maximum (sensor) size in the X direction</td>
<td>
MAX_SIZE_X</td>
<td>
$(P)$(R)MaxSizeX_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
ADMaxSizeY</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Maximum (sensor) size in the Y direction</td>
<td>
MAX_SIZE_Y</td>
<td>
$(P)$(R)MaxSizeY_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
ADTemperature</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Detector temperature</td>
<td>
TEMPERATURE</td>
<td>
$(P)$(R)Temperature<br />
$(P)$(R)Temperature_RBV<br />
</td>
<td>
ao<br />
ai</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Detector readout control including gain, binning, region start and size, reversal</b></td>
</tr>
<tr>
<td>
ADGain</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Detector gain</td>
<td>
GAIN</td>
<td>
$(P)$(R)Gain<br />
$(P)$(R)Gain_RBV</td>
<td>
ao<br />
ai</td>
</tr>
<tr>
<td>
ADBinX</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Binning in the X direction</td>
<td>
BIN_X</td>
<td>
$(P)$(R)BinX<br />
$(P)$(R)BinX_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADBinY</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Binning in the Y direction</td>
<td>
BIN_Y</td>
<td>
$(P)$(R)BinY<br />
$(P)$(R)BinY_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADMinX</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
First pixel to read in the X direction.
<br />
0 is the first pixel on the detector.</td>
<td>
MIN_X</td>
<td>
$(P)$(R)MinX<br />
$(P)$(R)MinX_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADMinY</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
First pixel to read in the Y direction.<br />
0 is the first pixel on the detector.</td>
<td>
MIN_Y</td>
<td>
$(P)$(R)MinY<br />
$(P)$(R)MinY_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADSizeX</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Size of the region to read in the X direction</td>
<td>
SIZE_X</td>
<td>
$(P)$(R)SizeX<br />
$(P)$(R)SizeX_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADSizeY</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Size of the region to read in the Y direction</td>
<td>
SIZE_Y</td>
<td>
$(P)$(R)SizeY<br />
$(P)$(R)SizeY_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADReverseX</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Reverse image in the X direction<br />
(0=No, 1=Yes)</td>
<td>
REVERSE_X</td>
<td>
$(P)$(R)ReverseX<br />
$(P)$(R)ReverseX_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADReverseY</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Reverse image in the Y direction<br />
(0=No, 1=Yes)</td>
<td>
REVERSE_Y</td>
<td>
$(P)$(R)ReverseY<br />
$(P)$(R)ReverseY_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Image and trigger modes</b></td>
</tr>
<tr>
<td>
ADImageMode</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Image mode (ADImageMode_t).</td>
<td>
IMAGE_MODE</td>
<td>
$(P)$(R)ImageMode<br />
$(P)$(R)ImageMode_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td>
ADTriggerMode</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Trigger mode (ADTriggerMode_t).</td>
<td>
TRIGGER_MODE</td>
<td>
$(P)$(R)TriggerMode<br />
$(P)$(R)TriggerMode_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Frame type</b></td>
</tr>
<tr>
<td>
ADFrameType</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Frame type (ADFrameType_t).</td>
<td>
FRAME_TYPE</td>
<td>
$(P)$(R)FrameType<br />
$(P)$(R)FrameType_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Acquisition time and period</b></td>
</tr>
<tr>
<td>
ADAcquireTime</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Acquisition time per image</td>
<td>
ACQ_TIME</td>
<td>
$(P)$(R)AcquireTime<br />
$(P)$(R)AcquireTime_RBV</td>
<td>
ao<br />
ai</td>
</tr>
<tr>
<td>
ADAcquirePeriod</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Acquisition period between images</td>
<td>
ACQ_PERIOD</td>
<td>
$(P)$(R)AcquirePeriod<br />
$(P)$(R)AcquirePeriod_RBV</td>
<td>
ao<br />
ai</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Number of exposures and number of images</b></td>
</tr>
<tr>
<td>
ADNumExposures</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Number of exposures per image to acquire</td>
<td>
NEXPOSURES</td>
<td>
$(P)$(R)NumExposures<br />
$(P)$(R)NumExposures_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td>
ADNumImages</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Number of images to acquire in one acquisition sequence</td>
<td>
NIMAGES</td>
<td>
$(P)$(R)NumImages<br />
$(P)$(R)NumImages_RBV</td>
<td>
longout<br />
longin</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Acquisition control</b></td>
</tr>
<tr>
<td>
ADAcquire</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Start (1) or stop (0) image acquisition. This record is linked to an EPICS busy
record that does not process its forward link until acquisition is complete. Clients
should write 1 to the Acquire record to start acquisition, and wait for Acquire
to go to 0 to know that acquisition is complete.</td>
<td>
ACQUIRE</td>
<td>
$(P)$(R)Acquire</td>
<td>
bo</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Status information</b></td>
</tr>
<tr>
<td>
ADStatus</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Acquisition status (ADStatus_t)</td>
<td>
STATUS</td>
<td>
$(P)$(R)DetectorState_RBV</td>
<td>
mbbi</td>
</tr>
<tr>
<td>
ADStatusMessage</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
Status message string</td>
<td>
STATUS_MESSAGE</td>
<td>
$(P)$(R)StatusMessage_RBV</td>
<td>
waveform</td>
</tr>
<tr>
<td>
ADStringToServer</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
String from driver to string-based vendor server</td>
<td>
STRING_TO_SERVER</td>
<td>
$(P)$(R)StringToServer_RBV</td>
<td>
waveform</td>
</tr>
<tr>
<td>
ADStringFromServer</td>
<td>
asynOctet</td>
<td>
r/o</td>
<td>
String from string-based vendor server to driver</td>
<td>
STRING_FROM_SERVER</td>
<td>
$(P)$(R)StringFromServer_RBV</td>
<td>
waveform</td>
</tr>
<tr>
<td>
ADNumExposuresCounter</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Counter that increments by 1 each time an exposure is acquired for the current image.
Driver resets to 0 when acquisition is started.</td>
<td>
NUM_EXPOSURES_COUNTER</td>
<td>
$(P)$(R)NumExposuresCounter_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
ADNumImagesCounter</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Counter that increments by 1 each time an image is acquired in the current acquisition
sequence. Driver resets to 0 when acquisition is started. Drivers can use this as
the loop counter when ADImageMode=ADImageMultiple.</td>
<td>
NUM_IMAGES_COUNTER</td>
<td>
$(P)$(R)NumImagesCounter_RBV</td>
<td>
longin</td>
</tr>
<tr>
<td>
N/A</td>
<td>
N/A</td>
<td>
r/o</td>
<td>
Rate (Hz) at which ImageCounter is incrementing. Computed in database.</td>
<td>
N/A</td>
<td>
$(P)$(R)ImageRate_RBV</td>
<td>
calc</td>
</tr>
<tr>
<td>
ADTimeRemaining</td>
<td>
asynFloat64</td>
<td>
r/o</td>
<td>
Time remaining for current image. Drivers should update this value if they are doing
the exposure timing internally, rather than in the detector hardware.</td>
<td>
TIME_REMAINING</td>
<td>
$(P)$(R)TimeRemaining_RBV</td>
<td>
ai</td>
</tr>
<tr>
<td>
ADReadStatus</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Write a 1 to this parameter to force a read of the detector status. Detector drivers
normally read the status as required, so this is usually not necessary, but there
may be some circumstances under which forcing a status read may be needed.</td>
<td>
READ_STATUS</td>
<td>
$(P)$(R)ReadStatus</td>
<td>
bo</td>
</tr>
<tr>
<td align="center" colspan="7">
<b>Shutter control</b></td>
</tr>
<tr>
<td>
ADShutterMode</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Shutter mode (None, detector-controlled or EPICS-controlled) (ADShutterMode_t)</td>
<td>
SHUTTER_MODE</td>
<td>
$(P)$(R)ShutterMode<br />
$(P)$(R)ShutterMode_RBV</td>
<td>
mbbo<br />
mbbi</td>
</tr>
<tr>
<td>
ADShutterControl</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
Shutter control for the selected (detector or EPICS) shutter (ADShutterStatus_t)</td>
<td>
SHUTTER_CONTROL</td>
<td>
$(P)$(R)ShutterControl<br />
$(P)$(R)ShutterControl_RBV</td>
<td>
bo<br />
bi</td>
</tr>
<tr>
<td>
ADShutterControlEPICS</td>
<td>
asynInt32</td>
<td>
r/w</td>
<td>
This record processes when it receives a callback from the driver to open or close
the EPICS shutter. It triggers the records below to actually open or close the EPICS
shutter.</td>
<td>
SHUTTER_CONTROL_EPICS</td>
<td>
$(P)$(R)ShutterControlEPICS</td>
<td>
bi</td>
</tr>
<tr>
<td>
N/A</td>
<td>
N/A</td>
<td>
r/w</td>
<td>
This record writes its OVAL field to its OUT field when the EPICS shutter is told
to open. The OCAL (and hence OVAL) and OUT fields are user-configurable, so any
EPICS-controllable shutter can be used.</td>
<td>
N/A</td>
<td>
$(P)$(R)ShutterOpenEPICS</td>
<td>
calcout</td>
</tr>
<tr>
<td>
N/A</td>
<td>
N/A</td>
<td>
r/w</td>
<td>
This record writes its OVAL field to its OUT field when the EPICS shutter is told
to close. The OCAL (and hence OVAL) and OUT fields are user-configurable, so any
EPICS-controllable shutter can be used.</td>
<td>
N/A</td>
<td>
$(P)$(R)ShutterCloseEPICS</td>
<td>
calcout</td>
</tr>
<tr>
<td>
ADShutterStatus</td>
<td>
asynInt32</td>
<td>
r/o</td>
<td>
Status of the detector-controlled shutter (ADShutterStatus_t)</td>
<td>
SHUTTER_STATUS</td>
<td>
$(P)$(R)ShutterStatus_RBV</td>
<td>
bi</td>
</tr>
<tr>
<td>
N/A</td>
<td>
N/A</td>
<td>
r/o</td>
<td>
Status of the EPICS-controlled shutter. This record should have its input link (INP)
set to a record that contains the open/close status information for the shutter.
The link should have the "CP" attribute, so this record processes when the input
changes. The ZRVL field should be set to the value of the input link when the shutter
is closed, and the ONVL field should be set to the value of the input link when
the shutter is open.</td>
<td>
N/A</td>
<td>
$(P)$(R)ShutterStatusEPICS_RBV</td>
<td>
mbbi</td>
</tr>
<tr>
<td>
ADShutterOpenDelay</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Time required for the shutter to actually open (ADShutterStatus_t)</td>
<td>
SHUTTER_OPEN_DELAY</td>
<td>
$(P)$(R)ShutterOpenDelay<br />
$(P)$(R)ShutterOpenDelay_RBV</td>
<td>
ao<br />
ai</td>
</tr>
<tr>
<td>
ADShutterCloseDelay</td>
<td>
asynFloat64</td>
<td>
r/w</td>
<td>
Time required for the shutter to actually close (ADShutterStatus_t)</td>
<td>
SHUTTER_CLOSE_DELAY</td>
<td>
$(P)$(R)ShutterCloseDelay<br />
$(P)$(R)ShutterCloseDelay_RBV</td>
<td>
ao<br />
ai</td>
</tr>
</tbody>
</table>
<h2 id="MEDM_screens">
MEDM screens</h2>
<p>
The following is the MEDM screen that provides access to the parameters in asynNDArrayDriver.h
and ADDriver.h through records in ADBase.template. This is a top-level MEDM screen
that will work with any areaDetector driver. Note however that many drivers will
not implement all of these parameters, and there will usually be detector-specific
parameters not shown in this screen, so detector-specific MEDM screens should generally
be created that display the EPICS PVs for the features implemented for that detector.
</p>
<div style="text-align: center">
<p>
<b>ADBase.adl</b></p>
<img alt="ADBase.png" src="ADBase.png" /></div>
<p>
The following is the MEDM screen that provides access to the file-related parameters
in ADStdDriverParams through records in NDFile.template. This screen is for use
with detector drivers that directly implement file I/O.
</p>
<div style="text-align: center">
<p>
<b>NDFile.adl</b></p>
<img alt="NDFile.png" src="NDFile.png" /></div>
<p>
The following is the MEDM screen that provides access to the EPICS shutter parameters
in ADStdDriverParams through records in ADBase.template. This screen allows one
to define the EPICS PVs to open the shutter, close the shutter, and determine the
shutter status. The values of these PVs for open and close drive and status can
also be defined. Note that in many cases the same PV will be used for open and close
drive, but in some cases (e.g. APS safety shutters) different PVs are used for open
and close.
</p>
<div style="text-align: center">
<p>
<b>ADEpicsShutter.adl</b></p>
<img alt="ADEpicsShutter.png" src="ADEpicsShutter.png" /></div>
</body>
</html>