minor updates and bug fixes in angle scan processing and data explorer
This commit is contained in:
368
doc/src/anglescan-processing.dox
Normal file
368
doc/src/anglescan-processing.dox
Normal file
@ -0,0 +1,368 @@
|
||||
/*! @page pag_anglescan_processing Angle-scan processing
|
||||
|
||||
@tableofcontents
|
||||
|
||||
\section sec_intro Introduction
|
||||
|
||||
This page describes the data processing steps of angle-scans using the PEARL Procedures.
|
||||
The description relies on using the command line regardless of available GUIs.
|
||||
|
||||
\section sec_import Data reduction
|
||||
|
||||
The goal of this step is to import raw data and at the same time eliminate the energy dimension.
|
||||
We want a two-dimensional wave
|
||||
where the first dimension is the angle axis of the detector
|
||||
and the second dimension is the sequence of measurements,
|
||||
scanning one or multiple manipulator angles.
|
||||
The second dimension requires additional one-dimensional waves
|
||||
that describe the polar, tilt and azimuthal angle setting of the manipulator
|
||||
for each dimension index.
|
||||
|
||||
The processing steps depend on the complexity of the measured spectrum.
|
||||
The user may have to adopt one of the predefined or a custom procedure accordingly.
|
||||
Here, we describe two procedures that may cover many generic cases
|
||||
or that can serve as a starting point for a refined, customized procedure.
|
||||
However, any procedure that produces the datasets mentioned above is, of course, a valid approach.
|
||||
For instance, you could load the complete three-dimensional ScientaImage dataset,
|
||||
and generate the two-dimensional dataset using your own procedures.
|
||||
|
||||
\subsection sec_import_basics Basic steps
|
||||
|
||||
The central import functions are @ref psh5_load_reduced and @ref psh5_load_dataset_reduced.
|
||||
The first form is sufficient if the file contains just one scan and region.
|
||||
Further regions/scans need to be loaded using the second form.
|
||||
The first form is also exposed in the PEARL data explorer window.
|
||||
|
||||
The functions require a data reduction function and processing parameters as arguments.
|
||||
Some particular reduction functions are described further below.
|
||||
More can be found in the source code (or obtained from other users).
|
||||
A list of functions that look like reduction functions can be got from @ref adh5_list_reduction_funcs.
|
||||
|
||||
The basic call sequence looks as follows.
|
||||
Substitute the arguments in angle brackets as necessary.
|
||||
You may have to analyse a reference spectrum or the complete ScientaImage
|
||||
to figure out the processing parameters beforehand.
|
||||
|
||||
First form:
|
||||
|
||||
@code{.ipf}
|
||||
setdatafolder root:
|
||||
string sparam
|
||||
sparam = "<param1=1.5;param2=test;>"
|
||||
psh5_load_reduced("<igor-datafolder>", "<igor-filepath>", "<filename>", <reduction_function>, sparam)
|
||||
@endcode
|
||||
|
||||
Second form:
|
||||
|
||||
@code{.ipf}
|
||||
// open the file
|
||||
setdatafolder root: // or other parent folder
|
||||
variable fid
|
||||
string sparam
|
||||
fid = psh5_open_file("<igor-datafolder>", "<igor-filepath>", "<filename>")
|
||||
|
||||
// load metadata for scaling
|
||||
psh5_load_scan_meta(fileID, "<scan 1>")
|
||||
newdatafolder /s /o attr
|
||||
psh5_load_scan_attrs(fileID, "<scan 1>")
|
||||
setdatafolder ::
|
||||
|
||||
// load and reduce dataset
|
||||
sparam = "<param1=1.5;param2=test;>"
|
||||
psh5_load_dataset_reduced(fid, "<scan 1/region1>", "<ScientaImage>", <reduction_function>, sparam)
|
||||
|
||||
// close the file
|
||||
psh5_close_file(fid)
|
||||
fid = 0
|
||||
@endcode
|
||||
|
||||
\subsection sec_import_intlinbg Peak integration over linear background
|
||||
|
||||
The @ref int_linbg_reduction function converts a two-dimensional Scienta image I(angle, energy)
|
||||
into a one-dimensional angle distribution I(angle).
|
||||
For each angle slice, it calculates a linear background.
|
||||
Then, it integrates the difference between the original data and the background over a specified interval.
|
||||
|
||||
The function requires the following, fixed parameters:
|
||||
|
||||
Parameter | Description | Typical value
|
||||
----------|-------------|--------------
|
||||
Lcrop | size of the low-energy cropping region | 0.11 (fixed mode)
|
||||
Lsize | size of the low-energy background region | 0.2
|
||||
Hcrop | size of the high-energy cropping region | 0.11
|
||||
Hsize | size of the high-energy background region | 0.2
|
||||
Cpos | position of the peak center | 0.5
|
||||
Csize | size of the center region | 0.3
|
||||
|
||||
All parameters are relative to the size of the image (length of the energy interval)
|
||||
and must be in the range from 0 to 1.
|
||||
|
||||
The cropping region is cut away from the image for the rest of the processing.
|
||||
This is necessary to remove the dark corners in fixed mode
|
||||
but can be neglected in swept mode (cropping size = 0).
|
||||
|
||||
The low and high background regions are adjacent to the cropping regions on either side.
|
||||
The function calculates two fix points of the linear background in the center of each background region.
|
||||
The intensity value of each fix point is the average intensity in the background region.
|
||||
|
||||
The peak region is integrated over the integral given by the Csize parameter centered at Cpos.
|
||||
|
||||
The background-subtracted peak integral is returned in ReducedData1.
|
||||
ReducedData2 receives the error estimate of the peak integral (assuming Poisson statistics).
|
||||
|
||||
\subsection sec_import_peakfit Peak fitting
|
||||
|
||||
The @ref gauss4_reduction function converts a two-dimensional Scienta image I(angle, energy)
|
||||
into a one-dimensional angle distribution I(angle).
|
||||
For each angle slice, it performs a Gaussian curve fit with up to four components on a linear background.
|
||||
|
||||
To improve the stability of the fit, the peak positions and widths are kept fixed
|
||||
while the amplitudes of the peaks and the background parameters are variable
|
||||
but constrained to reasonable values (positive amplitude).
|
||||
Furthermore, the function can optionally do a box averaging over three slices.
|
||||
|
||||
The function requires the following, fixed parameters:
|
||||
|
||||
Parameter | Description
|
||||
----------|------------
|
||||
rngl | lower limit of the fit interval
|
||||
rngh | upper limit of the fit interval
|
||||
npeaks | number of components
|
||||
pos1 | center energy of peak 1
|
||||
wid1 | width of peak 1
|
||||
pos2 | center energy of peak 2
|
||||
wid2 | width of peak 2
|
||||
pos3 | center energy of peak 3
|
||||
wid3 | width of peak 3
|
||||
pos4 | center energy of peak 3
|
||||
wid4 | width of peak 3
|
||||
ybox | box size of slice averaging (1 or 3)
|
||||
|
||||
The peak parameters should be determined beforehand from fitting a reference spectrum,
|
||||
or the angle-scan integrated over all angles.
|
||||
Peak positions and widths have to be specified only up to the given number of peaks.
|
||||
|
||||
The data reduction procedure returns the peak integrals
|
||||
(amplitude times width times square root of 2) in waves
|
||||
named ReducedDataN where N is a numeric index from 1 to npeaks.
|
||||
The waves starting with an index of npeaks+1
|
||||
contain the corresponding error estimate of the peak integral.
|
||||
|
||||
\subsection sec_import_custom Custom reduction functions
|
||||
|
||||
See the documentation and source code of @ref int_linbg_reduction, @ref gauss4_reduction and
|
||||
@ref adh5_default_reduction for help on writing custom reduction functions.
|
||||
To integrate your function with the PEARL data explorer,
|
||||
you have to provide an additional function that prompts for reduction parameters
|
||||
such as @ref prompt_int_linbg_reduction, for example.
|
||||
Since reduction functions cannot be called from the command line,
|
||||
it is redommended to also write an adapter function for testing.
|
||||
|
||||
\section sec_norm Normalization
|
||||
|
||||
The goal of the data normalization is to get a (still two-dimensional) dataset
|
||||
that ideally contains intensity variations due to diffraction features and statistical fluctuations only.
|
||||
In particular, instrumental variations should be removed.
|
||||
In some cases, it may be necessary to preserve the overall polar dependence of the intensity.
|
||||
Note that this latter case is not properly treated with the methods described here.
|
||||
|
||||
Depending on the quality of the measured data,
|
||||
only some of the following processing steps are necessary.
|
||||
Use your own judgement.
|
||||
|
||||
\subsection sec_norm_prep Preparations
|
||||
|
||||
Start by creating a new copy of the data and inspecting it:
|
||||
@code{.ipf}
|
||||
duplicate ReducedData1, NormData1
|
||||
ad_display_profiles(NormData1)
|
||||
@endcode
|
||||
|
||||
To update the display after changes to NormData1:
|
||||
@code{.ipf}
|
||||
ad_update_profiles(NormData1)
|
||||
@endcode
|
||||
|
||||
\subsection sec_norm_crop Detector angle range
|
||||
|
||||
Crop the detector angle axis to a useful range (usually about -25 to +25 degrees):
|
||||
@code{.ipf}
|
||||
crop_strip(NormData1, -25, 25)
|
||||
@endcode
|
||||
|
||||
\subsection sec_norm_angle Normalize detector angle
|
||||
|
||||
Remove inhomogeneity of the detector in the detector angle axis.
|
||||
This component may also include a contribution from the sample.
|
||||
If your raw data shows a flat distribution, this step is not necessary.
|
||||
|
||||
@code{.ipf}
|
||||
normalize_strip_x(NormData1, smooth_method=4, smooth_factor=0.15, check=2)
|
||||
@endcode
|
||||
|
||||
Note that the argument <code>check=2</code> causes the function to generate
|
||||
two check waves but not to modify the original data.
|
||||
To inspect the check waves:
|
||||
@code{.ipf}
|
||||
display check_dist, check_smoo
|
||||
ModifyGraph rgb(check_dist)=(0,0,0)
|
||||
@endcode
|
||||
|
||||
Vary the <code>smooth_factor</code> (between 0.1 and 1.0)
|
||||
until it follows the instrumental curve
|
||||
but does not affect diffraction features.
|
||||
Then set <code>check=1</code> to apply the normalization to <code>NormData1</code>.
|
||||
|
||||
\subsection sec_norm_wobble Azimuthal variation (wobble)
|
||||
|
||||
Reduce the effect of azimuthal wobble (misaligned surface) on intensity.
|
||||
A misaligned surface may cause a sinusoidal variation of the intensity as a function of azimuthal angle with a 360°ree; period.
|
||||
A strong azimuthal variation may affect the polar normalization in the next step.
|
||||
The azimuthal normalization can be based on a restricted range of polar angles (theta range).
|
||||
You have to find out which value works best for your sample.
|
||||
|
||||
@code{.ipf}
|
||||
normalize_strip_phi(NormData1, :attr:ManipulatorTheta, :attr:ManipulatorPhi, theta_offset=-8.8, theta_range=10, check=2)
|
||||
@endcode
|
||||
|
||||
Note, however, that his function does not correct for angle shifts induced by the misalignment!
|
||||
|
||||
\subsection sec_norm_theta Polar dependence
|
||||
|
||||
Remove the polar angle dependence (matrix element and excitation/detection geometry).
|
||||
|
||||
@code{.ipf}
|
||||
normalize_strip_theta(NormData1, :attr:ManipulatorTheta, theta_offset=-8.8, smooth_method=4, smooth_factor=0.5, check=2)
|
||||
@endcode
|
||||
|
||||
Use the check waves and the <code>check</code> argument as described above.
|
||||
|
||||
|
||||
\section sec_plot Binning and plotting
|
||||
|
||||
\subsection sec_plot_basics Basic steps
|
||||
|
||||
You can bin and plot the data in one step:
|
||||
|
||||
@code{.ipf}
|
||||
pizza_service(NormData1, "Nickname1", -8.8, 0.5, 6)
|
||||
@endcode
|
||||
|
||||
or two steps:
|
||||
|
||||
@code{.ipf}
|
||||
pizza_service(NormData1, "Nickname2", -8.8, 0.5, 6, nograph=1)
|
||||
display_hemi_scan("Nickname2")
|
||||
@endcode
|
||||
|
||||
The benefit of the latter is that you have more control over the graph through optional arguments.
|
||||
In particular, you can select the projection or hide the ticks and grids.
|
||||
See @ref display_hemi_scan for details.
|
||||
|
||||
The @ref pizza_service function requires the waves with manipulator positions
|
||||
in a specific place, namely <code>:attr:ManipulatorTheta</code> (for the polar angle),
|
||||
and the normal emission values as function arguments.
|
||||
If you have moved the waves, or if you have subtracted the offsets yourself,
|
||||
use the alternative @ref pizza_service_2 function.
|
||||
|
||||
Additional parameters of the @ref pizza_service function allow for rotational averaging,
|
||||
larger angle steps (default 1 degree),
|
||||
or the creation of metadata including a notebook for xpdPlot.
|
||||
|
||||
Note there is currently a bug in the nick name argument of some of the following functions.
|
||||
If the lines shown below do not work,
|
||||
try to switch to the data folder that contains the generated polar plot data,
|
||||
and call the function with an empty nickname <code>""</code>.
|
||||
|
||||
\subsection sec_plot_refine Refinements
|
||||
|
||||
To remove high polar angles above θ = 80 from the plot (and data):
|
||||
|
||||
@code{.ipf}
|
||||
trim_hemi_scan("Nickname1", 80)
|
||||
@endcode
|
||||
|
||||
Modify the pseudocolor scale by changing the <code>polarY0</code> trace:
|
||||
|
||||
@code{.ipf}
|
||||
ModifyGraph zColor(polarY0)={mod_values, *, *, BlueGreenOrange, 0}
|
||||
ModifyGraph zColor(polarY0)={mod_values, -0.2, 0.2, BlueGreenOrange, 0}
|
||||
@endcode
|
||||
|
||||
To set the contrast to clip specified percentiles of the data points,
|
||||
use the @arg set_contrast function:
|
||||
|
||||
@code{.ipf}
|
||||
set_contrast(2, 2, graphname="graph_Nickname1", colortable="BlueGreenOrange")
|
||||
@endcode
|
||||
|
||||
|
||||
\subsection sec_plot_interp Interpolation
|
||||
|
||||
Polar plots can also be interpolated to a rectangular matrix,
|
||||
which may in some cases produce nicer images:
|
||||
|
||||
@code{.ipf}
|
||||
interpolate_hemi_scan("Nickname1")
|
||||
display_hemi_scan("Nickname1", graphtype=3, graphname="intp")
|
||||
matrix = sqrt(x^2 + y^2) <= calc_graph_radius(80) ? matrix : nan
|
||||
ModifyImage matrix ctab= {*,*,BlueGreenOrange,0}
|
||||
@endcode
|
||||
|
||||
The <code>matrix =</code> line optionally removes artefacts at high polar angles.
|
||||
Replace the cut-off angle with your own.
|
||||
|
||||
|
||||
\subsection sec_modulation Modulation function
|
||||
|
||||
To calculate the modulation function and substitute it in the graph:
|
||||
|
||||
@code{.ipf}
|
||||
setdatafolder Nickname1
|
||||
calc_modulation(values, factor1=pol, factor2=az)
|
||||
ModifyGraph zColor(polarY0)={mod_values,-0.2,0.2,BlueGreenOrange,0}
|
||||
@endcode
|
||||
|
||||
|
||||
\section sec_export Data export
|
||||
|
||||
\subsection sec_export_plot Export picture
|
||||
|
||||
The following line is an example of how to export a graph window.
|
||||
Click on the desired graph window, then issue the following command,
|
||||
substituting the file path and file name as appropriate.
|
||||
|
||||
@code{.ipf}
|
||||
SavePICT/P=home/E=-5/B=144/O as "some_filename.png"
|
||||
@endcode
|
||||
|
||||
\subsection sec_export_data Export processed data
|
||||
|
||||
The following line saves the dataset to an Igor text file.
|
||||
The file contains all data necessary to recreate a polar plot without further processing.
|
||||
|
||||
@code{.ipf}
|
||||
save_hemi_scan("Nickname1", "home", "some_filename")
|
||||
@endcode
|
||||
|
||||
For structural optimization using the PMSCO software,
|
||||
it is necessary to generate an ETPI file.
|
||||
There is currently no special function for this.
|
||||
Instead, you have to create and set an energy wave,
|
||||
|
||||
@code{.ipf}
|
||||
duplicate pol, en
|
||||
en = 123.4 // kinetic energy of the photoelectron
|
||||
@endcode
|
||||
|
||||
and write the four waves <code>en, pol, az, values</code> to a general text file.
|
||||
Be careful about the ordering of the waves!
|
||||
You will also have to rename the file to the <code>.etpi</code> extension
|
||||
because Igor always saves with <code>.txt</code> extension.
|
||||
If you have a wave with statistical errors, add a fifth column and use the <code>.etpis</code> extension.
|
||||
|
||||
@code{.ipf}
|
||||
Save /G /M="\n" /O /P=home en, pol, az, values, sig as "Nickname1.etpis.txt"
|
||||
@endcode
|
||||
|
||||
*/
|
@ -1,4 +1,7 @@
|
||||
/*! @mainpage Introduction
|
||||
|
||||
@tableofcontents
|
||||
|
||||
\section sec_intro Introduction
|
||||
|
||||
PEARL Procedures is a suite of Igor Pro procedures developed for data acquisition and data processing at the PEARL beamline at the Swiss Light Source.
|
||||
|
Reference in New Issue
Block a user