mirror of
https://github.com/slsdetectorgroup/aare.git
synced 2026-05-02 11:14:13 +02:00
2736d975c526dfcf94f1bba399cddb76e939f1ca
- Allowing the users more flexibility to play around with custom eta
functions without touching the c++ code
- passing vector of eta values to ``transform_eta_values``
```
from aare import Interpolator, ClusterVector, Etai, Cluster
import numpy as np
def custom_eta(cluster_pixel_coordinate_x, cluster_pixel_coordinate_y, cluster_data):
# dummy custom eta function that just returns the sum of the cluster data
eta = Etai()
eta.x = 0.1 # dummy x value
eta.y = 0.1 # dummy y value
eta.sum = np.sum(cluster_data) # sum of the cluster data as the "energy
return eta
# Create a dummy eta distribution and bins
eta_distribution = np.zeros((10, 10, 1)) # dummy eta distribution
etax_bins = np.linspace(0, 1.0, 11)
etay_bins = np.linspace(0, 1.0, 11)
e_bins = np.array([0., 10.]) # dummy energy bins
# Create the interpolator
interpolator = Interpolator(eta_distribution, etax_bins, etay_bins, e_bins)
# Create a dummy cluster vector
cluster_vector = ClusterVector()
cluster_vector.push_back(Cluster(10, 5, np.ones(shape=9, dtype = np.int32)))
cluster_vector.push_back(Cluster(20, 10, np.ones(shape=9, dtype = np.int32)))
# Create dummy etas for the clusters
cluster_array = np.array(cluster_vector)
etas = np.array([custom_eta(cluster["x"], cluster["y"], cluster["data"]) for cluster in cluster_array])
# transform eta values to uniform coordinates
uniform_coordinates = interpolator.transform_eta_values(etas)
# Interpolate to get the photon coordinates e.g. apply interpolation logic
photon_coordinates_x = cluster_array["x"] + uniform_coordinates["x"] # add to pixel coordinate
photon_coordinates_y = cluster_array["y"] + uniform_coordinates["y"] # add to pixel coordinate
```
advantage: full control over interpolation logic,
downside: inefficient quite some loops in python
- passing pre computed eta values to interpolate function
```
Interpolator.interpolate(cluster_vector, etas)
```
downside: less flexibility in interpolation logic.
downside: People might misuse it instead of using interpolate directly
with a pre compiled eta function implemented in c++
aare
Data analysis library for PSI hybrid detectors
Documentation
Detailed documentation including installation can be found in Documentation
License
This project is licensed under the MPL-2.0 license. See the LICENSE file or https://www.mozilla.org/en-US/MPL/ for details.
Build and install
Prerequisites
- cmake >= 3.14
- C++17 compiler (gcc >= 8)
- python >= 3.10
Development install (for Python)
git clone git@github.com:slsdetectorgroup/aare.git --branch=v1 #or using http...
mkdir build
cd build
#configure using cmake
cmake ../aare -DAARE_PYTHON_BINDINGS=ON
#build (replace 4 with the number of threads you want to use)
make -j4
Now you can use the Python module from your build directory
import aare
f = aare.File('Some/File/I/Want_to_open_master_0.json')
To run from other folders either add the path to your conda environment using conda-build or add the module to your PYTHONPATH
export PYTHONPATH=path_to_aare/aare/build:$PYTHONPATH
Install using conda/mamba
#enable your env first!
conda install aare -c slsdetectorgroup # installs latest version
Install to a custom location and use in your project
Working example in: https://github.com/slsdetectorgroup/aare-examples
#build and install aare
git clone git@github.com:slsdetectorgroup/aare.git --branch=v1 #or using http...
mkdir build
cd build
#configure using cmake
cmake ../aare -DCMAKE_INSTALL_PREFIX=/where/to/put/aare
#build (replace 4 with the number of threads you want to use)
make -j4
#install
make install
#Now configure your project
cmake .. -DCMAKE_PREFIX_PATH=SOME_PATH
Local build of conda pkgs
conda build . --variants="{python: [3.11, 3.12, 3.13]}"
Developer's guide
We are looking forward to your contributions via pull requests!
If you want to fix an existing bug or propose a new feature:
- Install
pre-commitpython package and setup itpre-commit install - Create a new branch with
git branch branch_name - Implement your changes and make a commit (
pre-commitwill check your code automatically) - Push your commit and open a pull request if needed
Languages
Jupyter Notebook
76.3%
C++
20.9%
Python
2%
CMake
0.8%