mazzol_a 2736d975c5
Build on RHEL9 / build (push) Successful in 2m29s
Build on RHEL8 / build (push) Successful in 2m52s
Run tests using data on local RHEL8 / build (push) Successful in 3m49s
Build on local RHEL8 / build (push) Successful in 2m32s
Dev/enable custom etas (#305)
- Allowing the users more flexibility to play around with custom eta
functions without touching the c++ code

- passing vector of eta values to ``transform_eta_values`` 

```
from aare import Interpolator, ClusterVector, Etai, Cluster
import numpy as np 

def custom_eta(cluster_pixel_coordinate_x, cluster_pixel_coordinate_y, cluster_data):
    # dummy custom eta function that just returns the sum of the cluster data
    eta = Etai()
    eta.x = 0.1 # dummy x value
    eta.y = 0.1 # dummy y value
    eta.sum = np.sum(cluster_data) # sum of the cluster data as the "energy
    return eta

# Create a dummy eta distribution and bins
eta_distribution = np.zeros((10, 10, 1)) # dummy eta distribution
etax_bins = np.linspace(0, 1.0, 11)
etay_bins = np.linspace(0, 1.0, 11)
e_bins = np.array([0., 10.]) # dummy energy bins

# Create the interpolator
interpolator = Interpolator(eta_distribution, etax_bins, etay_bins, e_bins)

# Create a dummy cluster vector
cluster_vector = ClusterVector()
cluster_vector.push_back(Cluster(10, 5, np.ones(shape=9, dtype = np.int32)))
cluster_vector.push_back(Cluster(20, 10, np.ones(shape=9, dtype = np.int32)))

# Create dummy etas for the clusters
cluster_array = np.array(cluster_vector)
etas = np.array([custom_eta(cluster["x"], cluster["y"], cluster["data"]) for cluster in cluster_array])

# transform eta values to uniform coordinates 
uniform_coordinates = interpolator.transform_eta_values(etas)

# Interpolate to get the photon coordinates e.g. apply interpolation logic 
photon_coordinates_x = cluster_array["x"] + uniform_coordinates["x"] # add to pixel coordinate 
photon_coordinates_y = cluster_array["y"] + uniform_coordinates["y"] # add to pixel coordinate 

```
advantage: full control over interpolation logic, 
downside: inefficient quite some loops in python
- passing pre computed eta values to interpolate function 
```
Interpolator.interpolate(cluster_vector, etas) 
```
downside: less flexibility in interpolation logic. 
downside: People might misuse it instead of using interpolate directly
with a pre compiled eta function implemented in c++
2026-04-24 14:01:13 +02:00
2026-01-20 17:20:48 +01:00
2026-04-24 14:01:13 +02:00
2024-11-11 19:59:55 +01:00
2025-03-20 12:52:04 +01:00
2025-11-20 09:01:28 +01:00
2026-04-24 14:01:13 +02:00

aare

Data analysis library for PSI hybrid detectors

Documentation

Detailed documentation including installation can be found in Documentation

License

This project is licensed under the MPL-2.0 license. See the LICENSE file or https://www.mozilla.org/en-US/MPL/ for details.

Build and install

Prerequisites

  • cmake >= 3.14
  • C++17 compiler (gcc >= 8)
  • python >= 3.10

Development install (for Python)

git clone git@github.com:slsdetectorgroup/aare.git --branch=v1 #or using http...
mkdir build
cd build

#configure using cmake
cmake ../aare -DAARE_PYTHON_BINDINGS=ON 

#build (replace 4 with the number of threads you want to use)
make -j4 

Now you can use the Python module from your build directory

import aare
f = aare.File('Some/File/I/Want_to_open_master_0.json')

To run from other folders either add the path to your conda environment using conda-build or add the module to your PYTHONPATH

export PYTHONPATH=path_to_aare/aare/build:$PYTHONPATH

Install using conda/mamba

#enable your env first!
conda install aare -c slsdetectorgroup # installs latest version

Install to a custom location and use in your project

Working example in: https://github.com/slsdetectorgroup/aare-examples

#build and install aare 
git clone git@github.com:slsdetectorgroup/aare.git --branch=v1 #or using http...
mkdir build
cd build

#configure using cmake
cmake ../aare -DCMAKE_INSTALL_PREFIX=/where/to/put/aare

#build (replace 4 with the number of threads you want to use)
make -j4 

#install
make install


#Now configure your project
 cmake .. -DCMAKE_PREFIX_PATH=SOME_PATH

Local build of conda pkgs

conda build . --variants="{python: [3.11, 3.12, 3.13]}"

Developer's guide

We are looking forward to your contributions via pull requests!

If you want to fix an existing bug or propose a new feature:

  1. Install pre-commit python package and setup it pre-commit install
  2. Create a new branch with git branch branch_name
  3. Implement your changes and make a commit (pre-commit will check your code automatically)
  4. Push your commit and open a pull request if needed
S
Description
Mirrored from github
Readme MPL-2.0 120 MiB
Languages
Jupyter Notebook 76.3%
C++ 20.9%
Python 2%
CMake 0.8%