Merge branch '33-update-python-environments-provided-by-anaconda-2019-07' into 'master'

Resolve "Update python environments provided by anaconda/2019.07"

Closes #33

See merge request Pmodules/buildblocks!37
This commit is contained in:
2019-09-04 16:36:54 +02:00
7 changed files with 51 additions and 11 deletions

View File

@@ -0,0 +1,3 @@
Note that the dask 2.1.0 with python 3.6 led to a non-functioning configuration (serialization errors in dask). Therefore we exclude it here.
It would be better if users used the datascience_py37 environment.

View File

@@ -0,0 +1,22 @@
# Clean environment based on pure conda-forge packages
name: datascience_py36
channels:
- conda-forge
dependencies:
- python=3.6
- pandas
- numpy
- scipy
- scikit-learn
- matplotlib
- seaborn
- tensorflow=1.13.1
- pytables
- ipython
- keras=2.1.6
- deap
- nb_conda_kernels
- ipywidgets
# Note that the dask 2.1.0 with python 3.6 led to a non-functioning configuration
# (serialization errors in dask). Therefore we exclude it here.
# - dask

View File

@@ -1,9 +1,9 @@
# Clean environment based on pure conda-forge packages
name: datascience_py36
name: datascience_py37
channels:
- conda-forge
dependencies:
- python=3.6
- python=3.7
- pandas
- numpy
- scipy
@@ -13,8 +13,10 @@ dependencies:
- tensorflow=1.13.1
- pytables
- ipython
- keras=2.1.6
- dask
# keras 2.1.6 not available from conda-forge for py37
# - keras=2.1.6
- deap
- nb_conda_kernels
- ipywidgets
- dask
- dask-jobqueue

View File

@@ -0,0 +1,9 @@
# Clean environment based on pure conda-forge packages
name: hpce-tools
channels:
- conda-forge
- http://conda-pkg.intranet.psi.ch
dependencies:
- python=3.6
- ldapuserdir

View File

@@ -17,3 +17,4 @@ dependencies:
- deap
- nb_conda_kernels
- ipywidgets
- chaospy

View File

@@ -5,11 +5,12 @@
* The anaconda module just provides the **conda** package management tool together with its directory infrastructure which contains *conda environments* and a cache of downloaded packages
* Python and user software is provided in **conda environments**. These environments are located within the directory tree belonging to the anaconda module, e.g. `/afs/psi.ch/sys/psi.merlin/Programming/anaconda/2019.03/conda/envs/`
* The software in these environments can be accessed by users through
* loading the anaconda module and then using `conda activate somemodule_py36`
* a seperate pmodule that transparently switches to that installed environment by just setting the correct PATH
* jupyter installations running from one of the environments which discover the other environments if they contain the correct packages (**nb_conda_kernels**)
1) loading the anaconda module and then using `conda activate somemodule_py36`
1) a seperate pmodule that transparently switches to that installed environment by just setting the correct PATH to the python binary.
1) jupyter installations running from one of the environments which discover the other environments if they contain the correct packages (**nb_conda_kernels**)
* The `conda` tool has frequent updates, and our experience shows that they should be installed. However, it would be a waste to every time produce a new module, because with the new module would also be associated a new area for environments. So, we prefer to update conda in place, and only make a new anaconda module if their are special incentives
* Environments are self sufficient and do not depend on the conda tool at all. All depending libraries are installed by conda. Conda makes consisten use of **rpath** definitions for executables and libraries, i.e. there is no reason to set `LD_LIBRARY_PATH` at all.
* Most environments are self sufficient and do not depend on the conda tool at all after the instalation: Conda took care of installing all depending libraries, and the builds that conda provides make consistent use of **rpath** definitions for executables and libraries, i.e. there is no reason to set `LD_LIBRARY_PATH` at all.
* There is one important exception: If your environment needs additional setups (activation hooks), then it will rely on the `conda activate` call, since these hooks are only run inside of this call.
## Building a central conda environment
@@ -47,12 +48,14 @@ In most cases you will want to go ahead with `pip` installs. However, after runn
Proceed as above by defining a YAML file and use conda to first install all the conda based packages.
Even though the YAML file also allows for the specification of pip packages, I advise to do this step separately. The pip steps can fail for various reasons, and it is better to do them interactively. Describe what you have to do in a README.md inside of the `conda-env-defs/${myenv}` folder.
Even though the YAML file also allows for the specification of pip packages, I advise to do this step separately. The pip steps can fail for various reasons, and it is better to do them interactively. Describe what you have to do in a README.md inside of the `conda-env-defs/${myenv}** folder.
**Note** that if pip triggers compilations, the package may pick up shared libraries from outside the environment. This can lead to problems if the build is done on pmod6.psi.ch which runs SL6, while most of the production environments are now on REHL7!
### installation of a conda environment and adding source compiled packages
**DRAFT!!!**
**This is still a DRAFT!!!**
This works if the python package has a correct setup.py build
@@ -62,7 +65,7 @@ This works if the python package has a correct setup.py build
* document it in `conda-env-defs/${myenv}/README.md`
* downlad and store the sources in the install area under
`/opt/psi/Programming/anaconda/2019.07/xxxx/mypackage`
* Use pip to install them into the environment (requires the setup.py)
* Use pip to install them into the environment (requires that the package comes with a correct `setup.py`)
```
cd /opt/psi/Programming/anaconda/2019.07/xxxx/mypackage
pip install .