Warning, /eic.github.io/_software/escalate_singularity_1.md is written in an unsupported language. File is not indexed.
0001 ---
0002 title: Running ESCalate on HPC clusters
0003 name: escalate_singularity_1
0004 category: escalate
0005 layout: default
0006 ---
0007
0008 {% include layouts/title.md %}
0009
0010 - [Singularity images on CVMFS](#singularity-images-on-cvmfs)
0011 - [Building a local singularity container](#building-a-local-singularity-container)
0012 - [Running ESCalate on HPC clusters](#running-escalate-on-hpc-clusters)
0013 - [Running scripts](#running-scripts-inside-the-singularity-container)
0014 - [Running programs](#running-programs-inside-the-singularity-container)
0015 - [Scripts with multiple commands](#scripts-with-multiple-commands-inside-the-singularity-container)
0016 - [Scheduling batch jobs](#-scheduling-batch-jobs-with-the-singularity-container)
0017 - [Advanced ways of running](#advanced-ways-of-running)
0018
0019
0020
0021 <br>
0022
0023 #### Singularity images on CVMFS
0024
0025 To run the ESCalate docker container on HPC clusters, we use singularity. This does not require super-user privileges.
0026
0027
0028 Our [docker image](https://gitlab.com/eic/containers) is being automatically converted to singularity on any update.
0029
0030 This singularity images available at **BNL** and **JLab** and anywhere where one has has access to the EIC repository
0031 on the cvmfs filesystem (at `/cvmfs/eic.opensciencegrid.org`).
0032
0033 ```
0034 /cvmfs/singularity.opensciencegrid.org/electronioncollider/escalate\:latest
0035 ```
0036
0037 **(!)** All instructions below works if you put the above link there instead of `electronioncollider/escalate:latest`
0038
0039
0040 <br>
0041
0042 #### Building a local singularity container
0043
0044 Since the docker container is quite large, you will want to build the singularity image in a directory where you have plenty of space (e.g. `/volatile` at Jefferson Lab). You do *not* want to save this image to your home directory!
0045
0046 ```bash
0047 mkdir -p /volatile/eic/$USER/singularity
0048 cd /volatile/eic/$USER/singularity
0049 module load singularity
0050 export SINGULARITY_CACHEDIR="."
0051 singularity build --sandbox electronioncollider/escalate:latest docker://electronioncollider/escalate:latest
0052 ```
0053
0054 Setting the singularity cache directory `SINGULARITY_CACHEDIR` prevents singularity from filling up a hidden directory in your home directory with 10 GB of docker container layers.
0055
0056 While building the container, singularity may complain a bit about `EPERMS` and `setxattr`, but you can ignore this (the drawback of doing this without super-user privileges).
0057
0058 After lunch, you should have a sandbox container (i.e. a directory which encapsulates an entire operating system) at `/volatile/eic/$USER/singularity/electronioncollider/escalate:latest/`.
0059
0060 Note: If you aready have pulled the container into docker locally, you can save some downloading time by building the singularity image from the local container with
0061 ```bash
0062 singularity build --sandbox electronioncollider/escalate:latest docker-daemon://electronioncollider/escalate:latest
0063 ```
0064
0065 Note: As a temporary fix for older kernels < 3.15 (check with `uname -r`) , run the following before using the image:
0066 ```bash
0067 strip --remove-section=.note.ABI-tag electronioncollider/escalate:latest/usr/lib/x86_64-linux-gnu/libQt5Core.so.5.12.4
0068 ```
0069
0070
0071 <br>
0072
0073 #### Running interactive shells inside the singularity container
0074
0075 You can open a shell in the container with
0076 ```bash
0077 singularity shell --cleanenv electronioncollider/escalate:latest
0078 ```
0079 Inside the singularity container your home directory will be accessible (and possibly some other shared drives), allowing for easy interaction with files on the host system but software inside the container.
0080
0081 The `--cleanenv` or `-e` flag indicates that environment variables should not be kept (this ensures that, for example, `$ROOTSYS` is not pointing to the location in the host system). This does, however, also sever the x11 forwarding that allows graphics windows to appear. While at first it may be advisable to use `-e`, you may find that everything works without it.
0082
0083 In the shell you will not have the shell environment setup until you run
0084 ```bash
0085 source /etc/profile
0086 ```
0087 Now you should be able to run any commands you would run inside a terminal in the JupyterLab interface, e.g. `root -l` should open a ROOT session.
0088
0089 We can now, for example, run `dire` and generate some pythia8 events using the following command:
0090 ```bash
0091 dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0092 ```
0093 This will generate files in the current directory (300 KB, 30 seconds).
0094
0095 You can exit the shell again with `exit` or Ctrl-D.
0096
0097
0098 <br>
0099
0100 #### Running scripts inside the singularity container
0101
0102 Instead of interactively logging in and running commands, we can also create a script and run that from the host system command line. If we have an executable script `script.sh` with content
0103 ```bash
0104 #!/bin/bash
0105
0106 dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0107 ```
0108 then we can run this directly as
0109 ```bash
0110 singularity run electronioncollider/escalate:latest ./script.sh
0111 ```
0112 We need the `./` because that is what we would have typed inside the container because the current directory is (likely) not in the search path.
0113
0114 Note: You may encounter a warning from `tini`, the master process inside the container (similar to `init`, see what they did there?). You can remove the warning by defining an empty environment variable with `export TINI_SUBREAPER=""`.
0115
0116
0117 <br>
0118
0119 #### Running programs inside the singularity container
0120
0121 Now, if we can run scripts directly wtih `singularity run`, then why not run `dire` directly, you ask? Indeed, we can also run
0122 ```bash
0123 singularity run electronioncollider/escalate:latest dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0124 ```
0125 Note: since `dire` is in the search path, we do not need to specify the full path. Arguments can be passed on as usual.
0126
0127 This can be even more simplified if we just define an environment variable `$ESC` with
0128 ```bash
0129 export ESC="singularity run `pwd`/electronioncollider/escalate:latest"
0130 ```
0131 where the `pwd` is to ensure that an absolute path is used.
0132
0133 We can now prepend this to any command we want to run inside the container, no matter which directory we are in:
0134 ```bash
0135 $ESC dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0136 ```
0137
0138
0139 <br>
0140
0141 #### Scripts with multiple commands inside the singularity container
0142
0143 To be able to run batch jobs, we want to submit job scripts. For example, we can create a script with the following content (anywhere you want):
0144 ```bash
0145 #!/bin/bash
0146
0147 cd $HOME
0148
0149 export ESC="singularity run /volatile/eic/$USER/singularity/electronioncollider/escalate:latest"
0150
0151 $ESC dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0152 ```
0153 This will use the container to run the `dire` command specified and write the output to your home directory.
0154
0155
0156 <br>
0157
0158 #### Scheduling batch jobs with the singularity container
0159
0160 We can extend the previous script to turn it into a SLURM submission script, `job.sh`:
0161 ```bash
0162 #!/bin/bash
0163
0164 #SBATCH --time=0:05:00
0165 #SBATCH --mem-per-cpu=512
0166 #SBATCH --account=eic
0167 #SBATCH --partition=production
0168
0169 cd $HOME
0170
0171 export ESC="singularity run /volatile/eic/$USER/singularity/electronioncollider/escalate:latest"
0172
0173 $ESC dire --nevents 50 --setting "WeakSingleBoson:ffbar2gmZ = on"
0174 ```
0175 and submit this with `sbatch`.
0176
0177
0178 <br>
0179
0180 #### Advanced ways of running
0181
0182 One could create a python `fast.py` for a fast simulation (using eic-smear) like this:
0183 ```python
0184 from pyjano.jana import Jana
0185
0186 Jana().plugin('beagle_reader')\
0187 .plugin('open_charm')\
0188 .plugin('eic_smear', detector='jleic')\
0189 .plugin('jana', nevents=20000, output='hepmc_sm.root')\
0190 .source('beagle_eD.txt')\
0191 .run()
0192 ```
0193 and run
0194 ```bash
0195 wget https://gitlab.com/eic/escalate/workspace/-/raw/master/data/beagle_eD.txt
0196 $ESC python3 fast.py
0197 ```
0198
0199 Or for a full simulation, create `full.py` with:
0200 ```python
0201 from g4epy import Geant4Eic
0202 g4e = Geant4Eic()\
0203 .source('beagle_eD.txt')\
0204 .output('hello')\
0205 .beam_on(200)\
0206 .run()
0207 ```
0208 and run
0209 ```bash
0210 wget https://gitlab.com/eic/escalate/workspace/-/raw/master/data/beagle_eD.txt
0211 $ESC python3 full.py
0212 ```
0213
0214 <br>