Back to home page

EIC code displayed by LXR

 
 

    


Warning, /jana2/docs/howto/benchmarking.md is written in an unsupported language. File is not indexed.

0001 # Benchmarking JANA2  <!-- {docsify-ignore-all} -->
0002 
0003 
0004 JANA includes a built-in facililty for benchmarking programs and plugins. It produces a scalability curve by repeatedly pausing execution, adding additional worker threads, resuming execution, and measuring the resulting throughput over fixed time intervals. There is an additional option to measure the scalability curves for a matrix of different affinity and locality strategies. This is useful when your hardware architecture has nonuniform memory access.
0005 
0006 In case you don't have JANA code ready to benchmark yet, JANA provides a plugin called `JTest` which can simulate different workloads. `JTest`  runs a dummy algorithm on randomly generated data, using a user-specified event size and number of FLOPs (floating point operations) per _event_. This gives a rough estimate of your code's performance. If you don't know the number of FLOPs per event, you can still compare the performance of JANA on different hardware architectures just by using the default settings.
0007 
0008 Here is how you do benchmarking with `JTest`:
0009 
0010 ```bash
0011 
0012 # Obtain and build JANA, if you haven't already
0013 git clone http://github.com/JeffersonLab/JANA2
0014 cd JANA2
0015 mkdir build
0016 mkdir install
0017 export JANA_HOME=`pwd`/install
0018 cmake -S . -B build
0019 cmake --build build -j 10 --target install
0020 cd install/bin
0021 
0022 # Run the benchmarking
0023 ./jana -b -Pplugins=JTest
0024 # -b enables benchmarking
0025 # -Pplugins=JTest pulls in the JTest plugin
0026 # Additional configuration options are listed below
0027 
0028 
0029 # Benchmarking may take awhile. You can terminate any time without 
0030 # losing data by pressing Ctrl-C _once or twice_. If you press it three
0031 # times or more, it will hard-exit and won't write the results file.
0032 
0033 
0034 cd JANA_Test_Results
0035 # Raw data CSV files are in `samples.dat`
0036 # Average and RMS rates are in `rates.dat`
0037 
0038 # Show the scalability curve in a matplotlib window
0039 ./jana-plot-scaletest.py
0040 
0041 
0042 ```
0043 
0044 
0045 If you already have a JANA project you would like to benchmark, all you have to do is build and install it the way you usually would, and then run
0046 
0047 ```bash
0048 jana -b -Pplugins=$MY_PLUGIN
0049 # Or
0050 my_jana_app -b
0051 
0052 cd JANA_Test_Results
0053 # Raw data CSV files are in `samples.dat`
0054 # Average and RMS rates are in `rates.dat`
0055 
0056 # Show the scalability curve in a matplotlib window
0057 ./jana-plot-scaletest.py
0058 
0059 ```
0060 
0061 
0062 These are the relevant configuration parameters for `JTest`:
0063 
0064 
0065 | Name                 | Units  | Default           | Description                                       |
0066 |:-------------------- |:------ |:----------------- |:------------------------------------------------- |
0067 | benchmark:nsamples   | int    | 15                | Number of measurements made for each thread count |
0068 | benchmark:minthreads | int    | 1                 | Minimum thread count                              |
0069 | benchmark:maxthreads | int    | ncores            | Maximum thread count                              |
0070 | benchmark:threadstep | int    | 1                 | Thread count increment                            |
0071 | benchmark:resultsdir | string | JANA_Test_Results | Directory name for benchmark test results         |