Wiki » History » Version 1

Version 1/7 - Next ยป - Current version
Zarko Pavlovic, 03/21/2019 03:30 PM

Booster Neutrino Beam Systematics

The BNB systematics framework is based on the work developed by MiniBooNE. It was initially ported by MicroBooNE into LArSoft EventWeight module. Here the same MicroBooNE code was extracted, with all LArSoft/ART dependencies and made into a stand-alone module.

Downloading the code

The code is maintained in a git repository associated with this redmine project. You may clone this repository anonymously and without authentication but in order to push commits back you must be authenticated.

Authenticated clone (i.e., allows to modify the software and upload your modification on this site, via git push):

$ git clone ssh://

Anonymous clone (no push):

$ git clone


To setup the code cd into top directory. Note that you should run source from top directory to get the LD_LIBRARY_PATH and FW_SEARCH_PATH env variables correctly set.
Source the setup script:

source Setup/


From top directory cd into build and run:

cmake ..
make install -j3


To run the code cd into top directory and run the rwgh.

bin/rwgh -h
  -h [ --help ]                    Print help message
  -i [ --input ] arg               Path pattern for input files. Put it in 
                                   quotes or escape *.
  -f [ --fhicl ] arg               Configuration fhicl file.
  -o [ --output ] arg (=hist.root) Output file name.

For example to generate histograms using one beam ntuple:

bin/rwgh -f fcl/eventweight_microboone.fcl -i april07_baseline_0001.root -o output_0001.root

Running on grid

You can use the provided script.

Scripts/ -h
usage: [-h] [-d] -f FHICLFILE -g GROUP -i INPUTPATH [-j JOBID] -n
                    [1-10000] -o OUTPUTPATH

Submit beam sys jobs.

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           Will not delete submission files in the end. Useful
                        for debugging and will only print the submission
                        command on screen.
  -f FHICLFILE, --fhicl-file FHICLFILE
                        Configuration fhicl file.
  -g GROUP, --group GROUP
                        Group used to submit jobs.
  -i INPUTPATH, --input-path INPUTPATH
                        Path to input BooNEG4Beam files.
  -j JOBID, --jobidoffset JOBID
                        Id for running job. If submitting multiple jobs each
                        one is offset by its process number (0..n)
  -n [1-10000]          Number of jobs to submit.
  -o OUTPUTPATH, --output-path OUTPUTPATH
                        pnfs path where to copy final output.

Example how to submit 10 jobs:

Scripts/ -i /pnfs/uboone/persistent/uboonebeam/bnb_mc/ -o /pnfs/uboone/scratch/users/zarko/test_bnb_sys -f fcl/eventweight_microboone.fcl -n 10 -g uboone

Once the jobs are done running add all output histograms into one root file:

hadd merged_hist.root /pnfs/uboone/scratch/users/zarko/bnb_sys/*/*.root 


Simple script that demonstrates building the error matrix is provided in Scripts directory. For example to plot the numu flux and errors in root run:

.x Scripts/analyze.c("merged_hist.root","numu")

Note that you need to modify the script for other neutrino species to include the right set of systematics.


Several systematics reweight using flux histograms (horn current, skin depth, nucleon xsec).
For these systematics neutrino spectrum was calculated at particular location for input histograms and this location should be matched in the dk2nu nuray passed to the WeightCalc.
Right now it is hard-coded to use 1st location (0th is usually used for random direction).