Booster Neutrino Beam Systematics

The BNB systematics framework is based on the work developed by MiniBooNE as described here It also includes all of the subsequent updates. In particular, for charged pions splines are used to propagate the HARP error matrix, and the K+ production errors were updated to incorporate the SciBooNE measurement (
The code was initially ported from MiniBooNE framework by MicroBooNE into LArSoft EventWeight module. Here the same MicroBooNE code was extracted with all LArSoft/ART dependencies removed and made into a stand-alone module.

Downloading the code

The code is maintained in a git repository associated with this redmine project. You may clone this repository anonymously and without authentication but in order to push commits back you must be authenticated.

Authenticated clone (i.e., allows to modify the software and upload your modification on this site, via git push):

$ git clone ssh://

Anonymous clone (no push):

$ git clone


To setup the code cd into top directory. Note that you should run source from top directory to get the LD_LIBRARY_PATH and FW_SEARCH_PATH env variables correctly set.
Source the setup script:

source Setup/


From top directory cd into build and run:

cmake ..
make install -j3


To run the code cd into top directory and run the rwgh.

bin/rwgh -h
  -h [ --help ]                    Print help message
  -i [ --input ] arg               Path pattern for input files. Put it in 
                                   quotes or escape *.
  -f [ --fhicl ] arg               Configuration fhicl file.
  -o [ --output ] arg (=hist.root) Output file name.

For example to generate histograms using one beam ntuple:

bin/rwgh -f fcl/eventweight_microboone.fcl -i /pnfs/uboone/persistent/uboonebeam/bnb_mc/april07_baseline_0001.root -o output_0001.root


The code is configured using the fcl file. Each systematic is done using a particular WeightCalc configured in certain way. List of systematics (and reweighting) is given with weight_functions variable:

weight_functions:[bnbcorrection, piplus, piminus, kplus, kzero, kminus, horncurrent, pioninexsec, nucleontotxsec, nucleonqexsec, nucleoninexsec, pionqexsec, 
piontotxsec, expskin]

Each of the systematics is further configured by setting which type of WeightCalc is used and all the necessary input paramters. For example fcl parameters for horn current variation are:

        type: FluxUnisim
    parameter_list: ["horncurrent"]      
        weight_calculator: "MicroBooNE" # "MicroBooNE" OR "MiniBooNE" 
        mode: multisim
        number_of_multisims: 1000        
    use_MiniBooNE_random_numbers: false

This sets the FluxUnisimWeightCalc as the function used to calculate the weight and uses listed root files as source of input histograms.

Note that bnbcorrection is a reweighting that should be applied when running over raw BooNEG4Beam files. It is not necessary to include it when running over redecayed ntuples or newer gsimple files (most gsimple files 2018 and later were made using MiniBooNENtupletoGSimpleConversion code that does not require this correction). GSimple files made using standard gsimple scripts do need this correction. This mostly corrects for the muon polarization and affects nue flux.

Running on grid

You can use the provided script.

Scripts/ -h
usage: [-h] [-d] -f FHICLFILE -g GROUP -i INPUTPATH [-j JOBID] -n
                    [1-10000] -o OUTPUTPATH

Submit beam sys jobs.

optional arguments:
  -h, --help            show this help message and exit
  -d, --debug           Will not delete submission files in the end. Useful
                        for debugging and will only print the submission
                        command on screen.
  -f FHICLFILE, --fhicl-file FHICLFILE
                        Configuration fhicl file.
  -g GROUP, --group GROUP
                        Group used to submit jobs.
  -i INPUTPATH, --input-path INPUTPATH
                        Path to input BooNEG4Beam files.
  -j JOBID, --jobidoffset JOBID
                        Id for running job. If submitting multiple jobs each
                        one is offset by its process number (0..n)
  -n [1-10000]          Number of jobs to submit.
  -o OUTPUTPATH, --output-path OUTPUTPATH
                        pnfs path where to copy final output.

Example how to submit 10 jobs:

Scripts/ -i /pnfs/uboone/persistent/uboonebeam/bnb_mc/ -o /pnfs/uboone/scratch/users/zarko/test_bnb_sys -f fcl/eventweight_microboone.fcl -n 10 -g uboone

Once the jobs are done running add all output histograms into one root file:

hadd merged_hist.root /pnfs/uboone/scratch/users/zarko/bnb_sys/*/*.root 


Simple script that demonstrates building the error matrix is provided in Scripts directory. For example to plot the numu flux and errors in root run:

.x Scripts/analyze.c("merged_hist.root","numu")

Note that you need to modify the script for other neutrino species to include the right set of systematics.


Several systematics reweight using flux histograms (horn current, skin depth, nucleon xsec).
For these systematics neutrino spectrum was calculated at particular location for input histograms and this location should be matched in the dk2nu nuray passed to the WeightCalc.
Right now it is hard-coded to use 1st location (0th is usually used for random direction).