Relative calibration (under construction)¶
The relative calibration procedure is mostly documented in NOvA docdb : 13579 (https://nova-docdb.fnal.gov/cgi-bin/private/RetrieveFile?docid=13579&filename=Instruction_AttenuationCalibrationJobs.pdf&version=35). While the procedure hasn't changed except for the addition of fiber brightness bins in both ND-main (In the muon catcher, the relative calibration wasn't split by FB bins) and FD, the instructions may be slightly out of date.
There exists a proto-framework for doing the relative calibration using SRT rather than mrb which is what the docdb documents and as far as I know hasn't been maintained. A separate technote probably needs to be written therefore to document everything. But the main difference between the SRT and mrb frameworks for this purpose is in the grid job submission scripts, as described below.
Step 1 - Generating Attenuation Profiles¶
The relative calibration procedure is a bit different between data/MC. For Data, the calibration is done for each plane and cell, while for MC it is consolidated across diblocks but split by individual fiber brightness bins. The fcls used for generating the profiles are :
Data : Calibration/attenuation/makeattenuationprofilewithoutconsolidatejob.fcl MC : Calibration/attenuation/makeattenuationprofilewithfiberbrightnessjob.fcl
The fcls produce an art file containing the data products describing the attenuation profiles that goes into the attenuation fit as well as a ROOT analysis file which contains histograms of the same profiles for plotting and analysis.
To generate these files over the grid, one can use the cfg : `Calibration/attenuation/grid/attenprofs.cfg`
You'll need to modify it to suit your own purposes, such as providing the dataset to run over output directory, the test release directory, the number of files to run over in each job etc.
To submit jobs, simply do
submit_nova_art.py -f /path/to/attenprofs.cfg
The cfg uses the copyOut option to copy back output files from the grid. The script used is in `Calibration/attenuation/grid/copyout_atten.sh`. Since the number of files in the datasets are often O(10k) and art creates an output file for each of them, it is impractical to copy back all these files from the grid. The script therefore, sums the profiles from the art files handled in each job using the SumAttenuationProfiles plugin. The fcls used for summing are :
Data : Calibration/attenuation/sumattenuationprofilesjob.fcl MC : Calibration/attenuation/sumattenuationprofilesjobwithfiberbrightnessjob.fcl
The CopyOut script uses the MC one as default, one needs to modify it to the Data summing fcl to run over data or else it does the wrong thing and probably fails. This modification can be made in a test release and the .cfg can be suitably updated with the new path to the script in the --copyOut option. For example, if the new script is in `<test release dir>/Calibration/attenuation/grid/mycopyout_atten.sh`, the cfg should include
--testrel <test release dir> --copyOut Calibration/attenuation/grid/mycopyout_atten.sh
This ensures that the grid is able to see the modified script when the test release is copied to the grid nodes.
Step 2 - Generating Threshold and Shadowing Corrections¶
This part is only done for MC and the same corrections are applied for the Data PE profiles as well. However, separate jobs needs to be submitted for each FB bin. The script to do this is in `Calibration/attenuation/grid/submit_thresh_jobs.sh`. This script modifies the `thresholdanajob.fcl` and creates a new fcl in your test release job directory for each FB bin and then submits jobs running over it. The cfg it uses is in `Calibration/attenuation/grid/threshold.cfg`. Like Step 1, this cfg needs to be modified appropriately based on output directory, the path to your test release and the copyOut script : `Calibration/attenuation/grid/copyout_thresh.sh`
Once the grid outputs are returned, the threshold files needs to be hadded into a single file across all FB bins. This might be quite cumbersome and will take time. I recommend splitting the task up multiple ways, for eg, hadd all files across 1 FB bin and then hadd final files from all the FB bins. I'd also recommend copying over the hadded files to `/nova/ana` so as not to break `/pnfs` and not have them deleted from your scratch area.
Step 3 - Fitting the Threshold and Shadowing corrections¶
After Step 2, the threshold and shadowing corrections are fit to certain functional forms to be able to apply them for each plane and each cell in data and MC. This is done on the output of Step 2 using the root macro : `Calibration/macros/fit_thresh_corrs_fb.C`. One can do this over the output of Step 2, for example by :
root -b '$SRT_PRIVATE_CONTEXT/Calibration/macros/fit_thresh_corrs_fb.C("threshold_low_corr.root", "fdmc_low_threshold_fits.root")'
where the previous output is `threshold_low_corr.root` and the results of the fits are stored in `fdmc_low_threshold_fits.root`
Step 4 - Fitting the Attenuation Profiles¶
The final step is to do the actual calibration over the art files from Step 1 using the threshold and shadowing corrections from Step 3. The fcl used is in : `Calibration/attenuation/attenuationfitjob.fcl`
This fcl has different parameters depending on the calibration at ND-main/ND-muon catcher/FD or Data/MC. The producers are stored in `Calibration/attenuation/AttenuationFit.fcl`. One can see all the parameters used for the various 2019 calibrations. For your purpose, you need to modify atleast :
parameters. This will produce an enormous set of plots in the PlotsDirectory, so make sure that points to somewhere in `/nova/ana` as well. To run the fcl, as an example, one can do :
nova -c job/attenuationfitdatajob.fcl -s /path/to/profiles/*.root
where `/path/to/profiles/*.root` is the path to the art files generated from Step 1.
This step produces the resulting csvs that can be published in a new UPS product before any reconstruction steps in various analyses.