Currently Running CI Tests¶
The tests that run now are seen itemized at, say, current CI tests, to randomly pick one set of build results.
As explained elsewhere on this wiki these CI tests are all configurable from the
ci_tests.cfg file in the uboonecode and lbnecode repositories, in their
One sees that there are two tests that have "prodsingle" in their names. These are just simple single event, single-muon jobs. The muons are generated isotropically and homogeneously throughout the detectors. There is a job for the MicroBooNE ("uboonecode") geometry and one for the 35ton ("lbnecode") geometry. Neither ensures as of now that the start seeds are set carefully. Similarly, there is a single-event GENIE job for microboone.
Next, there are the tests that contain the words "openold" in their names. These tests grab one event from a previous run on dCache (not in any formal, experiment-approved place, necessarily) and run that event through the G4, detsim and reconstruction stages. There is a set of these for MicroBooNE and 35ton. The MicroBooNE reconstruction is divided over two separate 2D and a 3D jobs, whereas the 35ton reco is all aggregated into one job. (The "detsim2D" and "detsim3D" strings in the uboonecode jobs should really be "reco2D" and "reco3D".)
Last, there are the "hitana" jobs. These are currently just implemented for MicroBooNE; there is no lbnecode equivalent to the uboonecode hitana jobs yet. The goal is to do Kolmogorov-Smirnov tests between pairs of distributions. In our case we seek to compare the hit distributions for each U,V,W plane for the current build/simulation/reconstruction against those from a previous, "canonical" build/simulation/reconstruction. Therefore, we can run one LArSoft job whose fhicl file runs a simple art::EDAnalyzer module to make these histograms. This is the hitana_tinyana CI test. Its input is a "canonical" forward 25 degree cone, muon, reco'd events file (not isotropic, homogenous this time). Again, not a special, formally-declared file, but one in dCache that the author happened to know about: "canonical" is an overstatement. We are done now with the old file. But for the new build against which we compare we want to create these events from scratch, with individual prod, detsim, reco2D, reco3D jobs. Ideally, one would be certain that the start seeds are identical; we don't do that here. It turns out that it can be important in the K-S tests that start seeds aren't set the same: failures in these tests can arise from this lack of exacting care. It's instructive, in fact, to see some failures in the CI tests that arise for this reason. Hence, for now we don't fix the start seeds. Now that we have our events in the new build reconstructed through Hits (reco2D is sufficient, in other words) we may run the same hitana_tinyana job script over the output file and create the needed Hit histograms for the three planes. Now that we have our two tinyana root files -- one for the canonical build and one for the new build -- we run the histocomp job. This job takes these two files and a third argument -- the value which is the minimum acceptable KS value that allows the test to pass -- as inputs and runs the packaged script called histocomp (described elsewhere in the lar-ci instructions). The root files are traversed until identically-named TH1D's are discovered. The resulting three pairs of histograms are plotted, as seen by clicking on the histocomp link from the web page at the top of these instructions.
A final comment is that this last set of jobs -- the hitana jobs -- dominate the runtime of the CI tests. That is because in order to make K-S tests the statistics of the new build's processing jobs must be sufficient. Right now we only run 40 events. Any fewer and the Hit distributions with the current, not-unreasonable binning are just too ratty. However, even that takes on the order of 1-1.5 hours to get through the full processing.