Project

General

Profile

Beamline in the Offline

This page describes the implementation of the beamline data in our offline analysis framework (NOvASoft) and the conversion algorithms which may be used to read an 'online'-formatted data file (from the DAQ) and reformat the data into something which may be processed 'offline'. This just describes the beamline components; for a complete overview of the full data stream see TestBeam in the Offline.

Overview

The beamline of the NOvA test beam experiment contains multiple components (see Beamline information) which are read by the dedicated beamline DAQ (see DAQ overview). For data analysis purposes, this information must be incorporated into NOvASoft, the offline processing framework used in NOvA analyses.

In the offline framework, reconstruction and analyses may be developed. The offline processing chain will take follow these general steps:
offline conversion -> calibration -> reconstruction -> repackage to standard record -> analysis using CAFAna

The NOvASoft specific developments are described briefly below before a description of how to run the software.

NOvASoft

For a full description of the offline data-processing framework used in NOvA, see https://cdcvs.fnal.gov/redmine/projects/novaart/wiki/The_NOvA_Offline_Workbook.
NOvASoft is built upon the art framework. This page is not an art tutorial so please refer to the documentation for information on the framework: https://cdcvs.fnal.gov/redmine/projects/art/wiki

The raw data products are defined in the RawData package of NOvASoft, the reconstruction-level data products specific to the beamline data are defined in BeamlineRecoBase, the offline conversion algorithms are defined in DAQ2RawDigit and the reconstruction algorithms which are being developed live in BeamlineReco. Additionally, there is a TestBeam package containing some new test beam-specific analysis software.

Beamline in the Offline Software

Previous versions of this wiki were complicated. No longer! Everything is now in trunk!
(The only caveat is to run over data files taken from before January 2019, the relevant and working code lives on feature_testbeam2018 branch. I don't imagine we will use this any more, but it's there just in case...)

The relevant packages are
  • RawData, for raw beamline data products -- probably doesn't need touching at all any more
  • DAQ2RawDigit, for offline conversion algorithms -- again, mostly developed
  • BeamlineRecoBase, for reconstruction-level data products -- again, won't be changed much
  • BeamlineReco, for reconstruction algorithms -- active development
  • TestBeam, for test beam specific analysis and other modules -- active development

They can be added to a test release as normal

addpkg_svn -h (e.g.) TestBeam

and built using the usual
novasoft_build -t

script.

Running Beamline in the Offline

With the offline software, you may read in a DAQ-formatted data file into NOvASoft and apply reconstruction and analysis to it. These processes will be discussed in this section.

Converting to NOvASoft

The first step is to read in a DAQ file (saved in /daqdata on the beamline DAQ machine, novadaq@novabeamlinedaq00) to make it an art-formatted offline data file which may be processed in NOvASoft. There is a configuration ready to do this, and it may be run as:

nova -c beamline2rawdigitjob.fcl -s <input_daq_file.root> -o <output_novasoft_file.root>

This file will have within it all the beamline data, formatted using the offline NOvASoft data formats and saved within an art::Event.

There are a couple of ways this can be configured. By default, this online->offline unpacking is strict and places very specific requirements on the data file to ensure good quality data. Example requirements are equal number of spills saved for each components, equal number of triggers within each spill saved by each component, synchronizations between various components etc. Whilst this is absolutely necessary when processing production data (otherwise we can't be certain what we have in the output file is going to be correct!), it is not ideal for testing and commissioning purposes. For example, when looking at ToF data, you may not care if there are no TDU timestamps or the MWPC controller was misbehaving (which is its standard behavior).

Some parameters have been added in order to provide control over this:

UnpackTrigger
UnpackDigitizer
UnpackWireChamber
UnpackTDU

where each Boolean defines whether or not to attempt to reformat the data from each of the subdetector front-ends. All are set to true by default.

By default, the number of triggers in the spill is taken from the trigger board. However, since we now have removed the requirement of trigger board data, we need a new way to define how we structure the events in the output spill. This may be controlled using

TriggerSource: "Trigger" 

where available options are "Trigger", "Digitizer", "WireChamber", "TDU".

All the above parameters are properties of the BeamlineUnpack algorithm, configured in DAQ2RawDigit/beamline2rawdigit.fcl. They may be changed in there, or, as always with fhicl, the preferred method would be to write a local configuration file along these lines:

#include "beamline2rawdigitjob.fcl" 

source.BeamlineUnpack.TriggerSource: "Digitizer" 
source.BeamlineUnpack.UnpackTrigger: false
source.BeamlineUnpack.UnpackWireChamber: false
source.BeamlineUnpack.UnpackTDU: false

if for example you wanted to look at just a file with digitizer data in. And, of course, this is all read in at run time so means no compilations necessary. art is a truly wonderful thing.

These options should provide the flexibility and all the functionality we currently need. Please let Mike know if something doesn't work fully (they have only been tested trivially, they are hard to test!) or if additional support for dealing with a different unforeseen scenario is required.

Basic reconstruction

We are still developing reconstruction algorithms. In order to run reconstruction over your NOvASoft-formatted data file (see above), you can run, for example:

nova -c tofreco.fcl -s <novasoft_file.root>

This will extract the relevant raw information from the data file, run the defined algorithms over it and place the reconstruction-level quantities in the file.

New reconstruction modules may be added to the BeamlineReco package. For more information about reconstruction, see Beamline Reconstruction.