Project

General

Profile

Tracker Test Beam 2015 Simulation And Analysis

For the 2015 tracker test beam we are moving our simulation and analysis to the g-2 offline software framework, using art and geant4.

This page is for keeping track of useful information for developers/users.

Contact Tom Stuttard () or Becky Chislett () with questions.

Information on the design of the simulation and analysis can found in GM2-doc-2457 (http://gm2-docdb.fnal.gov:8080/cgi-bin/ShowDocument?docid=2457), although some elements will change during development.

Code overview

The code we are developing is:

  • Test beam simulation
    • One straw module placed in vacuum chamber
    • Beam gun representing 120 GeV p+ at FNAL MTest facility (with realistic beam characteristics, especially timing)
    • Straw DAQ readout
      • Straw digitization
      • DAQ cycle (e.g. only record protons during TDC accumulations)
    • Auxiliary silicon telescope detector
  • Test beam real data low-level processing
    • Unpacking, cleaning, time sync, etc
  • Analysis code
    • Should run on both simulated and real test beam data
    • Event selection (time islands, clusters, seeds)
    • Straw hit reconstruction for single module (one 3D hit per proton), including choosing 'good' seeds
    • Straw performance analysis (resolution, beam params, etc)
    • SiT track fit
    • "Event" builder to combine straw, SiT and truth data (maybe with graphical display)

The analysis code will be able to run on either real or simulated data.

Where to find the code

We are developing in git feature branches called trackerTestBeam, which are derived from and kept up to date with those in Tammy's strawTrackReconstruction feature branches.

How to setup the code

In the following it is assumed you have created a directory for the offline analysis e.g. /home/username/g-2/analysis and this base directory will be referred to as XXX in the following. YYY is the location of the compiled online code. We presently have copies of the online code on the online gm2strawX machines under: /home/nfs/gm2/mtest/daq and the offline gm2gpvmYY machines under /gm2/app/users/t1042/daq. So YYY needs to be set to either /home/nfs/gm2/mtest/daq or /gm2/app/users/t1042/daq

Make sure you have a valid kerberos ticket : check by issuing the command "klist"

Create a file: setup.sh in XXX with the following and in setup.sh change XXX to be your directory and YYY to be the location of the online code.

export VERSION=v6_01_00
export WORKSPACE=gm2Dev_$VERSION

# Set up online environment : change XXXX to be the directory where you have just installed gm2-tracker-readout-daq
source YYY/gm2-tracker-readout-daq/software/trackerDaq-setup.sh

# Setup offline configuration
source /cvmfs/gm2.opensciencegrid.org/prod/g-2/setup

# Get ninja
setup ninja v1_5_3a

# Setup local workspace
cd XXX/$WORKSPACE

# Setup mrb for this workspace
source localProducts_gm2_${VERSION}_prof/setup  

# Set buid env for this workspace
source mrb setEnv 

# Define useful aliases
alias makeninja='pushd $MRB_BUILDDIR ; ninja ; popd'

Other useful functions to have in this setup.sh : git_pull and hack_cmake are in /gm2/app/users/lmark/offline/setup.sh (remember to change the paths) such that the command "git_pull" then updates all the 6 packages to the latest version in one command and the command "hack_cmake" adds the line INCLUDE_DIRECTORIES( $ENV{GM2_TRACKER_DAQ_INC} ) into the srcs/CMakelists.txt

Then you need to create (once) the workspace:

 
cd XXX
# v6_01_00 is the $VERSION defined in the setup.sh
mkdir -p gm2Dev_v6_01_00

Then setup gm2 offline (only this once), where v6_01_00 is the number defined by $VERSION in setup.sh

cd XXX
source /cvmfs/gm2.opensciencegrid.org/prod/g-2/setup
setup gm2 v6_01_00 -q prof
cd gm2Dev_v6_01_00
mrb newDev

Then you need to source the setup.sh. This also needs to be done for each new login/bash session:

cd XXX
source setup.sh

Then you need to checkout all the offline packages/repositories (just once)

cd XXX/gm2Dev_v6_01_00
cd srcs
mrb g gm2dataproducts
cd gm2dataproducts
git flow feature track trackerTestBeam

cd ..
mrb g gm2geom
cd gm2geom
git flow feature track trackerTestBeam

cd ..
mrb g gm2ringsim
cd gm2ringsim
git flow feature track trackerTestBeam

cd ..
mrb g gm2util
cd gm2util
git flow feature track trackerTestBeam

cd ..
mrb g gm2tracker
cd gm2tracker
git flow feature track trackerTestBeam

cd ..
mrb g gm2midastoart  # Only if converting online data
cd gm2midastoart 
git flow feature track trackerTestBeam

cd ..
mrb g gm2unpackers   # Only if converting online data
cd gm2unpackers
git flow feature track trackerTestBeam

In doing this you will see errors like:

error: unknown option `set-upstream-to=origin/develop'

You can ignore these....

Before building the code for the first time you need to get your XXX/gm2Dev_v6_01_00/srcs/CMakeLists.txt file up to date. Do this via:

. mrb s
mrb ud
mrb uc

And then edit XXX/gm2Dev_v6_01_00/srcs/CMakeLists.txt and add the line:

INCLUDE_DIRECTORIES( $ENV{GM2_TRACKER_DAQ_INC} )

immediately ABOVE the "ADD_SUBDIRECTORY" lines which are the bottom of the file.

Now you can build the code using:

. mrb s
mrb b --generator ninja

The majority of the test beam simulation and analysis code is being done in gm2tracker/testBeam. The changes required to extract a single straw module are in gm2geom and gm2ringsim.

You are now setup and in future sessions you should simply only need to do:

cd XXX
source setup.sh
cd gm2Dev_v6_01_00/srcs
mrb b --generator ninja

If you do not add or delete files (ie your build dependencies) don't change then instead of the slower "mrb b --generator ninja" command to build you can simply type instead:

makeninja

But if you add or delete files then you should do:

mrb ud # update dependency
mrb b --generator ninja  

If someone has made changes and committed them back to the FNAL repository and you want these changes then you can get them via the command:

git pull origin feature/trackerTestBeam

Similarly to push your changes into the FNAL repository so they are available to everyone.

git push origin feature/trackerTestBeam

Note before doing a "push" you should commit your code to your local git repository via:

git add filename # if file is a new one
git commit -m "comment" filename

Running the code

art jobs/tasks are run using .fcl files.

To run a job from a given .fcl file type:

gm2 -c <fcl file>

There are many options you can pass to these (num events, first event, input/output paths, etc). Use the help command to see them:

gm2 -h

Simulation

The sequence of fcl files run (in order) to run the test beam simulation is:

gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestGun.fcl  # generate a spill of events (each event is one p+)
gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestGunAnalysis.fcl  # produce histos from the gun output
gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestSimtoDigits.fcl  # apply readout cycle filters to the generated data and build an accumulation event, then pass to straw digitizer (each event is one DAQ accumulation)

The output of this chain can be processed using the analysis tools listed below.

Test beam data processing

To process a MIDAS data file from the test beam up until the point where analysis tools can be used (e.g. to get to the same stage the simulation chain above ends at), use these steps (for example run number 00340):

gm2 -c $MRB_SOURCE/gm2midastoart/fcl/midasBankInput.fcl -s $MIDAS_ONLINE_DATA/run00340.mid.gz -o midasBankInput_00340.root  # Import MIDAS events to art events
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/MetaData.fcl -s midasBankInput_00340.root -o MetaData_00340.root  # Start MetaData record
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/AllUnpacker.fcl -s MetaData_00340.root -o AllUnpacker_00340.root  # Unpack raw data
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/DataCleaning.fcl -s AllUnpacker_00340.root -o DataCleaning_00340.root  # Clean unpacked data
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/SyncTimings.fcl -s DataCleaning_00340.root -o SyncTimings_00340.root  # Sync timings between detectors
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/RawDigitizer.fcl -s SyncTimings_00340.root -o RawDigitizer_00340.root  # Produce raw digits
gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestRawtoDigits.fcl -s RawDigitizer_00340.root -o mtestRawtoDigits_00340.root  # Produce digits

The following plotters are available if you want to check these steps:

gm2 -c $MRB_SOURCE/gm2unpackers/fcl/AllUnpackerPlots.fcl -s AllUnpacker_00340.root  # Plot unpacker output
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/DataCleaningPlots.fcl -s DataCleaning_00340.root  # Plot data cleaning output
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/SyncTimingsPlots.fcl -s SyncTimings_00340.root  # Plot timing sync output
gm2 -c $MRB_SOURCE/gm2unpackers/fcl/RawDigitizerPlots.fcl -s RawDigitizer_00340.root  # Plot raw digitization output

Analysis

Note that these are evolving daily right now during the test beam data analysis, so this information will go out of date quickly

Analysis runs on digits. This is the output of the both the simulation and the test beam data processing steps.

A simple example to try following the "Test beam data processing" section above is:

gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestExamplePlot.fcl -s mtestRawtoDigits_00340.root  # Make an example plot

The full chain for processing the digit data (from sim or test beam) is listed below. Use "-s" argument to "gm2" executable when running to specify input digit file, and hence specify whether using simulated (e.g. mtestSimtoDigits.root) or real (mtestRawtoDigits_00340.root) data:

gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestDigitsAnalysis.fcl #Loop at digits before reco
gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestReco.fcl  # perform event selection (time island, cluster, seed, reco seed) and reconstruction (get drift times and then hit position) on straw hits to build a module event 
gm2 -c $MRB_SOURCE/gm2tracker/fcl/mtestRecoAnalysis.fcl  # produce histos from the reconstruction output

A test beam real data analysis is currently being developed.

Software versions

We are using the g-2 offline software framework v6_01_00.