Beam Sim Files » History » Version 78
- Table of contents
- Beam Sim Files
Beam Sim Files¶
Set up the environment:
source /grid/fermiapp/lariat/setup_lariat.sh setup lariatsoft v06_05_00 -q e10:prof export GROUP=lariat export JOBSUB_GROUP=lariat setup G4beamline v2_16 -q e6:prof:nu
Move to your working directory (probably within your app area) and set up your beamline simulation working area:
git clone -b develop ssh://firstname.lastname@example.org/cvs/projects/lariatsoft cd lariatsoft/BeamlineSim/ sed -i "s/MYUSERNAME/$USER/g" jobdir_SurveyedGeom/*
Example: Generate 1k spills with 2ndary: pi+ 8 GeV, magnets @+50 Amps
Now make a new job directory from your template:
./MakeNewJobDir.py jobdir_SurveyedGeom -A 50.0 -E 08 --jobsize 1000 --jobcount 10 --spillsize 1000
You can first run 100 interactive events and check that the output is sensible:
g4bl jobdir_SurveyedGeom_10jobsof1k_64GeV_pos100Amps/LAriaT_13degProdxn_10degAna_SurveyedGeom_10jobsof1k_64GeV_pos100Amps.in first=0 last=100
Make sure you have a directory in /pnfs/lariat/scratch/ for the grid job output files:
mkdir /pnfs/lariat/scratch/users/$USER mkdir /pnfs/lariat/scratch/users/$USER/MCdata
Now run a number of grid jobs:
cd jobdir_SurveyedGeom_10jobsof1k_64GeV_pos100Amps ./Jobsubmit.sh input LAriaT_13degProdxn_10degAna_SurveyedGeom_10jobsof1k_64GeV_pos100Amps.in $PWD
Keep an eye on your jobs:
If they run within a few minutes then they probably failed. In that case use jobsub_fetchlog to get the logfiles and investigate.
jobsub_fetchlog -J YourJobID
Once jobs are complete merge all data files
hadd MergedAtStartLinesim_LAriaT_13degProdxn_10degAna_SurveyedGeom_10jobsof1k_64GeV_pos100AmpsAll.root /pnfs/lariat/scratch/users/$USER/MCdata/MergedAtStartLinesim_LAriaT_13degProdxn_10degAna_SurveyedGeom_10jobsof1k_64GeV_pos100Amps*.root
Processing G4BL Into Text Files for LArG4¶
At this point, you should have some number of root files, with each file representing one beam spill, though files can have more than one spill.
Now to process each spill to find triggering particles and close-in-time non-triggering particles and write to a text file in an hepevent format that can be parsed in LArG4.
This requires "G4BLTxtJobLauncher.py", "hepevtWriter.py" and "Treescript.sh", all found in lariatsoft/BeamlineSim/jobdir_SurveyedGeom/.
Make a directory to store the textfile output:
This is where grid jobs will send the text files when completed, but the directory must exist beforehand or the jobs will crash.
You can change the output directory, but then you must change the path in Treescript.sh, line 34, to the directory you want to use.
python G4BLTxtJobLauncher.py /path/to/directory/with/G4BLfiles --heppath /path/to/hepevtWriter.py
If --heppath is not set, grid will look in the same directory as the spill root files for hepevtWriter.py.
G4BLTxtJobLauncher.py finds the spill root files in G4BLFiles directory, and for each file, copies the name to Treescript.sh and executes jobsub_submit.
Treescript.sh is what gets executed on the grid: it runs hepevtWriter.py on the spill file, and returns the textfile output from hepevtWriter.py to the LArG4Files directory.
Each text file created by hepevtWriter.py has all the "events" found in a single spill file.
This stage is the most time intensive: jobs can take more than 30 hours to complete.
HepevtWriter.py has some flags you could set to look at only part of the file. Using them can speed up testing that your jobs run.
Understanding the Hepevt Format¶
When hepevtWriter.py finds a particle that goes through all necessary detectors to issue a trigger in data, it looks for all particles that pass through "StartLine" within [-393us,393us] of the triggering particle.
The trigger, and all close-in-time particles get put together into one event. Most events have the trigger, a couple other charged particles, and a bunch of photons.
Here is an example how an event could look in the output text file:
15 2 1 211 0 0 0 0 -0.16 0.0 0.75 0.779 0.13957 114.0 3.0 -720 -19.7 1 22 0 0 0 0 -0.0029 0.0015 0.025 0.025 0.0 112.0 4.5 -719.3 -3891.8
The first line specifies the event number and the number of particles in the event.
For each particle in the event, a set of numbers are given:
1 PDG 0 0 0 0 Px Py Pz E m X Y Z t
Momentum, mass, and energy are given in GeV. Position in cm, relative to TPC coordinate system. Time in ns, with t=0 being when the triggering particle passed through the DSTOF.
Particles are fired from Startline. The first particle in the example, a pi+, was the trigger, as it started t ~ -20ns. The pion would take 19.7 ns to travel the beamline, and should hit the DSTOF at t=0.
The other, a photon, passed through StartLine ~3.9us before the triggering pion.
Processing Text Files in LArG4¶Now with a directory full of text files, one created for each G4BL file, process the text files in LArG4.
This requires "LArG4JobLauncher.py" and "LArG4_Example.xml" in lariatsoft/BeamlineSim/jobdir_SurveyedGeom/ and "prodtext_lariat.fcl" in lariatsoft/JobConfigurations.
This also requires knowing the magnetic field conditions of the G4BL file used to make each text file. This information is probably somewhere in the textfile name.
Before this stage can be launched, many edits must be made. These edits are given in LArG4JobLauncher.py. Open in any text editor for the list of instructions.
python LArG4JobLauncher.py /folder/where/you/have/saved/text/files/
LArG4JobLauncher.py loops over the hepevent text files in your specified directory and for each text file:
- Get the name of the text file
- Count how many events are in the text file
- Edit LArG4_Example.xml to have <numevents> equal to the number of events in this file
- Edit prodtext_lariat.fcl to point to the path to this text file.
When complete, you should have a bunch of job directories, incrementing from 0 to the number of text files, with each directory containing the output from LArG4.
Each LArG4 output file has TPC digits, and AuxDetSimChannels for the beamline detectors.
jobsub_fetchlog and other commands useful for telling what's going on when something's not right: https://cdcvs.fnal.gov/redmine/projects/jobsub/wiki
Studies Awaiting an Author¶
- Particle multiplicity by species (per event)
- At StartLine and entering the TPC, separately
- WCTrack studies:
- Hit multiplicuty in each instrument (per event)
- momentum spectrum (reco vs. truth)
- WCTrack purity (% tracks with hits from multiple particles)
- WCTrack efficiency (show the quality variable, vertical plan linearity)
- hit clustering in time-wire space. Compare to data.
- Time of Flight studies:
- Hit multiplicuty in each instrument (per event)
- TOFHit purity (% TOFHits with hits from multiple particles)
- Particle ID studies:
- Reco mass spectrum (probably too good, so tweak uncertainties on TOF and momentum to make it match data reco)