Project

General

Profile

Using hypotcode for test jobs

We have a package now for test jobs called "hypotcode" (in the style of uboonecode, sbndcode, etc. for the "hypot" hypothetical experiment.)

Like he experiment software packages, hypotcode is centered on an experiment framework executable, named "hypot.exe" which can be configured to generate events, simulate events, reconstruct events, etc.

Unlike actual experiment frameworks, which generate .root files with ART extensions, hypotcode generates simple text files with literal embedded SAM metadata.

Also included in hypotcode is the hypot_metadata_extractor, which extracts SAM metadata from hypotcode files, and a config file for fife_launch (in the fife_utils package) to run jobs under the "samdev" SAM instance in the plain fermilab VO, using the FTS dropbox on fermicloud045 to drop off output files.

Note that all of the files for this setup are kept in scratch dcache, so previous runs, etc. tend to
disappear after a while.

We'll run a few stages of a usual workflow -- generating montecarlo events , modelling them, detector simulating , and reconstructing them.

First interactively:

 
#generate events
hypot.exe -o gen.troot -T hist_gen.troot -c gen.fcl
# model
hypot.exe -o g4.troot -T hist_g4.troot -c g4.fcl -s gen.troot
#detector simulate
hypot.exe -o sim.troot -T hist_sim.troot -c sim.fcl -s g4.troot
# reconstruct
hypot.exe -o reco.troot -T hist_reco.troot -c reco.fcl -s sim.troot

This of course leaves all the files here in your current directory, and does nothing with SAM.

Note that this code is really trival, and hypotcode does a sleep(300) at the end to make
it take long enough to be a useful

Next we want to submit jobs to do this, and have SAM do our bookkeeping.
This is all configured in the hypot_generic.cfg file in $HYPOTCODE_DIR/fife_launch_config

setup  hypotcode, fife_utils
fife_launch -c hypot_generic.cfg --stage=gen2  -Osubmit.N=3  -Ojob_output.add_to_dataset=${USER}_data1

So this will run 3 generation phase jobs, and put the files generated in a dataset named ${USER}_data1 (with ${USER} replaced by your username)
The full jobsub_submit command used will be printed, etc. because debug is on.

You can check your jobs with jobsub_q, etc. as usual; then check the
samdev FTS
and make sure the output files get copied out.

Next we can model those events:

fife_launch -c hypot_generic.cfg --stage=g4 -Osubmit.N=3  -Oglobal.dataset=${USER}_data1 -Ojob_output.add_to_dataset=${USER}_data2

Next we can simulete those events:
fife_launch -c hypot_generic.cfg --stage=sim -Osubmit.N=3  -Oglobal.dataset=${USER}_data2 -Ojob_output.add_to_dataset=${USER}_data3

And finally reconstruct those events:
fife_launch -c hypot_generic.cfg --stage=reco -Osubmit.N=3  -Oglobal.dataset=${USER}_data3 -Ojob_output.add_to_dataset=${USER}_data4

Or we can go over to POMS development, and run this as a POMS campaign -- see:

fake_demo2