Project

General

Profile

Workflow Case Study -- Monte Carlo

Our first workflow case study is a simulation workflow, where you have 3 stages:

We would like our event generation split up into multiple submissions, and then
simulation and reconstruction submissions for each of those to process the output
of the preceding stage. In this discussion, we will assume you already have defined
Job Types for each of these stages, and they've been tested with a few test data files.

There are two major variations of this: using SAM to track all the files, or letting
scripts handle it themselves.

Using SAM

A popular way to use SAM for MonteCarlo simulations is to have a dataset of small Art Framework .fcl files, one for each simulation run; for example by having a script write the .fcl files, and declare them to SAM. For example NOvA dataset prod_fcl_v01.60_fd_genie_fhc_nonswap_fhc_nova_v08_full_batch1_v5

Now that particular dataset has 2000 files in it; lets say you wanted to split it into 4 subsets.
We can do that using the dataset splitting feature of POMS. If you edit your campaign in the Edit Campaigns page, this looks like:

Here we're using the "mod" (modulus) split type to break the dataset into 4 parts.

When using SAM, you will be giving each stage of your workflow a dataset to use, so it is critical that your Job Type commands use %(dataset)s to set the input dataset that the job will use. It
is also important to use a Completion Type of "Located", so we wait for the output files to actaully be declared to SAM before starting the next stage of the workflow.

Using "fake" datasets for other parameters

Lets say that instead of having datasets of fcl files, you have one configuration for your MonteCarlo, but you need to run several batches at different energy levels. We can use the "list" split type to give POMS a list of "datasets", but not actually use them as datasets, rather as parameters to our JobType script.

This looks like the following:

In the Edit Campaign Layer page:

we give the list of energy levels 0.5, 1, 2, 4, 8, 16. Then in the job type, we use the %(dataset)s replacement to pass that energy level into the submission command line in the parameters editor.

When not using SAM, you are relying on each stage of your workflow knowing where the previous stage put its output files. In this case it is important to use a Completion Type of "Completed", so we do not wait for SAM locations to appear on the intermediate files.

Chaining them together

So now that we've configured POMS to split our work into multiple jobs at the simulation phase, we want to trigger job launches to complete the workflow:

  • we want each stage of our workflow to depend on the previous one
  • we want the first stage to depend on itself so it will launch the next batch as each one completes.
  • we want to tag them all with a name for our overall campaign.

This is done in the Edit Campaign Stage screens. So starting from the back, we want the reconstruction stage to depend on the simulation:

Next we configure the simulation to depend on the generation.

(screen shots omitted)

Finally we configure the generation to depend on itself to launch the next phase of generation as each one completes.

Tagging them into a group

Next we want to give this whole workflow a name. This involves adding a "tag" to each of the campaign stages. We can do that quickly by going to the Campaign Stages page, filling in some strings to match the campaigns we want, and tagging them all at once; or by going to the individual Campaign Info pages and adding the tag to each one.

Checking the workflow

Now we can go to the Campaign Tags page and check the dependencies:

Here we can see the later stages depending on the previous ones, and the first stage depending on itself, to launch the next workflow in the split set.

Watching the workflow

Once we launch jobs on the first stage, we can use the submissions link on the CampaignTags to watch all the submissions in the workflow.


Here we can see the submissions for all 3 stages as the workflow progresses over time.