Older instructions (not recommended)

In general, FLUGG is rather temperamental about having the proper environment variables set. So, if you are having problems performing any of the tasks below, try logging in to a fresh terminal and running only the setup scripts specified in the instructions for that task.

Checking out

To check out the software necessary to run FLUGG, go to the directory where you wish to install it (currently only verified to work in the /nova/app area) and follow the instructions from

Setting Up and Compiling

Compilation of g4numi flugg needs to be run if you modify the geometry or switch between the MINOS and NOvA geometries. It should also be done after you check out numisoft before you will be able to run simulations. To do this, perform the following steps.

Login to a nova machine with no setup scripts
Go to your numisoft/ direcrory

. g4numi_flugg/scripts/ [experiment], where [experiment] is either minos or nova. 
cd ..

The script g4numi flugg/scripts/ does several things. It sets up several environment variables and runs a setup script in the FLUGG directory so that
FLUGG can compile. As originally setup, compiling g4numi flugg required write access to the FLUGG code area. However, by setting the variable environment variable G4WORKDIR (this is done in the g4numi flugg/scripts/, the code is now compiled and located in the users local area. This solution and prevents people from overwriting each other’s code. Several other environment variables are set so that FLUGG uses the right version of programs on which it depends, such as Fluka and Geant4.

The script configures the g4numi source files for either MINOS geometry with the low energy target, or the NOvA geometry with the medium energy target and modified horn 1. The geometry files are stored and should be modified in g4numi/src and g4numi/include. Several of these files end with the suffixes .le_target and .me_target. These are the versions of the cod for the MINOS and NOvA configurations, respectively. The setup script copies the proper versions of these files into the g4numi flugg/src and g4numi flugg/include directories; the copies do not have the target suffixes. The [experiment] parameter specifies which set of files are copied for compilation.

You need to recompile after running g4numi flugg/scripts/

Running Simulations

Before running, you must create the directory in which the fluxfiles will be stored. You will probably start by running interactively and off the grid, so create a new directory. For example, it could be

 mkdir /nova/app/users/[username]/fluxfiles 

Once g4numi_flugg has been compiled, it is ready to generate simulated events. The generation is done by running g4numi flugg/scripts/g4numi easiest way to run simulations is with an executable wrapper script that sets the environment variables and calls

An example of such a script (for running interactively or on the local condor queue), which we shall call

umask 002

export OUTPUTDIR=/nova/app/users/[username]/fluxfiles

export EVTS=$1
export SPECIAL=$2
export TGT_GEOM=nova
export PERIOD=1
export LOWTH=true
[Insert path to your script directory]/

The meaning of the environment variables set in this script are described in the following section.

There are three ways to run FLUGG simulations: interactively, on the local condor queue, or on the grid.

To run interactively, first make the directory fluxfiles at the same level where nusoft is installed. This is where the output from FLUGG is placed by default. To run the script above interactively

export PROCESS=0
chmod a+x
./ 1000 _script_test

This will generate 1000 events in a root file in the directory fluxfiles/flugg_mn000z200i_script_test/Run000. For the meaning of mn000z200i and the overall naming convention for simulations, see the Naming Convention section below.

For jobs much larger than this, it is much more convenient to use either the condor queue or the grid. For general instructions on using these queues, see Using Condor and Running on Grid from IF Cluster Machines. The script above can be submitted to the condor queue as follows:

% source /grid/fermiapp/products/nova/etc/
% setup jobsub_tools
% jobsub ./ 100000 _script_test

The output will be placed in the same directory as if it were run interactively.

For the grid, a few lines should be added to the beginning of the script:


umask 002 #Ensure that files are user and group writable by default

mkdir ${_CONDOR_SCRATCH_DIR}/output

export SHAREDDIR=/nova/data/flux/flugg #Use this path for official MC only; 
# you can make your own directory in the /nova/data area
export GRID=1

If we call the resulting script, it can be submitted to the grid

% source /grid/fermiapp/nova/condor-scripts/
% jobsub -g ./ 100000 _script_test

If you have compiled on a machine running Scientific Linux 5, add the flag -onlySL5 to jobsub when submitting to the grid:

% jobsub -g -onlySL5 ./ 100000 _script_test

The output should appear in $SHAREDDIR.

Environment Variables

This script provides several environment variables that can be used to configure the geometry, output, and other options.

The script requires that four of the environment variables be set, or it will crash with an error message: RUN, EVTS, PERIOD, and TGT GEOM.

RUN specifies the run number for this simulation. It defines the output directory name and sets the random seed. If running in the condor queue or on the grid, set this variable = $PROCESS

EVTS specifies the number of protons-on-target (POT) to simulate.

PERIOD: For the MINOS configuration, this variable sets various conditions (e.g. Helium in the decay pipe) to correspond to the different MINOS Run Periods (currently 1-4). This variable is required to be set to something for the NOvA simulation, but it does not have any effect as yet.

TGT_GEOM can be set to either minos or nova; this determines the name of the FLUGG output directory, the default horn current values, and default target z position.

Three environment variables are required if running on the grid:

GRID is a flag to let the script know you are running on the grid. Set to 1 if running on the grid; leave undefiend otherwise.

OUTPUTDIR is the directory to which the output is written on the grid node running the job. It is temporarily stored here to be copied back to a shared area after the job is finished. The standard way to set this is to have the following lines in a wrapper script:

% mkdir ${_CONDOR_SCRATCH_DIR}/output
Use setenv rather than export for sh scripts.

SHAREDDIR is shared area to which the files will be copied and permanently stored after the grid job is finished. For official MC production, this should be /nova/data/flux/flugg

Several other optional and useful environment variables are also available. Note the convention for SPECIAL

CURRENT sets the horn current (in kA), 185 by default for MINOS; 200 by default for NOvA.

RHC: Set this to 1 to run reversed horn current simulations. It will make the current set by CURRENT negative.

TARGETZ sets the target z position in -cm, 10 by default for MINOS, 0 by default for NOvA.

STEPL sets the maximum step size in cm. Include . (decimal point), for example
% export STEPL="1.0"
Default is 1.0 if it is not set.

SPECIAL The contents of special is added to the run type. For example, if SPECIAL="_sh", then the files would output to flugg_le010z185i_sh/ For official NOvA MC, SPECIAL should be set to "_[date]_[description]." [date] is in the format YYYYMMDD, and [description] is optional.

LOWTH: Set to true to remove the 1 GeV tracking threshold. The string "_lowth" will be appended to the output directory name.

TARGFILE: Set to true to output the target hadron ntuples.

Naming Convention

The names of simulation configurations take the following form: [l/m][e/n][nnn]z[nnn]i; this is an extension of the MINOS convention

  • [l/m]: l = low energy horn 2 position, m = medium energy horn 2 position
  • [e/n]: e = MINOS target, n = NOvA target. In the event of target design changes, we have 24 letters left in the alphabet
  • [nnn]z = pull back (unchanged for MINOS target, relative to nominal pos. for NOvA target)
  • [nnn]i = horn current in kA (same as for MINOS)

A standard NOvA run will be mn000z200i

Common Problems and Solutions FGeometryInit.hh: No such file or directory: This probably means you have the wrong environment variables set. Try logging into a fresh window, running no setup scripts except g4numi_flugg/scripts/ , and recompiling.

If running interactively, make sure you have {\tt RUN} set to something. If you are using the example wrapper files, you can eithter manually set RUN to a value in the file or set PROCESS on the command line before running the wrapper.

Unresolved Bugs and Issues

Some FLUGG jobs simply stop without error before they are complete. So far, we have seen up to 77 jobs with this problem in batches ranging in size from 1 to 1000 Runs. When I run the same wrapper script with the same number of jobs more than once, the error occurs in the same run at the same place.