help on redmine wiki formatting


  DUNE NearDet Design Redmine Project

The purpose of this DUNE redmine subproject is a central location for bringing together bits and bobs of information needed to facilitate the design of the near detector. This is also where the group can document decision choices that should be used by all participants (e.g. standardized hall position, etc).

The intent is to cover all aspects of the design process, including simulation efforts, (pseudo-)reconstruction, and analyses that provide input into design choices. Careful documentation of short cuts taken at each stage should be included; in the words of Steve Brice, we expect to be "cheating, but not lying" .



  General Notes

  • for now we recommend not using the subproject's "primary" git repo dune-neardet-design

  Software Setup

Working Directory Structure

This is an example of Mike Kordosky's working area, with subdirectories corresponding to the different packages described below.


The setup script below lives in the root directory

and is sourced when sitting in that directory.

Setup Scripts

Here are some recommended setup scripts for use on the dunegpvm and grid nodes at Fermilab. They are:

  1. is for general-purpose simulation work, going from genie generation through geant (
  2. should be used when producing gsimple flux files from dk2nu flux files (


Geometry describes the relative placement of physical materials. This is primarily used by GENIE to determine interactions & vertices, and by Geant4 to determine energy losses as particles are propagated. GENIE can make use of ROOT representations, and GDML format files. Geant4 has its own separate representation but can also make use of GDML files. Thus for commonality, it is proposed that the group use GDML files. GDML files are pure text format and can be humanly read as well.

It is proposed that all technologies start from a common "world" geometry that represents the hall's structure and surrounding rock. Detector technologies then place elements within the hall.

Code elements:

These are both pure Python. The second one ( dunendggd ) extends the general base provided by the first with DUNE ND specific geometry builders. They largely interface with any other code only via the GDML they produce.

  Geometry generation for the ND Complex

You will need to clone from git the two main packages to generate the geometry

First go to a work directory (such as /dune/app/users/<username>/)

git clone
git clone 

To install them, just follow the instructions in the README file.
For gegede, you need to have the package libxml2 libxslt1-dev (gdml export) and ROOT available (root export)

cd gegede/
python install --user

For dunendggd

cd dunendggd/
python develop --user

Don't forget to add to to your PATH variable, for example for bash

echo ~/.bashrc >> export PATH=~/.local/bin/:${PATH}

In dunendggd, a script has been already provided to generate the geometry (as shown below) with the ND Hall (as of 27th Sept 2019) with the ND detectors (ArgonCube + MPD + 3DST-S/K)
Simply execute:

Each sub-detectors have their own configuration file(s)
  • ArgonCube: duneggd/Config/ArgonCube/*.cfg (Maintainer: James Sinclair/Jose L. Palomino Gallo)
  • MPD: duneggd/Config/MPD_Concept.cfg (Maintainer: Eldwan Brianne)
  • 3DST-S/K: duneggd/Config/KLOE_with_3DST.cfg (Maintainer: Perri Zilbermann/Guang Yang)

  Detector geometries

  Official GDML geometries

A tarball of official ND geometries in gdml is available: source:nd_hall_geometries_11_8_19.tar.gz (up to date as of Nov 8, 2019)

The files inside are:

  • nd_hall_with_dets.gdml: full hall with detectors
  • nd_hall_no_dets.gdml: empty hall
  • nd_hall_only_lar.gdml: only LArTPC (ArgonCube) in the hall
  • nd_hall_lar_antifid.gdml: only the Antifiducial part of ArgonCube, set in the hall (the missing volume is ArgonCubeActive, filled with air).
  • nd_hall_only_mpd.gdml: only the MPD in the hall
  • nd_hall_only_mpd_antifid.gdml: volTPCGas is filled with NoGas
  • nd_hall_only_kloe.gdml: only KLOE with the 3DST and TPCs in the hall
  • nd_hall_only_kloe_antifid.gdml: only KLOE, but with volMainDet_3DST removed
  • nd_hall_lar_mpd.gdml: only the LArTPC and MPD in the hall
  • nd_hall_lar_mpd_antifid.gdml: only the LARTPC and MPD in the hall. The MPD has the fiducial region removed (volTPCGas filled with NoGas)

The geometries can be viewed with the root script source:hallDisplay.C


 root -l 'hallDisplay.C("nd_hall_with_dets.gdml")' 

This will produce a drawing like this:

that one can zoom:

and then browse with a TBrowser to draw sub-volumes:

like this:

The browser also includes the ability to Print() a volume to see its shape, dimensions (usually half-dimensions), and composition:

See source:hallDisplay.C for some examples of how to access the geometry from a root script.


ArgonCube CDR samples are located at:



The current geometry of the MPD is as follows:

  • A HPgTPC of 2.6 m radius and 5 m length
    • The HPgTPC is surrounded by a chamber of 2.74 m radius and 5.2 m length.
    • The electrodes in the middle of the TPC are included in the geometry.
    • The TPC is filled with HP_ArCH4.
  • A pressure vessel surrounding the HPgTPC made of aluminium 44.49 mm thick corresponding to 0.5 X0. The radius is 2.74 m, 5.2 m length with a bulge of 100 cm for the endcaps
  • A Barrel ECAL in Octagonal shape made of two type of layers
    • Caveat: The ECAL layers are not segmented, this is handled in the software we use for the MPD (GArSoft)
    • 8 Highly granular layers with plastic tiles (Polystyrene) of 2x2 cm2 and 5 mm thick, 2 mm of Copper absorber and 1 mm of FR4 for the PCB, a safety margin of 0.01mm is included
    • 52 Low granularity layer using 2 mm Copper absorber and 5 mm thick plastic strips
  • An Endcap ECAL in Octagonal shape for now made of 4 module each
    • 6 Highly granular layers with plastic tiles (Polystyrene) of 2x2 cm2 and 5 mm thick, 2 mm of Copper absorber and 1 mm of FR4 for the PCB, a safety margin of 0.01mm is included
    • 54 Low granularity layer using 2 mm Copper absorber and 5 mm thick plastic strips
  • A magnet of 3.5 m radius, 10 m length and 165 mm thickness. The magnet material is Aluminium and the dimension correspond to a mass of about 100 metric tons. This will need to be further refined once a final design is decided.

MPD CDR samples (includes the active gas TPC volume, ECAL, and magnet) are located at:


First look at MPD CDR samples overlayed with rock events:


MPD CDR samples with events in active volume of the TPC are located at:


MPD CDR samples with events in passive volume of the TPC are located at:


Official GArSoft production samples overlayed with rock events are located at the following directories:

gas TPC, ECAL, Magnet, Pressure Vessel, and Rock: /pnfs/dune/persistent/users/afurmans/dirtProcessingOct2019/output/MPD_Rock_gas_overlay/
background enhanced files (ECAL, Magnet, Pressure Vessel, and Rock background): /pnfs/dune/persistent/users/afurmans/dirtProcessingOct2019/output/MPD_Rock_overlay

The final GArSoft production files include detector response and parameterized reconstruction; they are flat ntuples and follow this naming convention: cafanatree_*.root.


3DST CDR samples are located at:


  Detector Hall & Origin of Coordinates

  • The origin of coordinates ( [0,0,0] in the world geometry) is chosen to be the location where the beam enters the hall. It's shown as a white dot in the figures below.
  • The origin is assumed to be 574m from the beam's origin (MCzero) directly on the beam axis. (This is not enforced by the geometry, but by the flux window used when generating events. See below).
  • The figures also show a red line representing the beam centroid.
  • Not all details are shown in the figures. This is particularly true of the LArTPC.

Side view

Top view


For the purpose of these studies we take as "given" some pre-defined selection of LBNF beam simulation files based on guidance from the beam group. The primary files here are dk2nu format files. These record information about the decay of hadrons and muons that give rise to neutrinos in the beam system.

  • Dk2Nu is a package that serves two purposes. Primarily it is a format for holding the results from neutrino beamline simulations with sufficient information to recalculate the results of decays of hadrons and muons that give rise to neutrinos (different relative positions give different neutrino energies and probabilities). The second portion is a "flux driver" that integrates into GENIE.
  • GSimpleNtpFlux is a class within GENIE for minimal format for storing flux ray info.

GENIE and dk2nu supply "flux drivers" that deals with calculating the neutrino 4-momentum and weight at the detector location, as well as handling any importance weights found in these files. The transformation from beam coordinates to detector site coordinates is needed by the dk2nu driver; these are defined in an XML <param_set> that should be used in common for all the Near Detectors and the near site.

The GSimpleNtpFlux assumes rays are in the final coordinate system; this can be used as an intermediate storage format for processed dk2nu files (baking a transformation into the spectrum). This can often be computationally advantageous as the evaluation of dk2nu files can be expensive; this does come at a cost at a "loss" (in principle recoverable, but non-trivial) of extra information, primarily ancestor information, stored in the dk2nu files

Official flux files

The official flux comes from the optimized & engineered beam design effort of 2017. Doc-1781 and Doc-4559

Files are stored here:


File names look like:


Here XXXXX is a job number. The file format is dk2nu

Each file corresponds to 100k protons on target and there are a total of 249. ROOT v6.06.08 was used.

GSIMPLE generation

The primary documentation is in the GENIE wiki

The newly generated gsimple files purposed for various genie generations are located at the following pnfs directories within two corresponding "neutrino" and "antineutrino" subdirectories:

gsimple files for rock generation:


gsimple files for ND hall generation:


gsimple files for various sub-detector generations:



  Neutrino Event Generation

Neutrino event generation will (at least initially) be done by running GENIE's gevgen_fnal with the fluxes described above. The output of the process is a "gntp.*.ghep.root" file with individual GENIE records (and a friend TTree of the flux information).

GENIE takes as input a neutrino flux (in any of a variety of forms) and a geometry representation (GDML or root). The output of this is an event record which includes sufficient information for reweighting a variety of internal models. Within the the event is a list of particles that exit the struck nucleus after the hard scattering interaction, hadronization, and final state interactions (FSI) that are handed to the next step of Geant4.

For efficiency reasons, calculation of cross sections for individual sub-processes are pre-calculated as a function of neutrino energy and stored externally as splines in XML format. It is critical that only splines for one version of the combination of GENIE code version and chosen EventGeneratorList be used when generating interactions with the same combination. We propose that all groups use the same pre-generated splines. Available sets are documented here for various version of GENIE and physics configurations.

(additional local documentation: genie)

General, little known but important, note:

The ROOT TTree gtree in the "gntp.*.ghep.root" file contains within it a critical bit of information:
gtree->GetWeight() returns the POTs used in generating the file's events
irrespective of whether then stopping condition was number of events or a POTs limit.

Mike Kordosky, July 2019: This section would benefit from a few example gevgen invocations with reasonably up to date flux files, software setup, and geometry

  Rock Events

The code and documentation are located at rock_propagation


The goal here has been to produce a common set of GENIE "gntp.*.ghep.root" files that all can use. These use the a flux configuration with a larger "window" to capture all the relevant interactions (outside the hall or predefined inner volume) that contribute to particles entering the hall (or inner volume). The particles from the initial ν scatter were propagated (via Geant4) to the point that they enter the hall (or inner volume), and then re-written in "gntp.*.ghep.root" form with the "active" particles (status code = 1) being those that are just entering. Events that contribute no entering particles were be discarded. An overall normalization (in POTs/file) was provided. This provides a common way for all detector configurations to avoid the overhead of this production.

There are two programs in the repository:

  • runRockPropagation: Takes GENIE events, runs them through GEANT, collects particles entering the hall and stuffs them back into a GENIE GHEP record.
  • gntpc_dune: useful for handling the large events (many soft particles) produced in the rock propagation. The original GENIE gntpc will SEGV on such events.

The official documentation is on the github page.

Last Proccesing

The last pass of this processing was done by Dom Brailsford. Robert Hatcher has historically been involved.

The working area is here:


The propagated files are here:


Updated location of the rock files for CDR:

Rock CDR samples containing all the original neutrino interaction vertices are located at:


The propagated rock samples containing the stable final state particles from original neutrino interaction vertices in rock that make it to the boundary of the detector hall are located at:



  Cosmic Events

Similarly here it would be most convenient if lists of particle from the same cosmic ray were coerced into GENIE "gntp.*.ghep.root" format. This would facilitate applying this background to rock & detector pileup. A separate process would need to adjust time offsets to distribute events over the time of spill. Normalization isn't as trivial as rock & detector events.

  Event Pile-up / Overlay

The code and documentation are located at

Event pileup and overlay will be done by combining events (in ghep format) from multiple sources and writing to new files, also consisting of ghep events. A general purpose overlaying program genie_overlay has been written to do this. The program depends only on ROOT and GENIE and controlled with commandline options to specify the sources and how the events are to be distributed from each source. It can handle the case in which one wants to generate weighted events from a low mass detector. It also arranges events in time according to a uniform distribution between some tstart and tend, or a spill profile histogram.

The code can be grabbed, built, and invoked, like so:

# assure both ROOT and GENIE have been setup
git clone overlay_genie
cd overlay_genie
./overlay_genie -h

The output is a file consisting of ghep events in a TTree with one event per tree entry. The end of each spill is marked with a dummy event consisting of a single "Rootino" (ipdg=0). Downstream consumers, such as edep_sim, will need to be aware of this convention.

The output file is in ghep format but it can be easily converted to rootracker format (needed by edep_sim), and other formats, with the GENIE gntpc program.

  Beam Time Structure

The overlay_genie code comes with a NuMI-like, six batch, spill profile in spill_profile.root. It can be used with overlay_genie to distribute events realistically in time:

A NuMI-like spill profile

  Geant4 Use

  • choices of Geant4 version? extra flags? physics list?

The edep-sim program needs to be provided the output from sort of event kinematics generator (NEUT, NUANCE, NEUGEN, GPS, &c). The preferred input format is a rooTracker tree, but you can also use the text based tracker input format (NUANCE may be the only program in the world still using this format. It was developed as a standard kinematics format for the IMB III experiment in about 1990)

The GENIE program gntpc converts a native GENIE (GHEP/ROOT) event tree file to a host of plain text, XML or bare-ROOT formats (including the aforementioned rooTracker tree with -f rootracker flag).

Words here about "overlaying" (ie. pulling multiple GENIE records, e.g. those in main sensitive volume, separate "dead" material (magnet yokes, etc), rock events ...). These should already have been generated with the correct spatial distributions, but some account must be made for offsetting them in time w/ the beam spill structure.

  Detector Response




  other detector technologies

  Analyses, Design Studies

  • study 1
  • study 2