Project

General

Profile

 

 

help on redmine wiki formatting

 

  DUNE NearDet Design Redmine Project

The purpose of this DUNE redmine subproject is a central location for bringing together bits and bobs of information needed to facilitate the design of the near detector. This is also where the group can document decision choices that should be used by all participants (e.g. standardized hall position, etc).

The intent is to cover all aspects of the design process, including simulation efforts, (pseudo-)reconstruction, and analyses that provide input into design choices. Careful documentation of short cuts taken at each stage should be included; in the words of Steve Brice, we expect to be "cheating, but not lying" .

 

 

  General Notes

  • for now we recommend not using the subproject's "primary" git repo dune-neardet-design

  Geometry

Geometry describes the relative placement of physical materials. This is primarily used by GENIE to determine interactions & vertices, and by Geant4 to determine energy losses as particles are propagated. GENIE can make use of ROOT representations, and GDML format files. Geant4 has its own separate representation but can also make use of GDML files. Thus for commonality it is proposed that the group use GDML files. GDML files are pure text format and can be humanly read as well.

It is proposed that all technologies start from a common "world" geometry that represents the hall's structure and surrounding rock. Detector technologies then place elements within the hall.

Coordinate system proposal: set world's (0,0,0) to be the point where the beam centerline crosses some detector "center". This closely follows (modulo a overall z offset) a convention used historically by multiple experiments. Also continue with common convention where the y axis points up and z axis is horizontal (to local gravity) (in the beam's y-z plane), and the x-axis makes a right-handed coordinate system. The region of z<0 between the front of the detector (0,0,-z center ) and the upstream wall will be assumed to be similar in all the cases, and a hall size will be chosen large enough to house all the detector technologies.

Code elements:

These are both pure Python. The second one ( dunendggd ) extends the general base provided by the first with DUNE ND specific geometry builders. They largely interface with any other code only via the GDML they produce.

  Detector Hall

  Hall Placement Relative to Beam

put schematic diagram here. Also document size of hall, etc...

  Integrated Detector and Site Geometries


  Flux

For the purpose of these studies we take as "given" some pre-defined selection of LBNF beam simulation files based on guidance from the beam group. The primary files here are dk2nu format files. These record information about the decay of hadrons and muons that give rise to neutrinos in the beam system.

  • Dk2Nu is a package that serves two purposes. Primarily it is a format for holding the results from neutrino beamline simulations with sufficient information to recalculate the results of decays of hadrons and muons that give rise to neutrinos (different relative positions give different neutrino energies and probabilities). The second portion is a "flux driver" that integrates into GENIE.
  • GSimpleNtpFlux is a class within GENIE for minimal format for storing flux ray info.

GENIE and dk2nu supply "flux drivers" that deals with calculating the neutrino 4-momentum and weight at the detector location, as well as handling any importance weights found in these files. The transformation from beam coordinates to detector site coordinates is needed by the dk2nu driver; these are defined in an XML <param_set> that should be used in common for all the Near Detectors and the near site.

The GSimpleNtpFlux assumes rays are in the final coordinate system; this can be used as an intermediate storage format for processed dk2nu files (baking a transformation into the spectrum). This can often be computationally advantageous as the evaluation of dk2nu files can be expensive; this does come at a cost at a "loss" (in principle recoverable, but non-trivial) of extra information, primarily ancestor information, stored in the dk2nu files

The immediate proposal is, once an input dk2nu file set if chosen, to produce a common set of gsimple files that all can use. New common sets will be generated as required as the group needs.


  Neutrino Event Generation

Neutrino event generation will (at least initially) be done by running GENIE's gevgen_fnal with the fluxes described above. The output of the process is a "gntp.*.ghep.root" file with individual GENIE records (and a friend TTree of the flux information).

GENIE takes as input a neutrino flux (in any of a variety of forms) and a geometry representation (GDML or root). The output of this is an event record which includes sufficient information for reweighting a variety of internal models. Within the the event is a list of particles that exit the struck nucleus after the hard scattering interaction, hadronization, and final state interactions (FSI) that are handed to the next step of Geant4.

For efficiency reasons, calculation of cross sections for individual sub-processes are pre-calculated as a function of neutrino energy and stored externally as splines in XML format. It is critical that only splines for one version of the combination of GENIE code version and chosen EventGeneratorList be used when generating interactions with the same combination. We propose that all groups use the same pre-generated splines. Available sets are documented here for various version of GENIE and physics configurations.

(additional local documentation: genie)

General, little known but important, note:

The ROOT TTree gtree in the "gntp.*.ghep.root" file contains within it a critical bit of information:
gtree->GetWeight() returns the POTs used in generating the file's events
irrespective of whether then stopping condition was number of events or a POTs limit.


  Rock Events

The immediate proposal is to produce a common set of GENIE "gntp.*.ghep.root" files that all can use. These would use the a flux configuration with a larger "window" to capture all the relevant interactions (outside the hall or predefined inner volume) that contribute to particles entering the hall (or inner volume). The particles from the initial ν scatter would be propagated (via Geant4) to the point that they enter the hall (or inner volume), and then re-written in "gntp.*.ghep.root" form with the "active" particles (status code = 1) being those that are just entering. Events that contribute no entering particles would be discarded. An overall normalization would then be provided. This provides a common way for all detector configurations to avoid the overhead of this production.


  Cosmic Events

Similarly here it would be most convenient if lists of particle from the same cosmic ray were coerced into GENIE "gntp.*.ghep.root" format. This would facilitate applying this background to rock & detector pileup. A separate process would need to adjust time offsets to distribute events over the time of spill. Normalization isn't as trivial as rock & detector events.


  Event Pile-up / Overlay

Event pileup and overlay will be done by combining events (in ghep format) from multiple sources and writing to new files, also consisting of ghep events. A general purpose overlaying program genie_overlay has been written to do this. The program depends only on ROOT and GENIE and controlled with commandline options to specify the sources and how the events are to be distributed from each source. It can handle the case in which one wants to generate weighted events from a low mass detector. It also arranges events in time according to a uniform distribution between some tstart and tend, or a spill profile histogram.

The code and documentation are located at https://github.com/GENIEMC/OverlayGenie

The code can be grabbed, built, and invoked, like so:

# assure both ROOT and GENIE have been setup
git clone https://github.com/GENIEMC/OverlayGenie overlay_genie
cd overlay_genie
make
./overlay_genie -h

The output is a file consisting of ghep events in a TTree with one event per tree entry. The end of each spill is marked with a dummy event consisting of a single "Rootino" (ipdg=0). Downstream consumers, such as edep_sim, will need to be aware of this convention.

The output file is in ghep format but it can be easily converted to rootracker format (needed by edep_sim), and other formats, with the GENIE gntpc program.

  Beam Time Structure

Point at (or repackage) time distribution code written for DUNE NDTF effort.

The overlay_genie code comes with a NuMI-like, six batch, spill profile in spill_profile.root. It can be used with overlay_genie to distribute events realistically in time:

A NuMI-like spill profile


  Geant4 Use

  • choices of Geant4 version? extra flags? physics list?

The edep-sim program needs to be provided the output from sort of event kinematics generator (NEUT, NUANCE, NEUGEN, GPS, &c). The preferred input format is a rooTracker tree, but you can also use the text based tracker input format (NUANCE may be the only program in the world still using this format. It was developed as a standard kinematics format for the IMB III experiment in about 1990)

The GENIE program gntpc converts a native GENIE (GHEP/ROOT) event tree file to a host of plain text, XML or bare-ROOT formats (including the aforementioned rooTracker tree with -f rootracker flag).

Words here about "overlaying" (ie. pulling multiple GENIE records, e.g. those in main sensitive volume, separate "dead" material (magnet yokes, etc), rock events ...). These should already have been generated with the correct spatial distributions, but some account must be made for offsetting them in time w/ the beam spill structure.


  Detector Response

  LArTPC

  HPGAr TPC

  FGT

  other detector technologies


  Analyses, Design Studies

  • study 1
  • study 2