DUNE-centric Guide to Using LArSoft in git/mrb world

The following is a quick guide to the things you, a member of DUNE, will need to do in order to run and develop applications in LArSoft. Please note that there are more general instructions/guidance at, some of which this page is ripping off! That wiki is definitely worth looking at -- perhaps even before reading this document. The following assumes

  • You have all your computing privileges and accounts, including an account on the dune gpvm nodes (e.g.
  • You are working on one of the dune gpvm nodes. Many of these should work on other machines/sites as well, if they are configured in the standard way, but there are no guarantees for that. Talk to your system/local administrator if you’re having trouble, to make sure there’s no differences in the basic setup.
  • You have a working knowledge of UNIX.
  • You are using the bash shell (most of the things for csh should be clear)
  • You know something about the general structure of LArSoft. If you don’t know what a fcl file is, and/or don’t know how to make/modify one, then you will need a more basic tutorial than what’s here.

Run from an existing (“frozen”) LArSoft release

First, do

source /cvmfs/

Then, to list the available versions of the dune software, do
ups list -aK+ dunetpc

You should see a list (possible long list…) of versions of the code to set up. Each line should look something like this:
"dunetpc" "v04_19_00" "Linux64bit+2.6-2.12" "e7:prof" "" 

Pick which version you want to setup (probably the latest, and probably the profile version), and do something like this (with the version and qualifiers you want):
setup dunetpc v04_19_00 -q prof:e7

This will automatically setup larsoft as well, so don’t worry about anything else. Meaning, you don't need to also do setup larsoft v04_19_00 -q prof:e7. dunetpc of a particular version requires larsoft of that version, and thus it sets it up too. You can think of “dunetpc” as dune’s own particular implementation of larsoft. You should then be able to run your usual larsoft job:
lar -c my_fcl_file.fcl ...

Develop and build in LArSoft, based on the latest frozen release

Initial setup

We advise strongly that each new directory you would like to make corresponding to a new development environment you will create should only be performed once per login. So mkdir larsoft_mydev2 below should only happen after logging out and logging back in. The reason is that your $PRODUCTS variable tends to grow to include all working areas you visit per login. It can happen across logins too. This makes for many weird effects upon saying mrbsetenv. Check your $PRODUCTS when confused.

For developing your code, we recommend you use the latest LArSoft frozen release. The LArSoft frozen release is tagged and built once a week. By using the latest frozen release, you get the most up-to-date features and a stable release. We have instructions below on how to update your local release when a new LArSoft release is available.

Like above, the first thing to do is

source /cvmfs/

But now, things get different! First, check the larsoft release. Check this file and get the larsoft version on the line that starts with 'larsoft':
Here is an example:
product          version
larsoft          v07_10_00

Then setup the correct version of LArSoft, e.g. (note v07_10_00 -q e17:prof is just an example)
setup larsoft v07_10_00 -q e17:prof

You can find all available qualifiers (e.g. e17:prof) by doing
ups list -aK+ larsoft | grep v07_10_00

debug or prof:
prof uses optimized code and runs much faster. debug will provide more debug information if you use gdb or totalview.

Now, you should make a new directory where you intend to put your code (Do not use your home directory, as the libraries you build can get large. Use somewhere beneath /dune/app/users/USER/). Let’s say that new directory is going to be called larsoft_mydev. Then do

mkdir larsoft_mydev
cd larsoft_mydev

Then, inside the new directory, do
mrb newDev

Do the following to use the local release:
source localProducts_XXXX/setup

Now, pull down dunetpc,
cd srcs
mrb g dunetpc

A word about the remote URL. When mrb g clones the dunetpc repository at this point, it will choose to use a read-only URL or a read/write one. If you have developer (or manager) permissions for dunetpc, then mrb will use the read/write URL, otherwise it will pick the read-only one. To get developer permissions, talk to a repository manager. There is a Redmine group called larsoft users which grants developer permissions to dunetpc and all larsoft repositories which you probably should join.

The read-only url is and the read/write one is ssh:// This URL is stored in srcs/dunetpc/.git/config. If you initially clone the dunetpc repository without write permissions, get write permissions at a later date, and want to push, you will have to change your URL.

git remote set-url origin ssh://

Will do the job. Whenever you push, you need a valid Kerberos ticket.

Optional: when to add other repositories

You should add other repositories if
  • you need new features in that repository that were added after the last frozen release, or
  • you want to develop code in that repository, or
  • the current head version of dunetpc depends on the head version of another package, watch announcement.
    cd srcs
    mrb g larreco

    Note it can take much longer time to compile the code if you checkout additional packages. Don't checkout packages you don't need.

Building your code

Go to the build directory in your development area,

cd larsoft_mydev/build*

or just do

and then, if this is your first time building with these local packages, do

You can omit this if you have already done this, but if you at any point add a new package, you should do mrb z; mrbsetenv. Meaning, you want to redefine the CMakeList.txt files and hence the Makefiles. Once properly set up, you can build/install:
mrb i -j4

That latter part tells it to build on 4 cores, if available. Check for build errors! It may be easier to break that step above into two parts---mrb build -j4; mrb install---in order to more easily check for compilation issues.

Alternatively, you can use the ninja build system which should automatically determine the optimal number of processes to run (very handy on dunebuild01 and dunebuild02, each of which has 16 cores). Instructions at the bottom of this page.

If you see the following error:
ERROR: Version conflict -- dependency tree requires versions conflicting with current setup of product : full descriptions setpath v1_11 -f NULL -z /grid/fermiapp/products/common/db vs setpath v1_11 -f NULL -z /cvmfs/
INFO: no optional setup of duneutil v01_23_11 -q +e10:+prof
please refer to Issue #15028.

Setup your code to run

After everything is built and installed, do


to setup your newly built code. (‘slp’ stands for setup local products.) Now you can run your lar job:
lar -c my_fcl_file.fcl ...

Shortening the development cycle

Once you're stable and the code you're working on is well defined, and you're not git pull'ing new repositories, the development cycle can be shortened. Go make your edits in ../srcs/dunetpc/PackageA/AnalysisCode.cpp and jump back to ../build. You can now replace the onerous

 mrb i -j4 

 make install -j4

That should cut a few minute procedure to sub-minute.

There are two computers available to DUNE collaborators, and, each with sixteen cores. They are intended for building code only, and then mrb i -j16 and make install -j16 will speed up builds by quite a lot more.


The next time you login a dunegpvm node, do the following to set things up:

source /cvmfs/
cd larsoft_mydev
source localProducts_XXXX/setup

When a new LArSoft release is available and you want to develop against the new LArSoft release

  • Relogin a dunegpvm node, do the following
    source /cvmfs/
    setup larsoft vxx_yy_zz -q e6:prof #set up the latest larsoft release
    cd larsoft_mydev
    mrb newDev -p #this creates a new localProducts_XXXX directory using the new larsoft release and the existing srcs directory
    source localProducts_XXXX/setup  #make sure to use the new localProducts directory, it's better to delete the old directory
    cd srcs/dunetpc
    git pull origin develop #this would merge develop into your current branch
  • Update all other repositories if they exist in your srcs directory (e.g. larreco)
  • Make a clean build
    mrb z
    mrb i -j4 #rebuild

Useful commands

This command shows the versions of all ups packages and their locations:

ups active

Feature branch

There are many reasons to make a feature branch:
  • You are testing things that you don't want other people to use yet.
  • If you are making breaking changes, e.g. you are changing code in two repositories (e.g. dunetpc and larreco) and people need to checkout both repositories in order to compile code, it is good practice to put your changes in feature branches and ask the release manger to merge them back into develop when he/she is tagging a new release.
  • You want to collaborate with other people on code development.

This page has all the useful information on git flow/feature branch:

To create a feature branch.

git flow feature start user_MyFeatureAddition

It is recommended to add your user name in the name of the feature branch so people know who is responsible for this feature branch.
Now if you do
git branch

you should see the new feature branch.

Now you can make changes and commit them but do not push.

To publish a feature branch for collaboration purposes

git flow feature publish

Now if you do
git branch -a

you should see your feature branch in the list of remote branches as remotes/origin/feature/<feature>.

To update a published feature branch after making commits:

Make sure you are on the correct feature branch:

git branch

If not, do
git checkout <feature>

After making changes, do
git commit -a -m'commit message'
git push

to commit and push your changes to the remote feature branch. Note: rootfiles and large files will be rejected by the git repository server. If you need to distribute a large file, talk to DUNE software and computing experts, ask on Slack, or submit a service desk ticket to work out a solution.

Work on an existing feature branch created by someone else

First make sure the feature branch is published. Do

git pull
git branch -a

to verify remotes/origin/feature/<feature> exists.


git checkout feature/<feature>

to checkout the feature branch to your local area. Do
git branch

to verify you are on that branch.

Merge develop into a feature branch

If you need to fetch upstream changes in develop branch:

git checkout develop # go to local develop branch
git pull #update local develop branch to be in sync with remote develop branch
git checkout <feature> # go back to the local feature branch
git merge develop # merge your local develop branch into your local feature branch

If the last step gives errors on failed merging, open the files and resolve conflicts indicated by ">>>>>" and "<<<<<". After all conflicts are resolved, do "git commit -a"

A simpler way of merging develop into a feature branch is:

git pull origin develop

If you want to update remote feature branch, you can do "git push" now.

Merge a feature branch into develop

If you want to merge a feature branch into develop and still keep the feature branch:

git checkout develop # go to your local develop branch
git merge <feature>

Resolve conflicts if necessary.
If you want to push your changes to remote develop, you can do "git push" now. Note: rootfiles and large files will be rejected by the git repository server. If you need to distribute a large file, talk to DUNE software and computing experts, ask on Slack, or submit a service desk ticket to work out a solution.

Finish a feature branch

If you are done with a feature branch and merge it into develop

git flow feature finish

This would merge your feature branch into develop and delete the feature branch. You are on local develop branch now.
Resolve conflicts if necessary.
If you want to push your changes to remote develop, you can do "git push" now.

More complicated development tasks

The above should be enough to do what you are typically used to doing on a day-to-day basis, and it’s a good place to start. Once you think you have those down, there are some further things you may want to be able to do, outlined below.

Creating a new module in an existing package

Imagine you're working in your directory in your git clone'd, pull'd version of larsim/LArG4. LArG4 is a package -- it's a top-level directory underneath a project, which makes it a package in our parlance. Say you would like to create histograms or a TTree that characterizes some truth level information. (You're somehow unhappy with the module that already does this and want to start fresh.) You thus want an ED::analyzer module.

To accomplish this you would proceed like any right-thinking physicist and cp another * file from somewhere else and pull out all the un-needed bits, leaving the key structure of the module -- the methods you must over-write to make this thing work. You would then declare, initialize, and Fill your histograms/TTrees in the right places. (You don't need to Write() them because the TFileService in ART takes care of that, as usual.)

Now just go back to your build directory and mrb z; mrbsetenv; mrb i -j4. You hopefully don't need to touch any of the files that do the building, because the drilling down to find your new .cc just does this as a feature in the CMakeLists.txt. Now, if you had added stuff that needs access to header files that were not already included in larsim/LArG4/CMakeList.txt you would need to edit that file to add them. Same if you added new functions/methods not already in Libraries in that CMakeList.txt.

Creating a new package in an existing repository

Okay, instead, let's add a new directory under the existing repository, itself containing one or more modules and perhaps other .cpp and .h files. First, to get it to compile, your Makefile needs to know about it. As yet, it does not know about it. Remember, the cmake that happens in the first layer of doing an mrb b or mrb i takes your CMakeLists.txt files and creates the Makefiles. So that's what happened above, and merely adding a file in an existing directory, as we did there, did not complicate matters too much. Here, we have at least one other step. We'll need to inform the top level CMakeLists.txt that you have a new directory into which to drill down. You'll then need a CMakeLists.txt in that directory too.

Let's do this in dunetpc. Add a new add_subdirectory(MyNewDirectory) line in dunetpc/dune/CMakeLists.txt. That is, you are directing it to drill down into dunetpc/dune/MyNewDirectory to look for the next CMakeLists.txt file to generate the Makefile to build all the fabulous code you've put there. cp a CMakeLists.txt file that is nearby. Use the cetbuildtools macros already in there and from other CMakeLists.txt files to specify any new include directories to compile against and libraries to link against.

Creating a new repository/product

In fact, we don't see too common a need for this from the lay-collaborator. Imagine coming along with a whole new package of code (which itself might live in nusoft, say) for which you've written specific algorithms, e.g. Pandora. That use case is one for code that might merit a whole new repository. But, any new wire simulation or optical scintillation package or calorimetry reco module can go into an existing repository: dunetpc or larsim or larreco, correspondingly.

If you do need to do this you also need all the "hair" that comes with the package to make it build-able and ups-able by the cetbuildtools system. We leave details to

Multiple build and install areas.

Use multiple install areas when you want to build flavors using different ups qualifiers (-q setup option). This method would typically be used when compiling your code using debug compiler options (-q debug:e4) vs. profiled/optimized compiler options (-q e4:prof). Each install area should be initialized using the command mrb newDev.
mrb newDev -v v1_00_04 -q debug:e4 -T debug.v1_00_04 -f
mrb newDev -v v1_00_04 -q e4:prof -T prof.v1_00_04 -f

The -T option instructs mrb to put the build and install areas into the specified subdirectory. The -f option tells mrb that it is OK to use an existing source aras. Here are a couple of caveats about this method.
  • You can omit the -v and -q options if you have larsoft setup (mrb will inherit the version and qualifiers from your setup version of larsoft).
  • Your specified qualifiers are hardwired into the generated localProductsXXXX/setup initialization script. This initialization script is not reentrant. That is, you can not switch flavors simply by sourcing a different localProductsXXXX/setup script. You should always source localProductsXXXX/setup in a fresh shell.

Miscellaneous Tips, Tricks and Useful Stuff

David Adams' list of classes and producers used by the standard reconstruction chain [[]]

Using the ninja build system instead of make

Ninja is a replacement for make, and at least feels faster.
To setup ninja do

setup ninja

To use ninja you must do a fresh install.
mrb z
mrb i --generator ninja

Now, whenever you want to recompile do the following
ninja install

ninja -C ${MRB_BUILDDIR} -k 0 install | grep -v | grep -v "Up-to-date" 

The final grep prevents your screen getting spammed with useless information. You don't need the -j16 option as ninja will figure out how many cores are available and use that number+2 by default.