DUNE-centric Guide to Using LArSoft in git/mrb world¶
- Table of contents
- DUNE-centric Guide to Using LArSoft in git/mrb world
- Run from an existing (“frozen”) LArSoft release
- Develop and build in LArSoft, based on the latest frozen release
- Feature branch
- More complicated development tasks
- Miscellaneous Tips, Tricks and Useful Stuff
The following is a quick guide to the things you, a member of DUNE, will need to do in order to run and develop applications in LArSoft. Please note that there are more general instructions/guidance at https://cdcvs.fnal.gov/redmine/projects/larsoft/wiki/_Quick-start_guide_to_using_and_developing_LArSoft_code_, some of which this page is ripping off! That wiki is definitely worth looking at -- perhaps even before reading this document. The following assumes
- You have all your computing privileges and accounts, including an account on the dune gpvm nodes (e.g. dunegpvm01.fnal.gov).
- You are working on one of the dune gpvm nodes. Many of these should work on other machines/sites as well, if they are configured in the standard way, but there are no guarantees for that. Talk to your system/local administrator if you’re having trouble, to make sure there’s no differences in the basic setup.
- You have a working knowledge of UNIX.
- You are using the bash shell (most of the things for csh should be clear)
- You know something about the general structure of LArSoft. If you don’t know what a fcl file is, and/or don’t know how to make/modify one, then you will need a more basic tutorial than what’s here.
Run from an existing (“frozen”) LArSoft release¶
Then, to list the available versions of the dune software, do
ups list -aK+ dunetpc
You should see a list (possible long list…) of versions of the code to set up. Each line should look something like this:
"dunetpc" "v04_19_00" "Linux64bit+2.6-2.12" "e7:prof" ""
Pick which version you want to setup (probably the latest, and probably the profile version), and do something like this (with the version and qualifiers you want):
setup dunetpc v04_19_00 -q prof:e7
This will automatically setup larsoft as well, so don’t worry about anything else. Meaning, you don't need to also do
setup larsoft v04_19_00 -q prof:e7. dunetpc of a particular version requires larsoft of that version, and thus it sets it up too. You can think of “dunetpc” as dune’s own particular implementation of larsoft. You should then be able to run your usual larsoft job:
lar -c my_fcl_file.fcl ...
Develop and build in LArSoft, based on the latest frozen release¶
We advise strongly that each new directory you would like to make corresponding to a new development environment you will create should only be performed once per login. So
mkdir larsoft_mydev2 below should only happen after logging out and logging back in. The reason is that your $PRODUCTS variable tends to grow to include all working areas you visit per login. It can happen across logins too. This makes for many weird effects upon saying
mrbsetenv. Check your $PRODUCTS when confused.
For developing your code, we recommend you use the latest LArSoft frozen release. The LArSoft frozen release is tagged and built once a week. By using the latest frozen release, you get the most up-to-date features and a stable release. We have instructions below on how to update your local release when a new LArSoft release is available.
Like above, the first thing to do is
But now, things get different! First, check the larsoft releases:
ups list -aK+ larsoft
Then setup the latest version of LArSoft, e.g. (note v04_19_00 -q e7:prof is just an example, setup the latest larsoft version you find from this command. Note that the list from the ups command is NOT sorted, so the latest version is not the last one in the list)
setup larsoft v04_19_00 -q e7:prof
Another way to get the latest version of LArSoft that works with the latest dunetpc is to check this file and get the larsoft version on the line that starts with 'larsoft':
Do not use "nightly" as dunetpc nightly version is not supported at this moment.
debug or prof:
prof uses optimized code and runs much faster. debug will provide more debug information if you use gdb or totalview.
Now, you should make a new directory where you intend to put your code (Do not use your home directory, as the libraries you build can get large. Use somewhere beneath /dune/app/users/USER/). Let’s say that new directory is going to be called
larsoft_mydev. Then do
mkdir larsoft_mydev cd larsoft_mydev
Then, inside the new directory, do
Do the following to use the local release:
Now, pull down
cd srcs mrb g dunetpc
Optional: when to add other repositories¶You should add other repositories if
- you need new features in that repository that were added after the last frozen release, or
- you want to develop code in that repository, or
- the current head version of dunetpc depends on the head version of another package, watch announcement.
cd srcs mrb g larreco
Note it can take much longer time to compile the code if you checkout additional packages. Don't checkout packages you don't need.
Building your code¶
Go to the build directory in your development area,
or just do
and then, if this is your first time building with these local packages, do
You can omit this if you have already done this, but if you at any point add a new package, you should do
mrb z; mrbsetenv. Meaning, you want to redefine the CMakeList.txt files and hence the Makefiles. Once properly set up, you can build/install:
mrb i -j4
That latter part tells it to build on 4 cores, if available. It may be easier to break that step above into two parts---
mrb build -j4; mrb install---in order to more easily check for compilation issues.
If you see the following error:
ERROR: Version conflict -- dependency tree requires versions conflicting with current setup of product : full descriptions setpath v1_11 -f NULL -z /grid/fermiapp/products/common/db vs setpath v1_11 -f NULL -z /cvmfs/fermilab.opensciencegrid.org/products/common/db
INFO: no optional setup of duneutil v01_23_11 -q +e10:+prof
please refer to Issue #15028.
Setup your code to run¶
After everything is built and installed, do
to setup your newly built code. (‘slp’ stands for setup local products.) Now you can run your lar job:
lar -c my_fcl_file.fcl ...
Shortening the development cycle¶
Once you're stable and the code you're working on is well defined, and you're not git pull'ing new repositories, the development cycle can be shortened. Go make your edits in ../srcs/dunetpc/PackageA/AnalysisCode.cpp and jump back to ../build. You can now replace the onerous
mrbsetenv mrb i -j4 mrbslp
make install -j4
That should cut a few minute procedure to sub-minute.
There is a computer available to DUNE collaborators, dunebuild01.fnal.gov, with sixteen cores. It is intended for building code only, and then mrb i -j16 and make install -j16 will speed up builds by quite a lot more.
The next time you login a dunegpvm node, do the following to set things up:
source /grid/fermiapp/products/dune/setup_dune.sh cd larsoft_mydev source localProducts_XXXX/setup mrbslp
When a new LArSoft release is available and you want to develop against the new LArSoft release¶
- Relogin a dunegpvm node, do the following
source /grid/fermiapp/products/dune/setup_dune.sh setup larsoft vxx_yy_zz -q e6:prof #set up the latest larsoft release cd larsoft_mydev mrb newDev -p #this creates a new localProducts_XXXX directory using the new larsoft release and the existing srcs directory source localProducts_XXXX/setup #make sure to use the new localProducts directory, it's better to delete the old directory cd srcs/dunetpc git pull origin develop #this would merge develop into your current branch
- Update all other repositories if they exist in your srcs directory (e.g. larreco)
- Make a clean build
cd $MRB_BUILDDIR mrb z mrbsetenv mrb i -j4 #rebuild
This command shows the versions of all ups packages and their locations:
Feature branch¶There are many reasons to make a feature branch:
- You are testing things that you don't want other people to use yet.
- If you are making breaking changes, e.g. you are changing code in two repositories (e.g. dunetpc and larreco) and people need to checkout both repositories in order to compile code, it is good practice to put your changes in feature branches and ask the release manger to merge them back into develop when he/she is tagging a new release.
- You want to collaborate with other people on code development.
This page has all the useful information on git flow/feature branch:
To create a feature branch.¶
git flow feature start user_MyFeatureAddition
It is recommended to add your user name in the name of the feature branch so people know who is responsible for this feature branch.
Now if you do
you should see the new feature branch.
Now you can make changes and commit them but do not push.
To publish a feature branch for collaboration purposes¶
git flow feature publish
Now if you do
git branch -a
you should see your feature branch in the list of remote branches as remotes/origin/feature/<feature>.
To update a published feature branch after making commits:¶
Make sure you are on the correct feature branch:
If not, do
git checkout <feature>
After making changes, do
git commit -a -m'commit message' git push
to commit and push your changes to the remote feature branch.
Work on an existing feature branch created by someone else¶
First make sure the feature branch is published. Do
git pull git branch -a
to verify remotes/origin/feature/<feature> exists.
git checkout feature/<feature>
to checkout the feature branch to your local area. Do
to verify you are on that branch.
Merge develop into a feature branch¶
If you need to fetch upstream changes in develop branch:
git checkout develop # go to local develop branch git pull #update local develop branch to be in sync with remote develop branch git checkout <feature> # go back to the local feature branch git merge develop # merge your local develop branch into your local feature branch
If the last step gives errors on failed merging, open the files and resolve conflicts indicated by ">>>>>" and "<<<<<". After all conflicts are resolved, do "git commit -a"
A simpler way of merging develop into a feature branch is:
git pull origin develop
If you want to update remote feature branch, you can do "git push" now.
Merge a feature branch into develop¶
If you want to merge a feature branch into develop and still keep the feature branch:
git checkout develop # go to your local develop branch git merge <feature>
Resolve conflicts if necessary.
If you want to push your changes to remote develop, you can do "git push" now.
Finish a feature branch¶
If you are done with a feature branch and merge it into develop
git flow feature finish
This would merge your feature branch into develop and delete the feature branch. You are on local develop branch now.
Resolve conflicts if necessary.
If you want to push your changes to remote develop, you can do "git push" now.
More complicated development tasks¶
The above should be enough to do what you are typically used to doing on a day-to-day basis, and it’s a good place to start. Once you think you have those down, there are some further things you may want to be able to do, outlined below.
Creating a new module in an existing package¶
Imagine you're working in your directory in your git clone'd, pull'd version of larsim/LArG4. LArG4 is a package -- it's a top-level directory underneath a project, which makes it a package in our parlance. Say you would like to create histograms or a TTree that characterizes some truth level information. (You're somehow unhappy with the module LArG4Ana_module.cc that already does this and want to start fresh.) You thus want an ED::analyzer module.
To accomplish this you would proceed like any right-thinking physicist and cp another *Ana_module.cc file from somewhere else and pull out all the un-needed bits, leaving the key structure of the module -- the methods you must over-write to make this thing work. You would then declare, initialize, and Fill your histograms/TTrees in the right places. (You don't need to Write() them because the TFileService in ART takes care of that, as usual.)
Now just go back to your build directory and
mrb z; mrbsetenv; mrb i -j4. You hopefully don't need to touch any of the files that do the building, because the drilling down to find your new .cc just does this as a feature in the CMakeLists.txt. Now, if you had added stuff that needs access to header files that were not already included in larsim/LArG4/CMakeList.txt you would need to edit that file to add them. Same if you added new functions/methods not already in Libraries in that CMakeList.txt.
Creating a new package in an existing repository¶
Okay, instead, let's add a new directory under the existing repository, itself containing one or more modules and perhaps other .cpp and .h files. First, to get it to compile, your Makefile needs to know about it. As yet, it does not know about it. Remember, the cmake that happens in the first layer of doing an
mrb b or
mrb i takes your CMakeLists.txt files and creates the Makefiles. So that's what happened above, and merely adding a file in an existing directory, as we did there, did not complicate matters too much. Here, we have at least one other step. We'll need to inform the top level CMakeLists.txt that you have a new directory into which to drill down. You'll then need a CMakeLists.txt in that directory too.
Let's do this in dunetpc. Add a new add_subdirectory(MyNewDirectory) line in dunetpc/dune/CMakeLists.txt. That is, you are directing it to drill down into dunetpc/dune/MyNewDirectory to look for the next CMakeLists.txt file to generate the Makefile to build all the fabulous code you've put there. cp a CMakeLists.txt file that is nearby. Use the cetbuildtools macros already in there and from other CMakeLists.txt files to specify any new include directories to compile against and libraries to link against.
Creating a new repository/product¶
In fact, we don't see too common a need for this from the lay-collaborator. Imagine coming along with a whole new package of code (which itself might live in nusoft, say) for which you've written specific algorithms, e.g. Pandora. That use case is one for code that might merit a whole new repository. But, any new wire simulation or optical scintillation package or calorimetry reco module can go into an existing repository: dunetpc or larsim or larreco, correspondingly.
If you do need to do this you also need all the "hair" that comes with the package to make it build-able and ups-able by the cetbuildtools system. We leave details to https://cdcvs.fnal.gov/redmine/projects/larsoft/wiki/_Quick-start_guide_to_using_and_developing_LArSoft_code_#Creating-an-entirely-new-product-within-an-existing-work-area.
Multiple build and install areas.¶Use multiple install areas when you want to build flavors using different ups qualifiers (-q setup option). This method would typically be used when compiling your code using debug compiler options (-q debug:e4) vs. profiled/optimized compiler options (-q e4:prof). Each install area should be initialized using the command
mrb newDev -v v1_00_04 -q debug:e4 -T debug.v1_00_04 -f mrb newDev -v v1_00_04 -q e4:prof -T prof.v1_00_04 -f
mrbto put the build and install areas into the specified subdirectory. The
mrbthat it is OK to use an existing
sourcearas. Here are a couple of caveats about this method.
- You can omit the -v and -q options if you have
mrbwill inherit the version and qualifiers from your setup version of
- Your specified qualifiers are hardwired into the generated
localProductsXXXX/setupinitialization script. This initialization script is not reentrant. That is, you can not switch flavors simply by sourcing a different
localProductsXXXX/setupscript. You should always source
localProductsXXXX/setupin a fresh shell.
Miscellaneous Tips, Tricks and Useful Stuff¶
David Adams' list of classes and producers used by the standard reconstruction chain [[https://dune.bnl.gov/wiki/LBNE_Far_Detector_Software_Data_Model]]¶
Using the ninja build system instead of make¶
Ninja is a replacement for make, and at least feels faster.
To setup ninja do
setup ninja v1_6_0
To use ninja you must do a fresh install.
cd $MRB_BUILDDIR mrb z mrbsetenv mrb i --generator ninja
Now, whenever you want to recompile do the following
cd $MRB_BUILDDIR ninja install
You don't need the -j16 option as ninja will figure out how many cores are available and use that number+2 by default.