How To Trigger a Test Build

This section is for Admins or Code Coordinators who may wish to trigger to build a branch other than develop to test their code for readiness for release. Jump to the bottom of this page for an example build trigger.

Triggering a build with default parameters

There are several ways you can trigger a build:

  1. by git push-ing a change to the develop branch of a LAr module
  2. by git push-ing a change with "BuildIt" in the commit message
  3. by filling out the web form on the build service
    To trigger a test of the head of the develop branches, simply go to the request page and fill in the form, and click the Build button at the bottom.
  4. by running the trigger script in the lar_ci ups package, ala:

Parameter options


LAR_DEBUG is a boolean parameter that is false by default. If set true, the Jenkins log will show the full stdout and stderr of the test scripts rather than the simplified human-readable version it normally shows.


Set automatically when the test is triggered by a commit. Have no effect on how the test functions.


The version of LArSoft to use for setting up the architecture and for grabbing modules from numbered versions. By default, this is set to "latest" and the test scripts choose the latest version found with "ups list".


Allows you to specify exactly which modules you want to be built. The syntax for this parameter is a comma-separated list of terms. The syntax for each term is as follows.

Syntax for each term:

Valid syntax types for [modules]@:
Replace [module] with the name of a larsoft module to specify that module. (e.g. larcore@[location])
Replace [module] with * to specify all modules. (i.e. *@[location])
Leave out [module]@. If no at sign is detected, *@ will be appended and all modules will be drawn from [location]. (i.e. [location])

Valid syntax types for [location]:
Replace [location] with a branch of the LArSoft repository to specify that [module] will be drawn from that branch. (e.g. [module]@develop)
Replace [location] with ~ to specify that [module] will be drawn from the version specified in LAR_VERSION. (i.e. [module]@~)
Leave out [location]. If nothing is detected after the [module]@, develop will be appended and the modules will be drawn from the develop branch. (i.e. [module]@)

Due to the options to leave out both halves, the empty string is a valid term that evaluates to *@develop. *@develop is also always appended to the end of the list of terms so the remaining modules that have not been specified will be drawn from the develop branch. If a module is specified twice, only the first time it is specified is considered. For instance, for larcore@~,*@develop larcore is specified by both terms, but the larcore@~ term is used because it came first.

Configuring the builds

The builds are currently done via a tagged version of the lar_ci package,
which is an argument to the trigger script. The lar_ci version can be
set in the hook scripts of the repositories. The build system is configured
to checkout that version of the package, set it up with UPS, and then run the script from that release to do the build.

This script operates off a workflow.cfg script in the $LAR_CI_DIR/cfg directory.


This configuration file specifies the different stages of the build process, and
what commands to run in each phase. A sample config file looks like this:

# main config lists section to get personalty from

stages  = eval_n checkout_x_modules build test inst ci_test
modules = larana larcore lardata lareventdisplay larevt larexamples larpandora larreco larsim larsoft

stages  = eval_n checkout_x_modules build test inst ci_test
modules = larana larcore lardata lareventdisplay larevt larexamples larpandora larreco larsim larsoft uboonecode

modules    = larana larcore lardata lareventdisplay larevt larexamples larpandora larreco larsim larsoft ubutil uboonecode lbnecode

eval1 = source /cvmfs/ || source /grid/fermiapp/products/uboone/
eval2 = . `ups setup ups`
eval3 = LAR_VERSION_QUALS="${LAR_VERSION_QUALS:-`get_latest_ups_version_quals larsoft`}" 
eval4 = `mrb newDev -f -v $LAR_VERSION_QUALS | grep source` 
eval5 = source ${MRB_DIR}/bin/mrbSetEnv 
checkoutdir = ${MRB_SOURCE}
checkoutcmd = mrb -g -r $module; git checkout -b `match_args $module "$LAR_REVISION,*@default"`
builddir    = ${MRB_BUILDDIR}
buildsetup  = mrbsetenv
buildcmd    = mrb b -j `grep -c ^processor /proc/cpuinfo`
testdir     = ${MRB_TOP}
testcmd     = mrb test
instcmd     = mrb i
setupci_test =  mrbslp
ci_testcmd   = testrunner default

Basically, this specifies 6 "stages" to the workflow:

  • eval_n -- which will run the eval1, eval2, ... commands
  • checkout_x_modules -- which will run checkoutcmd command -- mrb g -- for each thing in the "modules" list (after cd-ing to checkoutdir and running chekcoutsetup)
  • build -- which will run the buildcmd (after cd-ing to builddir and runnign buildsetup) -- mrb b
  • test -- which will run testcmd (after cd-ing to testdir and running testsetup) -- mrb test
  • inst -- which will run instcmd (after cd-ing to instdir and running instsetup) -- mrb i
  • ci_test -- which will run ci_testcmd (after cd-ing to ci_testdir and running ci_testsetup) -- test_runner test_suite

So overall I think you see the pattern here; each stage can be either

  • a simple name
  • a name with an _n suffix
  • a name with an _x_variables suffix

Then for each name xxx in the stages list, the config is searched (under the [personality] section) for

  • xxxdir (which we cd to),
  • xxxsetup (which we run)
  • xxxcmd (which we run) or
  • xxx1,xxx2,... (if a _n suffix was seen)

If the stage has a _x_variables suffix, we add a

for variable in ${variables}
     eval $xxxcmd

sort of loop to run the command over for each item in the "variables" list.

Putting it all together to fire off a custom build

To illustrate the customization possible, let's build from some non-develop branches on some of our repositories.

Instructions below are in progress, Don't yet try,...

# Do one of the below voms-proxy-init's. Dunno why they're not symmetric under uboone<->lbne interchange!
voms-proxy-init -noregen -rfc -voms fermilab:/fermilab/uboone/Role=Analysis
voms-proxy-init -noregen -rfc -voms lbne:/lbne/Role=Analysis
setup lar_ci

# Now, finally trigger the build/workflow to build two non-develop branches
trigger --revisions "uboonecode@feature/mcshower lbnecode@feature/newTriplyWrappedAPAs"

Let's now show the next level of customization. Let's do two things in this example: let's change the workflow and let's build, again, some non-develop branches on one of our repos. The way in which one might change the workflow could be to change which suite of tests in our cfg file are to be run.

Note the gitology necessary below to capture your change.

git clone ssh://
cd lar-ci
cd cfg
cp workflow.cfg workflow_myname.cfg
emacs -nw workflow_myname.cfg
# edit this file to change the default_uboonecode line to something different
git add workflow_myname.cfg
git commit -a -m "Make my own cfg file."
git tag mynameTag
git push origin mynameTag

trigger --revisions "larreco@feature/newShowerBranch lbnecode@feature/newQuadruplyWrappedAPAs" --version mynameTag --wfcfg workflow_myname.cfg

We could also, instead of fiddling with the test suites that run, do something drastic like drop the ci-tests from the stages of the workflow altogether and thus not run any ci tests. This will shorten the workflow significantly -- at the expense of not learning whether crucial tests determining the functionality of recent check-ins worked or if we perhaps broke the build! We leave this very simple test as an exercise for the reader.

Triggering build of what you have checked out

If you have some code checked out that you are working on; if you are sitting in your $MRB_TOP or $MRB_SOURCE directory,
you should be able to

trigger --scan-revisions

and it should look in your checked out copies, collect the revisions, and build the revision list for you, so
you don't have to make a long --revisions="larreco@this lbnecode@that ..." list.