The analysis groups produce nue decafs (nue quality and containment cuts applied) and numu decafs (weak preselection, containment and loose cosmic rejection), and in the ND nue+numu decafs that contain all slices that pass either set of cuts. Decafs also drop some excessive truth information carried around by full CAFs. They are usually substantially smaller than the corresponding CAFs.
If the existing decafs are unsuitable for you for some reason, e.g. they cut out events you need, or you could make your files substantially smaller by applying more stringent cuts, you may be able to make your own. If your decafs might be broadly useful, please discuss the cuts and metadata with the relevant people (group conveners) and look into making them broadly available.
To decaf a single input file (e.g. for testing purposes) there are a variety of macros in the
CAFAna package, all beginning with "reduce". They take an input and an output file name as arguments.
cafe -bq reduce_foo_bar.C input.caf.root output.decaf.root
If you need to decaf a large number of files use
$ submit_decaf_project.sh Usage: submit_decaf_project.sh DATASET OUTPUTDIR RELEASE SCRIPT [ROLE] [HEX] SCRIPT can be a literal CAFAna script ($SRT_PUBLIC_CONTEXT/CAFAna/ implicit) or one of "nue", "numu", or "nue_or_numu". ROLE can be "Analysis" or "Production", "Analysis" is default. If HEX is "yes" create hexdirs under OUTPUTDIR. Implied by ROLE=Production