Reduce disk space usage?
The output canvases written by
MakeCanvases.py can occupy an awful lot of disk space (especially when there are multiple categories that all get multiplied together). Are there any ways of reducing it?
One idea: storing gzipped versions of plots (will help for
.svg, at least). Not sure if this is possible, though -- would need to get Apache to serve these with a special directive to decompress them -- and even if that is doable, it might be more work (and Service Desk tickets) than it's worth.
Probably need a better idea to move forward.
#1 Updated by Christopher Backhouse about 3 years ago
Are there json (or svg?) minifiers that help at all? Or are the files already pretty dense?
https://feeding.cloud.geek.nz/posts/serving-pre-compressed-files-using/ for apache configuration?
#2 Updated by Jeremy Wolcott about 3 years ago
Unfortunately you can't change the Apache headers sent with content without changing the directives in the master config file (not supported in .htaccess, I checked). Maybe it's worth fighting with SCD on that one if we can't find a better way, but I'm not keen on it.
If I had to guess, I'd say minifying the JSONs will maybe get us a factor of two, but not 10. Compression would be much better. I've discovered that the main culprit is actually SVGs that correspond to TH2s drawn with the "p" option, which I didn't realize was happening and have now disabled. I need to recheck the disk usage for realistic projects after that change.
#4 Updated by Jeremy Wolcott about 3 years ago
#6 Updated by Jeremy Wolcott about 3 years ago
- Status changed from Feedback to Closed
Hopefully switching to ROOT file serialization (novaart:r23056) will be a good enough solution that I'm going to close this. (Certainly it will eliminate the >1 GB (!) 2D histogram problem that stemmed from having too many points and/or too many bins drawn in vector format.)