- Table of contents
- Available releases
- Augmented Reality
- Official procedures
- Getting useful data from Argo
- The Team
Welcome to the VENu wiki! Here you can find documentation about the development of VENu.
VENu, a Virtual Environment for Neutrinos, is a multi-platform event display for the neutrino experiment MicroBooNE. VENu is built and rendered in a 3D environment using the game engine: Unity, and was designed to exhibit both virtual and augmented reality features in it's ability to display actual neutrino interactions from the MicroBooNE detector.
This page contains information about the development of VENu, or more specifically what is currently in the build. It includes some disclaimers about what we have learned, through trial and error, to alleviate any potential stress from the following design areas in order to help aid future developers.
The following are the tools that we used to create everything in VENu. Everything is open source, and doesn't cost a dime.
- Blender - An open source 3D modelling software, directly imports easily into Unity. The arbitrary units/dimensions in Blender port directly over to Unity. (e.g. - If you make a square with an area of 100 in Blender, it will have an area of 100 in Unity as well.)
- Gimp - An open source image manipulation program. Easy (enough) to use, and great for making certain colors to transparent; which is very important for things like logos and interface images.
Github and Unity¶
Github and Unity do NOT play well together. Unless you are working on the project by yourself, you will most likely run into issues when trying to sync repositories. Of course, you can opt to use separate branches; but this can slow the development process down. Especially when you have to merge multiple times a week or day.
- Occasionally you may have to 'nuke' (delete) your repository entirely, and re-clone it. This can be a pain, but there is no other way to fix the Unity project if things are suddenly broken. (i.e. none of your models are being displayed, or gameobjects are missing all of their scripts)
Download link. Installation is simple: extract the files and run the executable.
Download link. Installation is simple: extract the files and run the app.
Download link. Installation requires you to make the .x86 file to be executed as a program. On Ubuntu, it is as simple as right clicking the file->Properties->Allow executing file as a program.
Download link. Installation requires you to allow installation from "unknown sources", the way you do this depends on your version of Android.
Sadly, the iOS version is currently unavailable for download.
- The MicroBooNE detector was created in Blender using dimensions of decimeters (don't ask why..). The cryostat, TPC, and locations of PMT's are dimensionally correct. Everything else (i.e. Feedthroughs) was done by eye.
- When placing the detector in a Unity scene, you must rotate it by 270 degrees with respect to the y-axis. Otherwise, the events will not be displayed correctly because the origin/coordinate plane is not aligned correctly.
The following are some miscellaneous interface scripts.
fading.cs - Fades in/out from an image. Use a black square to create fluid transitions between scenes.
inGameMenuScript.cs - Gives the overall general function
Control Schemes¶There are multiple control schemes that are in VENu to accommodate different users.
- A mini-map (Cinematic) viewing mode, where there is a main camera orbiting a target; and users control the target's transverse movement by clicking/touching on an overlay of the detector. With a height control slider.
- One-Joystick scheme where the single joystick controls transverse movement of the camera, with touch and drag controls for the camera rotations. Vertical height controlled by a height control slider.
- Two-Joystick scheme where one joystick (on the left) controls transverse movement of the camera, and another joystick for the camera rotations. Vertical height controlled by a height control slider.
- WASD first person mode. Similar to any first person control scheme, though camera rotation will be disabled unless the user is right-clicking. Vertical height controlled by a height control slider.
controlSwitcher.cs - Gives the ability to switch to the control scheme that the user prefers. Attached to the
mouseInterface.cs - The first person control scheme for STANDALONE versions only.
twoJoyControl.cs - The two-joystick control scheme for MOBILE versions only.
oneJoyDragInterface.cs - A first person control scheme for MOBILE versions only. Has one joystick that controls movement, and swipes on the screen will control the camera motion.
The following scripts are for the "cinematic viewing mode" which essentially uses clicks/touches on a minimap to move an object that the camera is orbiting around.
moveTarget.js - Moves the target to the location of the click/touch on the assigned orthographic camera.
smoothCameraOrbit.js - Forces the main camera to look at and orbit around a target. The distance from the target can be changed by using "pinch to zoom" or the mouse scroll-wheel.
tooltip.js - Displays a tooltip for the track that was touched/clicked on. (For now, just displays filler info).
mainMenuScript.cs - Controls the functionality of the main menu.
exitButtonScript.js - Very simple script to allow the person to (sadly) exit the program.
onlineEventsMenu.cs - Controls the different panels for displaying online events.
onlineFileGatherer.js - Creates url arrays for the online events.
eventButtonScript.js - Attached to the EventButton prefab, displays the title of the event and draws it in the canvas.
pointDensitySlider.cs - Places a limit on the total number of spacepoints allowed, and gives the user the ability to change the percentage of how many are drawn on the scene.
showPercentage.js - Displays the percentage of the slider on a UI.Text object.
The event information is taken from a json string that can be downloaded, or parsed directly from "ARGO":argo-microboone.fnal.gov. In order to be able to parse, we use the SimpleJSON plug-in. There is a class, JSONNode which we use to parse the information. For our particular case, we drew the spacepoints with respect to where the MicroBooNE origin is (see below).
You should draw your tracks and spacepoints in reference to the same exact origin as MicroBooNE's, unless you are making this for another detector.
- Notice that there are two coordinate systems visible in the screen shot. You might be confused on which one you should trust - The correct one is the one in the upper right hand corner.
- The detector must be rotated by 270 degrees with respect to the y axis, otherwise the events appear outside the detector.
- The events and their information is parsed from ARGO and we are currently displaying the MCC6 data.
- The cached events are downloaded from ARGO and placed in the Streaming Assets folder, so that they stay as-is when compiled and built.
- Some events are very large, so the parsing may take sometime. You might think about hyperthreading the parse function, so that you avoid the catastrophic halt when the parsing is occurring (freezes the entire program).
- There are sometimes an enormous amount of spacepoints, so there is a cap placed on the total number allowed.
parseEvent.js - Parses the event and then calls the spacepoint and track drawing functions
drawSpacePoints.js - Draws the designated amount of spacepoints (determined from the pointDensitySlider). This is used with the particle dot prefab, which has a script to scale and face the camera.
drawTracks.js - Draws the particle tracks using the line renderer class.
Most of the scripts used for the AR mode can be found on the Vuforia developer portal.
ARDrawParticleTracks.js - Draws the tracks, similar to the way they are drawn in the Display mode.
Main Menu¶Note: The following differences between mobile and standalone main menu screens, they are automatically accounted for:
- The TPC is disabled on mobile, and instead replaced with a screen shot.
- To get the screen shot to look correct, and not distorted, there is a second camera (orthographic) overlapped with the main camera.
- There is an extra button for the AR mode on mobile.
Display¶Note: The following differences between mobile and standalone display scenes:
- The field cage is disabled on the TPC model in the mobile version
- The feedthroughs (1&2) are disabled on the TPC model in the mobile version
- There are added control schemes for the mobile version, though they are automatically activated/deactivated through the inGameMenuScript.cs.
Mobile Main Menu:
Standalone Main Menu:
For information on how to get the Vuforia extension hooked up with a Unity project, please refer to this page.
Once the extension is installed, it is fairly easy to set everything up.
Targets being used¶
Link to the Github. Once you have cloned the Github repository, checkout the Development branch.
All development should happen within the Development branch of the Github. The master branch is strictly for releases.
To make things simple, we chose to stick with a consistent naming convention. There are no spaces in a script, and the first word is always lowercase.
Things with acronyms at the front are exempt to this rule.
Ex: ARIsAnAcronym.cs or UIIsAlsoAnAcronym.cs
Official build structure¶
First and foremost, the ordering of the scenes is very important so they are as follows:
0 - splashScreen.unity
1 - MainMenu.unity
2 - Display.unity
3 - AR.unity
0 - splashScreen.unity
1 - MainMenu.unity
2 - Display.unity
Getting useful data from Argo¶
We rely on the Argo event display to produce data in JSON format which we feed into VENu. A typical LArSoft data file though can be enormous and therefore make the JSON string slow to load. To make this manageable, I wrote a LArSoft fcl file that drops data products we don't use. This then limits the data that goes into the JSON string and speeds up reading it in VENu. I have attached an example script to perform the dropping, venu_skimmer_uboone.fcl.
If the data is truncated, as was done with MCC7, the above skimmer wont work. Use this instead: venu_skimmer_uboone_truncated.fcl.
- Sam Zeller - Fermilab
- Tia Miceli - New Mexico State University
- Steve Pate - New Mexico State University
- Alistair McLean - New Mexico State University
- Thomas Wester - University of Chicago
- Owen Crawford - Bradley University
- Ben Carls - Fermilab
- Matt Bass - Oxford University
- Ariana Hackenburg - Yale University
- Gene Kim - Illinois Math and Science Academy
- Sean Ngo - Illinois Math and Science Academy
Redmine formatting documentation: [[https://cdcvs.fnal.gov/redmine/help/en/wiki_syntax_detailed.html]]