When I was experimenting with liquid physics in AR previously I had a lot of ideas for possible games, so I wanted to try it again to see what was possible.
Previously I had used Nvidia Flex in Unity for liquid which was pretty limited in how many liquid particles it could have. The only games I could make with it would have to have a very limited amount of particles – so I was going in the direction of food cooking simulators rather than games with a lot of flowing water or pools.
Looking into liquid options in 2022 I found Zibra Liquids. Like Nvidia Flex it’s entirely GPU based, but can handle around 5-10x more particles. Flex uses Smoothed Particle Hydrodynamics for liquid simulation, while Zibra uses Material Point Method – but not sure if that’s really the reason why. It also has a much better raymarching based liquid renderer – though I had to use it’s mesh based renderer for VR.
I first tried it out on the TiltFive headset – using Mixcast to record. Using the controller Wand trigger to create liquid and another button to suck it in.
Because of the way TiltFive works (you only actually see the liquid when looking at the board, not the liquid above the board) this Mixcast video isn’t really representative of the in headset experience. My biggest problem with TiltFive though was the input. The input wand just isn’t precise. It doesn’t seem to be able to track the depth of the wand away from the headset. Without good input I really don’t feel I can make interesting physics games. I think this was also the biggest problem with MagicLeap – I can accept optical AR headset limitations in terms of FOV (or in TiltFive’s case just being on the board, its FOV is massive), but without good input it’s impossible to make good games. All good VR games (like BeatSaber) completely depend on the precision of the controllers.
First liquid test I did with the Varjo XR-3 was just creating / moving around the liquid with my hands:
Hand tracking here is done with Varjo’s UltraLeap support. For this test I added some quick box liquid colliders to each finger of the hand – this was before the skinned mesh SDF collider Zibra added later on (which allowed me to create a SDF of my hand mesh and use that as the liquid collider).
Here is the same ‘pinch to spray’ but now room scale. I manually created the office collision mesh for this since the headset’s mesh generation was inaccurate:
Same room scale, but now with radial gravity so it’s a little less messy:
A lot of the possible game ideas I wanted to try out involved ‘sculpting’ the liquid into place. I was imagining making a building game with the liquid – mostly it was just fun to move it around with my hands.
You can also now see a bit of reflection of my office in the water for this. This was made by just making a quick photosphere of my phone from my office, then putting it into Unity as a Skybox:
Problem with this is the zero G liquid would just get everywhere, there was no ‘damping’ on the liquid or air friction. Since I couldn’t figure out how to add that yet, I had the idea to sculpt by just removing whatever my hand SDF mesh touched:
I was then able to figure out how to add velocity damping with a custom compute shader. Each liquid particle has it’s velocity multiplied by 0.95 each frame.
Then lastly I wanted to check out another feature of Zibra’s effect package, Smoke and Fire. They gave me a early package of it with VR support so I could test it out with hand tracking:
More smoke and fire tests:
The ring is a inverted donut SDF collider. The Dragon also used a inverted collider to keep the smoke in place (it’s removed a split second before my hand touches the smoke).
Next post I’ll go over some of the other Varjo XR-3 SDK features, like Eye tracking. Then after that a post on the liquid physics puzzle game I finally landed on.