In October I tried out the Nvidia Flex particle engine in AR:
Got fluid physics in AR working 👍 pic.twitter.com/P0W0tsVZ8g
— Lee Vermeulen (@Alientrap) October 24, 2018
I’ve used the Nvidia Flex particle engine in VR previously, but this new version only became possible with Nvidia’s latest Unity SDK. My 1080 GPU was able to keep 45 FPS with 5000 particles, which is just barely enough for me to stand the ZedMini’s AR passthrough. Before the last zero-g shot in the test I made a cube map of my office and added a environment probe for reflections (something I hope will be automatically generated with future AR SDKs). Cube map was made just by taking 6 quick photos of my office with my phone in all directions.
The same Flex particle engine can also be used for soft body physics:
Trying out squishy/sticky softbody physics in AR. Could really use sound effects pic.twitter.com/fP5Qq2d9k5
— Lee Vermeulen (@Alientrap) October 29, 2018
To allow for grabbing of the particles I added a new interaction system. When the VR controller’s trigger button is pressed it checks for any particles within a certain distance, then every frame after that sets their velocity to go towards the controllers position. Directly setting particle positions on a soft body would completely wreck it
Another fluid test, this time using a much larger scale of the world/camera (to get higher particle density but keep it stable):
More zero-g AR fluid! Now with better physics pic.twitter.com/QiFMliTVZU
— Lee Vermeulen (@Alientrap) December 5, 2018
For this I added a accurate collision mesh of the Vive controller, and changed the controller’s physics to be velocity based (rather than directly setting position on it also).
I then tried messing with the fluid viscosity and adhesion, and changing it a slimey look:
turned my AR fluid into green slime for 90s Nickelodeon / flubber nostalgia pic.twitter.com/e8aL8e6wuh
— Lee Vermeulen (@Alientrap) December 6, 2018
Then tried out what hand interaction would feel like with the LeapMotion:
Trying out what it's like to have giant floating AR hands with a @LeapMotion pic.twitter.com/dF2K7th2ty
— Lee Vermeulen (@Alientrap) February 1, 2019
I initially tried setting up the LeapMotion hands to just match my own. But found the inaccuracy of the hands really frustrating – it wasn’t nearly precise enough to feel like I was in control. Felt a lot better to have them as giant floating hands – the inaccuracy then became more fun and challenging. Having them offset also meant no visible occlusion errors with fingers.
I had a problem with the low gravity fluid before where it would just either float away or fall to the ground. So I decided to try out radial gravity on a small little virtual planet:
AR Radial Gravity 👍 pic.twitter.com/oox4Xm70Dy
— Lee Vermeulen (@Alientrap) February 5, 2019
Last shot of this video was the first experiment before any velocity damping was added. Making fluid orbits was fun, but without any damping the chaos at the end just never stops.
Once I had the radial gravity working, I set the gravity center to the VR controller’s position, then had grip button turn it on/off:
Waterbending pic.twitter.com/7ffAaErFfZ
— Lee Vermeulen (@Alientrap) February 23, 2019
Using that same fluid particle moving code, I tried out storing the fluid particles start position and moving towards that:
Test with AR fluid that remembers it's shape. Very T-1000 inspired pic.twitter.com/85OCEi74dq
— Lee Vermeulen (@Alientrap) April 2, 2019
Then extended this to not only store the fluid start position, but also where the particle would have been on different meshes (NvidiaFlex API includes voxelizing a mesh into individual particles):
Some more AR fluid morphing 🕴️✋🐕 pic.twitter.com/VGCRI5WYvu
— Lee Vermeulen (@Alientrap) April 10, 2019