When I made a zero G liquid prototype I made a quick cubemap of my office, and used that for the liquid reflections. I wanted to see how I could expand on that with better more update to date reflections. I wanted to be able to roughly see myself in the reflection.
The idea I had then was to update the reflection every frame, and include the tracked version of myself and hands:
The hands shown in the reflection are the Ultraleap hand models. And the humanoid model is a ‘Pilot’ model from the FinalIK Unity package, setup with VR IK to roughly equal my stance (based on just my head + hand positions).
I had done a test using mirrors in AR in 2018, and wanted to see how I could incorporate it into the liquid physics game as a test. The idea is that AR headsets don’t see it as a mirror – it’s just a window to more open space, so you can use that to place AR content inside the space.
I noticed in the above video how the mirror version of my hands went through the liquid. So for another test I decided to add the physics simulation of my mirrored hands also, so I could push liquid back and forth from the mirror world:
In August a paper was released showing the idea for Gaussian Splatting – a way to real time renderer captured real life environments and objects. Pretty soon after that Aras Pranckevičius (was of the very original Unity engine devs) made a Unity plugin that added Gaussian Splat rendering (and later editing).
The immediate thing I wanted to use this for was to try out for better reflections and portals – basically replacing the use of my skybox office capture into a actual 3d Gaussian Splat version of my office.
I was able to get the Gaussian Splats to render in AR, and did a quick test using some of the splat samples I could find online (it was actually very difficult at this point to find any splat samples that were single objects and not large environments, also no editing tools existed yet).
Next I made a cloned of myself:
To capture I used Polycam. It recently added Gaussian Splat creation also – but I found the end result to be way lower quality than a lot of the samples I had found online. So I followed some instructions on how to train gaussian splats online, and after a lot of python dependency hell that is usual with doing anything AI related, I was able to train this captured version of myself. I then used the newly added editing abilities of the GS Unity package to remove all the background and floating bits.
As a fun test, when I was in Vienna for my talk on AR prototypes I went to a wax museum, and used PolyCam to capture some wax statues:
The end goal for what I wanted to try GS splats for though was for reflections and portals. So my first test in this direction was capturing my office, then making a ‘virtual mirror’ of it:
In the mirror you can also see my tracked hands, and the terribly tracked GS splat clone of myself (just following my tracked headset position).
Unfortunately for this test the performance of the office rendering made the entire thing look glitchy. Performance of GS splat rendering is already incredibly fast for what it is – so I always knew it would be a unpractical thing to use it just for reflections/portals in AR, but hopefully I can get it working better to be more seamless for this test.