I was lucky enough to be one of the developers to receive a HTC Vive kit from Valve (after some begging to my only Valve contact).
I tried it for the first time at PAX 2015. The previous version I tried at Steam dev days, which I wrote about here, had a completely different setup and no input system. I couldn’t see it being used for games very well with no input, but it definitely made me realize the potential of room scaled VR at the time.
For this new demo I tried 5 games. Some of which were disappointing in how they didn’t use the Vive’s amazing input system to it’s full extent (I imagine since they were started / designed before this input was even possible) – so TiltBrush was the application that really made me see the Vive’s full potential. I felt more in control of the painting in Tiltbrush than I have ever had in any input device – better precision than even a mouse.
As soon as I left the demo I knew -exactly- the game I wanted to make, and spent the rest of my trip thinking about it and planning it. I’ve always enjoyed physics building games so this idea is just a extension of that – but I think with the Vive’s input it can really be taken to a new level. I now see it as a general holodeck builder in a sense – you build a environment/game and then share it with others.
I previously had no interest in doing VR work for a variety of reasons – mostly because of the lack of a good input system. I felt if all I could do was cinematic work, then the best games for it would be ones with large teams of artists, basically high budget film like experiences. Without a good input system I couldn’t imagine a systems based design – so being able to make a good product with a low amount of resources didn’t seem possible.
I also had no interest in having another shitty dev kit experience like I have had with all Oculus hardware. I barely even tried to develop anything with Oculus, mostly just running demos, and it was a horrible setup experience. Having to drag a window to another monitor while looking into the Oculus headset is something I never want to do again. Their old Unity integration meant quickly iterating on game ideas was also incredibly painful – usually having to build then test it. Sudden frame rate drops and crashes (both of which are common when developing) also meant constant nausea.
So not only did the Vive solve my problems with input, it’s integration with Unity is fantastic. I can test things in the editor itself, and even adjust the environment in scene view and hot swap code while the Vive is still running. Also when there was a frame rate drop or the application crashed it wouldn’t bring me back to a desktop (which is painful on the eyes), rather a blank VR room.
Before I started on my dream physics VR Vive game though I wanted to quickly try a project to get an idea of how to use the development kit. So within a few days I made this Godzilla sorta simulator using the game assets of another in development Alientrap game:
It took a bit to get the right scale – at first I was imagining the player as a much bigger monster, towering above everything, but I realized it was a lot more fun to be about the same height as the buildings you were destroying.
Instead of monster hands, or a hammer to destroy the city I decided to add a Morning Star type weapon. This was because there is no way to give tactical feedback to the player – if they have a hammer in their hand and they smash a building with it, it will go right through it with no physical feedback. It was important the enviroment feel tangible for destruction to really mean something. So with a Morning Star, the player would have to wind up to get a large force, and would have the feedback of the buildings stopping the morning star. So the world would still feel like it took effort to destroy.
Hopefully I’ll be able to show off our Vive game soon – kind of just waiting to see the best way to announce it. I should probably just give up on the idea of a big announcement and be entirely open with development – but we’ll see