Along with working on a MagicLeap version of Modbox – the majority of my time lately has been spent on developing Modbox’s visual scripting system. I’ve rewritten it multiple times now but it’s finally at the point where it’s powerful enough but still intuitive for non-programmers. An old version of the scripting system (which I’ve been calling MBScript) is shown in this test of an AR virtual assistant:
Demo of programming an AR assistant to switch off my lights with Modbox's visual scripting. Has sound! AI uses speech APIs to respond/listen pic.twitter.com/ZhfKezeqcv
— Lee Vermeulen (@Alientrap) February 12, 2019
In this a blank Humanoid is added, then the edit tool is used to add a speech-to-text ‘Listen’ component to it and a text-to-speech ‘Talk’ component. A script is added showing MBScript code: if the player ‘greets’ them (says the Humanoid’s name) then the AI speaks a line (‘hello’ player name) and listens for the next command of that player. The other event in the MBScript is a ‘Heard’ event – if the AI hears a line with the word ‘Light’ in it, then run the ‘Turn Off’ command on the ‘Lever’ entity (a virtual switch hooked up to my real lights with Philips Hue).
The way I would like to do this now would use a natural language processing system to get the main object of the sentence and check for that. Check if the subject was ‘Light’ and what verb for the action (off/on), and automatically check slight variations on those words. All of this was just a test to develop MBScript though since the AI focus for Modbox will be on things like deathmatch bots and game NPCs.
If someone wanted to take the ‘AR virtual assistant’ idea further, I’d really want to see an AI character feel like they actually have a presence in my house. Maybe by even faking them existing while offline – you put on the headset and look around the house to find them in the kitchen eating as if they always had a presence. They would need to really have a sense of the actual environment rather than just showing up on a flat plane pokemon-go style.
World scaling
Concept for AR editing with BingMaps API in Modbox. Can zoom out and pan with controllers – world map is then shown to place a building size sign across the street from my condo pic.twitter.com/fpLjVyknx6
— Lee Vermeulen (@Alientrap) April 15, 2019
Previously I tried out letting the player scale themselves for easier AR editing – using the controllers you could pan around the room and scale yourself up so you could easily edit the entire room within hand distance. I wanted to keep going with this idea but rather than using a scanned version of the environment, use GPS location data to show the actual neighborhood scale. With my current headset I can’t really make use of it to run around my entire neighborhood – but I wanted to show the game possibilities of letting an AR player edit large environments like this and using Google/Bing maps data to give them a preview.
This uses the Bing Maps API to get a 3d model of my neighborhood (in realtime given a GPS location), which I then place into the virtual world at a 1 to 1 scale. The model is then visible after I use controllers to scale up (at the end I am about 100x normal scale) so I can easily place giant things I wouldn’t be able to reach.
It’s hard to tell the scale of the sign in this video but in the end it is around 20 meters wide. What I wanted to do was place a giant Godzilla sized monster and have that monster be clearly shown as the size of the building. Unfortunately due to the glass window and my condo railing I couldn’t get the occlusion looking right, so I decided to just make it a sign. Really I should have tried this in a park but taking my setup outside my condo is way too much effort.
I can imagine being with my friends in the park with AR headsets and scaling up in edit mode to place enemies/obsolesces for a quick game, where showing a 3d map like this would be absolutely needed.
Mirrors
AR let's you see into the MIRROR WORLD pic.twitter.com/9aPCLdQwoI
— Lee Vermeulen (@Alientrap) February 17, 2019
There was a Wired article about AR being the ‘Mirrorworld’ that came out – which gave me the idea to try this. This required no extra coding/setup with Modbox, all I had to do was move my mirror into the office. The great thing about how AR headsets figure out space is they have no concept of a mirror – to the mesh spatialization it’s the exact same as a window to another room.
In some ways, it’s a bug with current AR headsets (on the MagicLeap you’re advised to not have large mirrors when room scanning), but I think this concept could be really useful for everyday AR use. I can imagine looking at a bathroom mirror and using all that additional space for information and virtual items. It’s a way to add additional virtual space where there isn’t any, while still being entirely anchored to the real world (rather than an entirely virtual window).
Drones
My $100 Tello drone has a Unity API! Was able to control it with Knuckles/Index controllers in AR passthrough pic.twitter.com/sjpxPQsYVn
— Lee Vermeulen (@Alientrap) June 12, 2019
I wanted to buy a cheap Drone to experiment with for a while but only did after I saw that there was a pretty simple Unity API for a $100 Tello drone.
The original concept I had was using the VR controllers to give a path to where the Drone should go. Trace a complex path with the VR controller, then have the Drone follow it exactly then land. When I tried this I found I couldn’t reliably track where the drone was in the virtual space – it’s simulated virtual position and real position would mismatch after a second of flying. I thought about ways to track its position while flying but nothing seemed easy (attaching a Vive tracker would mean dealing with vibration messing up SteamVR, and using QR codes to image track was just too much additional work).
Really the main thing this was testing was how a drone interface could work in AR. It’s common to control drones in a first-person VR view with the camera stream just completely taking over vision – but I wanted to still see the drone in the real world as I flew it.
Instead of the camera display on the controller, I tried having it more like an Iron-man style transparent heads up display that followed my head:
Trying out some more ideas for controlling a drone in AR. Single knuckles VR controller for motion controls and a transparent HUD pic.twitter.com/iL9ec7r2Tw
— Lee Vermeulen (@Alientrap) August 2, 2019
And also tried controlling the drone with the VR controller’s motion controls. This was better – but due to latency it still didn’t feel natural to control it. After testing this a few times and almost crashing the thing into my head, and freaking out my cats, I gave up and realized I hated flying drones.