Icaros is a fitness game where you use your body to fly through virtual worlds.
Since the first prototype, Hyve had been busy refining the Icaros hardware. They also wanted to try using mobile VR to get rid of all the wires and the need for a PC on the old DK2 prototype.
We decided to start with the Samsung Gear VR, with a view to expanding out to Zeiss/Google Cardboard if it went well.
Hyve had been invited to the WIRED 2015 Testlab, a great showcase for new technologies. This gave me seven weeks of development time, which was going to be tight.
The Icaros was conceived as a way to make exercising more fun and gaming more healthy. Michael & Johannes came up with a course design that had sections to cater for differing abilities and to work difference muscles. The canyon would tone your core muscles, a tunnel would take this further and push your upper body, an open mountain landscape would give you a rest and be a good start for beginners to get used to the controls.
One of the difficulties in designing environments for VR is getting the sense of scale right. You can use real-world units, but a lack of small details or unfamiliar terrain can make your brain think things are bigger or smaller than intended.
While the Samsung Galaxy S6 is a pretty powerful phone, we were concerned about frame rate on mobile and originally thought that going for a low-poly art style would work best.
I built a workflow for a low-poly terrain and some Unity editor tools to go with it, but it became clear that with the scale of the world and detail needed it wasn’t going to work very well.
I’d used some Unity-based tools for the previous prototype, but needed something a lot better for this one and eventually settled on World Machine.
World Machine is a separate tool but very good at generating realistic looking terrains. It takes a little while to get used to, but before long I was creating good looking landscapes.
I tried various ways to transfer the world machine landscape to Unity and in the end I exported it as 16 mesh tiles in a 4x4 grid. Each tile had about 6000 triangles but also a 2k color map and 2k normal map to give it some detail.
The canyon section needed even more detail than this, so I re-exported that with more triangles and 4K textures. I also needed to make a hole in the canyon for the tunnel section so brought it into 3dsMax to do that and optimised the mesh as well.
I’d built a Unity Editor path tool to arrange course markers and expanded this to procedurally generate the tunnel section.
Originally I just had a flat plane for water and wasn’t expecting to be able to have real-time reflections on a phone, but after a few tests I realised this was going to be possible and still run at 60fps.
Optimising for Mobile VR can be a frustrating process as unexpected things have a surprisingly large or small impact. I had expected to need to create proxy collision meshes for the terrain, but adding Mesh Colliders directly to the tiles barely registered when profiling the app. Also it’s quite easy to accidentally leave something on which kills the performance. The tunnel section was being lit by two lights at one point which almost halved the frame rate.
I didn’t have time to bake the lighting properly and while the end result isn’t as good as I’d like, it was fine for the conference with one real-time light.
Pushing the performance is one thing, but with mobile VR you also need to pay attention to power consumption and heat. Through the Oculus SDK you can change the CPU and GPU clock speeds, turning them up to gain more performance and down to save battery. I experimented with various combinations and having the CPU on 1 and the GPU on 3 worked well. We were demoing all day alternating between two phones every 45 minutes or so and didn’t have any issues with overheating.
It’s incredibly helpful when demoing VR to be able to see what the player is seeing. I did this using a PC version of the same app which was synchronised via Wifi. Unity’s new networking takes a little more setting up than before, but it’s still quite straightforward and with Unity 5.2 you also get local network discovery so the PC app finds the Gear VR acting as a server on the same wifi network.
During development there were concerns over getting on and off the Icaros with the Gear VR on. I thought I’d try to enable the phone’s camera so people could see through that and this Augmented Reality mode does actually help. I’ve not yet managed to get the frame-rate as good as the native Gear VR pass-through camera yet, but hopefully I’ll manage to do this in a future version or Oculus will provide this through their API.
Here you can see how it all works. Johannes gets onto the Icaros wearing the Gear VR as he is able to see through the phone’s camera. Once on, he switches to VR mode using the controller on the handlebars and starts flying. He shifts his body weight left and right, back and forwards to steer.
The feedback from people attending WIRED 2015 was incredibly positive, Icaros is now a separate company spun out from Hyve and they are gathering early stage funding to go into production and take the project forward. It’s clear that virtual reality can make exercise more fun and the team at Icaros have loads of ideas for games and experiences to add to the platform.comments powered by Disqus