For the past few months I’ve been working with Masters of Pie as lead developer on a VR experience for their client Siemens.
This experience was run at the Euroblech conference in Hanover from 21st to 25th October 2014.
Siemens wanted an immersive way to explain the benefits of their solutions in the manufacturing industry. Participants put on an Oculus DK2 headset, headphones and sat on a chair with a ButtKicker attached. They initially see a factory with a sheet metal press running in it. The realistic looking scene running at 75fps, audio and vibrations caused by the Buttkicker all contributed to the immersion and gave visitors a sense of being in a real factory away from the trade-show floor.
To demonstrate some of the special features of Siemens’ products, Masters of Pie came up with the concept of ‘data-mode’. This would give the visitor layers of contextual information depending on where they looked, and see the effect a Siemens product would have.
Initially we experimented with novel natural user interfaces, but in the end we opted for mouse control. While i’m keen to move beyond existing controllers for VR, in this case the mouse actually worked well. It also removed the need for users to learn a new input device in the short space of time at the trade-show.
When users right clicked on the mouse, an interface appeared. With this user interface they could change various parameters such as altering the speed and motion profile of the press, changing the level of energy management and moving to certain locations on a map.
Also looking at certain objects allowed you to zoom in on them. Moving the user in VR like this can cause issues with motion sickness if not managed well. To deal with this, we used a very flat easing curve to keep acceleration to a minimum and made sure the movement speed wasn’t too high as well.
While users could control everything for themselves, we also added keyboard controls for Siemens at the trade-show so they could do things like switch modes and trigger animations if they needed to.
At the trade-show, Siemens wanted other people at the stand to see what the user was seeing and doing. While with the current Oculus SDK you can enable mirroring to a second screen, you see a very distorted image that’s designed to be resolved back by the lenses in the Oculus headset. We all wanted the second screen to show an undistorted image and so i added the ability for a networked app on a second separate PC to see the same as the user was.
We built the app with Unity which has great networking support built-in for certain things so actually getting the camera view to be in sync was quite trivial. More complex however, was the state of the virtual world, animations and in particular the user interface. I added the necessary networking calls so that every screen state, button rollover animation and the mouse position was in sync between the two machines. Since there was just a single network cable between the two machines, there was hardly any lag and you could easily use the UI while looking at the second screen.
Pretty much everything in data-mode was either semi-transparent or needed to be faded in or out at some point. This created some z-ordering challenges but I resolved these using some custom shaders and some scripts to force a few objects to be rendered in front or behind others.
Masters of Pie wanted objects that you could interact with to be outlined when you looked at them and I implemented this with a stencil buffer technique which worked well.
One of the features that needed to be demonstrated was the effect of different energy management systems. We worked closely with the Siemens engineers to map what was happening with energy flow and managed to present this in a clearer and more informative way than had been achieved before.
It all went very well at the trade show, with the stand proving very popular and users getting a much more immersive experience than they were expecting to.comments powered by Disqus