3rd April 2014

Better interaction in VR - Oculus + Myo + iPhone

The new Oculus DK2 looks like it will be a big step forward with better resolution, head tracking and low persistence. While I'm eagerly waiting for my pre-order, I wanted to look at improving the way people can navigate and interact in VR. This hack involves moving all the navigation controls to an iPhone, leaving a free hand for interaction. In this case, throwing grenades around the Oculus Tuscany demo scene.

Navigation

To navigate a real world environment, people need to be able to move along the plane they're standing on in 2d dimensions, forward and back, left and right (strafing). They also need to rotate their body to decide what 'forward' is.

Most Oculus VR demos and the Unity integration use a similar setup.

They utilise standard keyboard/mouse controls PC game players are used to:

or with a gamepad it’s:

On top of this there are modifiers for running instead of walking and other buttons for jumping or ducking.

If you’ve got a VR headset on, I think keyboard control is a non-starter. Yes, experienced game players can do this, but for novices it’s easy to lose where your fingers are and you end up lifting the headset up to check. Something handheld is the way to go.

The gamepad controls are ok and easy if you’re used to playing games, but for me they have a fundamental problem: They take up both hands!

To interact in VR you’re going to want at least one hand free to do things that aren’t navigating.

Complete body immersion

One approach is replicate what your entire body is doing. The 5 sensor Stem2 system from Sixence looks like an amazing way to represent your body in a VR world. And combined with the Virtuix Omni you can use your feet to do all the navigation.

This will be great for installations but not many people with have this rig at home. Also I think a lot of people will want to experience VR sitting down.

Freeing up hands for interaction

Ideally you’d have both hands free. To do that you could use some sort of brainwave detection for navigation. That might be cool one day, but today it would need a lot of training and I want to reduce the learning curve for novice users. Another option is using your feet like the Omni; perhaps you could have some pedals if you were sitting down.

For the moment though, I’ll live with just one free hand and I've tried to get all the navigation controls into the other hand.

So what does that look like?

I haven't found a one-handed navigation controller that does what i’m looking for yet. You can hold a standard gamepad with one hand, but it’s not ideal and it’s very hard to turn your body and move at the same time.

At some point i’ll try and hack some hardware together, but for this VR hack, i’ve used an iPhone. It’s something people are used to holding in their hands and it’s got a gyroscope, compass, accelerometer and a touch screen so you’d hope it would be some use as an input device.

After a little experimentation:

This input was sent via a UDP socket to Unity over Wifi. It could be better as i’ll explain below, but does the job for now.

Interaction!

Now we have a free hand, how can we capture that input? At the most basic level, you need to be able to point at things and then perform some sort of action. This is where I think the Myo comes in quite well. It’s gyroscope can be used to approximate the movements of a mouse, but in 3d. And it’s gesture sensing ability allows you to perform actions.

I've set things up so doing a fist gesture makes a grenade appear roughly where your hand is. Then doing a throw gesture releases the grenade in the direction your hand is moving in.

Using natural gestures like this with your primary hand/arm feels quite satisfying, and really helps with the sense of immersion.

For this hack I paired the Myo with the iPhone and sent it's data through the UDP socket for simplicity.

Testing

I demod the Oculus at the Evolution of Gaming event and used this as an opportunity to try out novice-user reaction to the one-handed navigation controller with around 50 people.

Most people had never user VR before and were all pretty impressed with the DK1, even though it made them feel a bit dizzy. What was really encouraging was that they could all navigate well with a single hand, after only a little instruction.

There were some issues however:

Conclusions

While I think the touch screen fills in for an analogue stick surprisingly well, ultimately I think an 2D analogue stick has better affordance for the task of moving around in 2d. The gryoscope roll for body rotation also worked surprisingly well, but for longer experiences, I think using the orientation of the device will be too much of a strain and will be restrictive. A 1D analogue stick could fill this requirement quite nicely, but that may be difficult to use say with your fore finger at the same time as the 2D analogue stick with your thumb. Another option could be to require you to hold in a button to turn and still use the controller's orientation.

In a more controlled study it might be interesting to see the effects of swapping strafing and rotation controls to see which should be more primary. In big open spaces and driving-style scenarios, you don't need strafing, but as soon as you get indoors or have tight spaces it becomes much more necessary. The difference between strafing and turning is, I think, just going to be one of those things users are going to have to learn and it doesn't seem to be that big a hurdle to get over.

For some simple games you could just go in the direction you're facing, but for a more general purpose VR controller, I think you need head direction de-coupled from movement.

I didn't include a jump or crouch control, but some people suggested this could either be done by tapping or using the accelerometer to detect a sharp movement up or down.

I didn't let the users try the Myo, but considering it's pre-release hardware, it performs quite well and was easy to setup and integrate.

What's next

I'd like to hack together a few analogue sticks into a one-handed controller and test that with users. While it may seem preferable to re-use existing input devices for VR, I think something purpose-built for navigation will be much easier to use and doesn't need to be that expensive.

Ultimately the interaction hand input ought to have positional tracking as well as orientation and gestures and I'd like to experiment further with the Myo to see if that can do all three well.

Side note

All of this assumes realistic real world navigation, i.e. users sliding around a 2d plane and going up/down steps and gradients etc. Obviously VR doesn't have to be restricted to that, but I think users need to have mastered basic real world navigation before attempting unnatural movement. In theory though this could still be managed in one hand by making the one dimensional body rotation control, two dimensional.

comments powered by Disqus