27th July 2015

Penguin Hustle - HTC Vive London Game Jam

When the chance to go to a GameJam using the HTC Vive VR system came up, I jumped at the chance. I’d been increasingly convinced tracked physical controllers were the way forward after building some R&D prototypes for Masters of Pie including Gepetto, a direct manipulation animation tool. These were with the Razor Hydra, which while the best available at the time, suffers from a limited range, calibration issues and has easily tangled cables. The prospect of wireless controllers and a significantly larger tracking space for controllers and HMD (Head mounted display) sounded like it would open up lots of opportunities.

The Team

I had recently met Marco (the other Unity developer on the team) while doing some work at Happy Finish and I knew he’d be able to hit the ground running in the limited time of a GameJam. I knew Richard from HammerheadVR from some work we’d done previously and he was also one of the participants in the Big Data VR Challenge. Richard brought in Jasper, an ace animator who normally works on feature films, but had also worked on Colosse, a winning entry in the Oculus Mobile VR Jam and Rob, a great director starting to work in VR, but unfortunately he couldn’t make it down to London for the weekend.

The Kit

The HTC Vive is a VR system built by HTC and Valve, due out later this year. We got to play with the dev kit at the GameJam organised by Bossa Studios and playhubs in Somerset House. The HMD is great, you can hardly see any pixels or screen-door effect and there’s barely any chromatic aberration either. If you’ve only used a Oculus DK2 or Gear VR, it’s way better than that. It does run at a pretty high resolution though and also 90fps which I was concerned might give us performance problems in the GameJam, but actually that didn’t turn out to be a problem. The positional tracking is done with lots of sensors picking up laser scans from two Lighthouse base stations placed at opposite corners of your room. This has lots of benefits, the main one being you can move around in roughly 4mx4m area and still be accurately tracked.

Being able to walk a few steps in any direction is a liberating factor, but perhaps the biggest game-changer is the pair of wireless controllers that come with it too. The dev kit ones look a little odd, but using them in a VR space is a joy. They just are where you expect them to be and the tracking is spot on. They’re tools you hold in each hand which bridge the gap between the physical world where you hold them (and where you get real haptic feedback) and the virtual world where they can perform a variety of actions.

We also got to try out a few demos, including TiltBrush, where you can paint light in 3d space and Aperture and insanely high detailed demo using robots from Valve’s Portal. A surprisingly compelling demo was just using the controllers to blow balloons and bat them around realistically.

Initial ideas

We had a few different ideas for story, game and interaction mechanics before the GameJam but were well aware that these might not work so well once we saw them in VR. Our plan was to knock up a load of quick demos and try these as soon as possible to inform the direction we went in. We were given some sample source code from Valve and we each built a scene based on that to try interactions, art styles and to get a sense of the right scale to use. Amazingly this early build worked first time on the Vive hardware and gave us a rough sense of what we could achieve in the time.

It was obvious from this build that:

We went back to our white-board and wrote up a big list of things to work on next.

Penguins screenshot

Making it a game

The brief for the GameJam was very open, with the only criteria being that there should be a playable level at the end. This did force us to focus though, and we had to quickly come up with a simple objective and scoring system that we could get done in the time.

We decided that you had to give each penguin only one fish and you’d be scored on how many you under-fed, fed correctly and over-fed.

Penguins would jump onto the iceberg from the sea and then would ‘smell’ the fish in you hand or the buckets and move towards them. If you threw a fish they would run after it and race to eat it. The water pistol could be used to make the fed penguins go away while you tried to feed the hungry ones.

Our feature list was very long and unfortunately quite a lot of stuff got dropped as we had less than 36 hours from start to finish. Anything that was going to take much more than 30 minutes had to be really important to spend the time on it.

We did get some of Jasper’s great animations in though and some nice touches like splashes as the penguins jumped out of the water.

The end result got some great feedback and it was fun to see people’s different approaches to feeding the penguins.

Performance

Chet from Valve was very clear that 90fps was mandatory and games wouldn’t be shown if they didn’t hit that. SteamVR comes with a handy frame monitor though and you can even get it to show on the back of the virtual controller in the VR world. To help us out further, Nvidia had supplied monster demo machines with GTX980s in.

In one iteration we had 40+ penguins all animating on the little iceberg you stand on which pushed us under 90fps. This turned out to be a CPU bottleneck as GPU-skinning wasn’t turned on by default in Unity, so was easily fixed. We also reduced the amount of Penguins on the iceberg at any one time as it was a little crowded with that many anyway.

Towards the end we had another skirt with the 90fps line. We hadn’t really done much optimising and still had Unity’s dynamic lighting and GI on. We didn’t want to start baking things at this late stage so I went through and turned off a load of things we didn’t really need, like shadow casting on the penguin’s beaks and eye balls and a lot of logging we had been using for debugging. Something in all of this got us back the right side of the line and luckily it remained comfortably there for the whole level.

Penguins screenshot

Lessons learned

While we learned loads about the Vive and it’s potential at the weekend, we also learned some other lessons about GameJams in general:

Use the tools you know

Both me and Marco knew Unity very well and it allowed us to iterate very quickly and get lots done.

Ask for help if it’s available.

I ended up using a couple of new Unity 5 features i’d not touched before. Unity had a guy there helping developers and he pointed me in the direction of some extra docs which again sped up development.

Use source control

Richard and Jasper hadn’t used git much before, but with a quick intro to SourceTree they were up and running in no time. This combined with us each working in our own Unity scene meant we could all work rapidly on the same project at the same time.

Get the asset pipeline going as soon as possible

Jasper had quite a few problems with getting the character animations into Unity from Maya. This ended up being Unity assuming the units were going to be cm even though he’d set them as something else, but we lost some valuable time to this.

Follow the fun

I’d seen this line a few weeks before and think it’s from Marc LeBlanc. In the tight time constraints of a GameJam it’s good to try a few different directions at the beginning, but as soon as you have something that’s fun, focus on that and make it even better.

Next steps

It turns out there’s already a game called Penguin Hustle, so currently we’re calling it Penguins VR until we can think of anything better.

Marco’s been busy polishing up some of the rough edges left from the GameJam. If you’re lucky enough to have a Vive dev kit and want to have a go, download it here. Feedback is very welcome.

We’re discussing the best ways to expand the demo level into a full game at the moment, so watch the space for more developments!

comments powered by Disqus