6th November 2015

Big Data VR Challenge - Part 2 - Develop conference, Brighton

Arcs screen grab

This article follows on from part 1.

Our team LumaPie (myself and Masters of Pie) won the Big Data VR Challenge judged at the Develop conference in Brighton. This was also the first time our researchers had a chance to try the app so it was great that it went down well.

New Data

After further discussions with ALSPAC they identified a good data set for us to work with, but given the strict security around their data they weren’t going to be able to give us any of it! To resolve this, Paul from ALSPAC managed to simulate a set of 1000 records for us that closely matched some of the real data.

The fields we had were: gender, age, height, sitting height, waist circumference, hip circumference, weight, systolic blood pressure, diastolic blood pressure, pulse and BMI. We also had these at 2 rough ages per child, 7yrs old and 11 yrs old.

For much of the project I still used UE4’s DataTable to bring this data in but realised towards the end that this only works in the Editor. I was already converting the data into a 2d array of floats for easier filtering, so I simply loaded the csv file directly and parsed them into the array that way.

New Unreal Engine version

Since Part 1, Epic released a new version of Unreal Engine (4.8) which included significant changes to the way procedural meshes need to be created so ignore most of the code in the last article!

The new ProceduralMeshComponent is still marked as experimental, but worked well for us on this project.

Fundamentally you created a Procedural Mesh Component, assign a material and create a mesh section passing in all the properties of the mesh in a series of arrays.


// create the mesh component
UProceduralMeshComponent* mesh = CreateDefaultSubobject<UProceduralMeshComponent>(TEXT("ProceduralMesh"));

// find the material and assign to the mesh
static ConstructorHelpers::FObjectFinder<UMaterialInterface> Material(TEXT("Material'/Game/DataMaterial2.DataMaterial2'"));
mesh->SetMaterial(0, Material.Object);

// set the mesh as the root component
RootComponent = mesh;

// define the arrays
TArray<FVector> Vertices;
TArray<int32> Triangles;
TArray<FVector> Normals;
TArray<FVector2D> UV0;
TArray<FColor> VertexColors;
TArray<FProcMeshTangent> Tangents;

// fill all the arrays 

// create the mesh section, 0 for the first section, false at the end to stop collision being added
mesh->CreateMeshSection(0, Vertices, Triangles, Normals, UV0, VertexColors, Tangents, false);

New features

We’d had a few video conference sessions with the ALSPAC team and got some good feedback on what would be useful functionality. Masters of Pie then came up with a new way of visualising the data by using spirals arranged in arcs around the user. Each object or node on the spiral is a person represented by a pyramid for boys and an icosahedron for girls.

Arc screen grab

Spiral arcs were chosen as a way to pack a lot of nodes into a small space, but also for them all to be a similar distance away from the user to counter the effect of perspective on the node size.

Each field in the data could be visualised in one of 5 ways:

Ring Position

Lower values would make the nodes appear on the left, higher values of the right.

Ring Height

There were four arcs, so assigning a field to this would group the nodes into quarters, with the highest 25% in the top ring going down to the lowest 25% in the bottom ring.

Node size

The value of the field would affect the relative size of the node.

Heatmap

The value of the field would affect the colour of the node ranging from red to blue.

Pulsing

If the value of the field was in the top 30% the node would flash on and off.

Matt from Masters of Pie was going to be building the front-end UI of the app so I set out to build a simple API he could hook into via Blueprints. His easy-to-use petal UI changed the field that was used for each part of the visualisation and a new mesh was automatically created.

We used the Razor Hydra as an input device which allowed the user to point with a wand-like cursor and easily slide filters and change fields.

Petal UI screen grab

Filtering & Saving

Some studies will only want to look at subset of data, so we created a simple slider UI to filter each field before it goes into the visualisation.

Filter UI screen grab

Becca, one of the ALSPAC team, wanted to see how the project would work end to end. We were already loading the data dynamically from a csv file and so we added a button to export the filtered data ids to another text file. This export could be used by another researcher to load up a saved view of the data or to request more detailed information on the people from ALSPAC.

Networking

One feature we really wanted to get in was networking as we saw great potential in multiple scientists being able to point at areas of the data and discuss trends whilst still in VR.

UE4 has some nice networking features, and thanks to some last-minute help from Osman, one of Epic’s developers, we got it working the night before judging and were able to show a networked view on a second PC with another DK2.

Summary

It was great to win the competition as a lot of hard work had gone into making something that would be useful. We’re now talking to ALSPAC about the future direction of the tool and hope to develop it further.

comments powered by Disqus