Questions about the differences in graphics on the Quest and the Rift keep coming my way – both about Richie’s Plank Experience and other titles. To clarify a few things and provide insight into what goes behind the scenes, I’m putting this blog post together so anyone interested can learn what it takes to not drop too many frames on the Quest and still look adequate compared to the PC counterparts.
I’m Danny – the artist on the Richie’s Plank Experience team and this is the complete story of how we get to have trees and reflections in RPE on the Quest.
Early days of porting
Richie’s Plank is developed in Unity and it gives us access to amazing tools and default shaders. So, we used those just like we did on the Rift version, we put all the basic functionality and graphical assets in the game. It ran well, but we didn’t have a consistent frame rate and there were many graphical improvements to be made yet.
To solve the FPS problem, we started taking assets away: removing a few cars, removing the hedges, simplifying the materials, but all of that was not enough. There were still points of view in which the experience lagged a lot. The two major ones were looking at the city from the plank and flying around the city. Basically, what our experience is all about.
Doing some more research on the Oculus Dev Blog it turned out they have amazing articles with graphs and tons of information about what impacts the framerate on the Quest.
In short: the fewer materials, textures, and shaders you have, the faster the game runs. If we could somehow have 1 material for everything in the city and cut down on textures, we would be able to fix our performance.
One day of intense UV unwrapping
A 3D artist has many steps to take to make an asset that gets put into the game. Generally, it goes from Concept -> Modelling -> UV unwrapping -> Texturing -> Importing and setting up in the game. What is UV unwrapping? Imagine the process of taking an earth globe and flattening it out to make a map. Instead of using a sphere, we use whatever object we want to ‘flatten out’. This flattening of the 3D model is needed so we can use 2D textures on it.
Let’s take these two types of textures: tileable textures and non-tileable textures. The first one can be applied to an object many times and the object still looks good. The second one is made to be a specific size and if you change that everything looks really broken (example in image 7).
The city of RPE uses a lot of tileable textures because it gives us more control. But there is something cheaper – a texture atlas. Instead of having 1 texture per 1 object, you can have multiple smaller textures into one big texture and then assign as many objects as you need to that texture.
If we do this, we can reduce the number of materials and textures to 1 instead of having 1 different material for each type of object.
Remember that concept of UV unwrapping I explained earlier? That is what had to be done to each asset in the game in order to go from using an individual texture to using this one master atlas texture.
It comes with a few visual sacrifices – more obvious tiling and reduced resolution of the assets, but at the end of that day we optimised the game by so much we could think about bringing in some assets like the trees in the park and normal maps again.
Bye, bye normal maps, hello custom shaders
Well, not really, we still have normal maps in the game, but it is more like: “we have 1 normal map”. Let’s take it from the start – what are normal maps?
Normal maps are these purplish wacky looking textures that describe at what angle light should bounce off a surface. It is an amazing tool to simulate bumps and smaller details without adding more complexity to the models.
Using a custom shader created in Amplify Shaders I would create a way for us to use normal noise map that we could add in the game only to the surfaces we needed without using another 4k map like the Color texture.
The only surfaces that use normal maps for creating bumpy surfaces are the windows of the buildings.
Flying up close to a building and seeing all these reflection waves is one of the details we add to make that flying experience more epic and realistic. We just couldn’t go without it.
You most probably have noticed that there are not many transparent surfaces in VR and even less on the Quest titles. For a good reason – it is one of the heaviest effects to render. Add a simple glass to a scene and it could wreck the performance before you pour water into it.
But we need to have trees because they are nice and are the most important part of the RPE park. There are many ways to make trees, but the clear winner for the Quest was this method called “Alpha to coverage” which is a more efficient way to draw transparency.
As you can see in the image there is again a loss of quality, but that was the only solution we could find for having the trees in RPE. The way it works is that instead of drawing the transparency separately it would evaluate it all together which is the key to saving performance on the Quest.
Post-processing is the process of adjusting your colors after everything has been rendered. In this step you can easily add contrast, add bloom, tone your shadows to be bluer and your highlights to be warmer.
That technique on the Quest is expensive, thus, we had to drop it. In my opinion this is the main reason why most games look off compared to their counterparts on PC. We can use lighting to create contrast, but there are still cases where if you push it a bit more in post it starts getting that cinematic feeling.
All these techniques and sacrifices plus a lot of manual labour put into refactoring assets from the Rift to the Quest version has been rather time consuming but the result was worth it. Every time we see videos of people sharing our title with their friends and family, we know it was worth it.
Bonus: trailer and accurate representation of the game on each platform
We shot a whole new trailer and different mixed reality footage so everyone watching that trailer can have a more realistic expectations of what the title on that platform would look like.
Soon Victor and I will be working on a blog post describing that process in depth. 3 days of shooting and 2 months of editing and digital re-shooting. Stay tuned on Discord!