Friday, 11 April 2014

Oculus Rift on the Raspberry Pi

I already approached the subject a while ago, when I got the Oculus Rift SDK compiled for the Raspberry Pi and successfully accessed the head-orientation.

This time, I wanted to render simple 3D geometry for the Rift using the Pi. Is this adorable little machine powerful enough to support the Rift?

As explained in the official Oculus SDK document, rendering for the Oculus Rift implies several things:
  • Stereo rendering: rendering the 3D scene twice, once for each eye, side by side
  • Distortion correction: distorting the rendered image in a such a way that viewing it through the Rift lenses makes it look correct 
  • Chromatic aberration correction: this is a bonus step that aims at reducing color fringes introduced by the lenses

Accelerated 3D on the Raspberry Pi means using OpenGL ES, much like any mobile platform these days. The Pi supports OpenGL ES 2.0 which is enough for the shaders which implement the corrections described above. In fact, the SDK comes with "regular" Open GL shaders that work perfectly on Open GL ES 2.0. Smashing!

So after some learning and testing, I finally got "Rift-correct" rendering on the Pi. The scene is extremely simple as it's just a rotating cube floating in front of the user (the head tracking is there too). And here is how it looks like. Note that I removed the lenses of the Rift in order to film its screen.

Now for some benchmarks
All the tests were done on a Release build using the official vertex and fragment shaders. No attempt was made to optimize anything (I'm not sure there's much to be done honestly).

  • Rendering the scene twice (no correction shaders): 16 ms/frame
  • Rendering the scene in a texture and rendering the texture on a quad filling the screen (no fancy correction shaders): 27 ms/frame
  • Same thing with distortion correction shader: 36 ms/frame
  • Same thing with distortion and chroma correction instead: 45 ms/frame
Note: the render-target texture used in the tests was precisely the size of the screen (1280x800). Because of the pinching effect of the distortion shader, it should be larger than that (see "Distortion Scale" section of the Oculus doc) about 2175x1360. Unfortunately this was too much for the Pi. As a result a good part of the FOV is lost (the visible pink border). I haven't tried to see what the maximum texture size is, so I stuck to a scale of 1.

The good news is that using the Rift with the Pi can be done! However don't expect amazing results: with distortion correction on (a minimum to see a scene correctly), the frame rate on the simplest possible scene is about 27 fps. At first glance, this doesn't seem that bad. But when head-tracking is on, it does feel choppy and uncomfortable. Indeed, the Oculus team says that 60 fps is a minimum (in fact I believe the next development kit will have a screen with an even higher frame rate than that).

Getting the code
The code of this little experiment can be found at
Building instructions are provided there.


Edit - 11/26/2014
Since I did this test, things have moved a lot on the Oculus Rift front. It seems they're now providing rendering code that does much more clever things like vertex shader based distorsion correction with a precomputed map (instead of using complex pixel shader code). Hopefully this should be very beneficial for the Pi.