Friday, 11 April 2014

Oculus Rift on the Raspberry Pi

I already approached the subject a while ago, when I got the Oculus Rift SDK compiled for the Raspberry Pi and successfully accessed the head-orientation.

This time, I wanted to render simple 3D geometry for the Rift using the Pi. Is this adorable little machine powerful enough to support the Rift?

As explained in the official Oculus SDK document, rendering for the Oculus Rift implies several things:
  • Stereo rendering: rendering the 3D scene twice, once for each eye, side by side
  • Distortion correction: distorting the rendered image in a such a way that viewing it through the Rift lenses makes it look correct 
  • Chromatic aberration correction: this is a bonus step that aims at reducing color fringes introduced by the lenses

Accelerated 3D on the Raspberry Pi means using OpenGL ES, much like any mobile platform these days. The Pi supports OpenGL ES 2.0 which is enough for the shaders which implement the corrections described above. In fact, the SDK comes with "regular" Open GL shaders that work perfectly on Open GL ES 2.0. Smashing!

So after some learning and testing, I finally got "Rift-correct" rendering on the Pi. The scene is extremely simple as it's just a rotating cube floating in front of the user (the head tracking is there too). And here is how it looks like. Note that I removed the lenses of the Rift in order to film its screen.

Now for some benchmarks
All the tests were done on a Release build using the official vertex and fragment shaders. No attempt was made to optimize anything (I'm not sure there's much to be done honestly).

  • Rendering the scene twice (no correction shaders): 16 ms/frame
  • Rendering the scene in a texture and rendering the texture on a quad filling the screen (no fancy correction shaders): 27 ms/frame
  • Same thing with distortion correction shader: 36 ms/frame
  • Same thing with distortion and chroma correction instead: 45 ms/frame
Note: the render-target texture used in the tests was precisely the size of the screen (1280x800). Because of the pinching effect of the distortion shader, it should be larger than that (see "Distortion Scale" section of the Oculus doc) about 2175x1360. Unfortunately this was too much for the Pi. As a result a good part of the FOV is lost (the visible pink border). I haven't tried to see what the maximum texture size is, so I stuck to a scale of 1.

The good news is that using the Rift with the Pi can be done! However don't expect amazing results: with distortion correction on (a minimum to see a scene correctly), the frame rate on the simplest possible scene is about 27 fps. At first glance, this doesn't seem that bad. But when head-tracking is on, it does feel choppy and uncomfortable. Indeed, the Oculus team says that 60 fps is a minimum (in fact I believe the next development kit will have a screen with an even higher frame rate than that).

Getting the code
The code of this little experiment can be found at
Building instructions are provided there.


Edit - 11/26/2014
Since I did this test, things have moved a lot on the Oculus Rift front. It seems they're now providing rendering code that does much more clever things like vertex shader based distorsion correction with a precomputed map (instead of using complex pixel shader code). Hopefully this should be very beneficial for the Pi.


  1. thanks for sharing... it works!

    if it possible to use the x86linux apps for oculus on raspberry pi?

  2. thanks.
    Unfortunately, the Linux apps from the Oculus SDK are for the x86 architecture (Intel PC for example). They need to be compiled for ARM to run on the Pi

  3. This comment has been removed by a blog administrator.

  4. Impressive work, thank you! Everything works fine until I try to run it - then I get the error: failed to create Device. In your other post you wrote: '...he Sensor is properly detected, and by adding few lines to the initialization code...' is that why it doesnt work? I followed your instructions on github to the letter and I am clueless as to what I did wrong. Hope you can help!

    1. Hi Alex.
      It's been a while since I wrote this code...
      First question: are you using the exact same Oculus SDK version (0.2.5c) as I did? Not sure how easy it is to get old versions nowadays but make sure you've got the same one (more recent versions have changed quite a bit and I'm not sure my code would work).
      Then, make sure you've connected both the Oculus both with USB and HDMI (or DVI) obviously :)
      Try "lsusb" in a terminal line to check the device is detected by the Pi.
      Then maybe try to run the app with "sudo"... But I think it should work without.
      I can't remember which version of Raspian I had but I doubt a recent version would cause a problem.
      Hope it helps!

    2. Thank you for the very fast reply! I am not sure how to determine which SDK I have, but my documentation pdf says 0.2.3...
      lsusb shows the Rift, and everything is plugged in :)
      sudo doesn't change anything.
      One thing I forgot to mention was that when running cmake with the -DC... option, the terminal returns that this manually specified variable was ignored. Could that be a problem? Thanks again!

    3. It was in fact the wrong SDK version! Now everything works! Thank you once again!

    4. Hi Alex. Glad it finally works! Enjoy :)

    5. Hi Jacques. Although I'm using SDK 0.2.5c, I'm getting the same error message: "Failed to create Device". I'm using Oculus DK2, do you think this may be the issue? Since my PiB+ has only one HDMI port, I connected the Oculus to the HDMI port and used VNC to connect to the Pi display through my laptop. The Oculus light glows orange when only the USB is connected and alternate between orange and blue when HDMI is also connected. lusb seems to show that the Pi detects the Oculus as two devices: Bus 001 Device 007: ID 2833: 2021 and Bus 001 Device 008: ID 2833: 0021. Using sudo leads to the same error message. Any idea? :)

    6. Hi RĂ©mi,
      Looking at the Oculus website (, it seems they've introduced support for DK2 starting from SDK 0.3.1.
      Unfortunately that means this RiftOnThePi project most probably doesn't work with DK2 as it's using the old 0.2.5c SDK which explains the error you experienced :-(
      I might get back to the code at some point and update it to the most recent SDK, but I don't know when.

    7. Now I understand better. Thank you for this clarification Jacques! :)

  5. Any insight as to why mine is failing to link?

    pi@raspberrypi ~/RiftOnThePi/Build $ make
    [ 10%] Built target OpenGLESSandboxLib
    [ 92%] Built target LibOVR
    Linking CXX executable RiftOnThePi
    /usr/bin/ld: ../Dependencies/LibOVR/libLibOVR.a(OVR_Linux_HMDDevice.cpp.o): undefined reference to symbol 'XFree'
    //usr/lib/arm-linux-gnueabihf/ error adding symbols: DSO missing from command line
    collect2: ld returned 1 exit status
    RiftOnThePi/CMakeFiles/RiftOnThePi.dir/build.make:137: recipe for target 'RiftOnThePi/RiftOnThePi' failed
    make[2]: *** [RiftOnThePi/RiftOnThePi] Error 1
    CMakeFiles/Makefile2:194: recipe for target 'RiftOnThePi/CMakeFiles/RiftOnThePi.dir/all' failed
    make[1]: *** [RiftOnThePi/CMakeFiles/RiftOnThePi.dir/all] Error 2
    Makefile:72: recipe for target 'all' failed
    make: *** [all] Error 2

    1. The error is in libX11 - did you install it? did you remember to link in the Makefile with -lX11? I had the same problem at some point and think this was why it didn't work for me... also try google, it's a common problem :)

    2. Hi! I have the same problem, could you explain again how to solve it (I have tried to install libX11).


    3. I've fixed this X11 link problem. If you get the latest version from github it should work (probably due to a change in the Raspian distribution).