Friday, 11 April 2014

Oculus Rift on the Raspberry Pi

I already approached the subject a while ago, when I got the Oculus Rift SDK compiled for the Raspberry Pi and successfully accessed the head-orientation.

This time, I wanted to render simple 3D geometry for the Rift using the Pi. Is this adorable little machine powerful enough to support the Rift?

As explained in the official Oculus SDK document, rendering for the Oculus Rift implies several things:
  • Stereo rendering: rendering the 3D scene twice, once for each eye, side by side
  • Distortion correction: distorting the rendered image in a such a way that viewing it through the Rift lenses makes it look correct 
  • Chromatic aberration correction: this is a bonus step that aims at reducing color fringes introduced by the lenses

Accelerated 3D on the Raspberry Pi means using OpenGL ES, much like any mobile platform these days. The Pi supports OpenGL ES 2.0 which is enough for the shaders which implement the corrections described above. In fact, the SDK comes with "regular" Open GL shaders that work perfectly on Open GL ES 2.0. Smashing!

So after some learning and testing, I finally got "Rift-correct" rendering on the Pi. The scene is extremely simple as it's just a rotating cube floating in front of the user (the head tracking is there too). And here is how it looks like. Note that I removed the lenses of the Rift in order to film its screen.



Now for some benchmarks
All the tests were done on a Release build using the official vertex and fragment shaders. No attempt was made to optimize anything (I'm not sure there's much to be done honestly).

  • Rendering the scene twice (no correction shaders): 16 ms/frame
  • Rendering the scene in a texture and rendering the texture on a quad filling the screen (no fancy correction shaders): 27 ms/frame
  • Same thing with distortion correction shader: 36 ms/frame
  • Same thing with distortion and chroma correction instead: 45 ms/frame
Note: the render-target texture used in the tests was precisely the size of the screen (1280x800). Because of the pinching effect of the distortion shader, it should be larger than that (see "Distortion Scale" section of the Oculus doc) about 2175x1360. Unfortunately this was too much for the Pi. As a result a good part of the FOV is lost (the visible pink border). I haven't tried to see what the maximum texture size is, so I stuck to a scale of 1.

Conclusion
The good news is that using the Rift with the Pi can be done! However don't expect amazing results: with distortion correction on (a minimum to see a scene correctly), the frame rate on the simplest possible scene is about 27 fps. At first glance, this doesn't seem that bad. But when head-tracking is on, it does feel choppy and uncomfortable. Indeed, the Oculus team says that 60 fps is a minimum (in fact I believe the next development kit will have a screen with an even higher frame rate than that).

Getting the code
The code of this little experiment can be found at https://github.com/jbitoniau/RiftOnThePi
Building instructions are provided there.

Enjoy!



Edit - 11/26/2014
Since I did this test, things have moved a lot on the Oculus Rift front. It seems they're now providing rendering code that does much more clever things like vertex shader based distorsion correction with a precomputed map (instead of using complex pixel shader code). Hopefully this should be very beneficial for the Pi.

Thursday, 27 February 2014

Pan-Tilt FPV using the Oculus Rift

In a previous experiment, I used the head-orientation of the Oculus Rift to drive two servos moving an FPV camera. That was a good start but not very useful as the FPV video feed wasn't displayed in the Rift.

After putting more work in this project, I finally got a functional FPV or tele-presence system that takes the most of the Rift immersivity (if that's a word). The system relies on various bits and pieces that can't possibly be better explained than with a diagram!


The result
The result is indeed surprisingly immersive. I initially feared that the movement of the FPV camera would lag behind but it's not the case, the servos react quickly enough. Also, the large field of view of the Rift is put to good use with the lens I used on the FPV camera.



Some technical notes
The wide FOV lens I use on the FPV camera causes significant barrel distortion on the captured image. After calibrating the camera (using Agisoft Lens), I implemented a shader to correct this in realtime.

I use Ogre 3D and Kojack's Oculus code to produce the type of image expected by the Rift. In the Ogre 3D scene, I simply create a 3D quad mapped with the captured image and place it in front of the "virtual" head. Kojack's Rift code takes care of rendering the scene on two viewports (one for each eye). It also performs another distortion correction step which, this time, compensates for the Rift lenses in front each eye. Lastly, it provides me with the user's head-orientation that translates later down the chain to servo positions for moving the FPV camera.

As the camera is physically servo-controlled only on yaw and pitch, I apply the head-roll to the 3D quad displaying the captured image (in the opposite direction). This actually works really well (thanks Mathieu for the idea!). I'm not aware of any commercial RC FPV system that does that.

And ideas for future developments...
One of the downside of the system is the poor video quality. This comes from several things:
  • the source video feed is rather low resolution,
  • the wireless transmission adds some noise
  • the analog to digital conversion is performed with a cheap USB dongle
Going fully-digital could theoretically solve these problems:
  • for example, using the Raspberry Pi camera as a source: the resolution and image quality would be better. It is also much lighter than the Sony CCD. It doesn't have a large FOV though (but this can be worked around)
  • transmitting over WiFi would avoid using a separate wireless system. But what kind of low-latency codec to use then? Also range is an issue (though directional antena and tracking could help)
  • the image manipulated by the receiving computer would directly be digital, so no more composite video capture step.
Another problem with the current system is that the receiver end relies on a PC. It would be far more transportable if it could run on a small computer like the Raspberry Pi (which could probably be held at the user's belt).

I should also get rid of the Pololu Maestro module on the transmitter end as I've already successfully used the Raspberry Pi GPIO for generating PWM signals in the past.

Lastly, it would be fantastic to capture with two cameras and use the Rift stereoscopic display.

So still some room for improvement! Any advice welcomed.

The receiver-end (A/V receiver on the left, Wifi-Pi on the right)

Friday, 21 February 2014

Video latency investigation Part 2

In a previous article I investigated the latency of different video capture and display systems. Wireless video transmission was left aside at that time.

Since then, I got my hands a brand new RC video transmission kit and also a very old CRT TV...

Portable CRT TV
(composite input)
TS832 and RC832
5.8GHz AV Transmitter & Receiver


I (re)did some of the tests on the LCD TV I used in the previous article, just to check the reproductibility of previous measures. Unfortunately its composite input died when I started the tests, so I switched to the SCART input using an adapter.

Here are the results:
Setup Average
latency in ms
(4 measures)
Sony CCD + LCD TV (composite) (same test as prev. article) 41
Sony CCD + LCD TV (SCART) 31
Sony CCD + AV Transmission + LCD TV (SCART) 31
Sony CCD + CRT TV 15

Some more conclusions:

  • The latency induced by a CRT display is inredibly small!
  • The 5.8Ghz AV transmission kit doesn't add any measurable latency to the system
  • Weirdly enough (at least on this particular LCD TV), the latency decreases by about 10ms when using the SCART input instead of the composite one.

Thursday, 20 February 2014

Raspberry Pi Flight Controller Part 2 : Single-axis "acro" PID control

After dealing with inputs/outputs it's time for some basic flight control code!

The best explanation I've found about quadcopter flight control is Gareth Owen's article called "How to build your own Quadcopter Autopilot / Flight Controller". Basically, you can control a quadcopter in two different ways:

  • In "Acrobatic" or "Rate" mode: the user's input (i.e. moving the sticks on the transmitter) tells the controller at what rate/speed the aircraft should rotate on each axis (yaw, pitch and roll). In this mode the user is constantly adjusting the quadcopter rotational speed which is a bit tricky but allows acrobatic maneuvers, hence it's name! The "acro" controller is the easiest you can implement and it only requires a gyroscope in terms of hardware. The basic KK board I was using previously is an Acro controller.
  • In "Stabilized" mode: this time the user's inputs indicate the angles on each axis that the aircraft should hold. This is far easier to pilot: for example: if you center the sticks, the aircraft levels. Technically, a stabilized flight controller internally uses an Acro flight controller. In terms of hardware, in addition to the gyroscope, this controller needs an accelerometer (to distinguish up from down), and optionally a magnetometer (to get an absolute reference frame).
So let's start at the very beginning: an Acro flight controller working on a single axis. Here's the kind of loop behind such a controller:
  • Get the latest user's inputs:
    • The desired rotation rate around the axis (in degrees per second)
    • The throttle value (in my case, unit-less "motor-power" between 0 and 1)
  • Get a reading from the gyroscope (a rotation rate in degrees per second)
  • Determine the difference between the measured rotation rate and the desired one: this is the error (still in degrees per second)
  • Use the PID magic on this error to calculate the amount of motor-power to apply to correct the error. This is the same unit as the throttle.
  • Add this correction to the throttle value, and send the result to one motors. Subtract the correction from the throttle value and send the result to the second one (the [0..1] value is converted into a pulse-width in microseconds, i.e a PWM signal that the ESCs understand)
And repeat this loop as frequently as possible (for me: the flight control is done at 125Hz, the gyro reading at 250Hz, and the user's input reading at 50Hz). And this is what the result looks like:


Graph colors:

  • green: gyroscope reading
  • blue: user's "desired" angular rate 
  • red: throttle value
Notice how the quadcopter rotates at constant speed when the desired angular rate stays at the same non-zero value.

Saturday, 18 January 2014

Raspberry Pi Flight Controller Part 1 : Inputs and Outputs via GPIO

I'm now focused on my next mission: turning the Raspberry Pi into a quadcopter flight controller (this has already been done by many people, but I'd like to try it myself for the fun of it). If sucessfull, I'll be able to get rid of the good old KK board.

The first step is to handle the I/O, namely:
  • getting inputs from an IMU (gyroscope, accelerometer and more...)
  • sending output PWM signals to the ESCs/motors

In the past, I managed to do both using USB modules (a Phidget Spatial as an IMU and a Pololu Maestro as a PWM generator). Now I'd like to get rid of these too and do everything using the Raspberry Pi GPIO.

For the IMU part, I bought a Drotek IMU 10DOF V2. This I²C board has the following components:
  • MPU6050 gyroscope and accelerometer
  • HMC5883 magnetometer
  • MS5611-01BA barometer/temperature sensor
For only 20 euro-bucks, that's a bargain!

The code I wrote to support these components is using Jeff Rowberg's excellent I2C Device Library. This lib is originally for Arduino but someone ported the I2C communication code to the Raspberry Pi. In theory all the devices supported by the I2C Device Library should work on the Pi with minimum effort!

For the PWM part, the Raspberry Pi can generate PWM signal in hardware without any problem. I used the C code from the also excellent RPIO.PWM library.

I've wrapped all this code in my own hardware abstraction library called Piha. This allows me, for example, to swap PWM generation from GPIO to Pololu Maestro in pretty much one line of code. Piha has it's own user-interface to display recognized devices and play with them. It now supports quite a wide variety of devices/components/APIs, I shall present it in another post someday.

Here are some pictures of the set up. Only one ESC is connected to the GPIO for now.




And here it is live!