Saturday, 16 November 2013

Android remote for camera servos

Here is my first step in the Android world...

The app sends yaw/pitch angles as UDP packets to a remote computer that controls two servos (via a Pololu Maestro module).

In fact the remote computer runs the exact same code I used for my Raspberry-Pi-WiFi-Quadcopter. Indeed, the packets I send also contain throttle, rudder, elevators and ailerons controls (all set to 0 here). So in theory, I could just add 4 sliders to the app and it could fly the quadcopter :) but that wouldn't be very practical or safe!. It'd be better to use virtual joysticks (touchscreen-based), or maybe sensors like the gyro.

There are still a few things I need to improve in the code (like a loop that sends the packets regularly instead of in response to UI events only), but this is already a good starting point...

Sunday, 3 November 2013

GoPro Hero 3 Black Edition battery life

In this test, the GoPro is recording a video for as long as possible (the camera shuts itself down when there's no battery left). Here are the results:

Video settingsWifi Time 
1920x1080 50fpsOff1h20
1920x1080 50fpsOn55min
1920x1080 24fpsOff1h38
1920x1080 50fpsOn1h19

A bit disappointing. You better have spare batteries!

Monday, 28 October 2013

Video latency investigation

Following my first FPV flight, I wanted to get a better idea of the latency of the video feed. How does the Oculus Rift compare to a regular RC FPV system? What's the latency of different cameras and displays?

In this experiment, I only consider the image capture and display stages, not the wireless transmission stage involved in an FPV system. Why not? Because I fried my 5.4Ghz transmitter so I'm waiting for a new one!!!

OK, so the experiment is pretty straight forward:
  • A display (TV or monitor) shows "in real time" what a camera sees
  • I drop a ping-pong ball in front of the camera
  • At the same time, I film the actual physical event AND what the display shows with another camera
  • Then by watching the recorded video, I can spot the exact time when the ball bounces against the floor in real-life. Then I can do the same thing a few milliseconds later when the display shows the event. The difference in these two times gives me the latency of the entire image-capture-and-display system

The camera I use for filming the whole scene is a Go Pro Hero 3 set to its minimum resolution: WVGA 848x480 and maximum frame rate: 240 frames per second. This rate is huge, each frame of the video represents only about 4 milliseconds! This sounds great in theory in terms of precision but in practice things are not that clear when it comes to measuring the exact time of the "second" bounce on the TV/monitor. Indeed, due to the nature of image sensors and LCD screens, images of a rapid event look blurry. This can be seen on the following sample footage (resampled at 25 fps so it appears slow-mowy):


Here are the parts I've combined in the tests:


Sony CCD.
A traditional analog camera
used in FPV systems
GoPro Hero 3.
Also used in FPV systems
(analog live output)
EasyCap USB adapter.
Analog video input for PC
Sony PS3 Eye.
Often used in augmented
reality or vision applications
Raspberry PI camera
Funai LCD TV.
Old-school 4/3 TV
LG PC Monitor

Oculus Rift

And here are the combinations I've tried and the corresponding results. Once again, these are pretty "crude" latencies, but they allow comparison of various pieces of equipment.

Setup Average
latency in ms
Sony CCD + LCD TV 37
GoPro Hero 3 + LCD TV 54
Sony CCD + EasyCap + PC + Monitor 120
GoPro Hero 3 + EasyCap + PC + Monitor 143
Sony CCD + EasyCap + PC + Oculus Rift 123
GoPro Hero 3 + EasyCap + PC + Oculus Rift 131
Sony PS3 Eye (640x480 75Hz) + PC + Monitor 160
Sony PS3 Eye (320x240 187Hz) + PC + Monitor 58
Pi Camera (default HD) + Raspberry Pi + LCD TV98
Pi Camera (160x120) + Raspberry Pi + LCD TV90

In conclusion:

  • You can't beat a  traditional "analog" RC FPV system as it's latency is around 40 ms. Note again that we're not counting the wireless transmission here, just the image capture and display.
  • Adding a PC between the camera and the display (USB capture + rendering) adds 80 to 100 ms... In total we get about 150 ms. That's 3 to 4 times the latency of an RC system!
  • The webcams I've tried (i.e cameras that provide the computer with digital images directly) don't do very well either. Decreasing the resolution can help though. That's the case for the PS3 Eye at 320x240.
Once I receive my 5.8Ghz video transmitter, I'll be able to do a couple more tests with it (inserting it between the camera and the display). We'll see how much it adds in terms of latency (I suspect not much).



Wednesday, 9 October 2013

My first Oculus-based FPV!

At last, I was finally able to do my first FPV flight ever! Not the usual FPV system here as I'm using an Oculus Rift for display. The camera on the tricopter is a GoPro Hero 3. Both of these have a huge field of view so the feeling of immersion is just great!

On the down side though, this requires quite a bit of hardware (a laptop, an USB video capture device, the Rift itself, batteries...) and software (home made in my case). So this looks quite messy and not (yet!) as practical as a commercial out-of-the box FPV solution, but I'm working on it. There's also a bit of lag in the video feed most probably due to the video capture stage. It's still flyable though.

The reception/display set up

Here are some of the videos recorded by the GoPro. Note that these are not what I'm seeing live when flying. The transmitted video is noisy and smaller in resolution.







Monday, 7 October 2013

Testing the GoPro Hero 3 on a tricopter

Here are my first videos taken from a GoPro Hero 3 (black edition) mounted on a tricopter (RC-explorer 2.5 style). These are still visual flights here. It was getting dark, so the quality is not great.


I managed to lose one of the propellers at the end... (ahem!).

Thursday, 3 October 2013

Getting closer to FPV!

The standard waterproof GoPro case is very nice but it's quite heavy (about 105 grammes with the mount at the base) and you can't use the live video output cable at the same time. Nonetheless, I tried to attached it to the tricopter just to see how it'd fit...


In this configuration it would work fine for taking videos but it wouldn't be possible to fly FPV. So I designed an alternative casing out of foam (the kind used to protect things in transport boxes). Pretty crude but it does the job...



I'll try to come up with a safer way of fixing the camera to the tricopter, something less rubber-bandy! I couldn't resist doing a quick flight test (in my living room. Don't try this at home!). It's looking quite promising: there doesn't seem to be much vibrations.


Saturday, 14 September 2013

Powering the Oculus Rift from a lipo battery

In order to make the Rift more transportable I wanted to get rid of the wall power supply. As I had a lipo battery and a spare UBEC lying around I decided to give them a go.
A 2-cell lipo battery (1200 mAh)
The battery output is 7.4V so it needs to be reduced to around 5V. This is done using an UBEC from the RC world.
UBEC with 5V or 6V output (selectable)
The UBEC comes with bare output wires. As I use JST connectors a lot, I've soldered a male connector for the output (same as the battery output connector).

And finally, I just had to make a small JST-DC coaxial adapter cable. I think the DC connector I used is a type M (2.1mm inside diameter).

JST to DC coaxial cable
Once connected the whole thing looks like this (a bit cumbersome)...

The output reads about 5.2V which is very close to the output of the wall power supply, and the Oculus is pretty happy with it!

It's important not to power the Rift with an empty battery as the voltage would be too low. To detect that, you can use a low-voltage alarm plugged into the charge lead of the battery. Something like this:

So all this works OK, but I'll probably go for a simpler solution like an USB-to-DC cable as mentionned on the Oculus forums. It'll draw power from a spare USB port on the laptop/computer. A computer is necessary anyway for producing the images in the first place.

Sunday, 14 July 2013

Quadricopter training school!

Controlling a quadricopter with a computer joystick is cool, but controlling it with two joysticks doubles the fun! The goal here is of course to use the Wi-Fi transmission system as a way to teach people how to fly.

Maverick feels good about it
Here is how it works:
  • On the ground, two joysticks are connected to the transmitter computer (the laptop or the Raspberry Pi as explained in a previous post)
  • One joystick goes to the "instructor" (me!) and the other one to the "student"
  • The instructor can take total control of the aircraft at any time by holding a particular button down. 
  • When this button is not held, the commands are shared between the instructor and the student. The student is assigned a well defined subset of the 4 basic commands: throttle, rudder, elevators, ailerons. The instructor is implicetly assigned the remaining commands.
Here's a concrete example: let say we're at the beginning of the "training". I set up the transmitter so the student is only given one command on his joystick, say the elevators. His goal is to understand its effects on the aircraft (basically it makes the aircraft lean forward and backward). This means that while he's concentrating on it, the instructor (still me!) has to handle the three other commands (throttle, rudder and ailerons). Typically I'll try my best to hover a meter or two above ground as steadily as possible, so the role of the elevators in this examples is clear to the student.

After a while, when the student appears to be comfortable with a command, I give him another one to try (combinations are possible).

Of course, if he makes a mistake and the aircraft starts going nuts, I can take back full control with the push of a button! 

Here's my 6-step weight loss training program. 
Joysticks are using RC Mode 2 layout

  1. Elevators only (pitch)
  2. Elevators and ailerons (roll). The whole right stick on the controller.
  3. Throttle only
  4. Throttle and rudder (yaw). The whole left stick.
  5. Rudder, elevators and ailerons
  6. EVERYTHING!
The whole experiment took place with three different guinea pigs (thanks Daniel, Cédric and Cécile!) and it worked like a charm! Well almost... only a single benign crash at step 6 :) I won't give names!

Here are some photos and footage!






Trois, deux, un, go!




Saturday, 13 July 2013

Oculus Rift, servo-controlled camera and a bit of Raspberry Pi

I got my hands on an Oculus Rift this week (thanks Paul!). The OVR SDK is really neat and clean!

This is what I've done so far:


This runs on my desktop PC. Just out of curiosity, I tried the OVR SDK on the Raspberry Pi as well.

The ARM architecture isn't officially supported by the Oculus guys but it compiled almost OK. The only thing I had to change was a bit of assembler code. As far as understand, the "dmb" instruction doesn't exist on armv6 (which is the Pi's instruction set). After a bit of research, here is the black magic alternative I've found:

In file Kernel/OVR_Atomic.h, struct AtomicOpsRawBase (around line 100):

#elif defined(OVR_CPU_ARM)
// http://www.jonmasters.org/blog/2012/11/13/arm-atomic-operations/
#define MB()  __asm__ __volatile__ ("mcr p15, 0, r0, c7, c10, 5" : : : "memory")

struct FullSync { inline FullSync() { MB(); } ~FullSync() { MB(); } };
struct AcquireSync { inline AcquireSync() { } ~AcquireSync() { MB(); } };
struct ReleaseSync { inline ReleaseSync() { MB(); } };

Another thing I didn't fix was the name of the output directory for the lib which is still "i386".

Unfortunately, the OculusWorldDemo doesn't work. First, it doesn't find the HMD device. Also there seems to be a more global issue with X11+OpenGL on the Raspberry Pi. I haven't had a chance to look at that yet.

On the good side though, the Sensor is properly detected, and by adding few lines to the initialization code, I could get some readings...  (oh and thanks Synergy for the remote mouse! Instructions to build it for the Pi here)


The Sensor is not particularly useful on its own, but that means that the servo-controlled camera test above would also work on the Pi.

Now I need to work on displaying useful stuff on the Rift.You've probably guessed where this is all going...

Tuesday, 4 June 2013

Using two Raspberry Pi's to control a quadricopter

After the tricopter adventure I decided to go quadri! This means more payload and it also makes it easier/cleaner to strap various electronics on it and prototype things. Here's the result:


Back to the Raspberry Pi+WIFI transmission system, I've made some good progress. Initially the transmitter software only worked on Windows, so I ported it to Linux. It now runs on a second Raspberry Pi, aka "the ground station"! I'm using the Linux joystick API, via my own C++/object wrapper, and it works nicely with every game controller I have in store, including the XBox one...


...and an RC transmitter connected as a USB joystick (so not radio-transmitting!). The laptop on the picture is used to SSH the two PIs.


I had to do quick a few tests before finding decent values for the gains I apply to the different controls on the joysticks (throttle, rudder, etc...). In a nutshell, a stick returns a value between say -0.5 and 0.5, and I had to convert that into adequate PWM values for the KK board. The problem is that a thumbstick on an XBox controller is physically very different from one on an RC transmitter, it's much smaller so you're less precise as it moves faster with your thumb. It needs to be less sensitive than the RC stick.

Anyway the flying experience proved to be pretty correct.


Oh I forgot to say that in order to investigate latency, I first tried a wired version of the transmission system. The flying Pi was controlled with a joystick plugged directly onto it. In other words, this was a WC-quadricopter!


Next up: dual-command and quadricopter training school!

Sunday, 31 March 2013

Using a Raspberry PI + Wifi as transmission for a tricopter

I finally completed my little challenge of replacing the standard 2.4Ghz RC transmission system of my tricopter with a home-made wifi-based system. This is completely overkill and useless but it was pretty amusing to put in place! By the way, that's the exact opposite of what some people have done to their parrot drone transmission!

Here are a few diagrams to illustrate what I've done (I like diagrams!). The following is a block diagram of the "standard" tricopter:


And here is the same thing with the new "transmission" block:

Concretely, I've basically replaced these simple and effective pieces of equipment:

With this (on the transmitting end):

And this (on the receiving end):

Funnily enough, it's possible to connect an RC-transmitter to the laptop via USB. It will be perceived as a gaming device... Yes I realize how silly it all is :) !

Strapping all this on the tricopter was a bit lousy (the USB cable takes so much space and weight!)


But here's a successful 5-second test flight :)

The overall control felt a bit sluggish... It might be the extra weight or more likely the whole system introduces latency. Anyway I need to make some more tests!


Tuesday, 15 January 2013

Steady data transmission over WIFI - Part 4 - Other weirdnesses

So far, 1024 bytes every 20 ms represents a meagre 51KB/s. It's more than enough for transmitting RC commands, but with WIFI 802.11 g it should be possible to peak at about 1MB/s in perfect conditions (which might be incompatible with my need for low latency/steady transmission but anyway...).

Just out of curisosity, I naively tried to increase the amount of data sent per second (x4). Unfortunately it didn't go too well. Here is an example of catastrophic result :)

Weird increase of latency on the reception side
And after few seconds the latency reduces
It's not always broken! Here's a pretty acceptable transmission
Not sure what happens here, but the uglyness seems to happen only in one direction: from the Windows-7 netbook to the Raspberry Pi. Things are relatively normal in the other direction.

Amongst the things that might be worth looking into:
- The MTU
- The way I create the 2-machines WIFI network (I've been using Connectify so far). Maybe I could try an adhoc network
- Nagle's algorithm. It's supposed to be TCP only, but this weird increase/decrease latency phenomenon looks a bit like an algorithm is trying to do something clever.
- Network settings/parameters at the OS level on Windows and Linux
- Or to simplify things a lot: using identical hardware and OS on both sides

I'll investigate that later and try to cover it in another article. If anyone's got advice on the matter, please let me know!

Saturday, 12 January 2013

Steady data transmission over WIFI - Part 3 - Socket buffering

Here is another issue that showed up on the transmission graphs:


On the reception side (the top timeline on the graph), a packet loss seems to translate into a long silence  (longer than just what to be expected from the lost packets) followed by a big burst of data. There's some buffering at play here and I don't want it. It wouldn't help at all with controlling an RC model!

Fortunately there's some control over that: it is possible to set the size of the sending and receiving buffers on a per-socket basis. This is done using the setsockopt function with SO_RCVBUF or SO_SND_BUF as parameters. 

So I went on to expose these buffer sizes as tweakable parameters of my test definition file.
By setting these buffer to the size of the packet I transmit, I expected to achieved a more steady transmission. And indeed, in the same test conditions, it seemed to fix this particular burst problem.