When the documentation can't be trusted

Jun 16, 2013

As of late, this thing has basically turned into Per’s Gaming Blog. I don’t really mind, but sometimes it’s nice to have a little change of scenery, so for once I’m going to talk about something other than video games.

The Roomba, the eye tracker and our application. The Roomba, the eye tracker and our application in a nice group photo.

I recently finished my bachelor’s project in software development with a fellow student. It concerned controlling a Roomba vacuum cleaning robot with gaze by using an eye tracker, and while the project worked out alright, it didn’t go as smoothly as one could have hoped.

The most egregious problem was getting the Roomba control to work. Basically, the Roomba robots have a serial port through which you can send commands (like drive or stop) and receive sensor data (like distance and angle traveled). All this is specified in the documentation of the Roomba Open Interface (OI), which contains thorough descriptions of all available OI commands and sensor values.

The Plan

To navigate the Roomba around a given room, our application needed to be aware of the robot’s position at all times. This was achieved by combining an absolute origin position (which had to be manually placed) with continuous distance and angle measurements relative to this point.

Luckily, the robot itself contains an odometer that enables recording of both distance and angle, and the values of these can be requested by a serial OI command. The idea is that every time one of these sensor values is requested, the robot’s internal counter is reset to 0, meaning that our application needed to keep track of the total distance and angle by accumulating the received values.

To actually request the data, the OI documentation specifies several options, most notably single requests and streaming. The former is a one-off request, simply returning the value once, while the latter initiates a streaming mode, i.e. sending an updated value every 15 ms until explicitly stopped. The OI doc states that

[Streaming] sensor data is best if you are controlling Roomba over a wireless network (which has poor real-time characteristics) with software running on a desktop computer.

It makes sense, and because the quote above were our exact scenario, streaming was the obvious choice.

The Problem

But the streaming didn’t work. According to the OI doc, the format of the data returned is

[19][Length][Packet ID 1][Packet data][Packet ID 2][Packet data][Checksum]

which in our use case (distance and angle) would translate to

[19][6][19][Distance in mm][20][Angle in degrees][Checksum]

After coding up some stream alignment code and running it, something strange happened. The structure we received was exactly as specified, but the actual value bytes were all zero, all the time. Calculating the checksum confirmed that something was wrong; they surely weren’t supposed to be zero. The OI doc mentions that

It is up to you not to request more data than can be sent at the current baud rate in the 15 ms time slot. If more data is requested, the data stream will eventually become corrupted. This can be confirmed by checking the checksum.

However, we found that this wasn’t the cause of the problem as we requested far too little data to be limited by the transmission rate. Alas, after much troubleshooting (as well as asking a Roomba hacking forum and iRobot’s support itself), we couldn’t identify the issue other than maybe our robot was somehow defective.

We had to get it working though, and so we tried using the single request approach to see if we could get any usable sensor values back. It turned out to be another roadblock. According to the documentation, the distance and angle values are supposed to be returned in millimeters and degrees, respectively, but here’s an example of what we got:


1440° (clockwise)-652
1440° (counter-clockwise)653


5030 mm (forward)-631
4980 mm (forward)-615

Though reasonably linear in nature, the reported measurements were completely off. To this day, we have no idea what units they use, but it’s certainly not millimeters and degrees. Furthermore, straightforward movement produced negative values in direct contradiction to the documentation.

The Patch

The problem was twofold: how do we get distance and angle data from the robot in a continuous manner, and how do we get it in an actually usable format?

For the first part, we ended up trying to recreate the broken streaming feature by running a separate program thread that performs a single request at a given interval, in our case 60 ms to allow for the extra data traffic necessary.

while(true) {
    byte[] buf = { 142, (byte) packetID };
    Send(buf, 0, buf.Length);

As for converting the reported values to a usable format, we had to come up with some constant such that

actual = reported * constant

and a bit of simple math shows that this could be found using

constant = actual / reported

After a number of experiments, this resulted in a rotation constant of 2.2 and a translation constant of 8.1. Furthermore, the translation values had to be negated to make forward translation produce positive values, which is not necessary for the application but is more intuitive for humans.

Our application ultimately worked well, save for some inaccuracies likely caused by the constants, which were really just approximations based on empirical data. The initial confusion caused by the strange behavior as well as the multiple workarounds we had to implement delayed the project considerably, adding to the fact that we started it somewhat later than we probably should have.

I guess the moral of the story is to get started early and check any external factors for problems as soon as possible, so they can be fixed or the project can be adjusted to account for them. Also, use devices that actually work.