Thursday, January 16, 2014

Gamepad controlled Mindstorms NXT robots.

Yeah, yeah the EV3 has fancy ipad remote control. Infrared too. But I happen to have an NXT in perfect working condition. And I had an old gamepad in the attic. So why not try and connect all of that gear? It's a nice evening project.

Step 1. Get the Mac (and python) to read gamepad input

I connected the generic USB gamepad to my mac but not much happened. So I tried connecting a PS3 sixaxis controller. This showed a visible connection, in the list of bluetooth devices. Ok, now python. Some googling gave me three options for libraries: jaraco.input, PyGame, Py-SDL2. There were some more but I couldn't get them to install. Turned out that jaraco.input only worked on linux and windows. PyGame took a very long time to compile, and in the meantime I tried SDL2. Installing SDL2 via homebrew was really fast and the coding examples worked right away. It turns out I could read both the cheap generic USB controller and the nice PS3 sixaxis controller. Sweet!

When you can get it to install with homebrew and pip, it's as simple as this:


Step 2. Send motor commands to the NXT via bluetooth

For sending motor commands I had already found the jaraco.nxt library. I used that one for sending bluetooth messages in the Sumo project. It saw that it contained motor commands too. As it turned, sending the motor commands wasn't as easy as I thought. Jaraco comes with pretty meager documentation and examples, so I had to do some digging in the source code. It turned out that you have to explicitly turn motor regulation on, otherwise the motors only seem to run at half power. Turning that on was an undocumented feature.

So here's the code for sending a simple motor command, turning the motor on with a certain speed.

Step 3. Converting Gamepad input into motor input that works with the robot configuration

As I was driving around an omnibot, the great HTRotabot, I still needed to convert (normalised) joystick input to motor speeds for the three motors. A simple sin() distribution of the power over the motors does the trick. The motors are all in a 120 degree (2/3*PI radians) to each other. Here's the code:




Abandoned routes for the lego sumo project

During the development of the Sumo Arena with camera tracking I also tried a lot of approaches that DIDN'T work out. I think they are just as interesting as the final result, so I'll detail them here.

Processing.org

My first attempt was with processing. I went trough great pains to extend the great open source library NXTComm. I got it working. And contributed to the open source project. But then I abandoned the route for two reasons: java and openCV.
A while ago I discovered python, now I'm spoiled. I dislike java's semicolons, curly braces, variable types etc. etc. And was curious about openCV. Which - incidentally - plays very well with python and it's number crunching libraries SciPy and NumPy. In retrospect a bit of a shame, as processing.org is easy to install and plays well cross platform.

OpenCV - Haarcascades

I first played around with facial recognition and the algorithms behind that. What if I could train the computer to recognize robots instead of faces? After lots of code experiments it turned out that the haarcascades needed for facial recognition don't work very well rotated faces. Stand on your head and it doesn't work anymore. For the sumo match I needed to calculate rotation from each frame so this route was a dead end.

OpenCV - Feature recognition

Next I tried feature recognition. Computers can read QR codes, computers can detect transformed images in other images. So wouldn't they be able to recognize two mindstorms robots? Yes they could but... the process was too slow. I achieved a frame rate of 15 for detecting 1 robot. And I needed two robots at 30fps. Another 16h of coding wasted.

OpenCV - Blob recognition

First I figured that blob recognition would work best if I had a certain color on the robots that would really stand out. I thought that little LED lights would work nice. They would make for two spots that looked much brighter than their surroundings. And they look cool on robots too. Again I was wrong. Because of the autogain on most webcams the LEDs came in as white pixels, devoid of any color. Even with gain control at the lowest gain they stayed white.

The final solution was a couple of simple post-its. They stand out without triggering the camera gain. They are like very small 'green screens': easy to filter out.