Saturday, February 11, 2017

The math behind the vertical plotter

Many people are asking after the math behind the rope plotter. I used only second grade high school math to do it. Here's how it works. 

Target motor positions

For all calculations I am using a coordinate system with the top left of the door as (0,0) and with cm as units. So the top right of the door is (90,0). The positive y-axis is pointing down so I have only positive numbers. The first goal is now to calculate the correct lengths of the ropes L and R in terms of the coordinates (x,y) of the target location for the robot. For this you can use Pythagoras. The left rope is the easiest: there is a triangle with sides x,y and L where `L = (x**2 + y**2)**0.5` (In python the double asterisk means a power). The right rope is also part of a triangle. This triangle has y, 90-x and R as sides. Therefore `R = ( (90-x)**2 + y**2 )**0.5`. Now that we have the target lengths we have to calculate the degrees the motor has to rotate to reach our desired coordinate. The diameter of the bush is about 0,9 cm. A complete 360 degree rotation results therefore in a 0,6 cm * 3,14 change in length. That is about 170 degrees of rotation to move one centimeter. I mounted the motors so that forward rotation is upwards. Therefore the motor target for the left motor is L * -170, the right target is R * -170. 

Navigating the drawing canvas

Now we have a formula to navigate the complete door. But what we really want is to navigate a piece of paper in the middle of the door. I wanted the robot to be usable independently of the space between the attachment points or the size of the paper. Therefore I defined a second coordinate system that has (0,0) in the top left of the paper and is normalised. This means all coordinates are between 0 and 1. This coordinate system is easy to scale on different plotting surfaces and paper sizes. It works like this: let m be the margin left of the paper, n be the margin on top and C the with of the canvas. The coordinates (x,y) in terms of the normalised coordinates (u,v) become simply: `x = C * u + m` and `y = C * v + n`.

At this point it's simply a matter of reading two files, x.rtf and y.rtf with normalised coordinates and moving the robot towards each of these coordinates. 

Ev3 program

You can also download the Lego Mindstorms Ev3 program to see the details. The ev3 main program expects 2 rtf files (robot text files), on named x.rtf and the other y.rtf. Both contain target coordinates. x.rtf has the number of coordinates as it's first line. Note that you can't open these files with a Rich Text File editor!

All the stupid mistakes I made when building 3nsor, the vertical plotter

After long nights of hard work I finally have vertical plotter that makes recognisable illustrations! In this article I'm sharing all the trouble I ran into. There's als an LDD file, 3nsor.lxf, for this 5th generation.

Mistake 1: the wrong spindle

My first hunch was to use a large spindle, so the relative increase in diameter with a layer of cord would be minimal. A varying diameter makes the math a lot more complicated.  The second spindle looked best because it had the smoothest surface. The others had all kinds of ribs. I made several generations with the second spindle. The problem was, though that the motor had to push a really big load. Because, with the spindle diameter, the lever length and thus the torque increases. So I had to add two gears to increase the force. Also, when just using the large spindles without the gears, the internal motor inertia is not enough to keep the robot in place on the vertical surface. It would slowly uncoil and descend. Another problem with that spindle was, that it wasn't part of the original 31313 set. So I went and looked for a smaller spindle. The simple red bush proved to be the solution, but only after I found the right thread...

Mistake 2: the wrong cord

I tried many different cords. I was looking for thin and strong, because a thick thread makes the diameter of the spool change a lot when it's rolled up. As I was counting degrees for movement, this would mean a bad distortion of the drawing. The first thread (the brown) was nice and thin, but broke after 3 drawings. It nearly cost me my brick. The second one, thick white was stronger, but so hard that it dented the soft lego parts and so thick that I had to use a big spool to avoid the change in diameter. This in turn required a gearbox which was imprecise and lossy. Only after a year of failed experiments I had the idea of using dental flos. Thin, strong, cheap, and not all too elastic. A little elastic is ok, as the increased length of the thread, due to stretch would compensate for the increased diameter on the spool.

Mistake 3: running the program from the command line

During development I was coding on my laptop while the robot was drawing. After some time, though, when the robot was running stable I closed my laptop and went to other stuff while the robot was drawing. That was a bad idea. Because I was coding on ev3dev, I ran the program from the command line. And Linux terminates programs run from the command line if the command line gets a time-out. The result was that the program stopped in the middle of a drawing. And when an ev3dev program stops, the motors keep running at the speed they are set to, they don't stop automatically. This resulted in the robot driving all the way up to the top right corner of the door, making a hard-to-clean sharpie stain and then being catapulted all the way to the left on the floor. I was really lucky that it landed on relatively thick rug...

Mistake 4: Putting the pen way below the point where the cords meet

I thought gravity would keep the pen in position but I forgot about impulse. A pen down below gave really imprecise and wobbly lines. That was cool for the very first experiment. It even made my beard look better. But ultimately it was a bad idea.

Sunday, August 23, 2015

How to get the Mindstorms Ev3 brick to read coordinates from a file

In order to plot drawings I needed to read coordinates from a file, while running a program on the Ev3 brick. It turns out to be a very tricky task that is not well documented. The limitations I did know were:

  • Ev3 can only read a whole line per file access
  • There are no text manipulation blocks that allow me to split lines by commas
  • There is no way to detect the end of a file
  • I will have to put x coordinates and y coordinates in separate files

What is the file format used by the Ev3 for writing numbers?

I started by generating a file full of numbers from an Ev3 program so I could reverse engineer the file format. I made a simple program that divides 1 by 2 about 20 times.
Next I opened the file in vi to see what was in there:
A ha! This is very enlighting. My conclusions from this little test:
  • It seems that numbers have 4 decimal places
  • Numbers are separated by ascii 13
  • The file extension is .rtf
  • ...but the file format is txt. It has nothing to do with the Rich Text Files. 
TextEdit on Mac will not open the file. Only vi or sublime text are working for me. On to the next phase.

How to generate number files that the Ev3 can read

Now it's just a matter of writing my list of coordinates to file. In Python that looks like this:
xfile = open('x.rtf', 'w')
yfile = open('y.rtf', 'w')

for x,y in pointlist:
    #write each number on a new line


Test if the Ev3 can read the numbers

Finally I wrote a litte program that reads from the file, performs some math and plots the result to the screen. And behold, when I compare to the original text file, it works!

Building a Mindstorms plotter with two ropes

Today I finally succeeded in building a Mindstorms robot that plots pictures, suspended by two ropes. Check the video. I ran into so much trouble building this, that I'll share what I learned.

These are the problems I ran in to:

  • Making the robot lay flat on the door. Gravity can be a bitch. (coming soon)
  • Selecting rope and pulleys for the robot (coming soon)
  • Generating the coordinates for the portrait (coming soon)
  • Getting the Ev3 to read coordinates from a file
  • Building a PID controller that does not reset the motor sensor (coming soon)
  • The math behind a two-rope plotter (coming soon)
  • Making the pen go up and down (coming soon)
Enjoy! I hope you can avoid my mistakes and build an even better plotter.

Wednesday, December 24, 2014

Modifying the original BrickPi case to fit a Raspberry Pi model B+

The standard acrylic plates that come with a BrickPi do not accommodate a Raspberry Pi model B+. The screws don't fit and the holes are in the wrong locations. You need to do some modding to make it work. But then you have 4 USB ports on your lego creation, and a nice and compact micro SD!

What you'll need

  • A 3mm drill
  • An M3 screw with a small head

Step 1: The extra hole

The first thing to do is drilling a 3mm hole in the corner of the bottom acrylic plate, the one without the BrickPi logo. It doesn't have to be super accurate, as we'll be using a large hole for the other screw. Just put your B+ on the acrylic plate to mark the hole.

Step 2: Enlarge the holes on your Raspberry Pi

Somehow the new model has smaller holes than the old B+. Carefully enlarge the holes on the circuit board with a 3mm drill.

Step 3: Mount the RPi on the acrylic plate

In the top left corner you'll need that smaller M3 screw. The screws that come with the BrickPi are too large and cover the micro USB port and the black 4R7 thingy, so you can't tighten them.
The bottom right screw goes into a larger hole in the acrylic plate that is meant for a Lego peg. So you have some play there.

Step 4: Slide on the BrickPi and assemble the rest

Here's a completed assembly in an unfinished Lego robot.

Optional: Add a bevel to the holes in the acrylic plates

It's hard to insert Lego pegs in the acrylic plates that com with the BrickPi. Using a large drill you can manually add a little bevel so they go in more smoothly. You can also use a slow turning Dremel tool.

Thursday, December 11, 2014

Writing blog posts on blogger with code snippets made easy

In my mindstorms hacking projects I need quite some code. And I want to blog about it, but the default editor in doesn’t have a button to mark text as code. You can abuse the blockquote, or add the <code> tag by hand, but it’s quite a hassle. The solution proved the be the brilliant StackEdit web app! It’s so amazing, I’m going to pay the little fee they ask.

With it you can write your blog post using markdown. And with the code blocks like GitHub has them in the file. Actually I got the idea to use markdown for blogging when I was writing a readme. Markdown is so much easier to use than a wysiwyg editor or typing all the HTML tags manually.

Here’s how to use it on your blog.

1. Edit the html of your blogger blog

In the blogger dashboard go to ‘template’ and click the ‘Edit HTML’ button. Then, just above the closing <\HEAD> tag insert this:

<link href='' rel='stylesheet'/>
  <script src=''/>

2. Go to and write something interesting about code

StackEdit is mostly self-explanatory. So start writing and then click the hash icon on the top left. Choose: Publish > Blogger and there you are! A great blogpost with minimal typing and layout effort!

Realtime video stream with a Raspberry Pi and PiCam

I want to build remote controlled Lego robots with an onboard camera so I can drive around with them without having to see them. I did a lot of research to get a lagless video stream from the Raspberry Pi to my computer. It proved to be quite a challenge. But I found a way, it works!
Actually there are two methods that work: gstreamer and netcat. Both are detailed below. VLC and Mjpeg player are alternative methods that I didn’t get to work, at least not lagless. My favorite method is gstreamer.


This solution proved the most stable, lag-free and flexible solution to me. The reasons for this being that gstreamer has nice python bindings and the order in which you start the sender or the receiver doesn’t matter. Gstreamer installation should be really easy, both on the RPi and on your mac. I will assume you installed and enabled the PiCamera already. On your RPi just do:
$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install gstreamer1.0
On the mac, the easiest way to install gstreamer is using homebrew. I prefer it over macports. Just do:
$ brew install gstreamer gst-plugins-base gst-plugins-good
Easy as Pi. On windows I wasn’t able to get gstreamer to work. If you know a good installation tutorial, let me know.
Now it’s time to stream. These are the commands you need.
On Raspberry Pi do (change the IP address to the address of your target computer):
$ raspivid -t 999999 -b 2000000 -o - | gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host= port=5000
On your mac do:
$ gst-launch-1.0 udpsrc port=5001 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false
On my setup I had a near realtime stream over wifi. I didn’t measure it exactly but the lag was below 300ms.


The alternative to gstreamer is using netcat to dump the camera data over a network pipe. This requires installing mplayer on your mac/pc. Again, it’s easy with brew. The trick is to read at a higher framerate than Pi is sending. This way the buffer stays empty and the video is real-time.
Here the order in which you execute the commands is important. First do this on the mac:
$ nc -l 5001 | mplayer -fps 31 -cache 1024 -
Then, do this on the RPi - insert the correct IP address, of course.
$ raspivid -t 999999 -w 640 -h 480 -fps 20 -o - | nc 5001
It’s also possible to do this on Windows. For this you have to download netcat and mplayer and put them in the same directory. Go to that directory using the command prompt and execute this:
> nc -l -p 5001 | mplayer -cache 32 -demuxer lavf -


Streaming with VLC from the raspberry Pi is fairly straightforward. I was unable to do it lagless, but the cool thing is that you can pick up the stream on an iPad with vlc installed, or on a mac using just the VLC app. No need for brewing.
First install VLC on the RPi
$ sudo apt-get install vlc
Then start streaming on the RPi
$ raspivid -o - -t 0 -hf -b 1000000 -w 640 -h 480 -fps 24 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8160}' :demux=h264
To pick up the stream, open the VLC app and pick up the stream with a URL like this: Here insert the name or IP address of the RPi.


Mjpg-streamer is also cited as an alternative sometimes, but I haven’t gotten it to work. The installation instructions are arcane and require v4l drivers.
Written with StackEdit.