Last year in the Fall, Anthony (@anthonyslai) and Noel (@noelportugal) wanted to get some experience with their new Leap Motion gestural controllers.
So, naturally, they decided to use the Leap to control the OWI 535 robot arm via its USB interface.
Shortly after they finished that, they began to complain that the OWI 535 wasn’t good enough. They conspired to convince me that we needed the Lynxmotion AL5A because it was “better.”
Everyone loves the robot arm demo, and it’s quite memorable. So, I made them a deal; get the Lynxmotion, but make it controllable by Leap over the intertubes.
Challenge accepted.
A couple weeks ago, Noel took the box of parts that is the unassembled AL5A and made them into a robot arm, no easy task. There are 21 steps in the assembly instructions for the base alone.
This week, he, Anthony and Raymond have attached the arm to a Raspberry Pi, which controls it, and they’re working all the other pieces, including the Leap code, written in Python, and the other bits.
If all goes well, I should be able to open a browser to show a Dropcam streaming video of the AL5A. I’ll be able to attach my Leap Motion, run a script, and then control that arm remotely by waving my hand over the Leap. Pretty cool stuff.
I had high hopes that the old and busted robot arm would get a post describing the technical workings that went into bringing it to life, and I’ve been promised a post on the details of the new hotness as well. We’ll see. I’m not holding my breath.
Even so, there’s a lot of ingenuity that should be documented. Noel says he thinks this might be the first documented use of a Raspberry Pi to control this particular arm, so there’s that.
So, if you see any of us at a conference this year, ask for a demo of the new hotness.
Find the comments.
Please, please, please host the code somewhere! I’m trying to develop a remotely-deployed autosampler using a RasPi and the AL5D and I would LOVE some code to get me started (and that way I can reference you guys when I get it working!).