Robots Controlled by Text

After a bunch of travel and short hiatus, I’m back in the saddle.

In case you were wondering, I had three conferences in a two-week span, GSummit, Kscope 12 and jQuery Conference. Since I got back, I’ve been taking it easy and manning the grill, as is the custom for Summers in America.

Anyway, before I left, I teased our Kscope session. If you were at the show, you might know what we did, or maybe you saw Noel (@noelportugal) and me wandering around with the Rock ’em Sock ’em Robots. They attracted a lot of attention from the moment we checked into the JW Marriot in beautiful (and very hot) San Antonio.

While Noel parked the rental car, I hurried inside to get out of the 109-degree heat with all our stuff, including luggage, laptop bags, a box of flying monkeys and the robots.

I walk in the lobby and two bellhops are immediately asking about the robots. One even offered to buy the rig, once he heard it was controlled by text.

I’m getting ahead of myself a little.

When I submitted the session abstract, I wanted to focus on REST, its ease of use, flexibility, ubiquity, platform and language agnosticism, etc. and the RESTful APIs available in the WebCenter Suite (@oraclewebcenter) of products, i.e. Content, Portal, Sites, and Oracle Social Network (OSN).

I also wanted to showcase Noel’s chops as a maker; he’s done some pretty fantastic things, e.g. his Christmas lights and Halloween candy dispenser.

Because Kscope is a developer event, I wanted to create a combination of the Internet of Things and REST to build a fun and innovative demonstration. I wasn’t going for an obvious enterprisey demo, but rather something that would captivate the audience and get people thinking about how they could apply the principles to their work and/or personal lives.

I don’t know about you, but I like to be entertained at conferences. I can do the heavy thinking on my own time.

Anyway, we initially started with a Toy Story theme, but after an hour’s worth of brainstorming and some help from a few friends, we didn’t have anything we really liked. Somehow, we got on the Rock ’em Sock ’em Robot topic, and went from there.

Happily, there is a link between Toy Story and Rock ’em Sock ’em Robots, believe it or not:

Developers love this toy, and it’s been hacked previously with some very cool results.

Noel loves Twilio, and we knew everyone in the room would have a cell phone, making for a perfect control mechanism.

All that was left was to tie in the RESTful API of a WebCenter product, and we decided on OSN.

So here’s how it works:

  1. We provided a phone number for texting or calling. If you texted, all you had to do was include the keyword red or blue to trigger that robot. Callers were prompted through a voice menu.
  2. Requests were queued for each robot. Unfortunately, the robots alternated punches, so not very realistic, but we plan to change this for later versions.
  3. Each robot executed a randomized combination of punches for a specified time period.
  4. After the combination, the next queued request executed.
  5. Texters received replies that their robot had executed a given combination of punches.
  6. Each combination created a conversation entry in OSN.
  7. When the bout ended, i.e. the head popped up on one of the robots, texters received a victory reply, and the robot declared victory in the OSN conversation.

Seems pretty complicated, but Noel managed to pull this off, from concept to working demo, in less than a month. I’ll leave the details for him to describe, but suffice to say, it was a lot of work.

Sure, there were hiccups, e.g. the queue backed up during the session, leaving a lull in the action, the red robot punched himself out by moving around too much and had to be tweaked on site. But overall, the demo worked, and a lot of people at the show stopped to play with the demo, even if they didn’t come to our session.

Despite our session ambassador’s failure to capture any video during the session, Noel did get video, which he’ll be sharing soon.

Looking ahead, we’re hoping to do more of these kinds of demos to showcase the interaction of software and hardware view the intertubes. There are lots of good use cases where physical objects can communicate status to software, and software can be used to control physical objects. The possibilities are many.

We also may tweak this demo to include real gestures, perhaps with an XBox Kinect, paging John Sim (@jrsim_uix).

Anyway, watch this space for technical details and video from Noel.

Find the comments.




  1. I met Noel in Columbia and Ecuador. He’s a top bloke. I hope he can get to OOW so we can meet up again. Can you smuggle him in in a suitcase? 🙂



  2. I saw that you were on that tour and considered tweeting you to meet Noel IRL, then I forgot. Glad you two met. Not sure Noel would fit in my suitcase, but maybe I’ll have an accident and either a) need a replacement or b) need someone to help me 🙂

  3. Noel did a fantastic job. Reach out to him for details and to chat about the Dublin Maker Faire. I’d love to go, but I’m useless to you 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.