00:00:20.400
Who likes robots? Who likes robots that don't kill people? Okay, great! Well, you're in for a treat. Our next session is on doing robotics in Ruby.
00:00:33.760
The leader of this gang is Ron Evans. He's from L.A., and you may know him from L.A. Ruby stuff and also Kids Ruby. He's the founder of the Hybrid Group. He's been doing Ruby for about five or six years, it seems like he's always been around since 2005.
00:00:48.879
Okay, great! That's about eight years. And he's got a lovely singing voice, so I'll let him introduce the rest of his crew. So, Ron, take it away!
00:01:06.240
Thank you. Good afternoon, everybody! This is the Golden Gate Ruby Conference 2013.
00:01:11.520
Amazing! Before we get started, I want to give special thanks to Josh, Jim, and Leia, and all the other attendees and conference organizers. Let's give it up for them! They work really hard.
00:01:24.880
A true labor of love.
00:01:31.280
I am Deadprogram in the real world. I'm Ron Evans, but on Twitter, I am Deadprogram. I'm the ringleader of the Hybrid Group. This other guy over here is Adrian Zankich. He’s the serious programming guy at the Hybrid Group. He does all the work, and I take all the credit!
00:01:44.720
Just like grad school! So we're with the Hybrid Group, a software development consultancy based in Los Angeles, California. Among other things, we are the original creators of Kids Ruby, which this year was fortunate enough to be one of the recipients of the Fukuoka 2013 Ruby Awards. Thank you so much to all the contributors!
00:02:00.240
But today, we are talking about Ruby on Robots. This is not one of our robots. So, is innovation dead? Are we just going to be doing web development and figuring out how to disable TurboLinks for the rest of our development careers? I say nonsense!
00:02:20.080
Here today, the future is already here; it's just not very evenly distributed. At least, that's what William Gibson, the famous author and futurist, had said. And isn't that really true? Innovating is actually very, very hard, especially when you're working with hardware.
00:02:39.519
My brother Damon and I started doing open-source hardware development around 2008. We would order obscure parts from China that would be shipped to us in mysterious unmarked packages. We’d put them together, and they would immediately melt. Then we'd order them again. Nowadays, though, you can buy drones at retail stores and robots at the Apple Store. It's a whole new era; the future has already arrived!
00:03:03.360
So we are here to tell you about Artoo. Artoo is Ruby on Robots. It is a Ruby framework for robotics and physical computing. It supports multiple hardware devices and can integrate different devices simultaneously in Ruby.
00:03:31.200
Are we serious? Yes! Because we use Celluloid, the brilliant creation of Tony Arcieri and his team. Celluloid is a remarkable piece of software that, if you haven't checked it out, you're probably running it in production right now if you use Sidekiq or Adheration.
00:03:36.640
Ah, yes! Thank you! Thank you! So it’s just Ruby. That’s the beautiful thing about Artoo; it works in Ruby 1.9.3 and 2.0 on MRI. It works even better in JRuby with concurrency, and our favorite Ruby, the one we’re primarily going to show today, is Rubinius. If you are not using Rubinius, you should try it!
00:04:08.480
So, Artoo is to robotics like Rails is to web development. Let me say that again because it's a very important idea: Artoo is for robotics what Rails is to web development.
00:04:32.800
Here's a little example of some Artoo code. It actually looks a little more like Sinatra, but let's let that go. First, we require Artoo. Then we make a connection to an Arduino using the Fermata adapter, which is a serial protocol to communicate with Arduino devices.
00:04:49.919
Then we connect to a device, which is an LED connected on pin 13 of our Arduino. The work we’re going to do is toggle the LED on and off every one second, depending on the current state of the LED. We will see this demo in a minute, but first, a bit about Artoo's architecture.
00:05:12.720
The architecture of Artoo uses two interesting design patterns. The first one is the adapter pattern. You see we have the robot, which is the primary class in our class model. Robots have connections that are to specific hardware devices.
00:05:20.080
Just like ActiveRecord is able to connect to different kinds of databases using the same interface, Artoo's adapters allow connections to different kinds of hardware devices. We also have devices with drivers where our connections control actual communication.
00:05:34.960
Drivers include behaviors, so our drivers are disconnected from our adapters. You can use some of the same drivers on different adapters, allowing you to have the same LED or buttons that work on a Raspberry Pi or on an Arduino or any number of other devices.
00:05:57.039
The other interesting aspect is we use the Publish/Subscribe pattern. We detect different events when they occur in these drivers, and then we notify the robot so it can respond. This is something you're probably traditionally used to seeing in other languages that deal with event-driven programming.
00:06:08.400
So, the API of Artoo—yeah, thank you! At least one token left! So what gives this robot a function without an API? Your application can communicate with Artoo's API using either a REST HTTP interface or WebSockets, the interface of the future.
00:06:25.919
This API communicates with the MCP, which is the master control program, that then communicates with all the different robots—kind of like a robotic routing system.
00:06:44.880
Historically, the way that you do robotic development is you install your software, you hit the on switch, and you step back real fast! Would you try having production, please? Yes!
00:06:57.199
Also, there's a command line interface because all the things you need to do to connect with different robotic devices are a lot easier to achieve with a CLI.
00:07:11.360
So, without further ado, let’s see a demo of some real stuff. We're going to start with the Digispark.
00:07:11.360
So, what is the Digispark? Let's see if I can manipulate my user interface efficiently. All right, so we've got our camera. The Digispark is a very small microcontroller, really tiny—it’s a USB device. It's probably the minimum viable microcontroller that you could have.
00:07:28.560
It has a few pins, and you plug it into your USB interface. Let’s take a quick look at some code here. Here’s the sample code: again, you see we require Artoo.
00:07:41.039
We’re going to connect to the Digispark using the LittleWire adapter. LittleWire is another protocol, kind of like Fermata, that’s a serial communication protocol with microcontrollers. The difference is LittleWire runs on low-power, small microcontrollers, like the ATtiny that's in this Digispark.
00:07:52.479
So we have two devices: we have the board itself for device info and again our canonical LED. In this case, the work we’re going to do is display the firmware name and version off the Digispark and then flash every second.
00:08:10.319
So let's jump over to our camera and point it at our universal video adapter here, and there’s the Digispark, and he is going to run this code. If all goes well, it'll start to blink.
00:08:16.240
Voila! A 2,500-lumen lamp! All right, so you're probably wondering by now, what good is all this stuff, anyway? I mean, it's neat and fun, but what can you actually do with it? Well, you know, I'm glad you asked that very question!
00:08:38.839
Here’s another sample we wrote. It also uses the Digispark but adds another piece of hardware, an RGB LED. This program is also requiring Travis, which is the Travis gem. See if you can guess what it does. Yes, it is a continuous integration build notifier that uses this hardware.
00:09:04.399
The work it does is, if I can get my mouse over there, it’s got three different LEDs connected to different pins. The work it does is it turns on the blue LED, connects to a repository called Broken Arrow, which has a failing test. Then it calls Travis, gets the status every 10 seconds.
00:09:33.839
If the build is green, it turns on the green LED. If the build is just building, it turns on the blue, and if it fails, it turns on the red. We have a couple of methods: one that turns on a specific LED and one that turns all of them off. That's all there is to it.
00:09:54.160
Let’s jump over to our camera and run this code. If the gods of the internet favor us... okay, now it’s checking Travis to see if Broken Arrow is failing or passing. And it should turn red once it queries. Yes, our build is failing! Every 10 seconds, it keeps checking.
00:10:26.800
Because we're really pressed for time, they assure me I would be kicked off stage with a large hook, so we’re just going to move on to the next thing, but it does actually work!
00:10:32.720
We really love toys and cats, don't we? I mean, who doesn’t love cats? But you know, slides of cats are really boring. What's interesting are internet-enabled cats. So, what we decided we were going to do was write another little sample. This one still uses our Digispark.
00:10:51.600
We have two servos and support for the Leap Motion. In this case, we see the syntax here in Artoo for event handling: the on leap motion hand wave that calls this wave function. So every tenth of a second, we're going to move the servos according to the current X and Y.
00:11:15.200
We have an X and a Y with default values that center the servo, and then our wave is basically querying the data coming from the Leap Motion using a couple convenience methods to scale from Leap Motion to servo requirements.
00:11:39.840
So let's take a look. Here is the actual device itself. I'll get a little closer. You can see this was hand-built by my brother Damon. It’s got two servos, and it’s connected up for us to run it. You see here is the Leap Motion, and there’s the cat toy on the end of it.
00:12:00.160
Because we’re very serious people!
00:12:02.800
All right, internet-enabled cat toys!
00:12:13.200
Well, that was pretty fun! Now, let's do something completely different. You know, 99 many people of a certain generation think red balloons. Others who are hip-hop fans think problems. We, on the other hand, think that there are 99 people in this room who are going to be getting a free Digispark beginner kit by the end of this session!
00:12:41.280
Right after we're done, some of our wonderful members of the Hybrid Group are by the doors. We have 99 Digispark kits that include basic soldering; everything you need to build that Travis notifier and about half of the cat toy.
00:12:49.840
So, 99 problems but the microcontroller is not one. All right, moving on! Now we're going to show the Sphero. The Sphero is a very cool robotic device that has become really popular. It seems like a toy; it's just a sphere with a couple of microcontrollers in it.
00:13:01.200
One of which has an accelerometer and a couple of motors. It rolls around, and it’s really fun to play with! Let’s take a look at how it works in Artoo.
00:13:13.360
All right, so here we see again, we require Artoo. This time we're making a connection to the Sphero using the Sphero adapter, connecting to an IP address port, because we are using a socket to serial interface to send IP packets and turn them to serial.
00:13:38.560
Every second, it’s going to set its color to a random RGB color and roll around in a random direction. By the way, we're using the Sphero 2.0, which is connected through a BeagleBone Black.
00:14:03.199
The BeagleBone Black, which is a great low-cost Linux microcomputer, has a lot of IO capabilities and is based on the ARM Cortex processor for 40 bucks.
00:14:30.880
So now we’re connecting to the Sphero, which we will see in a powder-blue color when it's connected via Bluetooth. Then, Adrian here is going to run the code.
00:14:54.720
And voila! More robots! The Sphero 2.0 is a lot more powerful. You'll see it’s almost about to escape from this feral cage that we built; this is only half speed.
00:15:01.919
Sphero 2.0 is very cool! One of the things that we find really interesting is Conway's Game of Life. So, who is familiar with Conway's Game of Life?
00:15:17.760
Okay, about half of the people. John Conway is a mathematician who enjoyed playing with ideas, and one of them is what we call cellular automata.
00:15:31.520
The concept here is that by using very simple rules that apply to individual elements, we can have some emergent behaviors. What that looks like is typically played on a pad of graph paper, where each filled square is a cell.
00:15:51.760
The rules are as follows: if a cell has less than two neighbors, it dies in the next turn due to loneliness; if it has more than three neighbors, it dies due to overpopulation; on the other hand, if an empty square has exactly two neighbors next to it, a new ball is born.
00:16:20.640
We wanted to do Conway's Game of Life but with robots. However, we realized we would need to make some tweaks. One issue is that these robots can’t see each other. However, they do have accelerometers and can detect collisions.
00:16:45.760
If we stretch out the square and use those collisions, we could simulate Conway's Game of Life using an inverse Fourier transform for the mathematically minded.
00:17:06.560
Let’s take a look at the code. We see here that we're not just using the Sinatra syntax but also a class-oriented model. The work is that it measures its collisions every three seconds. If it’s still alive after measuring, it moves.
00:17:42.160
Every ten seconds, it has a birthday because life is short and hard in Sphero-land. We have some tests here to see if it’s still alive. If it dies, the Sphero turns red, and while it’s alive, it’s green.
00:17:54.720
If the Sphero has enough contacts to die, then it’s reborn; otherwise, it dies, and we reset our contacts. Let’s connect to six different Spheros and do the work.
00:18:06.080
If all goes as planned, the BeagleBone Black and the Spheros are going to have a very fun time together.
00:18:27.760
Wow, look at them go! I'm going to have to tape down the cage! So, far so good. It’s a sunny day, but death has just come to the Spheros. It's looking bad. I right now don’t want to kill the last one, but it’s trying to find love in all the wrong places.
00:18:54.400
Now it’s moving away somewhere else. Should we kill it? Please, little Sphero, get out! Save yourself!
00:19:07.440
Well, in the interest of avoiding the hook, I think we should just terminate the program. I’m so sorry, we’re terminating the program.
00:19:19.840
So, now we're going to do something completely different. I'm going to bring out our test pilot and my brother, Damon Evans, who is also a member of the Hybrid Group.
00:19:32.400
What he doesn’t know is that he is the test pilot and test subject today. All right, so let’s take a look at some code. What are we going to do now? We’re going to fly this AR Drone using Artoo.
00:19:53.840
In this case, you see the same pattern you’ve been seeing. We require Artoo and make a connection to the AR Drone, connecting it via a Wi-Fi port since the device is the drone itself.
00:20:11.840
The work that we’re going to do is the drone will start, take off after 15 seconds, and land after 20.
00:20:38.480
Is it alive? Oh, it’s not plugged in.
00:20:45.840
Once it can plug itself in, what then? I’ll try not to think about that too often. Let's take a look!
00:21:03.840
We’re connecting!
00:21:18.880
This is why he’s the serious programming guy. We call this robot ops! Thrusters engage!
00:21:31.840
All right, so a bunch of you have seen that before in other places and thought, 'Oh, that’s cool; I’ve already seen it.'
00:21:39.840
Now we’re going to show you something actually impressive. One of the things we've added to Artoo is support for OpenCV.
00:21:49.919
OpenCV is probably the standard computer vision library used in serious robotics projects by serious humans. So what we're going to do is connect to the capture device to pull the streaming video off the AR Drone.
00:22:07.840
We'll then put that through the processing device for analysis and then use the AR Drone code. The difference here is we will use a facial recognition library included with OpenCV.
00:22:24.000
So, what will happen is that after taking off, the drone will increase altitude to approximately face level, hover, and start a new timing loop.
00:22:41.840
Every half second, it will try to detect Damon’s face and adjust its position to look directly at him.
00:22:53.760
After 30 seconds, it should land. If all goes well, that’s what will happen, so let’s take a look using our universal video adapter.
00:23:14.080
We can actually see what’s going on and now you’ll be able to see what it sees.
00:23:42.560
We’re just going to reboot the server real fast. Did you try rebooting the drone? Can you imagine that type of customer service?
00:24:01.760
We mean it! All right, one more time; this time you know it’s real.
00:24:19.760
We’re going to try this again, hopefully.
00:24:34.560
As soon as the Wi-Fi reconnects, this is why we were looking for that USB adapter!
00:24:58.880
Standing by; Red Five standing by.
00:25:11.840
Count down! Let’s go.
00:25:31.840
Uh, we're still standing by here, Houston.
00:25:41.440
Is there a problem in the launch country? Standing by. Up! We are connected!
00:26:01.440
See our video! Let’s zoom in!
00:26:16.960
You see, it's recognizing his face every time there’s a red square around it!
00:26:28.480
Easy, boy! Easy! Now, we’re going to find out if the part that’s supposed to make it land actually works.
00:26:44.560
Let’s leave that for a minute. But there’s more! I know!
00:27:01.760
All right, so here we are. We've seen the AR Drone fly, and we’ve seen it fly autonomously.
00:27:15.520
Now we're going to put it back under human control. We’re adding some additional hardware, which includes the AR Drone, Arduino, and a Wii classic controller.
00:27:30.080
So, using the Artoo framework, we’re controlling the AR Drone with video coming from the camera and streaming through OpenCV.
00:27:47.920
Okay, so let's try it. I think we have two choices: fly without video or try again. I know I said two choices, but I lied!
00:28:02.720
Let's go for the same thing again, except this time, it works! Fresh batteries! The Parrot business model is all about selling repair parts and batteries!
00:28:18.520
Plugging in. They have a new high-capacity battery; I think we’ll pick one up.
00:28:37.760
One more attempt. Can we do it? Connecting again—the AR Drone is actually its own Wi-Fi access point.
00:28:56.720
This proves to be very convenient when flying with your phone but quite tricky when integrating with other devices. There are other drones coming; that’s all I'm going to say.
00:29:20.480
Standing by for takeoff! We are on the on-ramp to the highway to the danger zone!
00:29:36.480
Um, if it’s got facial recognition, do this!
00:29:49.520
I swear, I actually worked! All right, we're ready! Standing by!
00:30:13.919
It really—you guys are before I look—really close.
00:30:29.440
There we go! Now that is air flight!
00:30:39.840
This just proves Ruby air superiority, my friends!
00:31:01.440
I’ll take on those newcomers; bring it on!
00:31:06.320
I think we should challenge your footage! I view it more like Voltron force combined.
00:31:29.040
And here's a great time to mention robots: Robot Conf is coming up in December in Florida. It's organized by Chris Williams, who also organized JS Conf.
00:31:42.480
A bunch of Nerd Bots will be there, along with some Pythonistas and Closure folks—it’s going to be the ultimate robotics open-source conference ever!
00:31:58.080
Well played, sir! Was that fun? Thanks, guys!
00:32:14.880
We want you to join the robot evolution. Go check us out at r2.io. We're also on Twitter at r2io, because I, for one, say welcome to the machines.
00:32:45.440
But what about robot economics? I mean, what happens when all the jobs are done by robots? What about robot ethics? Do they even have ethics?
00:32:55.040
And what is going to happen? The answer is: I don't know. However, I do have friends who are professional futurists.
00:33:12.000
One of them is Daniel Rasmus, who wrote a great book called 'Listening to the Future.' He talks about scenario analysis. So, we’re going to do a little brief scenario analysis.
00:33:25.600
We’re going to look at two axes: one is robot friendliness, are they friendly or not, and the other is robot sentience: are they intelligent or not?
00:33:42.720
Because we're based in Los Angeles, we think of everything in terms of movies. If the robots are not intelligent and not friendly, you get the movie 'Brazil'—nothing works!
00:33:56.800
If they are intelligent but not friendly, you end up with 'Terminator.' Enough said.
00:34:07.760
If the robots are friendly but not intelligent, we get 'Power Rangers.' If they are intelligent and friendly, we reach the singularity.
00:34:19.040
This guy, Isaac Asimov, spent a lot of time thinking about the interactions between humanity and robots, and he came up with the three laws of robotics.
00:34:32.320
So I would like to share these with you now:
1. A robot must not injure a human being or allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the first law.
00:34:52.640
3. A robot must protect its own existence so long as such protection does not conflict with the first or second laws.
00:35:10.880
So, how is that working out for us? I don't really think it's fair because a drone can be flown by a human pilot located far from the battlefield.
00:35:24.400
We would like to suggest one little tweak—just one pull request—one patch revision to Asimov's first law.
00:35:31.680
We propose Law 1.1: A human may not injure a human being or allow a human being to come to harm. Imagine that future!
00:35:54.720
Thank you!