Talks
Ruby is For Fun and Robots

Ruby is For Fun and Robots

by Michael Ries

In the presentation titled "Ruby is For Fun and Robots" by Michael Ries at MountainWest RubyConf 2016, the speaker explores the playful side of programming with Ruby, focusing on robotics. Ries aims to highlight how programmers can engage in fun projects rather than solely working on serious applications, suggesting that many in the Ruby community started programming out of a desire to create exciting experiences, such as robots.

Key Points Discussed:

- Introduction to the Speaker: Michael Ries introduces himself and shares his passion for programming, emphasizing a shift towards enjoyment in coding, particularly through robotics.

- The Role of Robots in Fun Programming: Ries discusses his childhood fascination with robotics, noting that many people enter programming with the dream of building robots or video games. He highlights that robots can be both interactive and entertaining.

- Creating Interactive Robots: Ries elaborates on his vision for robots that can engage with users, similar to characters like R2-D2, suggesting that these robots should respond to human emotions and actions.

- Rubyists and Community Skills: He emphasizes how Rubyists excel at integrating various components and using libraries effectively, suggesting that their tendency to ‘glue’ code together makes them suitable for building robots.

- Tools and Platforms for Robotics: Ries recommends the iRobot Create platform and Raspberry Pi as accessible starting points for building robots. He describes how Ruby and various libraries can be employed to create robot behaviors without needing to delve deeply into lower-level programming languages.

- Using OpenCV and Google Cloud Vision: Ries mentions employing OpenCV for computer vision tasks and the recent introduction of Google Cloud Vision API to extend the capabilities of his robot to recognize faces and tweet about interactions.

- Demo Presentation: The talk culminates in a live demo of "Friendly Bot," which showcases the robot’s functionalities and its ability to perform tasks, tweet, and interact with attendees.

- Community and Inclusiveness: Ries concludes his presentation by appreciating the supportive nature of the Ruby community and encourages attendees to pursue fun programming projects, emphasizing the importance of staying engaged and inclusive.

Main Takeaways:

- Programming with Ruby can be a playful and enjoyable activity beyond professional obligations.

- Engaging in fun robotics projects can reconnect programmers with their passion for coding and creativity.

- Leveraging collective community knowledge and resources is essential for creating innovative and exciting programming projects.

00:00:00.000 Hey.
00:00:22.289 Welcome back, everybody! We're going to get going. Next up, we have Mike Ries, who is one of the locals here and the organizer of the Utah Valley meetup. After Brandon bailed and went to Austin, I didn't want to do it again, so we made Michael do it. But Michael is a good friend of mine. He actually lives in my neighborhood. We go to church together and we're home teaching in the canyons. It's pretty awesome, it's pretty epic.
00:00:43.690 So, there's no nepotism involved in Michael speaking, but actually, he was our last alternate. It quickly turned into, 'Can you do the full 45-minute session?' and he said, 'Sure, I have no stress at work; it's totally fun.' So I'm really looking forward to this. This is going to be fun; it's all about robots and similar topics.
00:01:28.140 It's always good to be known as the B-string, and I'm trying to carry that tradition on. As Mike mentioned, I really just want to talk about robots and fun stuff today. I hope that no one gets anything useful out of this presentation at all. You shouldn't be able to tell your boss about anything you learned during the next 45 minutes, and I'll make that as true as I can.
00:01:50.400 So yeah, there’s going to be lots of fun stuff like this. Quick introduction: I'm Mike Ries. I go by 'mm' or 'memories' pretty much everywhere I happen to be on the internet. I work at MX, but my real calling in life is making babies that have ridiculously good hair! I don’t have any stickers, but if you want to swap pictures of babies, just be prepared to be embarrassed because my babies are really good-looking. This is Winston; thank goodness he takes after me!
00:02:27.850 I'm really happy to be here at the last Mountain West RubyConf. I’ve spoken here once before and it is bittersweet for many of us. The other day, I was remembering the first time I attended a U-RUG meetup; I was still a student at BYU at the time. I remember that the week before, I had just gotten Ubuntu running on my ThinkPad laptop and thought I would be one of the cool kids when I showed up with it, but instead, I got there and everyone had a Mac. I felt so insecure about my hardware. I didn’t know it at the time, but I was just a hipster ahead of my time!
00:03:11.620 Now, it’s all about overthrowing our corporate overlords, and using Ubuntu would be cool again, but I was just too ahead of my time. Specifically within this U-RUG universe, I want to call out Mike. As he said, we have had plenty of time together. I've never gotten to cry into his shoulder, but I'm actively looking for a reason to start crying, so if anyone has a suggestion, feel free to let me know.
00:03:36.220 Some of you may not know this, but Mike is a secret millennial. I suspect that he’s actually made up of a group of little rascals hiding under a trench coat. Or perhaps Mike would prefer the metaphor of a group of ponies. This is actually why Mike loves to troll everyone so much—because at heart, he’s just a kid and loves to make fun of all the rest of us adults for the adult things that we do.
00:04:24.250 I also wanted to quickly call out something interesting: Tender Love at this conference has been looking really buttoned up. He’s looking almost Enterprise-e, dare I say, wearing a jacket and a button-up shirt! I was thinking we might get the weird hat Tender Love, or perhaps the wizard in a bathrobe, or maybe colonial Tender Love—but none of those ended up panning out. He sort of looks professor-like, so I propose we call him 'Professor Love' for the remainder of this last Mountain West RubyConf.
00:05:02.620 At the beginning, when I was thinking about this talk—when Mike was like, 'Hey, can you do a 20-minute talk? Okay, just kidding, do a 30-minute one. Okay, just kidding, you’re doing a 45-minute talk.' I was thinking about the useful and fun aspects of what I want to talk about. Then I just decided to throw away the useful part and focus on fun. This matters to me because, similar to one of our earlier talks from Jameson, I’ve experienced some ups and downs in my career where I get home at night, play hard with my kids, and as soon as they're asleep, I think, 'Boom! Gotta get on GitHub, got to get some PRs going, pad those stats!'
00:05:39.099 Then a month goes by and all I really want to do is watch reruns of 'Psych.' I’ve been on that roller coaster, and recently, a really good way for me to reconnect with the fun of programming has been through robots. Many people I know got into programming because they wanted to design video games, which is a fun endeavor, but for me, it was always about making robots.
00:06:21.930 I always wanted to make robots that could do cool things, like busting out some sweet dance moves, or fight each other awkwardly! I wanted to create robots that could play sweet jams. In fact, I have to say—this robot in the top right here—what I don’t understand is its job in this robot band because it’s not playing any instruments! I thought maybe it was a backup dancer, but it doesn't move at all; it just has giant saw blades on its head. Maybe that’s all you need to be in a robot band!
00:06:53.950 Also, wouldn't it be cool to make a robot that could raise its arms but somehow look really scary and intimidating in the process? Finally, I dream of building a chef bot that would create some sweet farm-to-table health Oreos—very finely diced and dedicated to that job. It would bring a lot of happiness into my life, but I’m not there yet; Chef Oreo is not quite ready for demo day.
00:08:05.150 I was preparing for this talk when I heard from another attendee about a place near here called 'God Hates Robots.' A little intimidating—maybe I should have thought about that before deciding to give my talk on robots! If anyone is out of town, though, I recommend this other place here, Tin Angel—it has super good food! Pro tip: go eat dinner there tonight and save room for panna cotta at the end. I didn’t realize this was, like, a theme, and it throws a wrench into my understanding of things. I’m feeling pretty intimidated. Hopefully, not many of you share similar feelings about robots in this crowd.
00:09:04.000 The robot I set out to build had the main goal of interacting with people. A lot of the robots I’ve experimented with in the past tried to accomplish practical jobs, like vacuuming a floor or moving something from one place to another with the minimum interference from humans. This time, I didn't want that kind of robot; I wanted to build something more interactive.
00:09:52.800 I envisioned a robot that would feel like R2-D2—when it ran into a good guy, it would beep happily at them and maybe shoot out a lightsaber from its head. If it encountered a bad person, it would somehow recognize that and back up while making nervous beeping sounds. This is definitely not totally feasible, along with Chef Oreo yet, but it's what I was aiming for.
00:10:11.130 I began by asking myself what skills I could fall back on as a Rubyist and what we as a community have acquired over our history. Right off the bat, Rubyists are really good at taking things and gluing them together. We may not understand how all the pieces work or worry about failure scenarios, but we make mash-up websites and projects proudly.
00:11:18.220 Thus, I didn’t want to get bogged down by the details of how many things I would need to order from SparkFun and Adafruit before I actually had something that moved. I wanted to focus on behavior instead, so I found this amazing platform that I highly recommend for anyone wanting to play with robots: the company that makes Roombas also sells a project called iRobot Create. They are on the second version of it, and it comes with batteries included, motors already set up, as well as infrared sensors, bump sensors, cliff sensors, wheel encoders, and buttons.
00:12:47.560 It even tells you the charging state of the battery and how full it is. If you tried to build all of this from scratch, it would take several months of learning. Now, that wasn’t the part of the project I wanted to work on; I wanted to focus on its behavior. On top of that, I didn’t want to learn C—it’s a great language, I'm sure it would do a magnificent job of running this robot, but it means a lot of new tools and gotchas. That's when I turned to projects like the Raspberry Pi. It has an SSH daemon that runs when you first boot it up.
00:13:28.750 I know how to do that; I know how to SSH to things, and I understand how Linux processes work and communicate. It’s like having my own mini-cloud on a card, and that’s loads of fun! It saves a ton of time. I also thought more about what Rubyists are really good at. I figured that since we excel at grabbing code snippets from Stack Overflow and mindlessly copying and pasting them into our projects, that seemed like a good direction.
00:14:42.000 I explored Ruby Gems, and one of the tricky things about a robot like this is knowing how to detect if there are people around. That’s a tough problem to solve! There's a great set of libraries known collectively as OpenCV, which handles a lot of computer vision tasks, but it’s written in C for performance reasons and historical contexts. I don’t want to learn computer vision theory, but fortunately, I can use Spyglass! Thank goodness for Andre Medeiros and the brilliant Ruby community we have.
00:15:35.950 This is an example pulled from their GitHub project’s examples directory. You can see we create a new video capture, then use the reverse shovel operator—I don’t know if there’s a cool name for this besides ‘chevron’—to pull frames from the video capture device into a variable, and then we pass them into a cascade classifier. I don’t know what that is or how it works, but I'm a Rubyist and I just steal this code because it works, and that’s great!
00:16:19.070 Oh, but I didn’t know how to tell which faces are 'Jedi' and which ones are from the Empire! I needed to differentiate the faces I was seeing, and luckily, during the prep for this talk, Google Cloud Vision API was announced! I was able to extend my project by using that! So, a big thank you to Mike for that—I'll shamelessly steal more of your code.
00:16:55.800 This API is super fun to play around with if you haven’t tried it yet. It does OCR to extract text from images and recognizes taggable entities—like sailboats, dogs, kittens, and probably even dinosaurs! I haven’t tested that yet, but I’m sure it can recognize them. It also performs facial analysis. While I still can’t tell the difference between a stormtrooper who’s thrilled to destroy my robot and a Jedi who is happy because they understand the force, at least if they’re happy, I know generally how to respond!
00:17:56.100 Next, Rubyists are also very good at having strong opinions. For example, some people in this room might have strong opinions about the merits of MiniTest versus RSpec, hypothetically speaking! And we may also have preferences over VMs or Vim versus Emacs or other editors. This is something we excel at as a community. Beeping and moving around are great ways for my robot to interact, but it'd be even better if it was more like a Rubyist. So, it has a Twitter account and will tweet somewhat trolling opinions about the people it sees.
00:20:00.200 A little note here: I have a good friend, Chris, in the audience, and I’m fairly certain if I ever posted a picture of him on social media, it’d require a signature with the GPG key. So, it’ll only tweet pictures of faces that are very close to the camera. If it scans the room, it won’t magically capture all your faces. The webcam it uses is terrible and can’t see much beyond a few feet. But if you get up close, like Tyler did, you’ll get a good picture with some opinion from a robot posted publicly.
00:21:00.000 The current version of the friendly bot looks like this—unsurprisingly—like a bunch of stuff that was glued and taped together by an amateur! The Raspberry Pi is riding on the back, sitting on top of an external battery. It is possible to wire it into the battery of the Pi, but then you have to deal with level shifting of the power supply and worry about power surges; I didn’t want to bother with that, so I bought another battery instead. It has an awful webcam on the front, but at this point, it’s worth just pausing before my inevitable demo failure to point out that I talked about several small things that were all complete blasts!
00:22:06.000 The first time I sent a binary command to this Roomba was around 2 a.m. when my kids were asleep, and I didn’t know what the unit of measure was for the speed parameter. I accidentally commanded it to drive half a meter per second, it went tearing across the kitchen floor, disconnecting from the plugs I had wired into the back. I couldn't send the stop command, and it started crashing into everything! I was certain it would wake everyone, and when I finally wrestled it to the ground and unscrewed its battery to turn it off, I experienced a feeling of pure joy—a little different than debugging a production system, but a similar excitement!
00:23:13.500 Since this presentation can’t get any worse, let's move on to the demo! Just a quick recap: for this demo to work, the Raspberry Pi has to boot without its SD card becoming troublesome and not booting at all, which has a 10% chance of success. It needs to communicate over a serial interface at 115,000 baud to a hopefully charged robot and connect to Wi-Fi. Luckily, we’re using a hotspot. Then, if the Cloud Vision API is also up and responsive, it should move and take pictures! If the lighting is okay, it might get a decent picture and tweet about it!
00:24:56.730 So yes, many things could go wrong, and it’ll probably go something like this: here is its Twitter account. Definitely follow it; hopefully, it will tweet soon! Please, dear robot, work for the love of all things good! I’m going to have it come this way so that the lights are on my face because the low light is the hardest part for the camera.
00:25:43.200 It takes about a minute to start because it has to initiate a Ruby VM and an Elixir VM along with a couple of other tasks, and boot up the camera. Here we go. If I get close, it might play a sound...
00:27:20.000 Oh, it made a sound! Alright, random shouting this is not a Republican debate; can someone please help me find my robot later? Mike Moore is going to rescue Friendly Bot. That went much less terrible than I expected, and as a result, I have a little time to explain how this actually works. I promise not to go too deep, but I'll roll through this quickly.
00:28:26.100 The first thing I had to do to get this bot moving was to send binary commands. The iRobot folks provided a really great PDF that shows all these commands and their structures. They use standard encodings, like signed integers. It turns out that Elixir and Erlang have a great syntax for parsing and generating this data. So, it was fun to write in another language, and Roomba-Ex was born. You can use that if you'd like! Here’s what the code looks like: I'm parsing a single byte representing the state of all the buttons on the Roomba. There are eight buttons!
00:29:10.490 By saying, 'Hey, bring in this one byte and pattern match it,' I can assign bits to the respective buttons. This was a lot of fun! I also needed something stateful to manage the commands because the Roomba has eight buttons, and I wanted to know whether to drive or stop. I aptly named it 'DJ' after our dear departed friend, DJ Roomba.
00:30:05.579 DJ handles most of the code for this project. It receives sensor updates 30 times a second, checks the sensors, and uses pattern matching to recognize common scenarios like when the bump sensor is pressed or released. It alters the driving command accordingly, shuffling itself around. It's somewhat painfully slow, but this makes it easier to find faces in the crowd after my talk! Then, the Ruby program looks for faces while informing the bot.
00:31:11.020 Here’s a neat piece of code to share with your team: global variables galore! Long live globals! In Ruby's true fashion, if nobody needs to read or understand your code, global variables can be a lot of fun! The main loop essentially instructs the bot to move while sampling frames. It processes around 10 frames per second using about half a core on the Raspberry Pi, which is better than expected. If it detects a face, it stops to capture a better image. If it doesn’t find anything, it plays a sad tune! You may hear those later in the lobby.
00:32:20.670 As for analysis, I pass images to this analyzer object, sending it off to Google Cloud to get results. It checks for blurriness, and if everything looks good, it’ll post to Twitter with a pseudo-random, somewhat trolling remark!
00:33:38.110 I've spent most of this presentation humorously addressing our community's strengths: stealing code snippets and blindly using resources. However, there are wonderful aspects of our community that shine, like the Friday hug, celebrating our heroes, and the 'Why the Lucky Stiff' legacy! These are individuals who have emphasized whimsy, playfulness, and joy over technical excellence or superiority. Initiatives like Ruby Friends aim to make social interactions less awkward among Rubyists.
00:34:52.290 I'd like to believe that Friendly Bot will help Rubyists connect with each other at conferences, fulfilling a neat purpose as well. In closing, the gentle support bestowed by tender love, giving rise to nice interactions, is something I appreciate. Our community believes in importance, kindness, and welcoming newcomers, which draws me to it. I hope that, moving forward, we continue to embody these values and remain inclusive, regardless of how long we've been in the trenches. Thank you!
00:36:12.890 I shouldn't repeat it because it’s a troll, but I should because it needs to be noted for future humanity. The question was, did I get a lot of design input on the robot or was it designed in a vacuum?
00:36:26.450 Thank you for that. Any other questions? Yes, did I look into frameworks like the R2 framework? There's this robot-related framework called R24 Ruby, made by the same people from Cylon.js and Go for Bah. It's a really cool framework! I did consider it, but I was excited about understanding how these bits and bytes fly over the wire, and since it intrigued me, I decided to write it myself.
00:37:01.230 I haven’t been asked a lot of questions, so let me go back to this one! There are a limited number of times we have families and other commitments to work in. How do I find time for these projects? I’d say I’m a victim of my own interests. I always wanted to build a robot but earlier on, I thought projects were too overwhelming, or I didn't have the right materials to start.
00:37:54.420 At this point, I saw this Roomba project, and I realized I could dive into it quickly without layers of distractions. I'd estimate that this endeavor required about 90 to 100 hours of coding, debugging, and just playing around. It's significant work, yet my wife and I plan our weeks together. She informs me of her dinners or classes, and I tell her which evenings I'm reserving for programming. This sets our expectations, so once the kids are in bed, I know I’ll be focusing somewhat on my projects.
00:38:58.560 Let’s wrap with a fun note! The robots I make have a delightful quirk. You need to watch them—there's something charming about their limited capabilities but a spirit of effort! So, as I close, I’d like to remind everyone: I'll be showcasing Friendly Bot just roaming around here, so it can take pictures and tweet about the lovely people attending this conference! If you’re intrigued by robots, let’s chat. Thank you, everyone!