Ruby

Summarized using AI

Keynote: The Signal

Ron Evans • February 21, 2013 • Burbank, CA

In this keynote presentation titled "The Signal" at the LA Ruby Conference 2013, Ron Evans explores the intersection of robotics, communication, and the future of technology. Evans emphasizes the historical impact of significant thinkers in the fields of information theory and programming, drawing on the contributions of Claude Shannon, Ada Lovelace, and other influential figures. The talk is structured into three main parts: the creation of communication machines, the application of Ruby in robotics, and the implications of automation in society.

Key Points Discussed:
- Creating the Machines: Evans discusses the transformation of raw noise into meaningful signals, invoking the foundational work of Claude Shannon in information theory and the legacy of pioneers like Ada Lovelace and Grace Hopper.
- Ruby on Robots: He introduces R2, a Ruby-powered microframework designed for robotics, which emphasizes concurrent communication and the handling of device interactions through message-passing paradigms inspired by influential thinkers like Alan Kay.
- Practical Demonstrations: Evans provides live coding examples showcasing how R2 interacts with hardware like Arduinos and Sphero, emphasizing the simplicity of communication between the framework and various devices. He highlights capabilities such as toggling LEDs and controlling robotic toys for educational and prototyping purposes.
- The Future of Robotics: In the final section, Evans reflects on the ethical implications of increasingly autonomous robots. He addresses concerns over job security, moral programming decisions, and the future relationship between humans and intelligent machines. He concludes with a call for developers to engage deeply with the ethical considerations of robotics to create a balanced future.

The talk ultimately encourages attendees to innovate responsibly while keeping humanity's values in focus amidst the rapid advancements in robotics and automation. Evans invites participation in developing the R2 framework, stressing collaboration within the robotics community.

Keynote: The Signal
Ron Evans • February 21, 2013 • Burbank, CA

Visit http://artoo.io for more details of the framework Ron presents in this talk!

Help us caption & translate this video!

http://amara.org/v/FGdX/

LA RubyConf 2013

00:00:24.040 Ron and Damon Evans presented at the Marconi in 2009. They gave an interesting presentation on flying robots, featuring a flying blimp. If you watch the video online, you'll find it quite fascinating. Ron is known for his incredibly high energy. For the past four years, besides his work at Hybrid Group, he has been driving a program called Kids Ruby. Let's give a round of applause for that!
00:01:03.790 If you haven't seen Kids Ruby, I encourage you to check it out. There's an event happening tomorrow, and another in two weeks up in Bend, Oregon. Kids Ruby focuses on educating and teaching programming to children, starting from ages six to eight, but more around eight if they can use a mouse.
00:01:17.360 I wanted to share a quick story. At RubyConf 2009, for those who don’t know, I run a company called ConFreaks, which records this conference and about a dozen others each year. Back then, we were using robotic pan-tilt-zoom cameras, which was an impressive setup but caused quite a headache. We ended up with the wrong set of cables.
00:01:34.070 At the software development conference, I had to trek 45 minutes away to buy cables. This was especially challenging because the conference was held at a hotel near the San Francisco airport, with no electronics stores nearby. We made the trek and returned with the wrong type of cables, which had molded plastic ends, making it impossible to rewire them. I posted a tweet asking if anyone had a voltmeter. Fifteen to twenty minutes later, Ron appeared with his multimeter, and he saved the day.
00:02:05.180 We managed to rewire the cables and record the sessions. You can watch those sessions online if you’re interested. Ron has been a tremendous friend throughout the Ruby community and to both Junior and me at LA RubyConf. So, when we discussed who the keynote speaker should be this year, Junior and I agreed that Ron, who has contributed so much energy to our community and given outstanding presentations, was a perfect choice.
00:02:30.020 We're delighted to have him here and hope you all enjoy what he has to share. Junior has something to add.
00:03:14.840 395 thousand dollars, 250 thousand dollars, 180 thousand dollars. If I break it, you have over a million dollars worth of cars within 30 feet of you. Have fun! Just call me Magnum. Well, welcome everyone! This is the Los Angeles Ruby Conference, and I want to thank Junior, Connie Fendt, and Kobe for the extraordinary effort they put into organizing this conference. Let's give them a big round of applause!
00:03:37.340 I'm DeadProgram on Twitter, and I'm Ron Evans. I work at Hybrid Group here in Los Angeles, a software development company. This keynote is in three parts because I don't know if I will ever get asked to keynote again. I really wanted to express this message: The Signal.
00:05:06.510 In the beginning, there was noise, and it had no distinguishable form, so it was not very useful. Then man made the signal, and it was good. This is what we need to communicate. We took these signals, combined them, and created messages. From these messages, the entire universe of human thought sprang into being.
00:05:35.280 This man, Claude Shannon, is the creator of information theory. He understood that with just two states, on and off, the entire universe of thought can be represented. Yet, this idea was rooted in earlier concepts introduced by Ada Lovelace, who recognized the difference between things and their representation.
00:06:00.530 This fundamental idea laid the foundation for later concepts of information and entropy posited by Stephen Hawking. Grace Hopper further developed this idea of communication abstraction, allowing us to communicate using the same language with different types of machines.
00:06:16.530 Kevin Ashton is the man who coined the term the Internet of Things, which defines the connected future we are part of today. Rob Pike, who invented much of what we call UNIX today, gave an influential talk on the differences between parallelism and concurrency that shaped my thinking. Then, there’s John Little, who created queuing theory, and Alan Kay, the creator of object-oriented programming and someone who realized that the message is the fundamental unit of communication.
00:06:30.780 Lastly, Ray Kurzweil sees beyond our current reality into the future. All of these thinkers are examples of Isaac Newton's idea: 'If I have seen further, it is by standing on the shoulders of giants.' These individuals have influenced the thoughts and ideas that will be shared in this talk today.
00:06:48.580 As you embark on your journey of intellectual creation, remember that every thought you ever had was a message sent by someone else. So, be humble.
00:07:07.940 That concludes part one: Creating the Machines. Now, for part two, Ruby on Robots.
00:07:15.500 For this section, I would like to introduce some of my colleagues at Hybrid Group. We have Adrian Zanuck, our flight engineer; my brother Damon Evans, the test pilot; and Sean Engorley, our photographer at large.
00:07:29.770 There’s a question many have asked: Is innovation dead? Some claim we’ve invented everything there is to invent. But I say that's nonsense. Innovation is not dead; it's all around us. As William Gibson said, 'The future is already here; it's just not evenly distributed.' In 2009, when my brother Damon and I showcased the first Ruby-powered unmanned aerial vehicle, we had pieced together various parts ordered from China. Times have changed since then; you can now buy a drone at Sharper Image or a robotic vacuum at Best Buy. The future is already here, but the challenge lies in making it do something useful.
00:08:15.120 When we built our Ruby-powered flying robot, it was incredibly difficult. Achieving anything useful requires integrating not just hardware but also software. That's what's holding back the advancements in robotics. Today, I am proud to introduce R2, a Ruby-powered microframework for robotics and physical computing. It supports multiple hardware devices, allowing for communication with various devices concurrently.
00:08:35.300 Many believe that Ruby doesn't scale, but they don't realize something called Celluloid. Celluloid is a framework created by the brilliant Tony Arcieri, and it incorporates powerful ideas from influential thinkers I shared earlier. The key points are Rob Pike's talks on concurrency and parallelism, and Alan Kay's ideas about message passing. Celluloid takes a different approach to message handling in Ruby.
00:09:19.890 In Ruby, when we call a method, we expect it to execute immediately. However, Celluloid allows us to understand that things can happen in time and space at different times. This is concurrency — when multiple tasks can happen simultaneously — and it should not be confused with parallelism, which involves breaking a task into smaller parts. Celluloid changes the way we think about handling messages in Ruby; every Ruby object can respond to these messages and act independently.
00:10:05.600 By combining Celluloid with Celluloid IO — an extension of Celluloid that enhances event-driven IO — we have a framework that allows us to use actor-based message handling while also incorporating TCP and UDP sockets. This creates a clean and efficient way to connect with hardware that utilizes event-driven programming.
00:10:56.900 R2 runs on JRuby because we need a Ruby implementation that supports real threading. My personal favorite is Rubinius, which has a powerful threading model and is built on the low-level virtual machine, making it very fast and compatible with various hardware platforms. The way R2 operates is by utilizing various design patterns, specifically the adapter pattern. Similar to Ruby on Rails, where database connections are managed, R2 allows a robot to communicate with various types of machines using an adapter.
00:12:15.750 This adapter establishes how we communicate with a device, which is crucial for basic communication and controlling its behavior. On top of that, R2 employs a publish and subscribe architecture, facilitating event communication back to the robot when different things occur, enabling it to respond and act appropriately.
00:12:50.750 In R2, device drivers are designed to control communication with the device, determining how we interact with it. As we script and code, the capabilities and nuances of R2 allow us to streamline robotics development and communication, making it much more accessible and efficient.
00:13:26.000 Now, let’s transition to part three, which is all about the practical applications of our framework.
00:14:10.020 I'm excited to show you a demo today! Let's dive into some practical examples of using R2. We’ll start with showcasing a robot system that relies on a series of hardware components. My brother Damon will introduce our favorite piece of hardware, the Arduino. Many of you might be familiar with them; they’re versatile and accessible.
00:14:29.740 Initially, when we started exploring hardware and robotics, Arduinos were new gadgets, usually available only online or through DIY methods. Thankfully, you can find them at local electronics stores like Radio Shack, which is back in business. We’ve created a prototyping board where we link various components together for our demonstrations.
00:14:58.030 Now, let’s take a look at some code setup. Here’s a very simple R2 program. I hope you can see the code clearly. We’re requiring the R2 gem and using an adapter called Fermata, which establishes a connection to the Arduino for communication with its digital and analog input/output.
00:15:17.670 This example demonstrates how to communicate with the Arduino using the socket-to-serial interface. We’ll toggle an LED on the board every second, providing a straightforward demonstration of our hardware communication.
00:15:43.050 Next, we will run another sketch, this one allowed for interaction using a button that has been wired up to query the Fermata to get button events, showcasing how we can instruct the LED to toggle on button presses. This interactive capability illustrates the ease of R2’s implementation with real-time hardware control.
00:16:02.490 Building off that functionality, we’ve set the stage to introduce another piece of innovative hardware: Sphero. This is an incredible robotic toy that offers a lot of potential for interactive projects. Damon will now provide insight into Sphero’s functionalities and features.
00:16:30.890 The Sphero is essentially a ball that can be controlled wirelessly. Within it, you'll find various sensors and motors that facilitate movement through a change in weight distribution. It also has LED lights that allow for visual effects, enhancing our robotic applications.
00:17:10.000 As we explore further applications and demonstrate interaction with the Sphero, we can link it using Bluetooth. With the proper setup, we can issue commands and responses, allowing us to create interesting and engaging interactions.
00:17:32.890 By programming the Sphero to roll in a random direction using the R2 framework, we can create less predictable behavior while demonstrating robotics principles. Furthermore, controlling it through physical interactions, such as buttons and movements, enhances the user's experience.
00:18:12.560 While it is important to demonstrate a variety of functionalities, we should also emphasize the importance of coding. Understanding how to manage these robotic toys is crucial for developers as they work on embedding intelligence within their creations, utilizing programming to harness the true capabilities of hardware.
00:19:05.220 We can program the actions of multiple Spheros to interact with one another, allowing for more dynamic and complex setups, paving the way for swarm behaviors in robotics. This showcases how R2 allows easy setup for multiple devices while demonstrating various patterns of interaction that can be achieved with incremental coding.
00:19:46.000 Now, let’s take a step forward and see how we can combine R2 with the AR. Drone. This drone can demonstrate the practical applications of how various hardware platforms can be controlled and operated together. It utilizes a slightly different interface but demonstrates the same underlying principles of our robotics framework.
00:20:22.820 As we set up to execute commands to the AR. Drone, we can show real-time responses through its navigation capabilities, combining visual feedback with command execution. Demonstrating how all of these systems can work in tandem really brings our understanding of robotics to life.
00:20:56.300 By showing how the drone can take off, operate, and return safely, we highlight the advantages of our programming framework in managing complex tasks in real-time. Plus, with visual and interactive feedback, audiences can see how every component fits together into the larger robotics picture.
00:21:42.000 In conclusion, there’s so much practicality and potential in using R2 and combining it with various hardware platforms. We’ve just scratched the surface of its capabilities today. As developers and enthusiasts, it’s our duty to explore these possibilities further and expand the boundaries of what robotics can achieve.
00:22:58.000 If you’re interested in this technology, you can find more about what we've developed. Be sure to visit artoo.io for additional resources and information. This also opens the door for collaboration; we invite you to check out the GitHub, contribute, and help us improve this framework for others in the robotics community.
00:23:21.000 Now, it’s time for the final part of my keynote titled 'Welcome to the Machines.' Here, we will explore the impact of robotics and automation in our modern society and what that means for humanity moving forward.
00:23:43.750 As we embrace the robotic revolution, it is essential to consider the ramifications for the traditional workforce. The integration of intelligent robots into various industries raises questions about job security, ethical programming decisions, and the future interaction between humans and robots.
00:24:40.000 Importantly, will robots become sentient, and how will they behave towards us? The concept of robot ethics arises from questionings about whether robots will have their own intuition and moral compass—and how will that shape our coexistence? The future indeed looks both exciting and uncertain.
00:25:20.000 To illustrate these points, I’d like to reference a visualization I created based on scenario analysis of future interactions between robots and humans. If robots remain non-sentient and unfriendly, we remain in a dysfunctional cycle, reminiscent of the film Brazil. If they become intelligent yet adversarial, we may find ourselves facing the consequences portrayed in the Terminator movies.
00:26:10.000 Conversely, if robots become friendly yet lack intelligence, perhaps we find ourselves in leadership roles akin to cyborgs in sci-fi narratives. Ultimately, the ideal scenario would be the friendly robots achieving consciousness, which could usher in an era of collaboration—where we are no longer adversaries but allies.
00:26:50.000 Isaac Asimov articulated three essential laws of robotics long ago. These laws intended to govern robot behavior and interactions with humans, promoting a safer integration of technology into society. We must reflect on how these principles have evolved and consider their relevance as we embrace the future.
00:27:30.000 Of course, it’s crucial to remember that current robotic capabilities are human-controlled processes, often resulting in outcomes beyond our intentions. Perhaps it’s more important to focus on how we, as creators, choose to implement robotics instead of just focusing on the technology itself.
00:28:12.000 In rethinking Asimov’s first law—that robots may not harm humans—let's explore how this may influence our technological landscape. Perhaps the future necessitates a deeper understanding of our own ethical frameworks as we venture deeper into the realm of robotics and automation.
00:28:52.000 As we look towards these possibilities, it becomes increasingly important for us as developers and innovators to engage actively in the, discussion on robotics and automated systems. It’s our responsibility to embrace not only the technology but also the principles of ethics, morality, and empathy that will ensure a positive future.
00:29:32.000 This journey into robotics is as much about the human experience as it is about the technology itself. Let’s strive to keep humanity at the forefront of our advancements as we move into an era populated by machines.
00:30:10.000 Thank you for allowing me to share this vision with you today. I am excited about the future, and I hope you are too! Let’s work together to shape a hopeful and constructive future for robotics technology.
Explore all talks recorded at LA RubyConf 2013
+6