Mobile Applications

Creating AR Apps with RubyMotion

Creating AR Apps with RubyMotion

by Lori Olson

The video, titled 'Creating AR Apps with RubyMotion' and presented by Lori Olson at RubyConf 2019, explores the development of augmented reality (AR) applications using RubyMotion, rebranded as DragonRuby. The presentation begins with a brief introduction to RubyMotion and its advantages over other languages like Objective-C, Swift, and React Native, highlighting its ease of use and accessibility. Below are the key points covered in the session:

  • Introduction to RubyMotion: Lori Olson introduces RubyMotion as a friendly alternative for mobile native development without the complications of Xcode. She emphasizes the convenience of using favorite text editors and rapid iteration through an integrated Read-Eval-Print Loop.

  • Comparison with Other Technologies: Lori critiques Objective-C and Swift for their complexity and backward compatibility issues, and expresses a strong distaste for JavaScript-based React Native.

  • Key Features of RubyMotion: The framework allows functional and integration testing with tools like Bacon, offers libraries to facilitate app deployment, and supports the ARKit for augmented reality development.

  • Augmented Reality Applications: The presentation shifts focus to the applications of AR in real-world scenarios, including military and commercial uses. Examples range from apps that help veterans manage anxiety to informative consumer applications.

  • Live Coding Demo: Lori provides a live coding demonstration where she builds a simple AR application, 'Hello World', utilizing RubyMotion's functionalities to create an AR scene. She demonstrates the creation of various components, including AR view controllers and visual geometry, while detailing the necessary code and configurations.

  • Complex Applications: Beyond the simple demo, she discusses more complex applications like 'Places of Interest', integrating mapping functionality, and 'Survive AR', a game developed by an intern, emphasizing community engagement through AR.

  • Encouragement for Diverse Development: Lori highlights the lack of diversity in AR app development and urges the audience to create applications that address various demographics beyond typical gaming and tech audiences.

In conclusion, Lori Olson advocates for leveraging RubyMotion to delve into AR development, encouraging developers at all levels to explore this innovative technology and share their own unique applications. The session not only sheds light on the technical aspects of RubyMotion but also encourages community involvement in building meaningful AR applications.

00:00:12.380 Okay, well it's four o'clock, so let's get started. I hope you all know that you're in the session on creating augmented reality apps with RubyMotion because that’s what we're doing this afternoon. My name is Lori Olson and my company is That Way Next Group, and I run the Windex School.
00:00:20.310 So, how many of you actually know or have used RubyMotion before? Hands up, how's it? Okay, good. I have about five or six minutes of introductory material to let you know what RubyMotion is all about.
00:00:40.639 So, why RubyMotion? Well, heck, because it's Ruby, right? That's what Matt said this morning; that’s why we’re all here. For augmented reality apps, it’s mobile native development. So why wouldn’t we use things like Objective-C or Swift or even React Native?
00:01:06.150 Let’s take these one at a time. Why not Objective-C? I’ve used it, and the syntax is pretty ugly and hard to read. It's the old, uncool language at this point. So why not the new cool language, Swift? Swift as a language has been evolving really fast. There’s been a new version every year or two, and some of those versions break backward compatibility. Although I must admit, the last release did something to fix that. However, in one of the previous years, you were SOL as Swift versions came out and the apps you wrote in the previous version broke in the new version.
00:01:31.140 In the last five years, they've managed to raise the complexity of the Swift language to the point that it's challenging Java for complexity. That’s not a compliment. I will say one nice thing about Swift: its syntax is nicer than Objective-C.
00:02:00.750 Okay, so why not React Native? I’m sorry... no, I’m really not. I really hate JavaScript. And despite the fact that 'native' is in the name, they can’t really fool us; it’s not actually native. From my experience, every mobile app that I ever really hated turned out to be written using JavaScript. I don’t want my house to be hated.
00:02:43.129 So, let's do our super lightning intro to RubyMotion. Actually, RubyMotion is in the process of being rebranded to DragonRuby, which is a way cooler name. RubyMotion apps are created from the command line, so we don’t have to use Xcode. And I don't know… like Xcode is not gonna win anybody’s 'favorite editor of the year' award; it’s just not that great. With RubyMotion, you can use your own favorite editor to write your code, which makes RubyMotion really accessible and friendly.
00:03:22.730 You can iterate fast with RubyMotion because it has a Rippl (Read-Eval-Print Loop), and you can actually make changes to the code from the console and see those changes in your running application. Testing is actually baked right into RubyMotion with a testing framework called Bacon, allowing you to do functional and integration testing right from the get-go without many problems. We’ve worked hard to address issues that people find really difficult, such as deploying apps to devices and uploading them to the App Store.
00:04:11.389 We’ve created libraries to help deal with that. There’s Motion Provisioning that helps you provision and get apps onto your own device for free, and there’s Motion App Store, which isn’t free, but lets you submit your apps to the App Store. So, RubyMotion in a nutshell — it’s real native code, statically compiled, running on top of the Objective-C runtime.
00:05:00.760 Why did I choose RubyMotion? Well, when the App Store first came out, I wanted to write an app, and I had this concept in mind called 'Wimpy'. If you're interested, you can go to that URL. But when I started to write it, I had to use Objective-C, which was the only game in town. I really disliked it. I mean, I used Objective-C back before the iPhone existed, writing plug-ins for Photoshop, and it was always hard to read and ugly.
00:05:42.970 Switching from my day job as a Rails developer to write an app was brutal. If I left it for a couple of weeks, it felt like starting fresh every time, which I hated. Then RubyMotion came out, and it was so much better because I could write in Ruby and use all the Ruby toolset, like 'Creek' and 'Bundler'. That made a huge difference in my experience as I finished my app. However, there are difficulties; there’s a lack of good deep examples in RubyMotion.
00:06:35.830 You often find yourself needing to go and find examples written in Objective-C or Swift to translate, which isn’t fun — really not fun. So I thought, why not teach? I’ve been teaching Rails to people from outside the Ruby community for years now, and I enjoy teaching developers from other frameworks. It eventually occurred to me that I should start teaching RubyMotion stuff.
00:07:17.020 Honestly, I love teaching, but the courses I taught over weekends, standing in front of people for two days, were exhausting. Talking makes me dry! So I thought, if I don’t want to do live teaching, maybe I could do online teaching. That’s when the Windex School was born. Yes, I do work from home and it’s fun!
00:08:01.990 Now, I got through all the intro stuff. You did come here for augmented reality, right? In my school, I want to cover the topics people want to see, and augmented reality is, of course, the new cool. There are many applications for augmented reality.
00:08:29.320 For example, there are apps helping veterans manage anxiety, and companies like Panera have created applications that let customers see nutrition facts on their products. Even the local university where I live has a whole program on virtual reality and augmented reality. They're not just teaching the technologies, but how to apply them, such as to journalism students and teaching police students how to respond to dangerous situations. Augmented reality is truly the wave of the future!
00:09:18.420 If I was gonna teach in my school, I needed some good augmented reality examples. By the way, this is my sticker. If you guys want a sticker at the end of the talk, I have a bunch! Of course, for augmented reality, you have to start somewhere. Let's do a 'Hello, World'. I know nobody wants to see 'Hello, World', so we're going to do it live coding style.
00:10:00.820 Okay, here we go! Creating a new Motion app. Oh, I'm in the wrong directory already.
00:10:10.310 ‘Motion create AR demo’ — this is creating an actual usable application right off the bat, but I'm going to immediately start messing with it. Let’s bring it up in RubyMine. We’ll start with our jump. I need to add a couple of gems right off the bat; first of all, I need to add a gem to get around some annoying nagging messages from the iOS setup about not having a splash screen for my app. One of the issues with augmented reality is it doesn’t run in the simulator because the simulator doesn’t have a camera. So right off the bat, we have to use Motion Provisioning to run our application.
00:11:11.550 Let’s get those installed... You guys, get off the Wi-Fi! Okay, backup internet on the phone because I'm not completely crazy. Oh come on! I’m going to try this one more time. You can do it! You can reach RubyGems; it's all installed already, but I will need it for Motion Provisioning later. Yay, thanks to whoever just got off the Wi-Fi! Time for a drink now.
00:12:14.710 Okay, we've got those installed. Next, we need to do a bundle exec ‘Greek’ to generate our simple splash screen. It’s telling me that I need to add a line to my Rakefile, so good thing I wanted to use the Rakefile next. We’ll go down to that section. There are a few other things we need to do in our Rakefile. This is just the standard Rakefile that comes with motion applications, and there are a whole lot of comments in here that you mostly don’t need after getting started, but they can be handy.
00:13:35.589 One of the things it says on number four is that your identifier is needed to deploy to an actual device. Yes, I need an app identifier, and it needs to be something somewhat unique when Apple creates these. I always format it like this with my company name — it only has to be unique within my company's base. Next, we need to specify a framework because we’re using ARKit, and we’ll be using something called SceneKit. To keep you guys from getting bored watching me type, I'm going to use a pre-prepared block of code that allows you to set up your app’s code signing certificate and provisioning profile necessary for getting your app onto a device, but this is all you have to do. The Motion Provisioning gem takes care of the rest.
00:14:51.040 I also need to pull up some more stuff because when we use AR on our phones, we need to request permissions for things like the user's location and the camera. You can't use these features without requesting access from the user.
00:15:21.030 That's it for our Rakefile! Now, let’s move on to the code. This is the only code in the project. It creates a window and shows it, but I’m going to create a new one. Let’s call it the ARViewController. This will be our view controller class, which is just a UIViewController, and I know that I'm going to need a couple of instance variables, which will be the scene view for our AR scene, and the scene configuration.
00:16:40.290 The UIViewController is an Objective-C class, and there’s some different stuff around creating new things. We need to have an init function — I don’t really need to do much in it, but I'm going to call super. Yes, I have notes; you didn’t think I was going to do this just from memory, did you? Next, we need to have the scene view created within a method that we get from UIViewController called viewDidLoad, and of course we need to call super and, in it, we’re going to create our ARSceneView.
00:17:40.560 When we create these new things that are Objective-C classes, instead of calling 'new', we use 'alloc' and 'init', and you can call 'name' as well. This is a reminder that we're calling an Objective-C class. We also need to set up our scene view to auto-enable default lighting because without that, we just get ugly-looking objects. This way, we will have nice lighting appearing on the objects we add to our scene.
00:18:39.179 We also have to specify the delegate, but I'm not actually going to implement any delegate functions; I’ll just say that my class is the delegate. Lastly, we need to tell it what its frame is — basically, what part of the screen it will occupy. We can do that with nice little arrays in RubyMotion instead of using ugly Objective-C macros. Now I could have installed another gem to make this prettier, but I just thought I’d give you an appreciation for everything you have to do when digging down to Objective-C sometimes. To get the height of the screen, that’s what I have to do — and also do the width and height.
00:19:59.320 Now we’re specifying dimensions, which sets the frame for our scene view, and last but not least, we need to set 'self.view' to our newly created scene view. That’s it for our viewDidLoad function, but we still need to create some configuration for our AR session, and we’ll do that in viewWillAppear. It takes a parameter, but I don't care about it, so I’m just going to proceed.
00:21:28.280 Now, our scene configuration is created and it’s an AR world tracking configuration, which we create and remember to call super, of course. We need to tell our scene view about this configuration, and we do it by saying 'session.runWithConfiguration'. So far, that’s all like boilerplate. We’ve created our scene view and configuration, but we don’t have anything interesting to render yet. So now we’ll add something! Let’s start with an easy example and add a box to our AR scene.
00:22:42.420 First, let’s create our box. This will just be a simple geometry defined by width, height, and length. Unfortunately, this method takes another parameter, which is the chamfer radius — that’s just to round off the corners, but I don’t care about that, so I’ll just set it to zero. Now, I have a box, but that’s just geometry! I need a node to put into the scene, so let’s create a node with geometry of our box.
00:24:00.380 I have a node; it has dimensions, but now I need to specify where it’s going to be positioned, and again, I can do that nicely with arrays in RubyMotion. I want it to be in front of me, not right in my face, so we’ll adjust its position to be just a little lower. Now, I have a node, and I need to add that node to the scene, but first, I need to create a scene. Thankfully, RubyMine helps a lot; I just want the scene!
00:25:05.640 Now I can add my node to the scene. There’s a default node called the root node, and I just add child nodes to it. We’re almost there! Now let's add the scene to the scene view. That’s it! We’ve completed coding up an AR example, but you guys want to see this in action, right? I do have my phone here; it’s a real device. If it doesn't crash, I’ll run it on the simulator. That won’t do anything, so I’m going to run it on my device.
00:26:39.310 This is Motion Provisioning, and you do need to have an Apple Developer account, but those are free; you don't have to pay for one. Okay, you've made the connection! Now it’s compiling… it’s compiling the splash screen from the storyboard and putting it on the phone. Unfortunately, right now there’s a bug in the REPL; it can’t connect to real devices because of some changes Apple made. They should fix it within the next week or two, but for now, you need to use the simulator.
00:27:07.090 You can see that there’s a new app on the screen that wasn’t there before; it's called 'AR Demo'. Let’s bring that up; it’s going to ask for access to the camera, so we’ll say 'okay'.
00:28:01.250 There it is, guys! The live coding worked. That’s the only part that matters, right? Okay, now ‘Hello World’ is interesting, but if I want to do this in my school, I need it to be a little more interesting than that. So, I like to call on other tutorials. For instance, Ryan Winter has some really good tutorials, and he has an augmented reality iOS tutorial that is location-based. I said, 'Yes! I'm good at maps; I can do that!'
00:28:44.930 So we created an app called 'Places of Interest'. It has a lot more functionality; I’m going to quickly walk through it. This app has three controllers instead of one. The master view controller manages flipping back and forth between the two other controllers. One of them is a map controller, which is a standard map controller in iOS development that can put pins on the screen. I'm going to go really fast through this since you can look at the code later.
00:29:30.140 Our AR view controller is also not a whole lot different from the previous one, but it does include a world alignment because this app requires movement, meaning it needs information about gravity. We pull some data and do some math. Let’s bring this app up; it's a simplified version of the app showing places of interest.
00:30:09.110 You can see as I swing the phone around, it’s orienting. I can choose one of these pins — “the Black 13 Tattoo Parlor” down the street. Okay, that looks interesting; let’s select that. Now, what’s this? It’s a pointer that indicates, 'Oh, okay, the Black 13 Tattoo Parlor is over there.' That little green thing you see is only little because of how far away it is! We can use this pointer to follow it all the way to the tattoo parlor...
00:31:05.890 The green pylon is quite large in reality, and that is our point of interest application. However, I know you may not find that example particularly compelling, but I think you can make it your own. You don't have to use the Google data or API keys; you could look for some open data from your local government, whether it's civic, state, or federal. There's tons of interesting open data out there, and I'm sure you can find something interesting to do with this.
00:32:06.210 But this is not the end of the examples! I have another one: a real media example. How about a game? Some of the most interesting AR applications so far have been games. One notable early example is Pokémon Go. When it was first released, it got people off their couches and into the streets; it was crazy! So, I created an app called 'Survive AR.'
00:32:51.080 In Survive AR, you are the center of the game and you have flaming demons coming at you. You target them with a little reticle, and then tap on the screen to shoot. Because it’s AR, the physics are real, so unless they’re really close, you need to be careful targeting. You can also see behind yourself using the mini-map, as AR things can come from behind!
00:33:12.730 Using demo recordings, you can see some gameplay where you shoot the demons. One fun feature is that if you run out of ammo, the game spawns an ammo truck — the little blue dot you see in the video — and you have to run for it. Just remember not to play this game inside, or you might end up running into walls.
00:34:29.740 Now, here’s the twist: I have to admit that I didn’t actually create these games; well, only the app I demoed in front of you. My intern, a first-year computer science student named Derrick, built these games. He didn’t have much experience outside of web programming and Java, and he came to me asking for a project. I taught him Ruby in the beginning, and while he was learning, I gave him a list of APIs to choose from, and he picked ARKit.
00:35:39.540 Derrick spent his time reviewing all the resources on RubyMotion, and while he did run into problems and needed help, he really succeeded in building an engaging AR game over the summer. So, I managed to teach my intern Ruby and how to create a fairly engaging AR game. However, there’s a lack of diverse apps out there. Most apps are being created by corporations and young single males.
00:36:22.410 Young single guys are great, but they often don’t see the problems faced by young parents, middle-aged people, seniors, people with disabilities, or those dealing with aging relatives. You really need to experience a problem to understand it well enough to build something helpful. That’s why I want to help people write those diverse apps. That’s why I have the Windex School, to help people get their apps out of their heads and into the App Store.
00:37:22.450 So do me a favor: get out there and build your own apps! And if you don’t have an idea, help somebody else build theirs. Thank you for coming! Here’s a URL that I’ll be tweeting, which will have links to all the resources, including all the source code for the apps mentioned.