00:00:15.720
Hey guys, my name is Haris, and I work for a company called Daily Burn, which is located right near here in the Frank Gehry building, right across from Chelsea Piers. If you're visiting town, you should definitely check it out. Frank Gehry is an amazing architect, and the building itself is a great piece of art, worth viewing from the outside. At Daily Burn, I wear many hats and get involved in various projects including hackathons and competitions. Often, I spend more time coding, and I built this app with one of my colleagues. He handled the iOS part while I focused on the backend, creating a system for face detection and recognition integrated with Facebook in a Terminator-like interface.
00:00:34.930
In this project, my colleague was the iOS developer, and I was responsible for the backend. I thought face recognition and detection was awesome, but I wanted to do it in Ruby. I've been experimenting with MacRuby for almost two years now, creating various toy apps, and I hope to push some of them to the Mac App Store soon. I want to get people excited about writing desktop applications again because it can be a lot of fun. The beauty of desktop apps is that you can be a bit sloppy — with 4GB of RAM and a terabyte of solid-state storage, we can afford some laziness.
00:01:09.009
MacRuby is a fantastic implementation of Ruby 1.9 that uses the LLVM compiler. It was the first Ruby implementation other than Ruby 1.9 that was compatible with 1.9. Rubinius and JRuby are also working on this, but MacRuby made significant progress early on. It was initially supported by Apple, although that isn’t the case anymore. I won’t go into RubyMotion too much, but it was pioneered by Matt and Eddie, who have created some fantastic tools based on their learning from RubyMotion.
00:01:47.020
What you see here will mostly translate to RubyMotion in terms of the code structure. The first step is to download the MacRuby framework. I prefer using nightly builds because they are quite stable. After downloading and installing it, when you create a new Mac project in Xcode, there will be an application template available for MacRuby, and you’ll be good to go. Now, let’s start building the app. Since I have limited time, I will move quickly. I will open the social project on stage immediately after this, so take a look at it later at your leisure.
00:02:01.800
First, we need a main Ruby file for the app, referred to as 'main.rb'. This file typically doesn't require much modification except when you’re including frameworks, like we will be doing. We are going to require the AVFoundation framework, which is a core Apple framework used for audio and video applications. After that, we’ll set up a preview and complete all our initialization tasks. In a MacRuby application, upon launching, there is a delegate method called 'applicationDidFinishLaunching', which is where most of your boilerplate setup code will go.
00:02:34.380
In this setup, we will initialize a capture session and a device to grab the actual camera from your MacBook or iMac. We will then define the width and height of the output. For the capture session, we will use the AVFoundation capture session preset of 640 by 480. These specifications might confuse you initially when working with MacRuby because you are still using Ruby syntax but referring to many camel case methods and constants. However, once you get used to it, it's not as complicated as it seems.
00:04:01.300
Next, we’ll create an input device while handling any errors appropriately, using Ruby’s key-value notation for hashes, which is quite straightforward. Additionally, we need to set up an AVCaptureOutput object to help us capture data. This won't be for our preview, but we need this data for processing. We'll add both our session input and output, and it's essential to remember the camel case method names for Ruby objects.
00:04:46.320
Next, we’re going to create a new view layer, setting its dimensions as well. It's crucial to make sure to add the sublayer for your camera preview object; if you don’t, it won’t show up even though you’ve configured other settings. Once we've set everything up correctly, we can start the session, and the capture will begin.
00:05:28.680
Now, we will utilize Grand Central Dispatch (GCD), an amazing library provided by Cocoa for queuing operations. We’ll be sampling the video output buffers using GCD, and it's incredibly simple as it matches the APIs very nicely. We create a dispatch queue named 'cameraQueue' and set its delegate, which will be responsible for managing the detection operations using the AP's camera.
00:06:33.820
For face detection, we will leverage the Core Image framework, which provides essential features. With the detection, we receive coordinates for facial features such as the left eye, right eye, and mouth. Although currently limited to these features, there is potential for expansion in the future. We instantiate a CI detector with a specified accuracy level, enabling us to detect facial features in real-time. The detector’s delegate method gets triggered each time a new frame is buffered from the camera, allowing us to process the captured image for features.
00:07:36.850
Next, I will capture the sample image and extract the features, which I will leave as an exercise for you when you explore the code later. To demonstrate, by the end of this setup, we will have implemented face detection features effectively. As for face recognition, the challenge lies in the fact that Ruby lacks efficient linear algebra libraries, which complicates the implementation of recognition algorithms.
00:08:55.640
Currently, the only notable library available is outdated. Nonetheless, I've used Face.com for my API in past projects, which provides face recognition capabilities. By utilizing their API, we can send an image for detection and receive identification results in return. In this instance, we will integrate a Ruby gem, which works perfectly with MacRuby, ensuring to perform installations appropriately. Finally, we'll configure the deployment arguments in Xcode as we prepare to run our app.
00:09:39.740
Now, I’m going to quickly demonstrate the application. As a note, I've added an overlay of a hipster filter, so hopefully, my app will get attention, perhaps even from Google or Facebook. As you can see, I’m overlaying whimsical features like a bowler hat, glasses, and a mustache, and the recognition call is being made through Face.com.
00:09:46.610
Unfortunately, the lighting is not ideal, and it seems the system is struggling to locate my face; perhaps the bright light is overpowering the detection. However, you get the idea! The last takeaway for you all is that while learning Cocoa and Objective-C is beneficial due to the API interaction, there's nothing wrong with our love for Ruby. Let's not be monogamous; we can share and be a bit of a polyglot in our coding journeys.