00:00:25.680
Wow, there have been lots of great talks today! I feel really honored to be on the stage with such smart people. Instead of talking about smart stuff, I made a thing, and I'm going to talk about that.
My name is Barrett Clark, and I'm from Texas. Are there any fellow Texans here? Hot damn, alright!
You can find me on Twitter or GitHub at Barrett Clark. I've been a Rubyist for about five years and have a long history as a web developer. I am a recovering Perl developer, but these days, I consider myself more of a polyglot. However, I still prefer Ruby where it makes sense. I work for Saber Labs, which is part of Sabre. Does anybody know Sabre? Okay, fantastic!
00:00:50.960
If you've heard of a little website called Travelocity, that’s our biggest consumer-facing brand. We are a global distribution system, and if you have booked air travel or a hotel in North America, chances are that inventory went through our infrastructure. I don't mess with any of that stuff. I work for Saber Labs, where we experiment with technology, ideas, and business models that might or might not make an impact in the near future in the travel industry.
So that's what's great about my job. This project began as an Android NFC (Near Field Communication) experiment. We were thinking, how do we make the travel process better? How can we make going through the airport suck less? We thought about having tap-in points or doing all kinds of cool stuff.
00:01:37.520
We came up with a big map and a lexicon to discover all the different stages and points during the travel process. While none of that stuff is necessarily important, it gives you an idea of where we were going. We introduced ideas like the wristband, tap endpoints, and biometric sensors. What we really wanted to do was answer a few key questions:
How long is it going to take me to get through the security line? When do I need to leave my house to get to the airport so that I don't miss my flight? From a business perspective, more importantly, the gate agent needs to know if they need to hold this flight for anyone. You may not realize this, but it costs the airlines money; they pay fees if they are late pushing off from the gate.
00:02:03.120
So they aren't going to hold the flight for just anybody—they're not going to hold it for me. However, if they know that they have someone with status who is in the building and that person has made it through security, and is probably just minutes away, then there's a good chance they will hold the flight for that person. Gate agents are really good at maximizing the efficiency and effectiveness of flights. In the airline industry, status is everything.
Then, iOS 7 was announced. Where we thought NFC was going to be okay, it became clear that from Apple’s perspective, NFC is not a thing, and Bluetooth—specifically Bluetooth Low Energy—is the future. Thus, we pivoted our approach.
00:02:58.079
To answer those questions, we needed to know where you are. Location in general is pretty easy, but indoor location is a different animal. Geolocation is accurate within a few feet and is really easy to store and query. PostgreSQL with the PostGIS extension is fantastic for that!
I've heard you can do it with MySQL, but why would you want to? It's easy to store and query, and pretty easy to set up. You can roll it with any of the major package managers, such as Homebrew, Apt, or Yum. It’s not difficult to set it up on a server; you can host it on Heroku with little pain because it’s not hard to set up and host.
00:03:41.440
The way your phone knows where you are is by taking a combination of GPS, which is from satellites, Wi-Fi, which is known access points, and cellular towers. When indoors, satellites are probably not visible, so it relies on those access points. The third access point, the Z-axis elevation, is also important. We had an experience at SFO baggage claim on the second level where our Uber driver called asking where we were. They thought we were on the first level because that's where Uber directed them based on latitude and longitude. However, we were on the second level.
So, it’s important to recognize that for indoor location, it’s not just about where in the world you are, but also where in the building you are. A fun fact is that Android can get more information from access points that it doesn’t know anything about; it has more flexibility than an iPhone, which is a little more restrictive. The point is, indoor location is tricky. It’s not the location of the access point that matters, it’s your proximity to it.
00:05:02.000
Okay, moving on, let’s talk about specifics. I made this project, and I want to focus on the hardware. The latest version of the Bluetooth specs, 4.1, includes Bluetooth Low Energy, which is a subset of the 4.0 spec. iBeacon is just an implementation of BLE; there’s no magic—it doesn’t do anything particularly cool—it’s just an implementation.
Here are some of the beacons that I created. The top device is a Raspberry Pi with a Bluetooth 4.0 dongle attached to it. The Raspberry Pi is an incredibly fun device to work with, and I recommend it. I believe it runs on a version of the Debian distribution, and it’s an affordable Linux computer that runs headless.
00:05:36.960
I set it up using a blog post I found from Radius. It was really easy to configure, and I created an init.d script so that when it fires up, it has Bluetooth configured, and it just works. Additionally, there are three Estimote beacons that I modified. These beacons originally broadcasted on different channels, which I didn’t like, but I found a way to change that. Now, they work exactly how I want them to. Finally, the little red board you see is a RedBear BLE mini, which is really tiny. You can solder a watch battery to it, but I opted not to.
00:06:02.560
It uses an Arduino-based sketch that you upload to the board, which allows you to configure it. I got to play with hardware, and as a software guy, that was a lot of fun! Now that we have our beacons, we need to interact with them. I wrote both iPhone and Android apps to accomplish this, but today I'm focusing specifically on the iPhone implementation, which was more enjoyable to work with.
Apple integrated the iBeacon functionality into Core Location. This makes sense in some ways but creates some awkwardness in other ways. To understand Core Location, does anyone here work with Xcode and Objective-C? If so, I apologize in advance!
00:06:49.920
To begin, we configure some settings, ensuring that we have the best accuracy possible—even though more accuracy can mean more expense. You get many readings, each with different accuracy levels, and your phone works harder for better readings. You can monitor movements and set a distance threshold, then initiate updates. A delegate method called didUpdateLocations is triggered when new location data is available. This allows you to know where you are or post the data to a service.
Now, let’s understand the iBeacon specs. There are three critical pieces of information: a proximity UUID, a major number, and a minor number. The UUID is a channel that the beacon broadcasts on, which you listen for. The major number represents a group of beacons, while the minor number refers to a specific beacon within that group.
00:07:30.720
For example, think of the major number as the terminal at an airport and the minor number as a specific gate. In terms of the Core Location implementation, obtaining updates is straightforward, but working with beacons involves a two-step process: first, you look for the beacons and then monitor them. You can start monitoring by specifying the proximity UUID—optionally including a major or minor number to filter the results.
When the app detects beacons, it determines their state and proximity. It returns an array of detected beacons sorted by proximity, which can be either unknown, immediate (right next to you), near (within two or three meters), or far (up to 50 meters). Thus, you might have an invalid reading if the closest beacon is too far away, which requires some smart handling to interpret the data correctly.
00:08:06.560
Now, that was a lot of Objective-C. Since this is a Ruby conference, let’s shift gears. With our setup, we determine where you are and what you’re near. I created a Rails backend to help interpret that data, splitting it into two different services. One is a reading collection server that is read-heavy and super simple, designed to store readings as quickly as possible and return a reading ID. I utilized MQTT for publishing those readings.
The reading collection service doesn’t need to be overly complicated. It does two primary things: it creates readings when the phone posts data and has a ‘show’ method to allow command-line queries for reading details. The service creates a reading and provides a backdoor in case the pub-sub method fails, although it rarely did.
00:09:29.840
I also built a location server to interpret the readings. Initially, both services were combined, but I separated them to allow for the different scaling needs since not every reading needs immediate interpretation. The location server uses PostGIS and subscribes to the reading topic, which allows it to tell the native app what each reading means. So, if you’re at a gate, the server might return data indicating that you’re in a terminal at a specific airport. This info can be very helpful for the application to determine where you are and what services to provide.
00:10:19.760
Here are some examples of how PostGIS is used, and if you’re not familiar, it may look strange initially. I use both RGEO helpers and the ActiveRecord PostGIS adapter on top of the RGEO gem. I’m quite comfortable with PostGIS and have used it for several years, often writing extensive SQL queries that perform complex tasks. However, I’m starting to explore some of the nice helpers provided by RGEO.
In the realm of geographical information systems (GIS), certain functions allow you to do checks like whether this point is within a geofence or whether two geofences intersect. By using these queries, the server can determine where each reading occurs based on the provided coordinates.
00:10:56.320
This is the initializer for MQTT, which is very straightforward. It simply listens for information on the specified topic without much hassle. At this point, we have the hardware, the phones, and the Rails app. Now, where does the airport fit into all of this? With everything we've established, we can provide valuable information during a person's journey through the airport, especially once they get past security.
00:11:27.040
For instance, we could inform a passenger that their gate is to the right, 200 feet away, and there are two bathrooms between them and their gate. Here's the Android version of the app. When developing for Android, you often encounter layouts that don't exactly meet your needs or work as intended.
More importantly, remember that as a business, we’re very interested in generating revenue. Gate agents can monitor where passengers are and confirm if boarding has completed so they can determine if the flight is good to go. They can also assess status and see if there's an issue with a particular flight.
00:12:05.920
We added a stage at the gate counter to allow agents to know who is approaching for personalized service. Status is incredibly important, and being able to assist passengers efficiently is part of improving the travel experience. Prototyping the airport doesn't have to be limited to airports specifically; it showcases the potential of geolocation in determining where in the world you are or where inside a building you are. The Rails backend processes all this information to decipher its significance.
Most importantly, it can help answer critical questions regarding a passenger's journey.
00:13:02.880
I'm Barrett Clark, and I work at Saber Labs.