00:00:24.410
Okay, so before I start, I just want to mention both Grain and Josh mentioned that I work at Excel, which is their Gold Sponsor here. I want to clarify this before people think I bought my way onto the stage. I've also been an organizer here for ten years, so I'm up here the proper way.
00:00:36.470
I started armed, geared into it, and then I strung around my employer to sponsor. So here we are. This is a look at the Crystal programming language, and I want to give the context of why I thought to look at it in the first place.
00:00:48.080
Josh mentioned that I run a non-profit in Loudoun County called Loudoun Codes, where I teach K-12 computer science education. If you were here last year, you saw two of my students speak; they’ve been part of the event regularly for the past few years. I have two here today, and I find teaching K-12 computer science education amazing. I think I missed my life's calling in not being a teacher; of course, I would have to move out of Loudoun County to teach in Loudoun County, which is a sad story about teachers.
00:01:05.089
In teaching students, especially at the high school level, we teach Java because the AP exam is in Java and they need to learn the fundamentals of the concepts. But I often show my students concepts in Ruby. At first, they don't even realize it's a different language, and then I run it, and they'll accuse me of cheating. They have to understand that in Ruby, we can find combinations of three things easily, while in Java, we need to write multiple loops for the same task.
00:01:27.740
I love Ruby, but in teaching computer science, we're often talking about the efficiency of computation and algorithms. Java beats Ruby hands down on those kinds of algorithms. I'm often writing something for clarity in Ruby, but then I have to make it faster to express a point in Java, which creates a tension between my love for Ruby and the performance needs of algorithmic tasks.
00:02:10.729
Let me tell you a bit about our MacGuffin that we're going to discuss today. That's why I was invested in Ruby. Today, however, we're going to talk about something I wrote last year for Papers We Love. We have these events in Washington, DC, and the lead organizer asked me to speak at the meeting last October. She gave me about six months' notice, so I thought that was cool; it would give me a lot of time to research a good topic.
00:02:53.110
I did what every good nerd would do: I went to my standard-issue copy of 'Selected Papers on Discrete Mathematics.' I got to the very first paper and found a problem that captivated me for the last year. It's called a mutually orthogonal Latin square. You might know it as a thoughtful little card game where you take all the face cards from a deck of cards and try to arrange them so that every row and column contains one face value and one suit.
00:03:40.210
This arrangement forms one solution, but there are interesting questions: How many solutions are there? The paper discusses Latin squares of order ten, and I can give kids a deck of cards to find a solution in about twenty minutes. Imagine doing it with ten by ten squares! The story involves Euler, who hypothesized about the existence of Latin squares of various orders. It wasn't until 1963 that a computer found one of order ten. Before that, even with the best available computers, it didn't seem like a solvable problem until a clever algorithm was developed.
00:04:31.729
One fascinating aspect of Latin squares is that you can switch any row or column and still end up with a valid square. Consequently, finding one solution means discovering many others. There’s a child's game that is a 5x5 version of this concept that I have available for people to play with at lunch. The relevant paper is titled 'A Computer Investigation of Orthogonal Latin Squares of Order Ten,' published in 1963. It took about 40 hours for a computer to find a Latin square of order ten at that time.
00:05:10.210
Searching for this paper online was a challenge. I couldn’t find it easily—only references to it in other papers. I resorted to going to the neighboring county’s library to search their system. Finally, two days later, I received an email with a PDF and instructions to pay three dollars and forty cents for the research they did. It was awesome! Supporting your local library is still very relevant.
00:06:14.389
The paper itself wasn't easy to digest. It might be the hardest and geekiest thing I have ever done for fun. It discusses a flawed algorithm alongside the optimized solution. Understanding this was a fascinating exercise in refactoring that involved transforming a huge combinatorial problem into a more manageable one. The naive implementation was drastically less efficient, but with our optimized algorithm, we cut down resolution time significantly from the original 40 hours.
00:07:18.490
This led me to wonder how quickly I could solve this issue originally in Ruby. My first Ruby program was expressed in terms of numbers and letters, ensuring that each unique letter and number occurred exactly once in their respective rows and columns. Despite the initial discovery of the first order ten Latin squares, my Ruby program—single-threaded on my machine—finds new solutions every two to five minutes. If I open eight threads on my machine with eight cores, it could potentially discover a new solution every thirty seconds.
00:08:05.150
This realization sparked my curiosity about how Ruby's performance compares to Java. When I started to transition my code into Java, I found it became cumbersome quickly, especially due to the combinations I was using throughout the code. I briefly explored using JRuby but concluded it wasn’t a good fit for what I wanted to accomplish.
00:08:39.110
So, I started looking into Crystal. This language is inspired by Ruby but does not share the goal of API compatibility. It's a compiled language, so let’s demystify that a little. Understanding compiled languages can be complex. A compiler is akin to having every unit test your language designer could have conceived for your program, ensuring a level of correctness.
00:09:19.240
One significant advantage of compiled languages is their reliance on type systems, which dictate how data types interact in your code. In Java, for instance, we specify variable types to utilize features like interfaces. However, this often leads to verbose code and complicated situations, particularly with generics, which can pollute the type system. Initially, I was hesitant about using Crystal due to these concerns, but I found unique features that intrigued me, such as its null checks and guards.
00:10:07.639
In Crystal, if I define a variable using its normal type, it cannot be nil, which eliminates null pointer exceptions that could crash applications. We also have a concurrency model that isn’t parallelized yet, but we can explore that a bit more later. There are numerous valuable features in Crystal, which we can discuss after I show you an example.
00:10:51.760
Here is my mutually orthogonal Latin square program in Ruby. I won't dive into too much detail, but it consists of 180 lines of code, and I'm defining an array of any size. This allows for flexibility in finding these squares, whether they’re small or large. When it finds a square, it prints out a dot to indicate progress, and if a valid solution is known, it performs a breadth-first search to exhaust that particular search path.
00:11:54.580
My Ruby code employs a depth-first search to find a valid configuration, implementing testing to validate candidate groups. This naive algorithm eventually finds valid Latin squares, but the efficiency in how we structure our searches can dramatically influence results. The essence of this optimization boils down to searching for reduced forms of the squares we find and iterating through potential combinations in a tighter search space.
00:12:52.800
Initially, the inefficient implementation sought Latin squares through exhaustive searches of all possible arrangements. In contrast, the optimized solution leverages transversals, which dramatically reduces the space we need to search through, allowing us to find valid arrangements more quickly. This method emphasizes the importance of processing only meaningful adjustments instead of every permutation.
00:14:12.600
As you can see, there is a significant amount of computation happening. I'm performing permutations to identify potential candidates, sorting and organizing them accordingly. The search for valid permutations promotes efficiency. For example, when searching for transversals, we validate them as we go without needing to revisit already examined paths.
00:15:06.610
Let's observe this in action. I can say 'Ruby moles 2rv' to retrieve all valid squares for order four. If I discover a limited number of solutions—say only two—this highlights the constraints of the problem. On the flip side, order five presents a sizeable puzzle, revealing numerous possible arrangements.
00:16:01.000
Now, while finding valid solutions is interesting in this example, more complex orders quickly become computationally intensive. Exploring the landscape of possibilities broadens our understanding of the initial problem, emphasizing resource availability and algorithm efficiency. I’ll fire up order ten in the background while I run the rest of my talk, and we’ll look at the Crystal version next, which I anticipate will be challenging due to needing type annotations.
00:17:24.020
Now, let’s switch gears and look at how to express some of this in Crystal. Take notice here, for example, when I manage collections—using arrays. When I run this snippet in Crystal, specifying how types are inferred comes into play, obliging me to classify my arrays explicitly, unlike Ruby, where it is more flexible. The compiler's responses and need for directives force me to refine accuracy within my code.
00:18:50.840
As you can see, in working with Crystal, I have to declare that my array will contain specific types—say Int32—for clarity. Meanwhile, in Ruby, it's much more lax since I can assign objects and values without strict rules. The safety that comes with type declarations in Crystal proves tremendously beneficial for debugging and clarity across collaborative environments.
00:19:29.950
This also extends to handling nil values. I can adjust types to include nil in Crystal more concisely, mitigating errors. I’ll show how switching types efficiently operates within the compiler's mechanism. You see how this influences method calls and execution, where we provide clarity in assignments or function calls through explicit directives.
00:20:45.660
Crystal's compilation requirements manifest a structured approach to error handling. Keep in mind that I can grant nil assignments in my collection as a design feature. Providing these explicit type assurances leads to safer, more predictable code behavior.
00:21:22.460
Returning to my moles presentation for comparison, my earlier Ruby code can be juxtaposed with Crystal code. You’ll see where I've articulated types, defining arrays and demonstrating handling for returning values without ambiguity. When we've worked directly with the original Latin squares, we've essentially permitted an agile problem-solving approach, enhancing what was in Ruby with more constraints.
00:22:55.460
The API in Crystal differs from Ruby in several ways. I often discover methods that aren't directly equivalent, like include versus includes. The subtleties involved in using blocks further complicate direct transitions between the languages. My Ruby implementation allowed for flexibility, yet entering typed constructs into Crystal facilitates clarity in what returns must be handled.
00:24:35.600
As I’ve developed my understanding of Crystal, I’ve been able to enhance the code from Ruby by creating custom types. Notice how simple yet powerful syntax enhances my type definitions while ensuring correctness. The leverage of type systems in Crystal stands to benefit how sometimes developers grapple with managing unknowns in method signatures. Documenting expected data types inherently smooths the development process.
00:25:50.950
To summarize, I have successfully transitioned numerous lines of Ruby code, refining and enhancing their performance while maintaining core algorithm functionality. Despite slight nuances, my experience demonstrates that understanding typing systems dramatically fine-tunes how solutions are structured. I encourage anyone considering implementation to engage thoroughly and evaluate how individual needs wrap around these languages’ capabilities.
00:26:52.310
I’ll refocus here on the comparative speed between the Ruby and Crystal versions of the arrangements. The results suggest my transition has yielded substantially faster solutions. The more structured approach proved valuable, enabling me to arrive quickly at efficient algorithms for solving more complex problems without sacrificing clarity. As we can see, Crystal typically offers better performance with less overhead in requirements than Ruby.
00:28:33.760
Reflecting, I note that teaching situations value both speed and clarity. Having experienced Ruby through numerous projects, discovering Crystal further builds that knowledge base while addressing performance. Crystal is evolving rapidly, with an ongoing bug bounty program urging contributions to refine its ability to support parallelism effectively.
00:29:39.449
So in conclusion, if you're a Ruby enthusiast, Crystal may be the next technology worth investing in to enhance both your programming toolkit and efficiency. It's pivotal, with a growing landscape surrounding this language, to familiarize oneself with its capabilities. If you are interested in discussing my experiences further or sharing insights about development practices, feel free to find me at lunch.