Ruby
Why Recursion Matters
Summarized using AI

Why Recursion Matters

by James Coglan

In this talk, James Coglan presents the significance of recursion in programming, shedding light on its historical context and its prevalent use in modern programming practices. He emphasizes that while recursion was once controversial in programming language design, it has become a fundamental tool that programmers utilize to simplify complex structures and processes.

Key points discussed include:
- Historical Context: Recursion was a contentious topic in early programming, but it is now commonly accepted and used. Coglan highlights how intuitive recursive tasks, like deleting directories or web crawling, are often taken for granted.
- Recursive Structures: The speaker differentiates between recursive functions and recursive structures, noting that they are omnipresent in tasks like generating HTML or managing file organization.
- Correctness via Recursion: Coglan explains how recursion helps prove program correctness through mathematical induction. He discusses how recursive definitions in functional programming languages, such as Haskell, enable more straightforward proofs and understanding of data structures like lists.
- Performance Optimizations: Specifically, he mentions the compilation of CoffeeScript files into JavaScript as an example of how recursion can lead to performance optimizations. By utilizing structures like Makefiles, programmers can effectively manage dependencies and reduce redundancy, leading to faster compilation times.
- Time Travel in Programming: The concept of running programs backwards is introduced, with an explanation of how certain languages manipulate structures to achieve reversible computations. Coglan uses Micro-Caml as a case study to illustrate how expressing relationships rather than mechanics allows for more natural compositions in programming.

In conclusion, Coglan advocates for a shift towards understanding recursion not just as a programming technique but as a method to express structures and relationships clearly, arguing that this approach leads to better solutions in problem-solving. He encourages viewers to consider how they design their applications and embrace a structural approach to programming.

The talk demonstrates that recursion is not just a tool, but a powerful concept that intersects correctness, efficiency, and expressiveness in programming.

00:00:14.910 Thank you all very much for coming. This is actually the first conference I spoke at in 2010, and it's really nice to come back here and talk about recursion today. By way of a brief introduction, I'm James. In terms of Ruby, I created a project called Fae, which includes a bunch of WebSocket functionality and is part of the stack that makes up Action Cable. However, I'm not really going to focus on Ruby so much; instead, I'll discuss other languages and several ideas connected by the concept of recursion and why it is important.
00:00:37.180 At one time, recursion was a highly contentious idea in programming. There was a lot of debate in early programming language design about whether it should be included, but now we take it for granted, and we're quite comfortable with it. For example, if you want to recursively delete a directory, you simply start with a file and delete it. If it’s a directory, you must delete everything inside it first, as you can't delete an empty directory. You look up the children of the directory, delete each of those recursively, and then remove the directory itself.
00:01:06.560 Likewise, if you want to spider a website or even the whole internet, you start with a URL. You access that URL and retrieve the corresponding page, then find all the links within that page. For each of those links, you repeat the process recursively. By doing this, you can end up reading the whole internet if you run the program correctly. We are accustomed to writing recursive functions but often overlook recursive structures.
00:01:22.390 We create recursive structures all the time, such as when generating HTML or organizing files on disk. However, we don't necessarily think about these structures in terms of their power and meaning. Let's focus more on these structures and what they can achieve. For example, in JavaScript, inheritance works through a special field called ‘proto.’ When you create an object, its prototype points to another object, creating a chain between them.
00:01:56.490 When you look up a property or invoke a method on an object, JavaScript checks whether the property exists in that object. If it doesn’t, the interpreter looks up the prototype chain to find it. This recursive chain of prototypes helps implement inheritance, and the structure itself allows you to understand how it functions. I would like to explore recursion's importance in three main areas: correctness, performance, and meaning.
00:02:39.670 Let’s start with correctness, specifically, the ability to prove that programs behave as expected. In Ruby, we often use unit testing to confirm that our programs are correct, but unit testing alone is not proof. You can run many unit tests for a program and still not discover bugs. Proof means knowing that something is correct for all possible inputs, and recursion provides a method for proving this.
00:03:02.890 The principle of mathematical induction follows a recursive argument: you assume the property holds for a base case and then use that assumption to prove it for the next case, such as from 'n' to 'n + 1.' For example, if you want to find the sum of numbers from 1 to 'n', you observe that the sum follows the pattern of 'n * (n + 1) / 2.' While you may confirm this with numerous examples, it’s the recursive argument that leads to a proof.
00:03:31.360 For arrays in Ruby, the definition is relatively straightforward, consisting of flat lists of values. In functional programming languages, like Haskell, lists are defined recursively: a list is either empty or consists of a value paired with another list. For example, the list [1, 2, 3, 4] can be represented as one paired with the pair of two paired with three paired with four and then the empty list. This representation forms a recursive structure, akin to a binary tree.
00:04:14.460 In imperative languages like Ruby, we typically define processes for calculating the length of a list or mapping over it. For length, we might start with zero and increment it for each element. To map over a list, we create a new array and compute the result of applying a function to each item, preserving the order. By contrast, in functional languages, definitions express relationships: the length of an empty list is zero, while the length of any other list is one plus the length of its remainder.
00:04:49.439 Mapping over an empty list produces an empty list, and mapping a function over any other list involves applying that function to the first element and then mapping it over the remainder. In Haskell, programming is less about procedural instructions and more about stating that certain relations exist; you express the relationships and let the language handle computation. For example, for the functor composition law: if you map a function G over a list and then apply another function F to the result, it is equivalent to mapping the composition of F and G over the original list.
00:06:38.869 To prove this from a list perspective, take a list with an extra element. For instance, using the definitions of map and previously assumed statements, we can recursively derive compositions. This works even for structures like trees and graphs. Such programming constructs enable the efficient expression of operations through recursion. In contrast, imperative languages require explicit sequences and step-by-step states.
00:07:35.830 In cumulative structures, such as filtering or appending operations in Haskell, the relationships are defined in terms of the underlying types. For example, filtering on a list yields another list depending on whether an element fits the condition or not. The Haskell approach inherently supports lazy evaluation, which allows computations to defer until the results are needed.
00:08:21.160 Now, let’s turn to performance. In performance-driven programming, we often want things to complete quickly. Optimizing performance manually can be quite difficult; however, recursion can be an effective way to express solutions that allow compilers to leverage optimizations. Let's discuss a practical example regarding compiling CoffeeScript files into JavaScript.
00:08:54.950 Imagine you are compiling multiple CoffeeScript files into JavaScript to serve on a website. One approach is to utilize a Bash script. In this scenario, you would set up a directory to store compiled files and iterate through the source files to run the compiler, concatenating the generated files into one artifact.
00:09:37.110 Alternatively, using a Makefile could illustrate a more optimized approach. The Makefile describes the source files and their relationship with the compiled JavaScript outputs. When executed, it does all of the necessary tasks, efficiently managing dependencies between files. If you were to modify just one source file and rerun Make, it only recompiles that specific file, greatly reducing execution time.
00:10:37.480 The benefit of using the Makefile lies in its ability to describe project structures and dependencies effectively. This allows Make to determine the necessary compilation tasks efficiently, reducing redundant processing. As a result, relative changes can yield substantial improvements in performance, while still ensuring correct output. Recursive structures are prevalent in many programming paradigms, yet not all environments exploit this potential as fully as others.
00:11:17.520 The recursive nature of many programming languages allows for optimizations and operations that manifest only when you express concerns around dependency and performance. Another example of this can be seen in functional programming languages like Haskell where constructs are defined in an optimally recursive manner, inherently allowing lazy evaluation of data and structures.
00:12:14.660 The discussion now leads us to the intriguing topic of time travel in programming, the notion that you can run programs backwards. In general, in Ruby, functions take inputs and yield outputs, and you cannot simply provide an output to ascertain what inputs might have led to it. However, certain languages allow for this type of reverse engineering based on the structures involved.
00:13:14.320 For example, consider a language called Micro-Caml. In this language, variables and pairs are defined in simple terms: a variable has a name, and a pair consists of left and right fields. This forms a base structure for building other data structures, allowing for manageable manipulations and alterations of state. When a state is created, it is immutable; you build a new state whenever a change is required.
00:14:09.630 In the Micro-Caml context, we deal with both walking and unification operations. Walking resolves the value of a variable without traversing all of the possible states, while unification attempts to return a new state in which two elements are equal, utilizing the structure of pairs effectively. By implementing goals as functions and expressing the necessary conditions and states, you can compose these operations in various ways.
00:15:05.620 By composing goals, you can form general-purpose computations, akin to a Turing machine. The results can yield meaningful relationships through assertion and definition, leading to computations that appear to know how to operate, even if they don't possess inherent knowledge. Rather than specifying imperative instructions, programming this way expresses relationships and constraints, allowing computations to follow naturally.
00:16:01.440 In summary, recursion and its related structures present significant opportunities for programming. The big picture involves understanding the different methods of utilizing computers to solve problems. One approach involves explicit problem-solving through prescribed instructions, while the other favors expressing relationships and leveraging these structures to derive solutions automatically.
00:16:51.630 The latter represents a movement towards type-2 programming, where the emphasis lies in shaping the structure of a problem rather than crafting the mechanics of a solution. I encourage you to reflect on how you represent problems and find opportunities for your applications to adopt a more structural approach to programming. Thank you very much for listening.
Explore all talks recorded at RubyConf 2016
+78