Algorithms

Summarized using AI

It's More Fun to Compute

Julian Cheal • September 08, 2016 • Kyoto, Japan

In the presentation 'It’s More Fun to Compute' by Julian Cheal at RubyKaigi 2016, the speaker explores the fascinating world of analog and digital synthesis, connecting coding with music creation. Cheal, a Ruby developer, discusses various synthesizer components and functionalities, emphasizing how Ruby can be used to generate music.

Key Points:

  • Introduction to Synthesizers: Cheal introduces different types of synthesizers, including the Arturia MiniBrute and Korg Volca series, explaining that synthesizers use oscillators to create sound waves such as square waves, sine waves, triangle waves, and sawtooth waves.
  • Understanding Envelopes: The speaker explains envelopes, detailing terms like attack, decay, sustain, and release, which define how sound evolves over time after a key is pressed.
  • Digital Synthesizers and Sonic Pi: Cheal highlights Sonic Pi, a Ruby application designed for educational purposes, allowing users to compose music through simple Ruby code. He shares his initial struggles with the tool and how practical experience helped him grasp the concepts better.
  • MIDI Communication: He introduces MIDI, explaining how it revolutionized music production by standardizing communication between instruments, allowing for easier integration of hardware with software.
  • Creative Coding with Ruby: Cheal demonstrates how Ruby can control MIDI devices and generate musical patterns, including a unique approach of translating test results into melodies, making coding more engaging.
  • Sound Experiments with Algorithms: The speaker shares fun experiments comparing the audio outputs of sorting algorithms like bubble sort and quicksort, exploring how they differ in sound.
  • Live Coding Demonstrations: The presentation features live coding sessions where Cheal interacts with synthesizers through Sonic Pi, showcasing the seamless integration between coding and music.
  • Future of Sonic Pi: Concluding thoughts focus on the potential of Sonic Pi to evolve further, including direct MIDI support and open-source contributions that enhance its capabilities. Cheal also highlights a device that facilitates spontaneous music composition.

Conclusions and Takeaways:

  • The intersection of programming and music is a rich field for exploration and creativity.
  • Tools like Sonic Pi democratize music production, making it accessible for learners and creators alike.
  • Engaging with sound through coding can enhance understanding and enjoyment of both disciplines, fostering innovative interactions between technology and art.

It's More Fun to Compute
Julian Cheal • September 08, 2016 • Kyoto, Japan

http://rubykaigi.org/2016/presentations/juliancheal.html

Come with us now on a journey through time and space. As we explore the world of analog/digital synthesis. From computer generated music to physical synthesisers and everything in between.
So you want to write music with code, but don’t know the difference between an LFO, ADSR, LMFAO, etc. Or a Sine wave, Saw wave, Google wave. We’ll explore what these mean, and how Ruby can be used to make awesome sounds. Ever wondered what Fizz Buzz sounds like, or which sounds better bubble sort or quick sort? So hey Ruby, let’s make music!

Julian Cheal, @juliancheal
A British Ruby/Rails developer, with a penchant for tweed, fine coffee, and homebrewing. When not deploying enterprise clouds, I help organise fun events around the world that teach people to program flying robots. I also occasionally speak at international conferences on the intersection of programming and robotics.

RubyKaigi 2016

00:00:00 Hi everybody, I'm Julian, and you might not recognize me because I'm not wearing tweed today. It's been far too warm here, but you can normally find me wearing tweed.
00:00:05 I'm a Ruby developer from Bath in England. Just so you know, it’s a beautiful city called Bath, and it’s not London. I live in Bath, and it's called Bath because we have many baths. We do sometimes have showers as well.
00:00:18 I work at a small open-source company called Red Hat on a project called ManageIQ. You can find this on GitHub. We’re always looking for contributors and hiring, so if you want to work at ManageIQ and Red Hat, come speak to me afterwards, and we can see what we can do.
00:00:42 Thank you to everyone who came to Ruby karaoke last night. I woke up this morning after getting in very late, and I was so tired. I just lay there with my alarm going off: 'Five more minutes, five more minutes.'
00:01:01 The name of this talk is 'It’s More Fun to Compute.' It comes from a track from an album called 'Computer World' by a German band called Kraftwerk, which is well-known for producing lots of electronic music using synthesizers.
00:01:24 Today, we're going to talk about synthesizers, but not the kind from Fallout 4. We’ll discuss electronic instruments. I have a few of my instruments here with me. From home, I have my Arturia MiniBrute, my Korg Volca Keys, which is a polyphonic synth, and my Korg Volca Beats, which is an analog drum machine.
00:01:50 I also have my littleBits, which teamed up with Korg to make the Korg littleBits synth. The first thing you need to know when you want to learn about synthesizers is that they’re basically created using oscillators.
00:02:21 Oscillators are small pieces of electronics that generate a repeating pattern. Those patterns are called waves, and there are a variety of different waves that synthesizers can generate.
00:02:43 We might all be familiar with the square wave, which is a simple on-off signal, commonly used in computing. You may also know the sine wave, which has a more fluid shape, but we don't tend to use it in everyday computing because we can only have zeros or ones.
00:03:01 Another wave that you might not be aware of is the triangle wave, which looks like triangles. We also have the sawtooth wave, which can sound very different depending on the direction it's played in.
00:03:20 Of course, we also have noise, which is just that—a mess of sounds, but it can be beautiful. Let’s have a quick demo to hear what those waves actually sound like.
00:03:48 To show you what the waves look like, I have a demo written in Processing. I didn’t write this; someone else made it. It visualizes sound, currently capturing the sound of my voice, and hopefully showing the sine wave from my synthesizer.
00:04:20 Here’s my synthesizer. I can generate different wave shapes using the oscillator. This is what a sawtooth would sound like, and you may notice that it doesn't show the visuals in an ideal way. This is using Ruby to perform a Fourier transformation to visualize the audio.
00:04:58 And now we hear a square wave, which is obviously not very square right now. If I could take more samples, we could make it work better, but as you can hear they’re very different in sound. A triangle wave is much quieter and more subtle. Noise, which you might have heard from tuning into a wrong television channel, sounds like the Big Bang. It’s essentially all the other sounds combined, which can be a beautiful sound.
00:05:53 Next, let’s learn about envelopes in synthesizers. Envelopes change the structure of sound with terms like attack, decay, sustain, and release. The attack is the amount of time it takes for the sound to reach its maximum level after pressing a key.
00:06:15 The decay is the time it takes to drop from the maximum sound level to the sustain level, which refers to the volume of the note. After you take your finger off, the release phase determines how long the sound lasts.
00:06:56 Let’s test this with the sawtooth wave. When I take my finger off the note, it stops. If I increase the attack and decay, it takes longer for the sound to start. We can also adjust the release, making the sound continue after I let go. By manipulating these parameters, we can create numerous sounds, using noise to make it sound like cymbals instead.
00:08:00 The essence of digital synthesizers is that by changing waveforms and envelopes, we can pretty much replicate any kind of instrument. This fascinated composers in the 60s, as it allowed them to produce orchestral music without hiring an entire orchestra.
00:09:00 You might now be thinking, 'This is super cool, but what does this have to do with Ruby?' Not everyone can carry around large analog synthesizers, and they need maintenance. The great thing is that we now have digital synthesizers like Sonic Pi, which is a Ruby application built on top of Overtone, a digital synthesizer.
00:10:03 Sonic Pi was created in the UK by a gentleman named Sam Aaron. Its name references it being created for kids to teach them programming and it can also run on Raspberry Pi. You use Sonic Pi to write simple Ruby code to generate music.
00:10:59 Initially, when I coded in Sonic Pi, I was intimidated. I didn’t know what a saw was or what the 'attack' and 'decay' meant. But after studying and building actual instruments where I could physically manipulate buttons and see how they affect sound, the code started to make sense.
00:11:50 Let’s have a look at an example in Sonic Pi. If I borrow this, we can see how we can generate notes. A simple Ruby code plays a note at MIDI 60, representing middle C.
00:12:29 As expected, this code produces a sound akin to what we did with the knobs on the synthesizer. With a few simple patterns, we can create amazing pieces of music.
00:12:54 You might be excited about using Ruby to make music in Sonic Pi. However, wouldn't it be even cooler if your instruments could communicate with the computer? You can indeed accomplish this through a protocol called MIDI, created in the 80s.
00:13:42 Before MIDI, every manufacturer had different ways for their instruments to communicate and they weren't interoperable. With MIDI, you can connect your instruments using a simple cable, allowing them to communicate effectively.
00:14:32 MIDI is a messaging protocol that indicates whether a note should be turned on or off, along with specifying a channel for up to 16 instruments. Note velocities indicate how hard you press the note, which impacts the sound.
00:15:19 You might have previously associated MIDI with those awful ringtone sounds from old mobile phones, but that’s a misconception—MIDI simply transmits messages. The sound quality depends on the instruments used.
00:16:05 I will demonstrate how we can use Ruby to communicate with MIDI devices. We will connect to a MIDI device and then play notes using Ruby code.
00:16:58 I’ll choose my MiniBrute device to play some music using Ruby.
00:17:31 Unfortunately, I forgot to turn off some notes, so if I had left everything as it was, the note would have continued to play indefinitely.
00:17:42 In addition, we can create rhythm patterns using a library called Topaz to set the tempo, simulating drum beats through Ruby code.
00:18:05 By tweaking the tempo, we can create different drumbeats that are either upbeat or slow for a more relaxed vibe.
00:18:57 When I realized I could control my instruments using Ruby, I began to think about how to generate music while running tests in the background—a satisfying way to gamify the coding experience.
00:19:50 I decided to write a mini test reporter to generate a MIDI melody for our test results—an interesting way to mix coding with composing.
00:20:32 For example, if your tests pass, it might sound one way, whereas a failure could play a sad note. So you’d quickly know the state of your code just by listening to the music generated.
00:21:15 Imagine how enjoyable that would be in your office: happy notes for passing tests, sad notes for failures. I plan to make this available with configuration options, so you can also have a musical office environment.
00:22:06 Now, as I was experimenting further, I was curious if quicksort sounds better than bubblesort. These sorting algorithms are often compared on their efficiency, but I wondered how they might sound while doing their sorting job.
00:23:12 So, I created a demo with both bubble sort and quicksort playing notes as they process. I chose to keep the values between 50 and 100 to match the instrument range.
00:23:52 First, let’s try the quicksort demo. Let's see how that sounds. It does not sound very musical at all. But I was also curious about the bubble sort demo and lets take a listen to it now.
00:24:35 That one sounds a bit better—it has more pleasant notes compared to the quicksort. This way, we get a fun comparison of the algorithms based not only on performance but also on their audio characteristics.
00:25:18 Have you ever wondered how the Fibonacci sequence sounds? It’s fascinating how sequences in mathematics can translate into musical patterns. I'd love to see more music inspired by mathematical concepts in the future.
00:26:00 As we’ve seen, we can use Ruby to make instruments communicate effectively. It's amazing to see how coding and music can come together in fun, creative ways.
00:26:20 Now, wouldn't it be cool if we could use Sonic Pi to control our analog synthesizers? Fortunately, we can, and I have a great example of that.
00:27:00 This is a song by Depeche Mode called 'Blue Monday.' Let’s play it in Sonic Pi first without the analog instruments.
00:27:39 Now let’s try to get my instruments to play it instead. By using DRB, I can call my code running in another process, connecting to the instruments.
00:28:25 So here’s the same song playing through my synthesizers. I can now interact with my instruments using the same MIDI patterns and code.
00:29:00 As we play with the settings, we could create interesting soundscapes. Ultimately, the excitement lies in the interaction between Ruby and performance; it brings a whole new dimension to both coding and music.
00:30:00 In future versions of Sonic Pi, we will see direct support for MIDI instruments, removing the need for workarounds. So keep an eye on Sonic Pi as it evolves.
00:30:42 Moreover, there’s open-source collaboration that drives Sonic Pi forward with many valuable contributors. One of them transcribed a famous song into Ruby, showcasing how versatile this tool can be.
00:31:33 This shows how inclusive and accessible music can be when combined with coding. I believe the next step is to allow spontaneous composition in real-time using these tools.
00:32:25 Now I have a touchpad MIDI device that I can use to send notes to Sonic Pi. This device enables me to control the music without relying solely on coding.
00:33:00 With this, I can easily create music by simply pressing buttons. It’s a great way to spontaneously compose, making music more interactive and fun!
00:33:40 Additionally, I have a small keyboard that can also send notes to Sonic Pi. By plugging the keyboard in, I can incorporate its music into my sonic creations seamlessly.
00:34:55 This integration allows me to use my analog synthesizers and create sounds with a digital twist. There's a slight delay, but overall it adds an enjoyable dimension to the music.
00:35:30 Unfortunately, my brain wave scanner broke on the way here, so I couldn’t show how it could be used to generate music based on brain activity. If it’s working by the end of the day, I’ll demonstrate it.
00:36:01 With that, we've reached the end of this session. Thank you very much for joining me today. Domo arigato!
Explore all talks recorded at RubyKaigi 2016
+30