Talks

Summarized using AI

Unconventional Computing

Paolo "Nusco" Perrotta • April 21, 2016 • Earth

In the presentation titled "Unconventional Computing" by Paolo "Nusco" Perrotta at the Ancient City Ruby 2016 event, the speaker explores the stagnation in traditional computing advancements and suggests various unconventional approaches to overcome these limitations.

Key points include:
- Stagnation of Performance: Compares the past rapid growth in computing speed to the current stagnation, referencing Moore's Law and Black's Law, as well as the physical limitations met by shrinking transistors.
- Need for Alternative Computing Methods: The speaker emphasizes the necessity for new computing paradigms, highlighting that simply adding more CPUs may not be sufficient for complex problems.
- Diverse Computation Ideas:
- Biological Computing: Examples include using living organisms like crabs and slime molds for computation, showcasing their unique abilities to solve problems.
- Photonic Computing: Discusses the potential of computers utilizing light, mentioning IBM's development of a fully photonic chip.
- Chemical Computation: Explores chemical reactions, such as the Belousov-Zhabotinsky reaction, for building logic gates.
- Quantum Computing: Highlights the allure and challenges of quantum technology and its potential to operate faster than classical computers through innovative algorithms.
- Emerging Trends in Programming: The speaker foresees a rise in functional programming and a return to mainframes as a response to the need for centralized, powerful computing resources in the medium term.
- Future Speculations: Envisions a diversifying landscape in computing technologies, leading to specialized coding frameworks that cater to unconventional computational processes.

In conclusion, Perrotta emphasizes the importance of exploring these varied computational methods to innovate beyond current limitations. He encourages an appreciation for the evolution of computing, connecting the discussion back to foundational figures like Charles Babbage.

Unconventional Computing
Paolo "Nusco" Perrotta • April 21, 2016 • Earth

Our computers have taken us to a dead end. We can't make them faster than they are. It may be time to go back to the drawing board and challenge the notion of what a computer should be like. What about computers made of light, fluid, or living beings?

Believe it or not, people are actually trying all of those ideas - and more. Let's make a sightseeing tour through the most unexpected and crazy approaches to computing.

Ancient City Ruby 2016

00:00:05.210 Hello. So, when I wrote my first professional program around 1995, it was a Java applet with a lot of graphics. It really had many shortcomings, and I couldn't even find a working copy on the Internet today. This gives you a hint as to how successful it wasn't, and it was really, really slow and unusable.
00:00:14.639 So, I did what a developer does; I optimized it. Here is how we optimized stuff in 1995. It was essentially a two-step process. Step one involved checking the turbo button. I'm actually surprised that so many people know what the turbo button is.
00:00:32.730 For those who don't know, it wasn't a button that made your computer faster. Of course, it made it slower. The idea was that computers were evolving so fast that the computers of today would make yesterday’s applications, particularly games, unusable. So, you slowed down the computer by pushing that button, and sometimes you forgot to release it.
00:00:51.630 If that didn't work, then there was another option: just sit and wait. Within six months to one year, your application would run just fine because you would have a new computer that was much faster than the previous one. It was just crazy how fast they evolved.
00:01:08.880 Why is that? Well, of course, the number of transistors on a chip doubles every couple of years. So, you have this nice logarithmic scale here. More transistors mean more computation, and more computation means more speed. At the same time, clock speed was progressing similarly, and in general, heat dispersion would go down.
00:01:29.010 Long story short: they got faster really reliably. Every six months to one year, you could see the difference— a huge difference. Then, around 2005, someone wrote an article, which many of you have probably read already. If you haven't, I recommend doing so; it essentially said it's over.
00:01:51.630 Did you enjoy just sitting on your hands and waiting for computers to get faster while your software runs faster? Not anymore! This is a dead end. We can’t keep making computers faster like that. This article was a little ahead of its time, especially regarding transistor sizes and clock speeds.
00:02:06.250 You can see how it suddenly levels out. The computer that I have here has pretty much the same clock speed as my previous computer. This is the first time this has happened in my entire lifetime, and now it's also happening with transistors. A very recent article from ARS Technica noted that the semiconductor industry is essentially throwing in the towel on this progress.
00:02:26.260 Why is that? Well, one issue is that transistors are becoming incredibly small. I did some back-of-the-napkin calculations. I went to the reception and asked how large these rooms were, and I got way more information than I asked for. I now feel I need to organize a conference there, otherwise, they will be very disappointed.
00:02:46.480 If you take a strand of hair and blow it up to the size of this room, you would see a transistor about one millimeter in size. You'd have to squint just to see it. At that size, if you squint hard enough, you could see your own DNA double helix, but of course, that’s not possible due to being well below the wavelength of visible light.
00:03:03.160 Now, we are reaching physical limits where we can’t make transistors smaller due to quantum effects. But that's not entirely true; we can still make them possibly one order of magnitude smaller before physics start to catch up with us. The stumbling block right now is actually much more powerful than physics.
00:03:19.600 I'm not talking about love here. What we have is a grim counterpart to Moore's Law, which is sometimes called Black's Law. It states that the money needed to build a transistor plant roughly doubles every four years. Right now, it costs about 14 billion dollars to build such a facility.
00:03:34.840 Transistors are getting too expensive. Sure, the next generation of transistors will be smaller, but you probably won't be able to afford it. So, we need to come up with other ways to make computers faster. You might say, 'Hey, wait a minute! I don't need a faster CPU, maybe I just need more CPUs.' If we have big data, we can use map-reduce techniques.
00:03:54.320 Just give me more computers, right? I mean, what’s fast enough? Tinder is already fast enough, so what else do you need? Well, there are actually a few cases where faster computers are needed, primarily in artificial intelligence.
00:04:07.370 Everybody is amazed by what we’re doing with artificial intelligence. Computers can wipe the floor with grandmasters at Go, and we’ve seen Google’s robots navigating snow. But those are relatively easy scenarios. Sorry to break it to you; better matching rules can constrain environments.
00:04:18.230 If you look at tasks like navigating unconstrained environments, for example, there is a competition called the DARPA Robotics Challenge where robots are supposed to simulate a rescue operation— driving into a building, entering the building, and doing tasks in there.
00:04:40.370 Well, some robots could actually drive to the building, but when it came to entering, it was kind of a disaster. This is the painful result; they couldn't even open the door sometimes. We will never be enslaved by such robots; they'll take forever to succeed.
00:04:56.120 I know that showing you those robots could get me in trouble, but for now, as someone said, if you're worried about the Terminator, just close the damn door— they just don’t stand a chance. We need faster computers, but we don't have the technology to build them, and we don’t know how.
00:05:12.600 This is what I’m going to talk about: how can we build computers that are different and possibly approach problems from entirely different points of view, potentially leading to faster solutions than current transistor-based systems?
00:05:27.360 I need to give a brief introduction to how computers work. Of course, the way they work may be offensive to a few of you. This is basic stuff, and I’m simplifying things, particularly the electrical side of it, to the point where I might provide incorrect information. But bear with me for a moment.
00:05:39.400 This is a transistor: it’s essentially a switch. What happens is that if I have current flowing, it doesn’t go anywhere because the switch is open. If I apply current to the control, I close the switch, and it gets through. I know this might be confusing regarding current and potential, but this is the basic idea of a transistor.
00:05:59.640 So, what do we do with this? For example, we can put two of these in line, wire them up, and connect them to the ground. If I now apply current, nothing happens because the circuit is open. If I close one of the inputs, nothing changes; the circuit remains open.
00:06:27.630 However, if I apply current to both inputs, then I close the circuit, allowing current to flow. If there’s no current at the output, that’s when the circuit is truly closed.
00:06:45.390 This gives rise to a truth table. The only way not to have current at the output is to have current on both inputs. Now, let’s replace current with 'true' and no current with 'false.' This leads us to what we call a logic gate, a device that applies logic to input and delivers output based on specific rules.
00:07:01.240 In particular, we often refer to one type as the AND gate. So, what can I do with an AND gate? I can connect the inputs together, which means the outputs will follow the inputs. This leads to what we call a NOT gate as well.
00:07:22.950 What can I do with a NOT gate? I can take a couple of them and connect them to an AND gate. To get a false output from this combination, both inputs must be true. Thus, if I have any other combination, I will get true at the output.
00:07:39.360 Can you recognize this logic gate? I just created the NOT gate and so on. I can create an entire range of these gates, assembling them into a CPU and building memory with them.
00:08:02.900 That’s how computers work. And the point I want to make is: it’s not about the transistors. We don’t care about the transistors; we care about the gates. Those are the essential components.
00:08:14.630 So, it’s about the gates. If we can build gates with anything that is not a transistor, we are set! The real question becomes: how can we build gates with other materials? We indeed can build gates with a variety of different things.
00:08:32.040 For example, we can build gates with domino pieces. If they fall, it’s true; if they don’t, it’s false. An AND gate would look like one that takes two pieces to fall simultaneously.
00:08:49.609 Can I design an AND gate with domino pieces? That's more challenging. Should I spoil it for you, or would you prefer to spend a geeky afternoon designing these things yourself? But fine, if you don’t want it spoiled, close your eyes.
00:09:05.730 This setup inhibits the input line because it kicks out these two pieces. For this to happen, this must not be inhibited, meaning it must be false; it gets a bit complex.
00:09:18.470 Nonetheless, we can assemble logic gates from domino pieces that work for real. They might not be the fastest computers, nor the most reliable, but they do perform basic functions.
00:09:30.830 There includes even more bizarre examples where people are actively working on building logic gates with live crabs. This research in Japan is publishing real papers on the subject!
00:09:41.810 These crabs exhibit reliable swarming behavior, and by projecting shadows over them, you can encourage movement. When crabs from both sides meet, they steer together, creating an AND gate with their behavior.
00:09:58.470 Now, I'm not advocating for building computers out of stressed-out crabs. It's just about coming up with ideas that go beyond conventional logic, and there are many possibilities.
00:10:15.300 For example, if we want computers to be faster and smaller, we're using electrons right now. What’s faster and smaller than an electron? A photon!
00:10:28.057 Yes, we can build computers that operate using light. I mean, we are already using light to transmit information, especially over long distances, to move vast amounts of data. This is a significant area of research.
00:10:42.370 Just last year, IBM built the first fully photonic chip that works with light. However, technical difficulties exist in making such systems work reliably. Light-based systems have their sensitivity and energy dispersion issues when converting between light and electricity.
00:11:05.590 Another fun avenue for computation is chemistry. An example would be something called the Belousov-Zhabotinsky reaction. It’s cyclical, so chemical reactions can keep happening over a long time, which is a unique property that makes it possible to build logic gates from chemical processes.
00:11:21.650 Finally, we cannot overlook the buzz around quantum computing. Quantum computing is intriguing for several reasons. Firstly, the computers look cool! They have this steampunk aesthetic to them.
00:11:37.950 Secondly, their functioning eludes our understanding. It can be challenging for most to grasp concepts like superposition, where a qubit can be zero, one, or both simultaneously, depending on its observation.
00:11:54.030 Understanding Schrodinger's Cat is an enormous challenge. The fast algorithms that can potentially be implemented on quantum devices offer a glimpse into a new realm of computation.
00:12:10.450 However, companies are working on developing actual quantum computers, and the debate goes on regarding whether they operate as true quantum systems or merely conventional computers when observed.
00:12:24.130 The only way to assess whether they are true quantum computers is to check their efficiency— as they should ideally be millions of times faster than current systems. But we first need to develop them!
00:12:38.330 I’ve wandered a bit from discussing the basic logic gates. What I want you to take away is that we are not just talking about classic Boolean logic; this is different from what we're traditionally used to.
00:12:53.930 We are building computers that may not be general-purpose but can solve specific problems through tailored computation approaches. For instance, using DNA as a computational medium is becoming a thing, given that it stores information and can duplicate with remarkable efficiency.
00:13:09.090 There have been some successful examples of utilizing DNA to solve classic computational problems, and if you’re curious, look up articles from credible sources like The Economist detailing how they achieved this.
00:13:20.770 Another fascinating possibility is using living organisms like slime molds in computation. These creatures can display incredible problem-solving capabilities.
00:13:34.290 When fed, slime molds will stretch and move towards food sources and can efficiently navigate through environments, displaying pathfinding capabilities that might find data pathways.
00:13:46.340 They have been used in experiments where oat flakes were placed at two edges of a maze. The slime mold navigated through the maze efficiently, leaving behind a path showing its optimal route.
00:14:01.260 There are many interesting applications where scientists and amateur developers are utilizing slime molds for computational simulations, and some discussions explore this in fascinating depth.
00:14:18.810 This is a collective effort bigger than any single project, as it explores unconventional computing methods outside traditional transistor-based systems.
00:14:33.340 As for predictions, I suppose I should try to address what's going to happen in short term, medium term, and long term. For the short term, we are seeing functional programming garner more attention.
00:14:51.660 How is that related? Well, since we can't make computers faster right now, we must parallelize our programs. However, current programming languages are insufficient for effectively achieving this.
00:15:06.930 Medium term, we're witnessing a notable trend where mainframes are making a comeback. It might sound absurd, but it’s true. Rather than being personal, computing must leverage centralized power due to the immense needed processing.
00:15:19.740 For the long term, we might witness many specialized technologies addressing specific issues. This might lead to a Cambrian explosion of different computing approaches.
00:15:36.200 What would programming look like for biologically-driven processes? If we gain new technologies for specific computing problems, we could expect a diverse and specialized coding landscape to emerge.
00:15:50.780 Overall, the journey into computing is evolving rapidly. It’s going to be exciting to see how things develop from here, especially with how disconnected things feel today compared to just a few decades ago.
00:16:02.310 Thank you for your attention, and don't forget to have respect for the legacy of Charles Babbage.
Explore all talks recorded at Ancient City Ruby 2016
+4