Cognitive Biases
The Not So Rational Programmer
Summarized using AI

The Not So Rational Programmer

by Laura Eck

In her talk titled "The Not So Rational Programmer" presented at RubyConf 2015, Laura Eck explores how cognitive biases affect programmers' decision-making processes. Eck likens the human brain to an outdated legacy system, highlighting its strengths and limitations in processing information. The core theme is the recognition of cognitive biases—mental shortcuts that can lead to suboptimal decisions—and how understanding these biases can enhance a developer's effectiveness and collaboration.

Key points discussed include:
- Cognitive Biases Defined: Eck explains that cognitive biases are heuristics our brains use to process information rapidly, which, while useful, can lead to poor decisions.
- Understanding Brains as Legacy Systems: The speaker draws a parallel between human brains and outdated systems, suggesting that, like software, our cognitive processing has flaws and lacks proper debugging tools.
- Importance of Rationality in Programming: Though programmers often believe themselves to be more rational due to their work with machines, Eck emphasizes that they are still subject to human biases.
- Examples of Cognitive Biases:
- Confirmation Bias: The tendency to search for or interpret information in a way that confirms existing beliefs, often ignoring contradictory evidence.
- Mere Exposure Effect: A preference for familiar items over unfamiliar ones, which can lead to poor decision-making based on comfort rather than objective analysis.
- False Consensus Effect: The assumption that others share the same opinions or beliefs, which can result in communication issues within teams.
- Groupthink: A phenomenon where the desire for harmony in a group leads to a lack of critical thinking, potentially stifling innovation and dissenting views.
- Strategies to Mitigate Biases: Eck suggests strategies for reducing the influence of bias in decision-making, such as challenging personal opinions, establishing objective criteria for evaluations, and encouraging open communication in teams.

Eck concludes her talk by urging attendees to embrace the complexities of cognitive biases without attempting to eliminate them. By recognizing and understanding their biases, programmers can improve their teamwork and decision-making skills, ultimately leading to greater success in their development careers. The key takeaway is the importance of being open to changing one’s opinions in response to new information, as this is a mark of growth and adaptability in the field of programming.

00:00:15.280 All right, hi everyone, I'm Laura, and I'm a developer. I work for a test cloud in Berlin. I live in Tokyo and I work remotely from there.
00:00:26.480 This is me, just in case you wanted to know, and see another image of myself.
00:00:33.760 To start out with, I have a question for you—or actually, two questions. But I'll start with question number one: Who in here has ever had to work with a really weird old legacy system? Not even like a complete rubble, just had to deal with it somehow. Please raise your hand if you have.
00:00:45.079 All right. Second question: Who here in this room has a brain? Please raise your hand if you do. Not everyone has a brain? That's kind of surprising, but I assumed all of you have one.
00:01:10.439 So everyone who raised their hand for question number two should also have raised their hand for question number one because what our brain really resembles is a big, fat old legacy system. It's been a while since the last hardware update.
00:01:31.840 The good news is that our brain is extremely powerful, and it can do a lot of amazing things. The bad news is that the documentation is pretty lackluster, and their error handling isn't that great either. We can't even debug it because we don't have access to the code base at all.
00:01:49.439 That sounds like the nightmare of every programmer, doesn't it? The problem is we can't just walk out on our project manager and quit. We're kind of stuck with this obnoxious and brilliant heap of jelly in our skulls that helps us process and react to our environment. It allows us to reason about abstract problems, communicate, and even program.
00:02:13.760 On the other hand, it constantly forgets people's names, reminds us of awkward situations from three years ago in a random fashion, and it makes decisions for us without even asking. So today, I would like to talk about how we can understand our brains better, the way they work, these weird little flaws called cognitive biases, and what to do about them.
00:02:56.280 You see, as programmers, we like to think of ourselves as a group of people that is somehow more rational than others. After all, we earn our living by talking to machines every day, and machines aren't exactly known for being super emotional. So, when we make technical decisions or plan projects, it's only fair to assume that we adhere to rational standards, right? Well, I have a surprise for you: programmers are human, and they have human brains.
00:03:19.560 To be fair, most of the time our brains do an amazing job. They process vast amounts of information and provide us with appropriate reactions to deal with everything coming our way. The human brain is really old, and many parts developed when social coherence was critical for survival. Importantly, an accurate and quick assessment of threats relative to your peers mattered; being ostracized could potentially mean death.
00:03:34.840 So, what is cognitive bias? Cognitive biases are heuristics that the brain uses to process information quickly and come up with appropriate reactions. They're usually pretty useful, but they aren't perfect processes. They can lead to suboptimal behavior or decisions.
00:03:53.960 The thing is this: we are all biased. It's essential to understand that this is natural and how our brains work. It's not inherently a bad thing—our brains use shortcuts to manage the information we receive and provide reasonable outcomes to avoid information overload.
00:04:06.480 In fact, our brains function in different modes, and I will show you two simple examples to illustrate this. When you look at the next slide, you will notice some things happen to you without even realizing it. You recognize immediately that this is a person, specifically a child. You can generally tell they are unhappy without even knowing them personally. If that child stood in front of you right now, you might prepare for them to cry or shout.
00:04:36.640 This process of recognition and perception occurs quickly and effortlessly. It's evolutionarily important to understand who is around us and their emotions. In this mode, the brain makes quick, automated decisions we're often unaware of, sacrificing accuracy for speed and approximated results that are acceptable most of the time.
00:04:59.680 As an opposite example, unless you're a mental math wiz or remember your multiplication tables from elementary school, your brain probably went blank when faced with a multiplication problem. This isn't something our brains are evolutionarily designed to handle, and they struggle with it.
00:05:18.960 You can recognize the multiplication problem visually but calculating the actual results requires active thought, which is slow and more difficult, in comparison to fast thinking.
00:05:30.240 Put simply, when our brain is in fast thinking mode, it uses cognitive biases to approximate solutions. Like any approximation, this mode does not prioritize optimal solutions. Instead, it focuses on delivering feasible solutions within a reasonable time frame. We cannot turn this off—it is hardwired into our brains. However, there are situations where we can work around it.
00:06:03.360 Nonetheless, sometimes we might not recognize it’s happening and, secondly, it might not always be feasible to act on it. If we questioned every single move our brain made, we would be paralyzed.
00:06:10.720 Yet, there are instances when it’s valuable to work around biases actively. Decision-making is one such instance. We constantly make decisions, from small ones like choosing lunch to significant ones about how to make our lives meaningful.
00:06:26.240 Most of our work decisions fall somewhere in between: how to implement a new feature, which frameworks to use for this new project, or which candidate to hire.
00:06:37.360 These are vital decisions, and our cognitive biases affect each one of them. So what can we do to make our decisions as good as possible?
00:06:48.160 I will start by examining cognitive biases that influence our personal decision-making, especially since we, as programmers, don't always work alone but in teams. Thus, I will cover some biases that affect decision-making within a team as well.
00:07:12.720 The first cognitive bias to address is confirmation bias. This bias means that when we search for information or interpret it, we tend to do so in ways that confirm our already-held beliefs. This affects both our information searching and interpretation.
00:07:40.560 We often disregard potential alternatives while seeking to confirm our existing assumptions. If we feel strongly about something, we might get upset if someone challenges our viewpoint. Many individuals hold strong opinions on emotionally charged topics like abortion or gun control.
00:07:57.440 In such cases, regardless of correctness, if we encounter information that contradicts our beliefs, we may dismiss it as nonsense rather than consider it.
00:08:05.920 A less emotionally charged technical decision exemplifies this concept, but let’s be honest. If you strongly believe that Rails is the best framework for the new project, you are less likely to heed those pointing out its drawbacks.
00:08:16.720 You might listen briefly but will quickly discard their opinions in favor of your own and fail to explore them deeply. Moreover, you will likely seek out information reinforcing your belief that Rails is the optimal choice.
00:08:38.560 Confirmation bias exemplifies how brains often choose speed and less effort at the expense of accuracy and correctness. We carry numerous preconceived notions about the world, which often hold true enough; hence, our brains don’t constantly check if our opinions are correct.
00:09:03.056 However, the problem arises when the opinion we are reinforcing is not a suitable solution for the challenges we encounter.
00:09:18.360 So, what can we do about confirmation bias? One effective strategy is to challenge your own opinions. Try to prove yourself wrong when making important decisions.
00:09:32.280 Put yourself in the shoes of someone assigned to demonstrate that your approach is flawed. This isn't easy, but it can offer you a fresh perspective and reveal considerations you might have missed.
00:09:50.440 If you're uncertain of your ability to do this sincerely, ask a trusted coworker to play devil’s advocate and challenge your opinions. Avoid being defensive; consider what this person communicates.
00:10:11.720 Adjust your opinion if necessary; this isn't simple, but if we aren't ready to change our ideas, we should rethink our approach to cognitive bias, as there may ultimately be no benefit to doing so.”
00:10:28.000 If it turns out your original thought still seems to be the best choice, that’s not a bad outcome either.
00:10:42.720 Next, let’s consider another cognitive bias strongly influencing our decision-making: the mere exposure effect. This effect suggests that we tend to prefer things simply because they are familiar to us.
00:10:58.720 This preference likely stems from survival instincts; things we know and understand create cognitive ease, which makes us feel safe.
00:11:06.880 Cognitive ease results in a more casual and superficial way of thinking, while cognitive strain occurs when we face unfamiliar concepts, prompting us to engage more deeply with the material.
00:11:15.600 In challenging situations, cognitive strain results in more careful and accurate thinking, although it also slows us down and reduces creativity.
00:11:38.640 Naturally preferring familiar things is an instinctive behavior shaped in our brains. However, as with confirmation bias, we can mitigate this tendency by questioning our preferences.
00:11:55.120 For example, when choosing a framework, ask yourself: Do we like it because we are familiar with it or is it genuinely the best option? Do we dismiss an applicant based on actual reasons, or simply because they're not what we are used to?
00:12:14.080 While it’s important to not disregard familiarity entirely—being accustomed to something helps us hit the ground running—there should be a reflective process to ensure we aren’t operating on autopilot simply because it feels comfortable.
00:12:32.640 To handle the mere exposure effect, establish a list of objective criteria for significant decisions. Use these criteria to evaluate your options before diving into subjective impressions.
00:12:52.960 Take notes on your personal impressions separately after that—a hybrid of objective evaluation and subjective reflection has shown to produce positive results.
00:13:11.480 When making decisions, focus primarily on objective criteria before referring to your personal impressions. If possible, separate the evaluator from the final decision-maker to limit personal biases in the process.
00:13:33.080 When discussing decisions, it’s essential to engage with the topic clearly. Often, we assume everyone is on the same page, which can lead to confusion.
00:13:54.560 The false consensus effect is an example of this phenomenon, where one assumes their views are widely shared, resulting in misalignment of understanding in group discussions.
00:14:10.640 This overestimation of agreement stems from poor social judgments and our tendency to project our attitudes onto others. Without clear communication, we could assume that everyone thinks as we do.
00:14:29.960 Additionally, a dominating individual can lead to false consensus; their convincing discourse might deter others from expressing dissenting opinions. Consequently, decisions may reflect only one person's perspective.
00:14:50.920 To address the false consensus effect, be explicit about the focus of discussions. Ensure everyone understands the topic in question and the decision at hand, so we can proceed cohesively.
00:15:06.920 Encourage inquiries to clarify understanding before diving into discussion. Additionally, collect opinions in silence before engaging in open discussion to ensure all voices contribute.
00:15:22.080 Another bias affecting teams is groupthink. This phenomenon occurs when group cohesion compromises critical evaluation of differing viewpoints, potentially diminishing the creativity and independence necessary for informed decisions.
00:15:40.720 Groupthink can result in members deferring to what they perceive as the majority opinion or suppressing contrary viewpoints—loyalty to the group leads to a lack of sincere discourse.
00:16:01.120 To counteract groupthink, cultivate diversity in teams. Group cohesion is beneficial for decision-making, yet without diverse perspectives, ideas may not reflect collective thought.
00:16:19.240 Encourage critical evaluation and a safe environment for individual expression. If members fear consequences for speaking out, they may choose silence over constructive debate.
00:16:39.720 If you lead a team, avoid stating your opinions at the beginning of discussions. This approach can unintentionally stifle contributions from others. Rather, let team members express their viewpoints first.
00:16:56.960 Moreover, invite outside expertise into discussions. Exploring perspectives beyond your group can mitigate echo chambers.
00:17:16.960 Finally, consider appointing a devil's advocate to challenge ideas regularly. This practice can institutionalize critical thinking and keep discussions healthy and open.
00:17:35.360 Remember, if everyone agrees without discussion, view it with suspicion. Could this be groupthink?
00:17:53.920 Let's recap the cognitive biases discussed: We explored confirmation bias, which influences personal decision-making by reconfirming existing opinions, the mere exposure effect, which steers us to favor what we know, and the false consensus effect, which leads teams to overestimate shared views.
00:18:09.680 Lastly, we covered groupthink, which arises when preserving group harmony compromises critical evaluation.
00:18:26.680 An important point to remember is that it's okay to change our opinions with new information. It's a sign of growth, not weakness.
00:18:36.800 Daniel Kahneman, a prominent psychologist and expert on cognitive bias, emphasizes our inclination to ignore our ignorance. We can't eradicate cognitive bias, and trying to do so could hinder our cognitive function.
00:18:56.800 Instead, we should strive to recognize and learn about them, allowing us to address biases effectively.
00:19:13.520 By understanding ourselves better, cultivating teamwork, and enhancing decision-making skills, we become more successful developers.
00:19:29.520 Let us embark on the journey toward reducing our ignorance, and I hope this talk has provided you with a solid starting point. Thank you for listening.
Explore all talks recorded at RubyConf 2015
+75