Team Dynamics
Delivering fast and slow - Ethics of quality

Summarized using AI

Delivering fast and slow - Ethics of quality

Lena Wiberg • June 03, 2021 • Helsinki, Finland (online)

In her presentation at Euruko 2021, Lena Wiberg discusses the ethical implications of software delivery amidst the growing pressures to deliver quickly and efficiently. She emphasizes that speed should not come at the cost of quality, safety, or ethical responsibility in software development, highlighting the potential disastrous outcomes of poorly made decisions in technology.

Key Points:

- Background of Lena Wiberg: Lena has transitioned from a developer to an engineering manager, emphasizing continuous improvement and the importance of team growth.
- The Reality of Fast Delivery: The demand for faster, cheaper, and more efficient software delivery often creates a precarious environment where the risk of failure increases dramatically due to decisions made under pressure.
- Hypothetical Team Scenario: Lena presents a hypothetical situation involving team members with different roles, illustrating how miscommunication and the fear of speaking up can result in negative outcomes, such as a security flaw that leads to identity theft.

- Real World Examples of Failure: She references major incidents like the Boeing 737 MAX crashes and the Mars Climate Orbiter failure, drawing parallels to the software industry where quick decisions without thorough testing can have severe consequences.
- Ethics in Technology: Lena introduces the 'Ten Commandments for Ethical Techies,' which outline the importance of understanding professional, legal, and ethical obligations in software development.
- Cognitive Bias in Decision Making: She discusses Kahneman's concepts of "thinking fast and slow," which highlights how quick, impulsive decision-making can lead to oversights and poor judgment in software solutions.
- Balance in Development: Emphasizing the need to balance rapid delivery with ethical considerations, Lena warns against prioritizing speed at the expense of safeguarding user interests and software integrity.

- Responsible Software Development: Lena calls every developer to recognize their role and responsibility in maintaining ethical standards, suggesting that raising concerns about potential issues is crucial for the integrity of their work and safety of users.

Conclusion: Wiberg urges software developers and stakeholders to contemplate their commitment to ethical practices in delivery processes, ensuring that their decisions contribute not only to competitive advantage but to the safety and well-being of users. Documenting decisions, maintaining transparency, and being willing to voice alarm against potential issues are essential takeaways from her talk.

In the end, the importance of making sound choices, prioritizing ethics, and upholding safety in technology is clear; actions must always align with the intent to protect users and uphold professional standards.

Delivering fast and slow - Ethics of quality
Lena Wiberg • June 03, 2021 • Helsinki, Finland (online)

Daily, we are pushing the boundaries of how fast we can deliver software. Constantly running on a knife’s edge between great success and horrible failure.

Delivering something new, better, faster than our competition can mean incredible payoff and we are constantly being asked to cut costs and deliver more, faster, cheaper. But then suddenly, you fall off the other side of the edge and wake up to 189 dead in a plane crash or having to take down and redesign your entire banking service because the architecture didn’t hold up to the load. It probably wasn’t your decision to push that to production but one can imagine that a long chain of people have to have made a number of small (or huge) decisions that led up to that result.

EuRuKo 2021

00:00:00.160 Let's talk a little about Lena Wiberg. Lena started out as a wide-eyed developer in 1999. After a decade of coding, she found testing.
00:00:08.720 Later, in 2017, she moved into management. The skills she gained from building and testing software also work wonders when making people, teams, and organizations grow.
00:00:15.519 She believes in continuous improvement, keeping up to date, and always challenging our assumptions and the way things are done. She is also an avid speaker, facilitator, and creator of the "Would You Risk It" card deck. Currently, she is working as an engineering manager at Mentimeter.
00:00:35.040 There's a lot going on with her. About Lena's talk today: daily, we are pushing the boundaries of how fast we can deliver software, constantly walking a tightrope between great success and horrible failure.
00:01:06.240 Delivering something new, better, and faster than our competition can yield incredible payoffs, and we are often asked to cut costs and deliver more—faster and cheaper. But suddenly, we can fall off the edge and wake up to 189 dead in a plane crash or having to take down and redesign our entire banking service because the architecture didn't hold up under load.
00:01:37.759 It probably wasn't your decision to push that to production; one can imagine a long chain of people making a series of small—or huge—decisions that led to that disastrous result. Please, let's welcome Lena Wiberg, delivering fast and slow: ethics of quality.
00:02:25.040 Hello! Basically, you got the too long; didn't read (TL;DR) of my presentation in that introduction, but let's see if I can expand a bit on what I mean with the different parts.
00:02:38.160 As mentioned, I started out as a developer, building software for 11 years before finding testing, which presents a different kind of challenge. For the last five years, I've also worked more on building teams and people.
00:03:06.319 I currently work as an engineering manager at Mentimeter, which I hope many of you know about. If you don't and want to know more, just connect with me on any social platform, and I can tell you more.
00:03:20.159 I would like to start by sharing a hypothetical problem that a team faces. In our team, we have Ian, a junior developer; Sam, our senior tech-savvy code ninja; Alex, our business analyst; Morgan, our tester; and Kim, our project manager. Also connected to this team is Chris, who is our user.
00:04:06.319 It all starts out with Sam. In this case, Sam is a real techie who loves trying out new things. So, she picks up a feature that allows her to try a new framework and tool.
00:04:15.200 However, she gets caught up in experimenting with this new tool, and in the end, we end up with technical debt because Sam has to implement a quick and dirty solution to move the project forward.
00:04:30.240 Sam and our business analyst, Alex, enjoy pairing and discussing different solutions to problems. As you all know in agile, we value personal conversations over documentation, and Sam and Alex are a perfect example of this.
00:05:03.600 They decide to change several aspects of a feature, but they forget to document what exactly they changed. The result is that different people in the team have different interpretations of what this feature is actually supposed to do.
00:05:23.039 Our solution is to cue Ian, our junior developer. Ian is new to the team, and I'm not sure if this is an international issue or just in Sweden, but in Sweden, you have a long probation period where one can be let go quite easily.
00:05:41.440 Ian is a bit afraid of losing his job. So, when Ian doesn't understand how to implement a solution to a feature and struggles with the chosen framework compared to the user stories, he doesn’t speak up.
00:06:05.280 This means that the actual implementation turns out to be slightly off. Morgan, our tester, notices this problem when testing and goes to talk to Kim, our product manager about it.
00:06:24.479 However, they agree that this is likely a very improbable edge case, so Morgan decides that she’s only a sharer of information, not a decision-maker. She informs Kim of the issue, but Kim chooses not to address it.
00:06:41.919 Kim has this huge project and the responsibilities surrounding it. Kim believes that the cost of delaying the project would be too significant, and he thinks the risk is worth it to just release the production.
00:07:03.199 Thus, Kim hides this potential bug from the board, and unfortunately, the bug goes into production.
00:07:28.480 Now, on the other hand, we have Chris, our user of this new software. The effect for Chris is that her data is leaked because the bug turns out to be a security flaw, granting someone access to Chris's data.
00:08:02.560 Of course, we all know you certainly don’t do what we expect. Many users out there aren't as tech-savvy as we assume them to be, and Chris happens to reuse a username and password on other sites.
00:08:34.080 In this hypothetical scenario, Chris actually gets her identity stolen. As I mentioned, this is a hypothetical team and project, but we can all agree that things like this happen all the time.
00:09:10.640 So, what is the worst that can happen? Well, hopefully, you've heard of the Boeing 737 MAX aircraft tragedies. The consequence in this case was the tragic loss of more than 300 lives.
00:09:25.760 The situation began in 2010 with a competition stating that they would produce a new cost-efficient airplane. Boeing panicked and rushed the production of an adaptation of a very popular existing model.
00:10:12.960 To accommodate larger engines, they had to shift their placement on the aircraft, which made the plane slightly unbalanced. To correct this, they introduced software to adjust the aircraft's nose angle.
00:10:53.838 However, this was rushed, and they skipped adequate pilot training and pushed all paperwork, analysis, and reports aside, resulting in two fatal crashes.
00:11:05.600 As for their competitor, Airbus, while they experienced a software bug that required a specific model to be rebooted every six days due to an internal timer issue, the cost of fixing this problem was deemed too high, leading to this drastic measure.
00:11:50.640 It does raise concerns about the safety of flying, doesn’t it? We have seen incidents like the Mars Climate Orbiter, where one part of the software was built with the imperial system, while another was using the metric system, leading to its loss in space.
00:12:26.159 This failure cost about $550 million. In Sweden, we had a journaling system for a particular hospital that couldn't handle Swedish characters, causing their systems to freeze whenever these characters were inputted.
00:12:46.880 And you might think, 'Well, I don’t work with hardware. I only work with banking systems.' But Australia had to refund a lot of money because of missed use cases with their automated debt collection systems.
00:13:25.840 Then there's the Robinhood incident, where a student mistook a significant debt and sadly took his own life.
00:13:46.560 Ethics is something we truly need to care about when building software, and I will discuss the Ten Commandments for Ethical Techies by Fiona Charles from Quality Intelligence.
00:14:11.760 It starts with understanding our professional obligations. For testers, this means discovering information and relaying it, while for developers, it involves creating safe, fast, and effective architectural software.
00:14:37.920 This may differ based on your role. We also need to comprehend our legal and contractual obligations, including standards and regulations we must follow, such as GDPR.
00:14:58.000 While it may not be necessary to know all details about GDPR, we should understand enough to ensure we do not implement something illegal. Individually, we also need to know our own bottom line.
00:15:38.000 What lines are we unwilling to cross? How far are we willing to go? Will we hide a potential bug because we don’t think it’s important? We need to consider the worst outcomes for ourselves and others.
00:15:54.880 Next, we must understand the interests we serve. For example, if we are building X-ray machines, we likely serve doctors and patients rather than shareholders expecting profit.
00:16:03.840 If we're developing voting applications, our true users are voters and the government, not the company seeking profit.
00:16:10.720 We need to be aware of the potential harm our software may cause, considering both the worst and most likely consequences. For instance, if we skip on security, we must understand what risks might arise from unauthorized data access.
00:16:38.799 Initially, it was easier to track the bits and pieces of our software ecosystem, but today we utilize numerous algorithms, components, and frameworks where we might not fully understand their operation.
00:17:13.280 We collect vast amounts of data for various purposes, heightening the risk of misuse. For example, consider the Cambridge Analytica incident, which started as a promising idea but resulted in negative consequences.
00:17:27.920 We must maintain critical awareness of our entire work environment, including applications, data, and the ecosystem where the application lives, as well as the company's hidden goals.
00:17:57.440 For example, Volkswagen was not initially aiming to deceive regarding CO2 emissions, but they took actions that led to such outcomes. Additionally, we need to stay updated with existing ethical issues, such as bias in machine learning that could produce racist applications.
00:18:31.199 It is important to understand how we construct user personas to avoid exclusion of groups and ensure our systems are equitable. Furthermore, we need to practice saying no more often.
00:19:32.000 We tend to encourage positivity and compliance, leading to a society where saying no can feel difficult. But we must train ourselves in the necessity of saying no, especially when it matters.
00:20:12.480 In my experience as a tester, I've often not been the most popular person in the room for raising problems and risks, but it's essential to learn how to voice these concerns. It's crucial to speak truth to power.
00:21:00.560 However, we can't simply hide behind the excuse that we are 'only following orders.' Events can go horribly wrong if we do that. We must know how to escalate when necessary; if someone denies our concerns, we need pathways to raise them higher.
00:21:45.000 Know your own risk tolerance. Assess whether raising a concern may jeopardize your job, your reputation, or potentially someone’s life. Additionally, weigh the risks associated with not raising an issue.
00:22:13.440 If your assessment leads you to decide that your values concerning these risks are in conflict with your role, you might need to reconsider your position. There should be a balance between your responsibilities and your ethical obligations.
00:22:53.440 Lastly, make sure to document important matters adequately. Have a paper trail that you can refer to later, as this can provide necessary context during discussions.
00:23:05.840 I encourage everyone to take this slide with the collection and keep it visible as these are questions we should consistently consider. We talked earlier about a hypothetical team problem and the potential worst outcomes.
00:23:44.560 Now, I want to transition to the part of being pushed to deliver more, faster, and cheaper. To do this, I will reference Daniel Kahneman's 'Thinking Fast and Slow,' which, while not about software, explains how our brains work.
00:24:25.920 Kahneman's theory states that humans are governed by two systems. System one is automatic, fast, and impulsive; it makes assumptions and drives our daily tasks.
00:24:53.520 We employ it for routine activities such as finding our way home, simple math, or recognizing friends. On the other hand, we have system two, which is thoughtful, slow, and needs energy and effort.
00:25:06.880 We resort to using system two for analyzing complex issues or tackling difficult math. System one, in an attempt to save energy, leads us to make quick assumptions.
00:25:38.720 Why is this relevant? When solving problems by relying too heavily on system one, we tend to resort to past solutions even if they are not applicable to our current challenges.
00:26:20.000 Finding innovation or better solutions requires engaging system two for fresh ideas. It’s essential to recognize the value of rapid delivery; small and frequent deliveries are indeed beneficial.
00:27:05.000 Failing fast allows us to learn fast and respond to change rather than having to rework large portions of our applications. However, this same logic can be twisted to justify cutting costs and prioritizing first to market.
00:27:46.560 The pressures placed on agile teams can divert focus from customer value and team health, pushing for speed and profits instead. We've often attempted to address this by having cross-functional teams; however, that's not enough.
00:28:21.440 Although we might want each team member to embody multiple roles, focusing on cross-functionality leads to context-switching. Context switching dilutes flow and can exhaust energy.
00:29:07.200 While cross-functionality can sometimes be productive, handovers often introduce delays and attenuate feedback loops, which emphasize the need to maintain delicate balances.
00:29:40.000 In doing so, we need to be aware of the balances within our systems and processes. We are fortunate to have frameworks and libraries to simplify the development process.
00:30:10.000 However, relying on third-party components means we may not entirely comprehend how everything operates, complicating repairs when issues arise.
00:30:38.720 In summary, planes don't crash as a result of one catastrophic decision; they fall apart due to a series of small poor choices made along the way. I urge you all to contemplate the reasons behind prioritizing speed.
00:31:07.520 Push back if you sense that speed is prioritized for the wrong reasons, and know your bottom line so that you can act when necessary, or you might lose your way, unsure of how to respond.
00:31:23.440 Never attribute your unsound decisions solely to others or use the excuse of simply following orders. If you're in a position to exert pressure on others, be mindful and refrain from compelling anyone to justify poor decisions.
00:32:03.280 We are all accountable for our actions, but we also have a duty to call out wrong decisions when we witness them occurring. Ending on an unsettling note, we all occasionally take liberties with the truth.
00:32:42.640 We might hide parts of the truth or even lie outright, typically believing that these omissions cause less harm than good. Sometimes, this is done to protect those we cherish, out of fear, or with the intention of gaining something.
00:33:20.800 Constantly making these judgments can lead us astray. I encourage you to reflect on ensuring your risk assessments do not overlook the most critical issues and to be mindful of the motivations behind our decisions.
00:34:02.080 Thank you so much! If you have questions that you don’t want to tackle during the Q&A, feel free to email me, tweet me, or connect with me on any social media!
Explore all talks recorded at EuRuKo 2021
+11