Ethics
Contractualism + Software Engineering: We're All In This Together
Summarized using AI

Contractualism + Software Engineering: We're All In This Together

by Katya Dreyer Oren

In the talk "Contractualism + Software Engineering: We're All In This Together," presented by Katya Dreyer Oren at RubyConf 2021, the concept of contractualism is explored as an ethical framework for software engineers. The speaker emphasizes the importance of treating users as part of a larger community and making ethical decisions in software development.

Key points discussed include:
- Understanding Morality: Katya begins by addressing the origins of morality, dismissing the idea that morality should derive from divine authority. Instead, she focuses on contractualism as a method to evaluate moral actions based on the principles that can be reasonably accepted by all involved.
- Ethical Frameworks: Katya briefly explains various ethical theories: agent-centered ethics, which assesses the morality of the individual; consequentialism, which considers the outcomes of actions; and non-consequentialism, which regards the intentions behind actions. She argues that contractualism effectively synthesizes these theories.
- User Stack Concept: Introducing the term "user stack," Katya categorizes users into different groups—employees, engineers, customers, end users, and those indirectly affected by decisions made—highlighting their impact on ethical software design.
- Responsibility of Software Engineers: Katya discusses the need for a code of ethics in software engineering, similar to other professions like law and medicine, which can enforce standards through licensing. She advocates for using contractualism as a guideline when making tech decisions.
- Real-World Examples: Several examples illustrate the discussion:
- The consequences of poor documentation in coding and its ripple effect on teams.
- YouTube's pricing discrepancies based on the platform choice and how it disproportionately affects low-income users.
- A critique of shift scheduling software that features a toggle allowing employers to bypass providing benefits to workers, thus exploiting them.
- The ethical implications of facial recognition technology and its misuse, exemplified by the wrongful arrest of Robert Williams.
- Practical Applications: Katya concludes by encouraging engineers to reflect on their actions and consider the feelings and potential objections of others, promoting the idea that ethical considerations should be integral to the software development process.
- Takeaway Messages: The crux of the presentation is that software engineers have not just the capacity but also a responsibility to ensure their work benefits their communities, advocating for ongoing engagement with ethical considerations in technology development.

In summary, the talk stresses that achieving ethical outcomes requires collective moral deliberation and a commitment to considering the impact of technology on all user groups involved.

00:00:10.960 Welcome.
00:00:12.240 My name is Katya, and this is a presentation on contractualism and software engineering.
00:00:14.639 We’re all in this together.
00:00:16.320 We’ve all seen this ad right? We know it well; if you don’t have YouTube Premium, it doesn’t feel quite right.
00:00:20.240 In this talk, we’re going to discuss why this feels wrong and what you, as software engineers, can do when presented with something similar that doesn't resonate right.
00:00:24.560 Just a couple of things about me: I'm Katya, a software engineer at Heroku.
00:00:27.920 I am a second career engineer; in a previous part-time life, I was a jazz singer.
00:00:32.480 Additionally, I am an enthusiast of all things chocolate and I am owned by a cat.
00:00:37.360 Before we begin, we need to zoom way out to understand where morality actually comes from. How do we, as human beings, know what is moral and what is immoral? We need to comprehend this before we can figure out how it applies to us as technologists.
00:00:49.600 So, here’s one thing we won’t use as a foundation for morality: that’s God. This is not a referendum on religion, but the idea of deriving morality from God presents us with a paradox.
00:00:51.920 If an action is good simply because God has commanded it, then God could demand all sorts of terrible actions, and we might have to comply even if they feel wrong. Conversely, if God requires certain actions because they are good, then those actions would inherently be good regardless of God's demands. Therefore, we're leaving God out of today’s conversation.
00:01:01.680 Now, why did I choose contractualism as the ethical framework for this discussion? There are many great ethical frameworks available, but I believe contractualism encapsulates some of the best principles.
00:01:06.479 Let’s categorize the different ethical theories briefly. There are three main buckets: agent-centered ethics, which focuses on the moral status of individuals; consequentialism, which emphasizes the outcomes of actions; and non-consequentialism, which centers on the intentions behind actions.
00:01:13.680 The flaw with agent-centered ethics is that it allows good people to perform bad actions. Consequentialism permits individuals to carry out heinous acts if the overall outcome is perceived as good, which is an issue prevalent in the tech industry.
00:01:20.800 Non-consequentialism, while focused on intentions, still neglects the outcomes, which are equally important. What I appreciate about contractualism is it takes the best aspects of these ethical theories and integrates them into a practical ethical framework.
00:01:27.520 Let’s watch a quick video from one of my all-time favorite shows, The Good Place. Hopefully, it will buffer in time.
00:01:39.200 [Video Clip from The Good Place]
00:02:03.680 As Chidi explains in this clip—highly recommend The Good Place if you aren't already watching it—the book he refers to is T.M. Scanlon’s seminal work, 'What We Owe to Each Other.' Essentially, contractualism states that an action is wrong if any principle that permits that action could be reasonably rejected.
00:02:15.200 For example, in the clip, Eleanor posits a rule that no one can veto her decisions; that would likely get rejected on principle. We determine whether something is moral by asking our community, highlighting the importance of human connection.
00:02:25.680 So, how does this apply to software engineering?
00:02:30.640 Let’s consider several professions—doctors, lawyers, and architects—all of which have a code of ethics. Violating these codes has severe consequences including losing one’s job or license. In contrast, technology has a much lower barrier to entry.
00:02:41.600 As a senior software engineer at Heroku, I recognize that while this lower barrier can be beneficial, it makes it challenging to enforce a code of ethics in our field, unlike in others that exert a large influence.
00:02:53.680 Since we cannot mandate a code of ethics, I propose that we adopt contractualism as a guiding framework. So how can we leverage contractualism to enhance our decision-making as software engineers?
00:03:07.360 I’ve coined a term, 'user stack,' although it's drawn from a Linux kernel term and serves a different meaning here.
00:03:12.080 In tech, we frequently hear about a tech stack—like a React front end and a Rails back end. However, on the user side, we have both internal and external users. When we think of a user, we often envision the end-user—those interacting with our products.
00:03:24.080 But I want to expand our definition of users. Each user group can be considered a mini contractualist community, allowing us to use contractualism to evaluate whether we are treating these groups ethically.
00:03:35.680 First, we have the company as an entity. We must ask whether the company treats its employees well and if the benefits, salary, and work environment are adequate.
00:03:46.240 As an individual, I am also a user—not just of the product but of my own code. When I look back at my earlier work, am I satisfied with it six months later? Have I documented my code adequately for future use?
00:03:57.680 Next, we examine fellow employees. Is the codebase maintainable and readable? Are our tools up to date? Are new engineers able to onboard seamlessly? These considerations affect the entire organization.
00:04:07.040 Customers, another group in this user stack, have different needs. Are our products appropriate for their budget and do they provide utility?
00:04:18.560 End users are the most straightforward group—we need to ensure our products are intuitive and accessible for all types of users. This means creating experiences not just for users who resemble our engineering team but for everyone.
00:04:30.760 Then we have users affected by those utilizing our product—those who may not have consented directly but are influenced by the outcomes. Are we treating these individuals ethically?
00:04:39.120 Finally, we need to evaluate whether our actions are beneficial for the world at large. In tech, it is often challenging to conclude whether our work is positively impacting society.
00:04:53.440 Bringing this back to contractualism, actions are morally justified when they can be explained to each individual without disagreement on principle. I view contractualism as a collaborative effort.
00:05:05.920 Moral individuals form and understand themselves through their interactions with others. Thus, as engineers, we must consider every type of user in this stack and make decisions that are defensible across the board.
00:05:14.560 This can be challenging, especially when asked to develop specific features or fix bugs very quickly. Still, contractualism encourages us to identify solutions that all can agree upon without anyone feeling conflicted about being hurt by our work.
00:05:26.400 This is not about finding a perfect solution; rather, it’s about finding one that can be justified across various perspectives. Our humanity informs our work as engineers, and it’s vital that we treat our users with respect.
00:05:37.920 Let’s explore a couple of examples. For instance, we often come across comments in code like: 'I don’t know what this does,' or 'I’m not sure if this is documented or tested.' This indicates poor practices that could confuse future developers.
00:05:48.960 In our user stack, we should analyze who benefits from such coding practices and who is harmed. Poor documentation or hard-to-read code may benefit the person writing it at the time but ultimately harms everyone else who has to work with it later.
00:06:00.880 Returning to the earlier YouTube example, users often don't realize the discrepancies in pricing plans based on sign-up methods. It benefits the company but harms users who might be unaware of potential extra costs.
00:06:12.240 Research shows that individuals with lower incomes often lack desktop computers; thus, they miss out on cheaper subscription options. What seems like a minor annoyance to some has significant implications for those who cannot afford it.
00:06:24.560 Another example can be found in the book 'Weapons of Math Destruction' by Cathy O’Neil, which discusses shift scheduling software that allows managers to reduce employee hours to avoid providing benefits.
00:06:36.960 Product teams may convince themselves that such features are beneficial, yet they harm employees by denying access to vital benefits such as paid time off or healthcare, ultimately burdening both the workers and the broader society.
00:06:50.960 My final example involves facial recognition software, which has led to wrongful arrests due to its inaccuracies. The case of Robert Williams highlights the potential dangers of relying solely on technology without ethical considerations.
00:07:00.960 Law enforcement officers should be using facial recognition merely as a guideline rather than as the sole basis for arrests. This software bears serious implications for individuals if not implemented responsibly.
00:07:12.240 Despite guidelines suggesting better use, producers of this software often do little to directly inform law enforcement agencies. This neglect contributes to wrongful imprisonments and reflects systematic issues within the technology.
00:07:24.560 As a community, tech professionals must examine how their work contributes to societal issues and ensure that even minor decisions don’t proliferate negative outcomes. We can’t allow convenience to overshadow rigorous ethical responsibility.
00:07:35.760 Throughout this talk, I urge all software engineers to reflect on the decisions made in their work. As engineers, we can choose not to act when it’s unethical or harmful and instead pursue responsible practices.
00:07:49.040 Some practical tips I recommend include fostering a blameless atmosphere where discussions on ethical concerns can occur openly. Consider engaging in ethics retrospectives similarly to how we handle post-incident reviews, without placing blame.
00:08:01.200 In our current tech landscape, engineers have significant power. It's essential to use that power wisely and advocate for ethical practices, especially if you find yourself in a troubling ethical environment.
00:08:12.800 Tools can assist those in uncomfortable situations, such as seeking support from trusted managers or reporting suspects in adherence to company policy. Collective action often proves more effective than going it alone.
00:08:24.400 As engineers, we provide the building blocks for the technology that shapes societies. With that comes a responsibility—if we discover harmful practices, we possess the power to halt or correct them.
00:08:36.400 Despite challenges, do not give up on ethical decision-making even if outcomes do not always favor your ideals. Every effort made in promoting ethics contributes to more conscientious practices in the future.
00:08:48.960 In conclusion, there may not be rewards for ethical behavior, but the true fulfillment comes from knowing you’ve acted in alignment with your values and those of the community.
00:09:00.640 Thank you so much.
Explore all talks recorded at RubyConf 2021
+92