Talks

Summarized using AI

Finding Responsibility

Caleb Thompson • November 28, 2017 • New Orleans, LA

The video titled 'Finding Responsibility' by Caleb Thompson, presented at RubyConf 2017, focuses on the ethical responsibilities of software developers when creating technology that can have harmful implications.

Key Points:
- Introduction and Distracting Topics: Thompson opens with a warning regarding disturbing content and explains how he previously titled the talk 'Finding Responsibility' but changed it to 'Don't Get Distracted'. The core theme is the distraction developers face from technical challenges that may mask ethical considerations.
- Internship Experience: Caleb shares his early career experience as an intern at a Department of Defense contractor, where he built a Wi-Fi geolocation application. Initially, he focused on the technical aspects and the excitement of developing a novel tool without considering its potential misuse.
- Technical Details: The software utilized algorithms to locate Wi-Fi signals based on signal strength and the phone's location, ultimately optimizing performance through machine learning, but Thompson repeatedly emphasizes that the software was intended for lethal purposes.
- Realization of Ethics: Eventually, he realizes that the technology could aid in locating and targeting individuals, leading to calls for reflection on ethical practices within tech development.
- Case Studies: Thompson cites other ethical dilemmas in software development, such as the consequences faced by fellow developers and companies, like Uber's 'Greyball' tool, which was designed to circumvent law enforcement.
- Moral vs. Ethical Frameworks: He distinguishes between ethics, imposed by society, and morals, which are personal beliefs. This distinction is crucial for developers to navigate their responsibilities in their work.
- Call to Action: At the conclusion, Thompson urges developers to consider the implications of their work, prioritize ethical considerations, and actively think about potential misuses of their software. He calls for a culture where developers critically assess the purpose of their projects and the potential harm they may cause, advocating for safeguards against misuse.
- Final Thoughts: He reflects on how even when developers are distracted by intriguing technical problems, it is essential to ask critical questions about the ultimate goals of their projects.

Thompson stresses that as technology increasingly impacts lives, developers hold a significant responsibility to ensure their creations do not cause harm or serve malevolent purposes.

Finding Responsibility
Caleb Thompson • November 28, 2017 • New Orleans, LA

Finding Responsibility by Caleb Thompson

In 2011, with a team of interns at a Department of Defense contractor, I created a Wi-Fi geolocation app to locate hotspots. It could find the location in 3D space of every hotspot near you in seconds. We made formulas to model signal strength and probable distances. We used machine learning to optimize completion time and accuracy.

I was so caught up in the details that it took me months to see it would be used to kill people. What do we do when we discover that we're building something immoral or unethical? How can we think through the uses of our software to avoid this problem entirely?

RubyConf 2017

00:00:10.670 This talk is called 'Don't Get Distracted.' It used to be called 'Finding Responsibility,' but this name worked better. You should be able to tell why after my brief introduction.
00:00:16.109 I'm going to hit the ground running with some potentially disturbing content. It includes references to, but not descriptions of, killing and suicide. Please take the next few minutes to decide on another talk if you'd like to. There won't be any judgment, and if you're sitting in the middle, people will be happy to give you out of their way.
00:00:34.410 Yes, these slides are blank. No, they are not broken. I've been a developer for about ten years. I have a bachelor's degree in software engineering. I've worked in software jobs in multiple industries, including at a consultancy where I worked with many clients with varying business needs. I built tools from a social support network for people with Type 1 diabetes to a shipping rate comparison service.
00:00:47.399 I've even worked for banks, building task management software. Now I work for Heroku on the support team, where I help all sorts of developers like you run your code and solve your interesting problems on our service. For the past four years, I've organized 'Keep Ruby Weird,' a community-oriented Ruby conference in Austin, Texas.
00:01:07.310 I'm going to tell you about how I took a job building software to kill people, but don't get distracted by that. I didn't know it at the time. Even before I walked across the stage at graduation, I accepted an offer for an internship. It paid half as much as the most I'd ever gotten at my highest paying job up to that point, not to mention that I had just spent four years working low-paid student jobs or living on student loans.
00:01:37.170 I'd be joining a contracting company for the Department of Defense. The Department of Defense, or DoD, is the part of the government made up of the military in the United States. The DoD outsources all sorts of things, from multi-mission maritime aircraft to blue shade 451 poly wool cloth. They also outsource a lot of software projects. At the time, I thought nothing of the fact that I would be helping the military. Besides, they're the good guys, right? My dad was in the military; so was my grandfather. I have great respect for those who serve in our militaries.
00:02:02.790 It was good money, a great opportunity, and a good friend of mine had gotten me the gig. Life was good. I showed up for my first day of work in Northern Virginia, or Nova, as the industry likes to call it. I met the team of other interns and learned about what we would be building: a tool to find Wi-Fi signals using your phone.
00:02:38.160 It seemed pretty cool compared to what I had built up to that point. The most complicated thing I had ever done was an inventory management service. The app didn't concern itself much with persistence; who really needs to stop and restart a program anyhow? That either was right there in memory, and if you forgot how many Aerosmith CDs you had, who cares? Honestly, the idea of finding Wi-Fi routers based on the signal strength seemed pretty intimidating.
00:03:04.260 The idea impressed me, but don't get distracted by all this. The software was intended to kill people. I joined the team after they had already started on the project. The gist of the tool was that they would look at how Wi-Fi signal strength changed as your phone moved around. If the signal strength got stronger, you were probably moving closer to the source; if it got weaker, you were probably moving away.
00:03:42.840 To find this information, we'd collect two pieces of information for each Wi-Fi signal in range: your phone's location and the signal strength of that Wi-Fi source to predict the actual location of the Wi-Fi signal. We used a convolution of two algorithms. The first was R-squared; it measured the distance between the signal strength we'd observed and the expected signal strength at that distance, and it did that for every point in the search grid.
00:04:06.750 Locations that had the lowest R-squared error rate were the most likely to be the source of the Wi-Fi signal. We'd combine that calculation with a Gaussian estimation, which creates a probability curve using the standard distribution, or the bell curve that you've probably seen all over the place. It started with an inverted curve, a probability hole of likelihoods that the Wi-Fi signal originated from those distances that represented the idea that the phone was probably not standing right next to what you were trying to find.
00:04:39.030 It added a normal bell curve up that represented a high probability that the source originated from a distance further out. The algorithm adjusted the width and height of each of those curves by consulting past measurements. It created a heat map of probabilities for the signal source. We normalized those two probabilities for each location in a search grid and combined them. The combination of those two algorithms was much more correct than either individually.
00:05:13.020 We stored this probability matrix for each location collected by the phone. Using these, we could give you the distance you were from that Wi-Fi source if you moved in a straight line. If you turned a corner, we could also tell you the direction so that you could find it in two-dimensional space. If you climbed some stairs, we would give you altitude as well.
00:05:32.340 The technology was the most interesting thing I had built up to that point. It may still be the most interesting thing I've built, but don't let that distract you. It was designed to kill people. I mentioned earlier that I have a software engineering degree. At this point, my teammates were much earlier in their educational careers; most of them were a year or two into their four-year programs and were math or computer science majors.
00:05:54.570 My expertise was in the design and process of building software, while theirs was in the theory of mathematics or how computers are used. I'd help translate the working algorithms they built in MATLAB into the Java code that would need to run on Android phones. And let's be honest, we spent plenty of time deciding whether we preferred Eclipse or NetBeans. Can I say how happy I am that as a Ruby developer, I've not had to figure out where to put a JAR file in almost five years?
00:06:39.060 I also spent a lot of time bike shedding CO2 organization and pairing on performance improvements to the code. It worked, but it took almost seven minutes to find the Wi-Fi sources. This was partially because of the code we translated from MATLAB, which is optimized for working in matrices of numbers; we wrote it in nested loops, quadruple-nested loops with some pretty expensive calculations in the middle.
00:06:58.199 One example is that we were calculating the distance between two points. We used Great Circle distance, which measures the shortest distance between two points on a sphere, such as Earth. The function performing that calculation was hit hundreds of thousands of times for each collection point, often with the same two locations. It was very slow.
00:07:21.180 We solved that by implementing a hash with the keys as the two locations and the values as the distances between them. This at least meant that we didn't have to redo those calculations. That, along with other optimizations we made, sped up the performance from seven minutes to a few seconds.
00:07:44.969 But don't get distracted; that performance made it faster to kill people. The accuracy of the locations wasn't fantastic either. I don't remember exactly what it was, but the number 45 feet sticks in my head, which is about the length of a shipping container. That's significant when the Wi-Fi signal strength of 802.11n is only about a hundred feet.
00:08:03.719 That meant that we could be almost half the range of that Wi-Fi router away from where it actually was. It's a big error rate. I talked about the Gaussian estimation; the two curves were part of the second algorithm. We hard-coded the numbers that defined the width and depth of those curves. They were only starting points, but they were starting points that we used every time we made this calculation.
00:08:30.569 Does anyone know what a genetic algorithm is? It's a type of machine learning program that produces a set of values that optimize for a desired result. Each of the Gaussian estimation values is a gene; the set of values in genetic algorithm parlance is a genome. The fitness function is what the genetic algorithm uses to measure the performance of a genome.
00:09:11.300 For my data sets of readings that I was using, I knew the actual location of the Wi-Fi source. So, I was able to plug that genome into the geolocation algorithm and measure the distance between the location it found and the location I knew was accurate. That was my fitness score—the smaller, the better.
00:09:45.600 Genetic algorithms take a set of genomes called a generation and keep a certain percentage of top performers. These performers survive to the next generation as clones. Sometimes the algorithm mutates those clones by adding a Gaussian random value to each gene, which means that each of them had a slight chance of performing better or worse.
00:10:10.610 New random genomes were created for the remainder of the population. We saved the top performers across all populations, meaning that when I finished the genetic algorithm, I could look at the best performer and plug that into our geolocation algorithm permanently. I let this geolocation algorithm run over the weekend.
00:10:28.600 It increased the accuracy from around 40 feet to about 10.3 feet, which is 25% of the error of the original. That's less than the GPS accuracy on the phones we were collecting data from, which means that it was probably not accurate overall. It's called overfitting, and it's solved by using separate sets of tests and training data.
00:10:54.890 I love this stuff—genetic algorithms, R-squared, Gaussian random values—it's the sort of thing that you hear about and learn in school but are told you'll never use again after you graduate. But we were using it in a real project! That's great, but don't let that distract you; this made it easier to help people.
00:11:45.210 The location algorithm now worked accurately and quickly. The next feature was to add tracking for a moving Wi-Fi access point. I briefly wondered why a Wi-Fi access point would be moving, but that wasn't as interesting as figuring out how to find it with our code. We made use of Kalman filters to observe three state variables: the position, velocity, and acceleration of the Wi-Fi signal source.
00:12:04.370 Given these values and the time since the last measurement, a Kalman filter can improve the current prediction surprisingly well. It throws away values that have low accuracy automatically. Each time we ran the real-time algorithm, we'd also run the Kalman filter with only that information.
00:12:34.150 It was able to produce an estimate that was more accurate than the calculated value. At the same time, we added the ability to track more than one Wi-Fi signal. We’d filter our collection of readings by the unique identifier of each signal, known as the MAC address.
00:12:58.220 The filtered data sets went through the full algorithm to produce predictions. We used the APIs of the phones to pull the location data for each Wi-Fi signal and sort it in range. Basically, anything you can see in your Wi-Fi connection list—it was all very exciting! These seemed like academic problems, but we were getting to use them in a real-world project.
00:13:26.050 Being a programmer was going to be great, but don't let that distract you. This meant that we were able to kill multiple people with our software. We had been working with the project owner throughout this process, who was fairly low-key about most things. He'd check in maybe once a day and then go back to the part of the building dedicated to undercover and classified work.
00:14:13.979 Whenever we hit one of these milestones, he'd be happy about it, but a question always came up. He wanted us to sniff out signals put out by phones in addition to those put out by Wi-Fi hotspots. It's a much harder problem from a technical perspective. The functionality necessary to do this is called promiscuous mode; it's a setting on the wireless network controller.
00:14:56.060 Neither Android nor iPhones support this option, so we'd have to either root or jailbreak the phones. We looked for packages that would help us sniff out these packets as they were sent back to the router. We found something on SourceForge, but it wasn't well documented, and we didn't understand it very well.
00:15:53.790 We told the project owner we'd get back to this later. None of us thought it was that important; we had the technology to find these Wi-Fi sources, and that was the goal. Each time we demonstrated a new and exciting tech, though, the same question came up: 'Great, does it find phones?'
00:16:22.610 It's taking seconds instead of minutes. Great! Does it find phones? We looked into finding phones, and it didn't seem likely, but we'd get back to it later. We got moving targets working. Great! Does it find phones? I had been distracted by all the cool problems we were solving—finding nodes, speeding things up, making more accurate predictions. It was all so much fun.
00:16:53.300 I hadn't thought about why we were putting all this work into finding a better place to sit and get good Wi-Fi. That doesn't even make sense if you think about it for more than a few seconds. 'Does it find phones?' This was never about finding better Wi-Fi. It was always about finding phones.
00:17:22.110 Phones carried by people. Remember, I said I worked for a Department of Defense contractor. The DoD is the military. I was building a tool for the military to find people based on where their phones were and kill them.
00:17:51.890 I tried to rationalize this. The military is in place to protect truth, justice, and the American way. But this is the same time that we found out that the government had been spying on us in the United States with drones and that they'd lent out that technology to local, state, and federal government officials—law enforcement officials—over 700 times to run their own missions.
00:18:11.990 The military is the tool of the government, and it seemed like we couldn't trust the government as much as we thought we had. I didn't want to be a part of something that was going to be used to kill people, especially since I would never know who it was used against, let alone have a say in that decision.
00:18:43.790 I rationalized it then too: we were interns, we didn't have a clearance. The projects that this company did for the government were top secret. I wasn't allowed to know what they were. My work could probably get shelved and forgotten about. This was an extreme example of code that was used in a way that the developer did not intend.
00:19:21.800 The project owner conveniently left out its purpose when he was explaining the goals, and I conveniently didn't look too hard into that. It was great pay for me at the time, a great project. Maybe I just didn't want to know what it would be used for.
00:19:54.850 I let myself get distracted. I was distracted by the technology, but it would be just as easy to be distracted by a cool framework that the company was using, the great design of an app that would look super good on your portfolio, or some really nice office amenities.
00:20:32.680 Maybe others are doing the same thing, and clearly they've already thought about it, so it must be fine. There are other examples when code was used in ways that it wasn't intended and of code that just does bad things. A year and a day ago, a developer named Bill Sora wrote a blog post.
00:21:07.680 It opened with the line, 'If you write code for a living, there's a chance that at some point someone will ask you to code something a little deceitful, if not outright unethical.' Bill had been asked to create a quiz that would almost always give a result that benefitted the client. Bill worked in Canada, and in Canada, there are laws in place that limit how pharmaceutical companies can advertise prescription drugs to patients.
00:21:45.500 Anyone can learn about the general symptoms addressed by a drug, but only those with the prescription can get specific information. Because of this law, the quiz was posing as a general information site rather than an advertisement for a specific drug. If the user didn't answer that they were allergic to the drug or that they were already taking it, every result said, 'Ask your doctor about this drug.' That's where the requirements said to do, and that's what Bill coded up.
00:22:27.920 The project manager did a quick check and told Bill that it didn't seem to be working; it always gave the same answer to those requirements. Bill said, 'Oh, okay.' A little while later, Bill got an email from a colleague. It contained a link to a news article: a young woman had taken the drug that Bill had written this quiz for—she had killed herself.
00:22:51.360 It turns out that some of the side effects of this drug are severe depression and suicidal thoughts. Nothing Bill did was illegal. Like me, Bill was a young developer, making great money and doing what he was told. The purpose of the site was to push a specific project; that's why it was being built. Bill had chalked this all up to marketing. He never intended for this to happen. Maybe Bill got distracted too.
00:23:23.320 In his conclusion, Bill writes, 'As developers, we're often one of the last lines of defense against potentially dangerous and unethical practices. We're approaching a time where the software we build will drive vehicles that transport your family to soccer practice. There are already AI programs that help doctors diagnose disease. It's not hard to imagine them recommending prescription drugs soon too.'
00:24:06.370 The more software continues to take over every aspect of our lives, the more important it is for us to take a stand and ensure that our ethics are ever-present in our code. Since that day, I always try to think twice about the effects of my code before writing it. I hope that you will too.
00:24:30.250 I think it's poignant that all of the examples that Bill listed as something that might happen in the future already happened today. Bill's story isn't that far off from mine. But there are still other examples. Earlier this year, a story came out about Uber.
00:24:55.460 It had built into its ride-sharing app code that was called Greyball. It's a feature about their violation of Terms of Service— a tool that can populate the screen with fake cars when its app is open by users in violation of its terms. In a statement, Uber said this program denies ride requests to users who are violating our Terms of Service.
00:25:30.490 Whether that's people who seek to physically harm our drivers, competitors looking to disrupt our operations, or opponents who collude with officials on secret stings meant to entrap drivers in practices.
00:25:56.100 The New York Times reported it was used in Portland to avoid code enforcement officers looking to build a case against Uber for operating without a license. When triggered by Uber's logic, it populates the app with cars that don't exist, with fake drivers who quickly cancel after accepting a ride.
00:26:16.030 I'm not a lawyer, but this seems like an obstruction of justice itself—a crime. Outside of the illegal activities in Portland, Greyball is used even today, though mostly outside the United States. I'm a huge fan of ride-sharing services. It's not uncommon to see in the news these days articles about drivers doing heinous things.
00:27:01.035 Greyball may have enabled some of those things to happen. Again, this is an unintended consequence of a tool that was built. Maybe the internal pitch for this was that Greyball users were violating the terms. People who are under 18, people who didn't pay to clean up their explosive accident last night, rather than block them, potentially causing them to create a new account, they could just be put into an alternate dimension where they could never get a ride.
00:27:35.050 For some reason, that’s fine. If these developers had thought about the worst possible use for this code, this circumvention of justice might have come up earlier and could have been addressed. Maybe they were distracted by the face value of the request, rather than looking deeper at its purpose and users.
00:28:29.840 There are all sorts of things that aren't as black and white; apps that listen to the microphone to tailor ads to you based on what you say near your phone, websites designed to exploit psychology to take up as much of your free time as possible, which have been linked to exploding rates of depression, and any number of apps that opt you into an email newsletter.
00:29:04.110 These aren't as obviously bad as the previous examples, but at least in my opinion, they're still kind of shady. This value system is different for others. Richard Stallman, for example, believes that eBooks are unethical. Others may think that's a little eccentric, but that viewpoint agrees with his overall system of beliefs.
00:29:48.240 There are actually words for what society decides are good or bad versus what you or someone like Richard Stallman individually believes: ethics and morals. Modern psychology more or less uses these terms interchangeably. An understanding between you and I will be useful later on. Ethics are imposed by an outside group as a society, a profession, a community like ours, or even where you live.
00:30:37.200 Religions provide ethical systems; so do groups of friends. Society, in whatever form, determines right or wrong, good and bad, and imposes those definitions on its members. Ethics and society at the local, state, and national levels are often coded into law.
00:31:11.060 Morals, on the other hand, are a more personal version of the same thing. Society as a whole imposes mores on smaller communities, and all of that trickles down to the individual level. That's not to say that your morals can't conflict with society's ethics. Maybe you believe that freedom of speech is a core tenet of human rights, but you live somewhere where expressing political or religious objections is wrong.
00:31:58.390 Let's not get distracted by morals and ethics just yet, though; we'll come back to those later. The unifying factor of all these stories is that developers built code that did these unethical or immoral things. As a profession, we have a superpower: we can make computers do things. We build tools, and ultimately, some responsibility lies with us to think through how they're going to be used, not just what their intention is, but what misuses might come out of them.
00:32:34.540 None of us wants to be building things that will be used for evil. The Association of Computing Machinery is a society dedicated to advancing computing as a science and profession. ACM includes this in their code of ethics and professional conduct: responsible actions, including those that accomplish assigned duties, may lead to harm unexpectedly. In such an event, the responsible person or persons are obligated to undo or mitigate the negative consequences as much as possible.
00:33:01.320 One way to avoid unintentional harm is to carefully consider potential impacts on all of those affected by decisions made during design and implementation. So, how can we carefully consider potential impacts? Honestly, I don't have the answers. I don't think there is a universal answer because I have to believe that if there were, we wouldn't have this problem; we wouldn't be building this code in the first place.
00:33:42.740 I do have a couple of ideas, though. One I got from my friend Nimes is to add to the planning process a step where we come up with the worst possible use for our software. For example, opting in folks to a mailing list by default—the worst case is probably that we send them a bunch of emails and they unsubscribe, or maybe they stop being our customer.
00:34:56.450 Ash, a team member, said, 'Am I willing to sell my hypothetical startup's soul for a bigger mailing list, especially when that might be all that keeps my company afloat?' Yeah, why not? And I can see that. I understand it. I still think it's a little bit shady. It's not a best practice, but it's not physically hurting anyone.
00:35:24.640 If I had sat down and thought through what my code would be used for when I was building the Wi-Fi location app, I would have come to a much different conclusion. Actually, I think that thinking through the worst possible uses of code could be a fun exercise; you may come up with some pretty wacky examples.
00:35:50.230 If we send Batman an email and he has notifications for new emails on his phone, then he might be looking at his iPhone when the Riddler drives by in the Riddler car, and he might get off his witty one-liner at the crime scene. It'll be this little me that says, 'Who's afraid of the big black bat.' It's not so plausible, but it shows that you can come up with some pretty wild out-there examples that aren't obvious at first glance.
00:36:41.350 Another thing that I think I should have done, and that we can all do more of, is just not take the request at face value. The project owner at the defense contractor didn't spell out what the code would be used for, but at least in retrospect, it wasn't a big jump in logic. 'We're going to build an app to find Wi-Fi signals,' is all true, but it's not the whole truth.
00:37:17.540 Asking myself or them why often enough might have led me to an earlier understanding. Why find the Wi-Fi signal source? Why go to them? Why, why, why? Comedian Kumail Nanjiani, best known for the TV show Silicon Valley and his recent film The Big Sick, took to Twitter recently on this subject.
00:37:56.120 He said, 'I know there's a lot of scary stuff in the world right now, but this is something that I've been thinking about that I can't get out of my head. As a cast member on a show about tech, our job entails visiting tech companies, conferences, etc. We meet people all eager to show off the tech. Often, we'll see stuff that's scary. I don't mean weapons; I mean altering videos, stuff with obvious ethical issues.'
00:38:33.450 And we'll bring up our concerns to them. We're realizing that zero consideration seems to be given to the ethical implications of tech. They didn't even have a pat rehearsed answer—they're shocked at being asked—which means that nobody is asking these questions. 'We're not making it for that reason, but if people choose to use it that way, it isn't our fault.' Safeguards will develop, but tech is moving so fast, there's no way humanity or laws can keep up.
00:39:36.650 We don't even know how to deal with open death threats online anymore. What can we do? This is something we should never do. We've seen that same blasé attitude in how Twitter and Facebook deal with abuse and fake news. Tech has the capability to destroy us. We see the negative effects of social media, and no ethical considerations are going into the development of these technologies.
00:40:07.730 You can't put this stuff back in the box; once it's out there, it's out there. And there are no guardians—it's terrifying. It's a major problem when we're given so much power in tech, but we can't do anything to ensure that its uses are safe.
00:40:39.520 Thinking about what we're doing and being careful not to build things that can be used maliciously is really important, and it's the least that we can do. This is Chelsea Manning in an interview with The New Yorker Radio Hour. She's discussing the ethics of what developers do. No, she's not. I guess technologists should realize that we have an ethical obligation to make decisions that go beyond just meeting deadlines and creating a project.
00:41:17.839 Let's actually take some chunks of time and ask, 'What are the consequences of this system? How can this be used? How can it be misused? Let's try to figure out how we can mitigate a software system from being misused or decide whether we want to implement it at all.' There are systems that, if misused, can be very dangerous.
00:41:59.940 Don't get distracted by deadlines and feature requests. Think about the consequences of what you're building. Build in safeguards to prevent misuse, or don't build it at all because it's too dangerous. I'm asking you to do something about this. Well, I guess I'm asking you not to do something because of this.
00:42:37.040 It's only fair that we talk about when and where to take that stand. Let's say I had a time machine and could go back to 2011 and do it all over again. The tool has just been explained to me for the first time. What do I say? I think the first thing is to establish a mutual understanding of the task.
00:43:02.270 It's entirely possible at this point that I don't understand what the actual thing is, and I'm overreacting. I ask, 'Why are we finding these signals?' The project owner says, 'We want to find people's cell phones.' 'Who is finding them and why?' 'I don't know; probably some soldiers in the Middle East.' 'Why?' I repeat.
00:43:42.490 I can't tell you that. 'I can't tell you that' is something that I got a lot from the project owner. It's code for 'I have a clearance and I know things about this project. I know what you're asking, and I know the answer, but I'm not allowed to tell you.' At this point, I think we have our mutual understanding: the task is to help soldiers find people's phones, probably attached to those people.
00:44:24.410 The reason is left unsaid, but I think we both know that to this organization, as a defense contractor, they build things for the military—that is their core competency. They're not going to do this. On the other hand, I care a lot about not killing people.
00:44:51.930 The company's goal is to build things for the military. The military is necessary, and sometimes they need to use force. Personally, however, I don't want to be a part of that. If my goal is not to build those tools, then there isn't a good fit for me at this company. That probably means that the worst-case scenario here is that I'm going to leave today without a job.
00:45:22.770 Either I'll say no, and they'll fire me, or I'll say I'm not comfortable with this, and I'll quit. Those are worst-case scenarios—they're not necessarily what's going to happen. So before I do this, I need to consider some questions: can I afford to leave here without this job financially?
00:45:58.610 Can I rely on my personal network to get me a new job? Have I built up enough trust with my employer where I can go to them and be heard out? The answer for me at the time was no to all of those questions. Sometimes something is important enough that you still need to do it, but that's a very personal decision. There's a lot to go into these decisions, and they have consequences.
00:46:30.920 I would like to think that I would still say no now. Let's look at another situation where someone did the ethical thing. A developer we'll call Alice received a strange request: we want to identify weak passwords in our system and notify users to change them. We'd like you to run a password cracking tool on our very large database of users.
00:47:07.320 Alice thought this was kind of a strange request, but she said that if the appropriate paperwork were filed, she would do it. They received the paperwork, and she ran their crack. The next request was: we'd like the list of users' email addresses, as well as their passwords.
00:47:43.050 Alice knew that her co-workers had a valid desire to help customers improve their passwords. She also knew that a lot of users reused credentials across websites. If a report was created combining those two pieces of information, it could be misused to log into those users' accounts on other websites.
00:48:31.540 Alice pointed this out to her manager, and together with the customer success team, they designed an email that didn't include the password. Customers received notifications about their weak passwords and were able to change them. Nobody got fired, and Alice built up trust within her team.
00:48:56.560 Different scenarios require different ways of thinking about what you should do. Sometimes the right thing to do is to say nothing and just do the work. It isn't a simple thing, and it has consequences, but don't get distracted by having to think about it; sometimes your code can kill people.
Explore all talks recorded at RubyConf 2017
+83