Software Development
Dishonest Software: Fighting Back Against the Industry Norms

Summarized using AI

Dishonest Software: Fighting Back Against the Industry Norms

Jason Meller • November 08, 2022 • Denver, CO

The video,
"Dishonest Software: Fighting Back Against the Industry Norms," presents an insightful talk by Jason Meller at RubyConf 2021, focusing on the ethical implications of software development and the pervasive issue of dishonest software in various industries. Meller emphasizes that while many developers enter the field with good intentions, they often find themselves contributing to dishonest practices due to prevailing industry norms and incentives.

Key Points Discussed:

- Definition of Dishonest Software: Meller differentiates between dishonest and merely legal software, underscoring that legality does not equate to ethicality.
- Personal Story: He shares a personal anecdote from his time at General Electric, where his team mistakenly deleted a contractor's personal photos, believing them to be part of a cybersecurity threat. This incident highlighted the potential harm of unexamined practices in the name of security.
- Advanced Persistent Threats (APT): Meller explains the concept of APT, particularly in the context of cyber espionage, detailing how attackers gather intelligence from a company like GE involved in defense contracts.
- Legal Oversight: He discusses the Electronic Communication Privacy Act, which allows companies significant rights to monitor employees. He notes that such practices can be rationalized under the guise of protecting national security, but can lead to unethical outcomes.

- Informed Consent: Meller advocates for transparency and informed consent within software design, suggesting that if a software’s function would break with full disclosure, it is inherently dishonest.
- Industry Examples: He provides a real-world example involving a company called FullStory, which faced backlash for not adequately informing users about being tracked, leading to legal consequences.

- Advocacy for Ethical Practices: Meller encourages engineers to leverage their skills to foster a culture of honesty and accountability in software design, stating that doing so can become a competitive advantage in the market.

Conclusions and Takeaways:

- Developers possess the ability to enact meaningful change within their organizations by prioritizing honesty over compliance.
- Organizations should strive for a culture of transparency, enabling better relationships with users and clients.
- It is crucial for developers to recognize the ethical dimensions of their work and act as advocates for privacy rights.

Meller's talk underlines the importance of being vigilant and ethical in software development, urging attendees to reflect on their practices and the potential impact of their work on both users and society at large.

Dishonest Software: Fighting Back Against the Industry Norms
Jason Meller • November 08, 2022 • Denver, CO

From daemons that conceal themselves, to apps which lie to us, every day you're impacted by software with dishonest intentions.

No one starts their career building dishonest tools, but over time, the norms & incentives in specific industries (ex: infosec, advertising) can compromise the ethics of even the most principled developer.

In this talk we will...

Define dishonest software using examples & counter-examples
Arm you with compelling arguments to convince product leadership to build ethical software
Explore how engineers can advocate for the data privacy rights of others

RubyConf 2021

00:00:10 We made it! We're right at the end. Did everybody have a good time? This is my first RubyConf. I've been going to RailsConf for a few years, but I'm so excited about how many people I got to meet and the camaraderie here.
00:00:22 I'm really excited that you guys actually stuck it out to attend my talk. This is a really important topic to me. My talk is entitled 'Dishonest Software: Fighting Back Against Industry Norms'.
00:00:40 My name is Jason Meller. I'm a reformed script kiddie in practice. When I was about 13, I liked to call myself a hacker, but really, I was just downloading things off the internet and breaking things. I found out later that did not feel so good.
00:00:51 I ended up starting my cybersecurity career right after college, but I'm also a Ruby developer, specifically working with Rails apps. I probably brought Ruby on Rails to every single cybersecurity company I've ever worked for.
00:01:09 I founded my own company called Collide, which is a security app for devices. Our philosophy is to message employees on Slack when their device has a security or policy issue instead of locking down the device and preventing people from using them.
00:01:28 We try to take a user-focused approach to be honest about it and to fix the problems that can't be addressed with automated remediation alone. It's really about forming a relationship with the end user, so honesty is a huge component of that.
00:01:45 In fact, it's so significant that I authored a mini book titled 'Honest Security.' It's kind of sad that the URL was available for us to register, which speaks volumes about the industry I'm in today.
00:02:11 In this talk, we will discuss the differences between honest software versus legal software and provide you with things you can take back to your workplaces to advocate for honesty in the products we build and use.
00:02:30 There are cynics who might not agree with this, but I contend that most people in this room never intended or wanted to benefit from dishonest software. In fact, I think many of us already have or will. I'm going to share a personal story that changed my perspective on this.
00:02:48 We won't be talking about classic design software like malware or viruses, things that are inherently evil just from a glance. We'll discuss how I, someone who thought they were a good person, fell victim to building and promoting dishonest software.
00:03:09 It all started at my first job at General Electric, where I graduated from college. I entered their management program and soon found myself on their security team called the GE Computer Incident Response Team.
00:03:36 This was an exciting time for the team, as we were battling something known as the Advanced Persistent Threat, or APT. The term has now been turned into a marketing buzzword, but it specifically refers to the Chinese government sponsoring threat actors.
00:04:02 These threat actors waged a long-running and extensive cyber espionage campaign against Western interests, particularly Fortune 500 companies. Their mission was to exfiltrate valuable information to advance China's military and economic interests.
00:04:42 This is mostly proprietary intellectual property, which isn't just an abstract concept. We could see where these actors worked; this is their actual location in Beijing, as shown in the next slide.
00:05:06 The building shown is related to the People's Liberation Army Unit 61398. We could see the guards outside, these are the people actually attacking the company and aiming to exfiltrate data. You might wonder what value they could really gain - what's at stake?
00:05:54 Well, GE isn't just about appliances; they are a Department of Defense contractor. Back in the late 90s, they were awarded a joint strike fighter engine contract by the Department of Defense to build an advanced turbo fan engine used in many aircraft.
00:06:45 This was part of a program to create the next generation of aircraft for Western militaries, crucial for defending interests and enhancing military capability. If the adversary could obtain the schematics for this engine, they could determine vital information.
00:07:34 This includes things like how long aircraft could remain in the air between refuelings and how they would perform in dogfights, which is highly valuable intelligence that the Chinese government sought.
00:08:00 So how does APT work? They exploit arbitrary servers that they've already compromised and stage the data they want to steal. They use phishing campaigns to trick someone into installing malware.
00:08:16 Once they have access, they achieve persistence on the device, spread to other devices, collect credentials, and identify more targets to exfiltrate. Eventually, they bundle the data and send it off using FTP to a remote server.
00:08:55 They typically use a VPS host that wouldn't raise alarms, something inexpensive obtained with a prepaid card. Eventually, it all funnels back to a server physically located in China.
00:09:38 They utilize the RAR archive format instead of ZIP because RAR can chunk files. This clever strategy allows them to recover files even if transmissions are interrupted. This is the modus operandi that we were looking for.
00:10:26 Now, let's pivot to our job at the GE CERT. Our goal was to detect this activity and perhaps stop it in real time. So, in 2010, we built a massive detection apparatus.
00:10:54 We installed network taps throughout all the data centers and monitored egress points. The data generated by every employee was being mirrored, scanned, and stored.
00:11:24 We used a format called PCAP, which allowed us to analyze every single part of the packets we were capturing. At the time, only about five percent of traffic used HTTPS, meaning we could see everything clearly.
00:12:14 We could monitor each employee's online activity in real time. This led to a lot of questions: How could this possibly be legal?
00:12:39 In the United States, it is legal due to the Electronic Communications Privacy Act, passed in the mid-80s. While intended to protect individuals from wiretapping, it carved out exceptions for businesses.
00:13:07 This law allows businesses to legally monitor assets they own, effectively permitting a lot of surveillance measures. It allows practices that most people wouldn't consider lawful.
00:13:37 For instance, businesses can open physical mail delivered to an office without considering it a crime, and they have the right to track you on devices they control, including devices in your personal possession.
00:14:15 They can record keystrokes, take screenshots, and save network traffic on company devices. Recently, a court ruled that companies could not remotely activate webcams or microphones without prior consent.
00:15:10 This was challenged by parents of students at a school district whose devices had remote monitoring software installed. The school district attempted to defend their actions, but ultimately had to settle out of court.
00:15:54 Most surveillance is legal, but we need to rationalize it further. At GE, we justified our data monitoring under the belief that our mission was pure. We felt we were patriotic, fighting a foreign enemy and protecting the country.
00:16:17 Yes, we could see all employees' activities, but we were looking for significant threats. We convinced ourselves we only sought sophisticated attacks, not ordinary employee activity.
00:17:09 We thought there wouldn't be ramifications because we knew we were good people. Most of us had background checks or security clearances. Our surveillance felt justified because we could potentially save lives.
00:18:01 After running this system for months, one dark, stormy night, we received an alert that matched our expectations. Someone was staging RAR files on a server, preparing to transfer them to an FTP server.
00:18:39 We took action immediately, deleting all RAR files and blocking the FTP transmission. Our executives were notified, and they authorized us to hack back and delete files from the FTP server.
00:19:12 We were ecstatic, feeling like heroes for thwarting a real-time exfiltration attempt by a state-sponsored actor.
00:19:41 However, a couple of days later, we learned something rather sobering. What we believed to be APT activity was just a simple contractor backing up personal photos to their own FTP site, using RAR files.
00:20:37 The scrutiny that arose from executives at a Fortune 5 company was overwhelming, leading the contractor’s boss to fire them. Everything we deleted were their personal family photos, which were being backed up.
00:21:26 News of this spread throughout GE, tarnishing our reputation. Employees began to fear what we might do next. As a junior member of the team, I felt terrible about the impact of our actions.
00:21:59 Although no one faced tangible consequences, we lost credibility within the organization. I firmly believe this tarnished our credibility and negatively affected the overall security of the company.
00:22:55 This led me to realize an important truth: Trust us because we are the good guys.
00:23:01 What is honesty? Honesty is in trusting us because you can independently verify that we are telling the truth. At Collide, we believe that you have the right to know what we can see.
00:23:48 At Collide, every single end user—not just the buyers—receives a roadmap explaining what the company is looking for, why, and what privacy impacts it may have on them. Users have transparency before deciding to opt-in.
00:24:41 When building software, you can create new relationships that weren't there before. Our entire product focuses on messaging people on Slack to help solve problems that can't be automated.
00:25:40 One such task is notifying developers to set passcodes for unencrypted SSH keys on developer machines. We must ensure that developers understand why they are being messaged and how our system works to establish that trust.
00:26:37 Let’s zoom out with the time we have left and try to devise a test for dishonest software. The previous metric was whether the software breaks the law, which was a misleading test because honesty matters.
00:27:13 So, does the introduction of informed consent compromise the software's value proposition? If you inform users and require their consent, does the software still function properly?
00:27:50 I want to discuss a real company as an example of this. You might be familiar with FullStory, which offers a pixel-perfect session replay tool. This little JavaScript widget tracks every action individual users take on your website.
00:28:21 A recent legal action against Nike surfaced due to their usage of FullStory. They were sued for allegedly recording sessions without users' knowledge, which many interpreted as wiretapping.
00:29:06 FullStory's stance was that it is up to developers to ensure compliance with the law. However, this reveals a troubling aspect of their approach.
00:29:53 Steve Jobs once asserted that people are smart and that companies should ask for consent every time they handle sensitive user data. This principle resonates with Apple’s current practices, particularly since the iOS 14 updates.
00:30:34 The anatomy of informed consent starts with asking users in plain English—and requiring a response. For instance, Collide has a feature called 'lost mode' that helps locate devices.
00:31:25 We don't have to seek permission for geolocation data collection, but we do, and we explain what information the administrators can see.
00:31:50 We ensure users must approve via a clear action when their location data will be tracked. Once consent is granted, users can access their data. This ensures that they aren't required to request it from another person.
00:32:20 Finally, users must be allowed to revoke consent at any time without needing to talk to anyone. In our case, this is done right in Slack, where users can simply click a button to turn off lost mode.
00:33:16 If your product would fail to function without this process, it possesses dishonest properties.
00:33:57 As we approach the end, I want to highlight your role. I’m glad you’re here because we are entering a new era of privacy awareness both in the country and globally.
00:34:28 This newfound consciousness allows companies like Collide to establish a strong foothold against dishonest competitors. Each of you is likely a developer and has more power than you realize.
00:35:18 You have the unique ability to leverage your technical understanding to analyze software's honesty or dishonesty. Advocate not just for yourself, but also for your family, coworkers, and friends who lack similar knowledge.
00:36:13 Often, when a company announces a new software rollout, developers can analyze the implications behind it. Understand what you're advocating for and ensure others around you do too.
00:36:48 Building honest software has turned into a competitive advantage. There are many applications that expose the dishonest nature of many tools today. Examples include services like hey.com by DHH.
00:37:29 Security concerns surrounding dishonest software are critical. With increasing privacy laws, especially in the European Union like GDPR, the landscape is changing.
00:38:14 Companies must prepare for upcoming regulations instead of being caught unaware. Remember that organizations justifying dishonest software can lead to a pervasive culture of dishonesty.
00:39:02 Creating a culture of honesty and transparency in your organization will have positive effects beyond software development.
00:39:51 Thank you for being here. Go forth and try your best to be as honest as possible.
00:40:34 For those who arrived when I started, we have Collide swag in the front: t-shirts and things like that. Feel free to grab it on your way out.
00:40:56 I have a minute for questions, and I can take one or two if anyone has any.
00:41:10 So the question was: How do I feel about losing access to something if I'm not providing my information? This can be viewed from two angles, particularly in B2C cases.
00:41:53 Should consumers expect to benefit from a social network without sharing information? This is a legitimate business transaction, provided there's confirmed consent.
00:42:30 Conversely, when it comes to employment and the pressure to submit information, I see potential for collective action. Workers can unite and push back against coercion.
00:43:12 It often requires groups working together, as individuals lack the strength to defend against such tactics on their own. Thank you.
00:43:54 Any other questions? We've got time for maybe one more. All right. Thank you so much! Enjoy the last keynote.
Explore all talks recorded at RubyConf 2021
+95