Talks

Security Doesn’t Have To Be a Nightmare

Security Doesn’t Have To Be a Nightmare

by Wiktoria Dalach

In the video "Security Doesn’t Have To Be a Nightmare," Wiktoria Dalach, a security engineer, shares essential insights on enhancing code security. Drawing from her experiences transitioning from a software engineering role to the security team, she presents practical tips to make software development more secure and manageable. Dalach elaborates on the challenges faced by developers when security reviews are conducted right before the release, often leading to undue stress and last-minute changes. She emphasizes the need for early security involvement in the development lifecycle.

Key points discussed include:
- Sanitizing Input: Developers must never trust user input blindly. Implementing input sanitization can help prevent severe attacks such as cross-site scripting (XSS) and remote code execution. Tools like Rails’ sanitize helper or Ruby’s sanitized gem can aid in this effort.
- Validating Data: It is imperative to validate all incoming data to maintain database integrity. Every field must undergo validation to protect against threats like SQL injection.
- Managing Credentials: Storing credentials in repositories poses significant security risks. Developers are urged to avoid this practice, as forgotten credentials can lead to unauthorized access and data breaches.
- Automating Security: Dalach introduces static application security testing (SAST) and dynamic application security testing (DAST) tools, emphasizing their importance in automating the discovery of vulnerabilities during the development process. Integrating these tools into the workflow can provide timely feedback on potential security issues.
- Understanding the CIA Triad: Dalach introduces the CIA triad—Confidentiality, Integrity, and Availability—as a framework for understanding security threats. By focusing on these three categories, developers can prioritize security discussions and decisions effectively.

She illustrates the applicability of the CIA triad by discussing various scenarios and the importance of security reviews at the design stage rather than at release time. The conclusion of her talk emphasizes an industry shift; developers must take on the responsibility of securing sensitive data in software systems. Dalach advocates for a culture of security awareness among developers, encouraging them to consult security experts early and often throughout the development process.

In summary, adopting secure practices early in the software development lifecycle helps mitigate risks and improve overall security posture. Wiktoria Dalach’s advice empowers developers to treat security as an integral part of their development responsibilities.

00:00:07.259 Imagine this: it's Monday, the last sprint of your project. You just finished the planning, and you know that you need to deliver two bug fixes and add some documentation. You feel good; it seems achievable. You are also excited for the release and for the new project that will come later. It's Monday, but it's a good Monday.
00:00:44.219 On Tuesday, your engineering manager comes into the room, and she says, "You know, it would be great if we could run this through security just before the release." In this case, "great" means we must do it. So, you attend a meeting with the security team. They ask you questions about the design and the documentation that you haven't finished yet. After this meeting, you end up feeling lost, and if you ask me, I also have no idea what the snail is doing up there.
00:01:53.340 It's a nightmare—a complete nightmare. And my friends, I will be honest with you; I've been in this situation way too many times. After the release, we would have this wonderful meeting called a project retrospective, and we would all agree that having a security review in the last sprint, just before the release, is a nightmare. But the moment we finished that meeting, we would completely forget our agreement, and the next time, the situation would haunt us all over again.
00:02:43.800 Last year, after about eight years of writing software, I decided to switch to another team and joined the security team. During my transition from software engineer to security engineer, I read some books and took some online classes on cyber security. When I started working with engineering teams, I gained a new perspective. I realized that there are some things you can do—low-hanging fruits that can improve the security of your product. Today, I will share those absolutely lowest hanging fruits of security. In the second part of my presentation, I will share a piece of theory that completely blew my mind. Are you with me?
00:03:53.060 Awesome, I even got a response! So, first, let's talk about sanitizing the input. You should never trust what you get from users. Sanitization means cleaning up the input from potentially executable code, and we are lucky because we have a sanitize helper in Rails, and for Ruby, we have the sanitized gem. In reality, it is fairly easy to sanitize the input. Of course, sometimes you may want to allow users to insert links or style their descriptions, so it is fairly easy to implement. However, it can have a huge impact on preventing attacks like cross-site scripting and remote code execution.
00:05:04.620 Another low-hanging fruit is validating data. You need to remember that the database is the most important asset you have, and you need to protect it. People may forgive you if you cannot access your service for 15 hours, but they will not forgive you if you lose their cat pictures. Therefore, make sure to validate all the data you receive from users and all the data you want to save in the database. Sometimes, we tend to validate only a few fields, but it's essential to validate all of them. By implementing sanitization and validation, you will reduce the risk of cross-site scripting, remote code executions, SQL injection, and many more.
00:06:11.220 Another important point is to avoid storing credentials in the repository. I know that some of you might be thinking, 'But it's okay for testing!' Well, it's not okay because we tend to forget those credentials easily. By credentials, I mean usernames, passwords, API tokens, and access keys. We don't want those in the repositories because you might lose control over what this token is responsible for and what resources can be accessed with it. Trust me, you don't want to wake up to a lovely email from a security researcher who found an API token in a testing repository that would enable an attacker to access your entire organization.
00:07:51.180 The good news is that many of these things can be automated. I really wanted to include a robot emoji in this presentation, and I'm glad I managed to do it! You may ask me, 'Okay, but this isn't a strawberry— it's a pear, and pears aren't low-hanging fruit.' Well, it's still reachable. It requires you to talk to people in your company who can allocate resources, meaning implementing scanning tools. The most popular scanning tools are SAST and DAST tools. DAST stands for dynamic application security testing, which interacts with the front end of your web application. It feeds it malicious input and looks for responses, sometimes finding potential vulnerabilities.
00:08:50.460 The other type of scanners are SAST tools, which stands for static application security testing. These tools interact with your code base and search for potential vulnerabilities. With SAST, you can integrate these tools directly into your workflow on GitHub or GitLab. When you create a PR or commit new code, you can receive instant feedback, like, 'Oops, you committed a credential' or 'Hey, you just introduced a potential SQL injection.' These tools are fantastic, and I think we should use them because security is hard enough, and development is hard enough, so adding security can feel overwhelming.
00:10:51.779 I've been speaking for 10 minutes already—how are you feeling? That low-hanging fruit was not super controversial, right? I'm so glad I can see your faces; it's much more reassuring. Now, let me take a sip.
00:11:23.760 I hope you didn't hear that. Now I want to share something that completely blew my mind when I learned about it, and I hope it ignites a fire in your heart. Before I explain, let’s be honest: what is the problem with security? Security is just this ocean of topics. There is no one-size-fits-all solution, and if I asked each of you if your product is secure, you would all give me different answers. Security encompasses such a wide range of areas: application security, infrastructure security, IT security, password management, secret management, phishing vulnerabilities, and much more.
00:12:44.760 I think it's a bit unfair. As developers, we have so much work to do: write good code, test it, make it accessible, and then, at the end, we need to address security. It's a massive task because security is such a big ocean of topics, which can be overwhelming and intimidating. When I mentored younger people, I often had to use developer lingo. For example, I'm Polish, so I would translate technical terms directly into Polish, which made it difficult for non-native speakers to understand.
00:13:39.360 Security experts use acronyms for everything, and I kid you not, every acronym includes either C or S, so good luck! They even use them as verbs. I spent many months in my first few months in the team Googling new acronyms every day. Security is intimidating, and security experts are partly to blame. We have this ocean of topics, and the security community agrees that the problem arises when you have an application to secure it: there's an infinite number of threats.
00:15:01.980 Are you with me? All of those threats can actually be categorized into just three categories. The good news is you already know the acronym for these categories: confidentiality, integrity, and availability, which collectively make up the CIA triad. I won't make a spy joke because the only spies I like to talk about are the Spice Girls. Let's focus on these three categories.
00:15:54.000 Three categories can cover an infinite number of threats. Let's not get distracted by pop culture; this is significant. Before I explain why this blew my mind, let’s clarify the definitions of these three concepts so we're on the same page. Confidentiality means that secrets must remain confidential. For instance, if I send you an email, I want that to be accessible only to us. Integrity means we get what we expect. If you log into your Twitter account, you want to see all your tweets—not an empty page.
00:17:05.240 Availability means being able to access information at all times. We understand that 100% availability is not the goal; 99.99% is basically what we aim for. Now that we know what confidentiality, integrity, and availability stand for, let’s consider how to apply this concept.
00:17:47.200 The CIA triad can completely transform how we approach security. Instead of being overwhelmed by countless threats, we need to focus on just these three things. For each project and user story, ask the question: how can the confidentiality, integrity, or availability of this project be compromised? This concept is beautiful because it is applicable to any technology you may work with—whether you're a back-end, front-end, mobile developer, or in DevOps.
00:18:33.780 Of course, the specific questions will vary based on what you’re working on, but the baseline remains the same. Let’s clarify with examples. For confidentiality, you can ask: who has access to this resource? How do we store credentials? Additionally, what do we log? Do we log any sensitive data, tokens, or usernames? For integrity, key questions include who can create, update, or delete this resource. If something goes wrong, is there a way to audit who did what?
00:19:10.620 What happens if a malicious actor sends bad data to our API or via a form? Regarding availability, when we mention availability, everyone's mind jumps to DDOS attacks. However, we, as developers, are highly responsible for maintaining availability because we introduce new dependencies to our products. Thus, it’s vital to ask: what happens if a crucial piece of software we depend on goes down?
00:20:54.580 Here's an anecdote: A few years ago, I implemented a billing system for selling products in Europe. If a customer provides their VAT ID, they pay the product's net price without tax. To validate that VAT number, we had to query a European system called VIES. During a successful product launch, we got rate-limited just an hour in. What were we supposed to do? We couldn't just stop orders after a successful launch. Luckily, we had a workaround prepared. This is just one example related to availability.
00:22:24.480 Using the CIA approach will help you create much more secure solutions and software. I encourage you to apply these principles for major projects, user stories, and especially in the design phase. Shifting security left means addressing security early in the software development lifecycle. When you consult security engineers while designing, it builds a much stronger foundation for your software.
00:23:30.700 Security experts agree that having security reviews right before release is stressful for everyone, including security engineers, who want to contribute positively to the development process. The ‘hot topic’ of shifting security left means that, instead of doing security reviews at the end, you engage with security teams in the design phase. The CIA triad can provide a framework for this approach, so I strongly encourage you to adopt it.
00:24:42.780 Every change that touches existing code is expensive, especially if that code is already in production. Shifting security left allows you to make changes early in the design phase—these changes are quick, cheap, and easy because they often happen on paper rather than in a production code base.
00:25:55.740 We hear about data breaches and vulnerabilities in popular libraries every week. Customers are concerned and do not want to risk their reputation, money, or data. Consequently, security is becoming an increasingly relevant topic in sales negotiations, and organizations are more hesitant to spend money on products they deem insecure.
00:27:05.240 Additionally, we, as engineers, need to accept that the careless days of 'move fast and break things' are over. Customers trust us with very personal, often sensitive data, and as engineers, we owe it to them to create the most secure solutions possible. I hope my talk has brought security a little closer to you, and I wish you a pleasant afternoon. Thank you for listening.
00:28:49.860 Victoria, thank you so much. There are plenty of questions. Please, feel free to ask more. The first one is from Stefan: What are your thoughts about storing credentials in encrypted form as possible in Rails? I don't have much experience in that. Secure secret management is a vast topic, and if you’d like, we can discuss it later.
00:29:58.799 There’s an anonymous question: How do you apply the CIA Triad to a big and old codebase? You’d want to do it step by step. One effective method is to automate security with scanners that can greatly aid older codebases. Injecting the CIA principles into your design process is also vital. This won't happen overnight, but there are many ways to integrate security into your software development lifecycle.
00:31:38.720 Regarding automation, one question from a lazy person: Is there a VS Code plugin for this? Yes! For instance, if you use SAST tools, some providers indeed offer plugins for VS Code. When addressing secrets in production, my recommendation is not to put them in repositories.
00:33:12.039 There are several services like AWS Secret Manager, and there are vault solutions that help manage secrets effectively. Thank you for this discussion!