Talks

Caching strategies on dev.to

Caching strategies on dev.to

by Ridhwana Khan

In this video presentation titled "Caching strategies on dev.to" by Ridhwana Khan, a lead engineer at DEV, the central focus is on how effective caching can improve performance on the platform. Ridhwana begins by sharing her background and passion for technology and community building, before providing an overview of DEV as an online community for software developers.

Key Points Discussed:
- Caching Importance: Caching is emphasized as a critical factor in DEV's fast performance, especially given that the site has a high read-load with infrequently changing content.
- Types of Caching: The talk highlights various caching techniques including response caching, edge caching, fragment caching, and browser caching.
- Response Caching: This method reduces server load and speeds up requests by storing responses from the server.
- Edge Caching: Utilized at DEV through Fastly, it caches content closer to users worldwide, significantly boosting loading speeds.
- Fragment Caching: Important for complex views, ensuring that only parts of the view need to be processed on database calls rather than the entire page.
- Browser Caching: Aims to prevent unnecessary requests from leaving the user's browser, enhancing speed.
- Caching Implementation: Examples of DEV’s implementation include:
- Using warmed edge cache to improve user experience by accelerating loading times for articles.
- Storing comments for SEO enhancements while rendering dynamic content like user reactions separately to ensure accuracy.
- Decision-Making for Caching: Ridhwana shares reflective questions she considers when deciding whether to implement caching, such as the frequency of data changes and the most advantageous stack layer for caching.
- Miscellaneous Techniques: Additional strategies mentioned include counter caching, which helps display counts efficiently without extra database queries, and the use of tagging systems to categorize content effectively.

Conclusion and Takeaways:

Ridhwana concludes by stressing the importance of constant monitoring and adjustment of caching strategies based on performance metrics. By encouraging teams to implement features and closely observe their performance, DEV maintains its commitment to an efficient platform for its users. Ultimately, effective caching not only enhances the speed and efficiency of the DEV platform but also contributes to a better overall user experience.

00:00:00.199 Hi everybody, thank you so much for being here today. My name is Ridhwana Khan and today I'll be talking to you about caching on dev.to.
00:00:06.600 Before we dive into that, let me tell you a little bit about myself. I am a lead engineer at DEV, which is also known as Farum. At DEV, I feel my passion for building really useful things. I enjoy speaking at conferences and writing articles, as I believe they provide me with opportunities to meet interesting folks like yourselves. I'm super passionate about building communities both online and in person.
00:00:26.359 This is a photo from a Women in Tech community that I was running in South Africa. There are other communities where design and development intersect. It's a fun endeavor. On a personal note, I’m a cat mom. This little long-haired fellow is named Zeus. I give him lots of love, attention, treats, and food in return for the pleasure of his company, which often includes hair on my clothes and the occasional mishap on the couch.
00:01:11.680 Have any of you been to South Africa? Yes, a few people. I'm from Cape Town, South Africa. I’d like to give you a quick one-minute virtual tour introduction. This is an aerial view of Cape Town, where I live, featuring one of the most prominent landmarks called Table Mountain. It's called Table Mountain because it looks like a tabletop. On the right is the Atlantic Ocean and on the left is the Indian Ocean, so at Cape Point, they meet.
00:01:32.680 We’re really proud of our wildlife; we refer to the Big Five, which includes the lion, leopard, buffalo, elephant, and rhino. We also celebrate our diverse people, which include various religions and races. Now that I've convinced you to visit South Africa, just so you know, today is also my birthday. I feel grateful to be celebrating it here with you all. Thank you to the organizers for giving me this opportunity to speak.
00:02:39.440 How many of you have heard of DEV or are familiar with it? A few people. For those who haven't heard about DEV, it's an online platform that forms a community of software developers. These developers use the platform to share their knowledge and provide support to each other. Currently, we have over 1 million users signed up on the platform, and we believe that the software industry relies on collaboration and network learning, and DEV aims to be the place for that to happen.
00:03:34.680 Another nice aspect of DEV is that it’s open-source. Anyone can visit, create features, and report any bugs. Our code is written in Ruby on Rails on the server side, with sprinkles of JavaScript for interactivity and partial hydration. We often get complimented on the speed of DEV; people say that it's really snappy. If you opened the homepage on your phone right now and visited dev.to, you'd see many articles with very little loading time.
00:04:26.680 When you go to an article page, you won't even see a loading spinner—it loads up quickly. So why is it so fast? The main contributing factor is caching, which can be intimidating. I've felt scared about caching for a long time. However, I've recently started dipping my toes in and playing around with it. While I'm not an expert, I'd like to share some of those learnings within the context of DEV.
00:05:13.840 One major reason DEV is a perfect use case for caching is that it’s a read-heavy site. Tens of thousands of users come to DEV daily to read articles, and once created, these articles rarely change. This makes caching a good use case. Although there may be some interactive elements—like reactions—they present an interesting challenge regarding caching.
00:05:43.200 There are many types of caching. I will focus on response caching and the different types of it, along with some miscellaneous caching techniques. Let's start by discussing response caching.
00:06:01.560 Response caching accounts for the majority of use cases on DEV. As the name suggests, it is simply caching a response from a request. This reduces request time and server load by decreasing the number of requests sent to the server. In a typical Rails application, we send an HTTP request with information about the resource we want, along with some headers.
00:06:58.520 The web server receives that request, matches it to a route, and determines which controller action it maps to. The controller runs the corresponding code, likely calling a model that makes a database request. The controller packages this data and generates a view. Rails then sends that view back up through a response message along with additional headers and metadata, which the browser renders. The browser may also request additional scripts or images.
00:07:49.680 This flow puts tremendous pressure on the web server, with thousands of requests coming through every minute. We can ease that load by identifying static content that can be cached. There are multiple layers where we can implement caching, and one critical decision for developers is determining where to add that caching.
00:08:14.920 At DEV, we implement several types of response caching, including edge caching, fragment caching, and browser caching. Edge caching allows us to avoid hitting the web server, while fragment caching is implemented at the view layer for complex rendering. Browser caching restricts requests from ever leaving the browser. Each of these strategies has its specific use cases, and sometimes we combine them for optimal results.
00:09:00.360 For instance, on the article page at DEV, we use fragment caching for complex views, edge caching to minimize hits to the web server, and browser caching to enhance performance. Now, let’s discuss each of these caching techniques in more detail.
00:09:34.760 Edge caching lives between a user's browser and the web origin server. It helps reduce the need to continuously access the origin server. At DEV, we use Fastly for edge caching. Fastly caches our content on a point of presence located around the world. This setup brings the memory storage closer to the user, which significantly decreases request time. For example, if I’m a user in South Africa, I’d prefer my requests to be handled locally rather than always routing to a server in the US.
00:10:39.960 The edge network retains a cached version of the page, minimizing the need to recompute this for each request. The benefits include reduced server load, improved content delivery, faster response times, and lower network load by cutting down on duplicated data.
00:11:43.960 Now, regarding how edge caching functions: When a user navigates to a page, we first check the edge cache, which could be cold or warm. A cold cache indicates no existing data, meaning the request must go all the way to the origin server. Once the information is retrieved and sent back to the user, it will be stored in the edge cache for future visits. This is known as warming the cache.
00:12:19.599 Once the cache is warmed, subsequent requests will access this data without needing to hit the origin server, reducing loading times significantly. This process ensures that users can retrieve data quickly, especially for frequently accessed pages.
00:13:02.800 For example, on DEV, when a specific article loads, the content is drawn from the warmed edge cache. If you checked this page multiple times, you would see it load almost instantaneously after the first visit. We ensure that our caching mechanism works efficiently, allowing for a seamless user experience.
00:13:58.000 We also cache comments on articles for SEO purposes, while more dynamic content, like user reactions, isn’t cached as frequently. Instead, we load the page from the edge cache before dynamically rendering the sidebar asynchronously, ensuring optimized performance without compromising content accuracy.
00:15:05.720 As for caching strategy headers, a public cache indicates that the response can be shared across different users, especially for requests that aren’t authorized or tied to user accounts. We set long expiry dates and opt for purging instead of relying solely on time-based expiration.
00:15:47.280 Alongside response caching, we utilize some miscellaneous caching techniques at DEV, like counter caching. This keeps track of cached counters without loading all related records. For example, a user can have many articles and reactions, and counter caching allows us to efficiently display counts without redundant database queries.
00:16:55.560 We also implement a tagging system to categorize content effectively. Instead of adding tags as a column on the article table, we use a gem called ActsAsTaggableOn, which adds necessary associations and supports caching the tag list dynamically, keeping everything current and efficient.
00:17:41.159 To conclude, I’d like to share some questions I ask myself when determining whether to implement caching for a feature. The first question is about how often the data for that feature changes. If the data changes frequently, caching may not be beneficial. However, for more static content, such as that on an article page, caching can significantly enhance performance.
00:18:44.360 Next, I evaluate which layer of the stack would benefit from caching. For complex views that take longer to render, I lean towards fragment caching. We often edge cache various elements at DEV to minimize web server hits. The chosen strategy depends on traffic and request patterns, and the effectiveness of caching can vary, so monitoring data is essential.
00:19:49.440 Our team encourages shipping features and then monitoring their performance to make informed decisions on whether to cache or how to implement it. This process is valuable in ensuring our platform remains fast and responsive, ultimately benefiting the DEV community.
00:20:15.320 Thank you so much for your attention. Do we have any questions?
00:20:24.880 Please feel free to leave me any feedback; I would really appreciate it.
00:20:23.560 While you compile your questions, I hope I can provide answers to them. If anyone has any inquiries specifically about edge caching or shielding, I'm happy to clarify.
00:21:02.000 To clarify, while we don’t implement shielding in the typical sense, we enable it via configuration settings on Fastly. It’s an informed decision whether we need that extra layer of caching based on our management capacity without overcomplicating the processes.
00:22:19.279 Fastly allows for cache expiration based on API interaction. We set a default expiration time, typically one day, and we have explicit functions in our code to manage cache purging efficiently. These purging requests ensure that when an article is updated, the associated cache is expired promptly.
00:22:41.240 If there are any final questions, please feel free to ask them now, or we can connect during the coffee break later.
00:23:05.080 Thank you very much for your attention and engagement!