Performance
Let's make the web faster - tips from trenches @ Google

Summarized using AI

Let's make the web faster - tips from trenches @ Google

Ilya Grigorick • April 23, 2012 • Austin, TX

The video titled "Let's make the web faster - tips from trenches @ Google" presented by Ilya Grigorick at Rails Conf 2012 focuses on strategies to improve web performance based on experiences and methodologies developed at Google. In this talk, Grigorick highlights the importance of speed in web applications and provides valuable insights into optimizing user experience through various techniques.

Key Points Discussed:

  • User Experience and Speed: Grigorick begins by discussing how users perceive latency, emphasizing that responses under 100 milliseconds feel instant, while delays over one second lead to cognitive disconnection.
  • Current State of Web Performance: He shares statistics from Google Analytics, noting average page load times exceed six seconds for desktop and ten seconds for mobile, revealing a concerning state for user experience.
  • Measuring Web Performance: Introducing the Navigation Timing API, Grigorick explains how it provides developers with detailed metrics on page loading times, including DNS resolution and TCP connection delays, allowing for better performance analysis.
  • Optimization Techniques: The speaker discusses various optimization techniques, such as leveraging browser caching, applying compression, and optimizing JavaScript loading through asynchronous scripts, to improve page speed and reduce loading times.
  • Tools for Performance Analysis: Grigorick showcases tools such as Chrome DevTools, Google PageSpeed Insights, and WebPageTest, underscoring their capabilities in diagnosing and resolving performance bottlenecks. He stresses the importance of histograms over averages for a more accurate understanding of performance distributions.
  • PageSpeed Family: Lastly, he introduces Google's PageSpeed family, including mod PageSpeed for Apache, which automates optimization processes by applying best practices in real-time without needing code changes.

Conclusions and Takeaways:

  • Developing a speedy web presence requires a combined approach of measuring performance, understanding user perception, applying optimization techniques, and utilizing the right tools.
  • Mobile optimization is critical, with slower connections impacting performance metrics significantly.
  • Grigorick urges attendees to focus on user experience, utilize available tools, and build their own methods for ongoing speed improvements, ensuring the web becomes faster overall.

By the end of the session, participants are encouraged to consider these insights as a checklist for optimizing their own applications, ultimately aiming for a faster web experience for all users.

Let's make the web faster - tips from trenches @ Google
Ilya Grigorick • April 23, 2012 • Austin, TX

Google loves speed, and we want to make the entire web faster - yes, that includes your Rails app! We'll explore what we've learned from running our own services at scale, as well as cover the research, projects, and open sourced tools we've developed in the process.

We'll start at the top with website optimization best practices, take a look at what the browser and HTML5 can do for us, take a detour into the optimizations for the mobile web, and finally dive deep into the SPDY and TCP protocol optimizations.

We'll cover a lot of ground, so bring a coffee. By the end of the session, you should have a good checklist to help you optimize your own site.

RailsConf 2012

00:00:24.679 So I think we'll go ahead and get started. We have a lot to cover today, and what I want to talk about is, of course, speed, which is something we care a lot about at Google. Hopefully, that's something that you guys have experienced yourselves across many of our applications. Although I know we can still make them faster, in the process, we've spent quite a bit of time and effort developing better tools and methodology for making our web pages faster. The intent of this talk is to give you a glimpse into the process we use to analyze our own products and pages.
00:01:02.250 In terms of logistics, we're going to look at a variety of different apps, and I'm going to try to provide as many demos as possible, assuming the demo and Wi-Fi gods are with us. Hopefully, that will go well. At the end, there will be many links and resources embedded in these slides, with a link to the slides at the very end. So, don’t rush to write down every short link; it will all be provided.
00:01:35.909 Larry Page made an interesting statement some time ago: "Browsing should be as simple and fast as turning a page in a magazine." Now, that’s kind of odd. Is that really what we aspire to achieve—making web pages feel like handling dead media? Not quite. The metaphor here is that when you think of the user experience of turning a page in a magazine, it works quite well. There are usability issues that accompany magazines, but the fact is that when you flip the page, it doesn’t just stop halfway or fail to let you continue. When you flip it over, all the content is there; you don’t have to wait 10 seconds for the page to load, only to have the entire content shift down because an iframe kicked in. That’s the experience we want to emulate. We shouldn't stop at that; we should aim for even faster load times.
00:02:36.390 With that in mind, we have an entire team at Google that is quite large, called "Make the Web Fast." You can infer what we're trying to do. Perhaps the most interesting aspect of this team is that, while we spend a lot of time optimizing our own products, the actual mandate for this team is to make the web faster as a whole. We measure the performance of the entire web, which is amazing when you think about it.
00:03:04.320 This includes various optimizations such as kernel networking, mobile, Chrome, infrastructure, data centers, and routers. This involves research, working with standards bodies, building open-source tools, and collaborating with communities like the Ruby community to help optimize the web. This is an exciting field to work in. Before we dive into the specifics of optimization, it’s valuable to establish some baselines.
00:03:36.300 Usability engineering 101 shows us that every month or so we get a new case study or paper that states, "Hey, we lowered our page loading times, and our conversions or sales went up." It's great to hear these success stories, and I think we need to be reminded of them constantly. The truth of the matter is that these numbers were published by Jakob Nielsen back in 1993. He conducted several studies to find constants regarding usability on computers and created high-level buckets of timing. Here’s an interesting takeaway: if you respond to a user within 100 milliseconds, it feels instant; humans can't discern any difference between 50 milliseconds and 200 milliseconds. Once you exceed 300 milliseconds, it starts to feel sluggish, similar to pressing a sticky button. Over one second significantly disrupts a user's mental engagement and can lead to loss of context.
00:05:14.340 We conducted some studies and shared these findings on the Google Analytics blog, analyzing hundreds of millions of web sessions—not just Google pages, but the web as a whole. We investigated average page load times for desktop versus mobile. The mean loading time for a desktop page today exceeds six seconds, while for mobile, it surpasses ten seconds. Though the medians are lower and that's a good sign, these figures represent a dire state of web performance.
00:05:58.080 If the average page load time is three to five seconds, that’s pretty bad. Reflecting on what we discussed earlier about the magazine metaphor, consider what experience you'd have if you had to wait three to five seconds for a page to load. That’s a terrible experience, and it showcases a critical area for improvement.
00:07:09.880 One hypothesis we might consider is whether the size of the pages could be contributing to this slow experience. We have a separate project maintained by Google employees called HTTP Archive. Every month we crawl some of the most popular pages on the web—thousands of them, including CNN and BBC—to collect data on all resources. We track the average size of a page, which has increased from 702 kilobytes in November of last year to just over 1,000 kilobytes. This means that the average page size today is about one megabyte, which is quite significant. Moreover, it takes approximately 84 requests to render a single average page of this size.
00:09:00.610 The requests break down into various types: eight HTML pages, numerous iframes pulling in additional HTML resources, a vast number of images that dominate the download size, along with JavaScript and CSS. The challenge is that while our pages are getting larger, our networks are not improving in speed at the same pace.
00:09:39.220 This isn't pessimistic; it's simply a hypothesis based on observations. Page slowness can often be traced back to a series of interconnected factors. When a user types in a URL or clicks a link, the first step for the browser is to unload existing content from the Document Object Model (DOM). Then, they have to handle DNS resolution, which most users don’t realize introduces delays. Furthermore, there’s the TCP connection handshake essential to our 84 requests. Each connection requires us to send requests, wait for a response, and parse the returned data.
00:10:13.800 The beautiful aspect of web apps is the lack of an install process, but this comes at the expense of an infinite install process every time a user loads a page. It's crucial to identify where the bottlenecks are in your pages; the sole answer is we need to measure everything.
00:10:58.180 Before we can optimize, we must understand what we're trying to fix. However, measuring certain data, like DNS resolution time or TCP connection latency, is often out of reach since browsers don’t provide direct access to that information. Fortunately, the Navigation Timing specification, which was introduced last year, helps with this. It allows us to access precise timing data right from the browser without the need for plugins.
00:12:11.270 Each of the timer labels shows us how long various operations took, from DNS lookup to fetch times, and provides insights into users' connectivity experiences. However, it’s vital to keep in mind that users only perceive the loading time as a sum of these components—not solely attributed to the Rails stack or the browser.
00:12:59.279 What does the state of the Navigation Timing API look like today? It's supported in Internet Explorer, Firefox, and Chrome, with over 85 percent penetration. While the latest Android browser also supports it, some mobile browsers are still catching up. What this means is developers can access detailed timing information on how long pages take to load. For instance, inserting some JavaScript can help gather timing metrics and log them for analysis.
00:14:30.660 With Google Analytics, we can automatically collect this data with no need for extra instrumentation. This service samples up to 10,000 unique visits per day, averaging 5 percent of all visits. If your site is smaller—say less than 10,000 daily unique visitors—you can set your sampling to 100 percent and receive reports on how users interact with your site, providing invaluable insights into performance.
00:15:34.200 So, I've been exploring the Google Analytics site speed reports on my own blog, and while I preach site speed, my average load time was a staggering 9.7 seconds. This brings into focus the technical reports available within Google Analytics that detail metrics such as average page loading times, redirection times, server response times, and much more—all without needing additional instrumentation.
00:16:54.790 Before I dive into specifics, I want to highlight how Google Analytics allows segmentation of traffic based on various criteria, like country, device type, browser, and more. This enables granular analysis of performance data, such as comparing page load times between visitors in Asia versus those in Europe. Such insights reveal that discrepancies are often due to varying resource availability or load times, offering plenty of room for optimization. I encourage you to explore these reports within your own Google Analytics accounts.
00:18:45.800 I'd like to stress the importance of averages in performance metrics; they can often be misleading. Many web performance analyses yield skewed distributions instead of normal ones, so it's essential to look at histograms of performance metrics rather than relying solely on average values. After all, averages can obscure valuable information about the range of actual experiences users may encounter.
00:20:20.680 You can access histograms for specific metrics through Google Analytics; for example, page timings, performance, and responsiveness. These histograms reveal peculiar patterns, offering deeper insights into user experiences rather than just a simple average. One of my earlier experiences involved a complete redesign and migration of my blog, resulting in significant performance improvements visible in both histograms and average load times.
00:22:11.560 To recap, measuring user-perceived latency is crucial for optimizing user experience. Navigation Timing gives you unprecedented visibility into your metrics, while Google Analytics offers powerful advanced segments for further analysis. It's essential to set up daily or weekly reports to consistently track performance. Now that we covered measurement tools, let’s transition into optimization strategies, focusing on several available tools from Google Chrome.
00:24:37.230 Google Chrome provides several built-in tools to help optimize web pages. For example, the Network tab displays a waterfall of requests, allowing you to analyze resource load times and the impact of latency on performance. You can also leverage the timeline profiler to visualize resource loading and execution that affects load times, capturing data like JavaScript execution, download times, and layout timings.
00:25:47.840 One helpful feature within Chrome’s developer tools is the ability to trace browser behavior, revealing the underlying processes and pinpointing inefficiencies. You can see how many times different functions are called, where there are delays, and how resources affect loading speed. This level of detail is invaluable for optimizing your pages, especially when it comes to understanding JavaScript execution patterns.
00:27:25.290 Moreover, Chrome DevTools is itself a web app, making it accessible and easy to understand. This means that if you are developing an application, you can leverage the same tools that power Chrome to analyze and optimize your application. This aspect is key, as developers continuously seek ways to enhance user experience while maintaining the responsiveness of their applications.
00:28:56.410 Remote debugging is also possible with Chrome, allowing you to connect your desktop to your mobile browser and view real-time network performance metrics right on your desktop. Such flexibility allows you to not only analyze performance but also troubleshoot issues rapidly as they arise in mobile viewing contexts.
00:30:56.300 Additionally, I want to emphasize that Chrome exposes a web socket and JSON protocol that lets developers extract detailed metrics from Chrome. These metrics provide a wealth of information regarding resource loading and timing, bridging the gap between developer efforts and end-user experiences.
00:32:19.730 The question then arises—how can we build enhanced tools to better analyze performance based on the data we obtain from these various tools? Taking the conversation further, I will introduce you to webpagetest.org, another invaluable resource. This platform allows you to run tests from different geographic locations and a variety of browsers, including IE 8 with DSL connections, giving you insights into performance under real-world conditions.
00:34:06.790 WebPageTest’s waterfall charts and content breakdowns provide granular views of the loading process, showing completed requests, DNS connection times, and more, which allows you to identify specific areas for improvement. Furthermore, it's critical to understand visual completeness by measuring how quickly users can interact with your webpage, not just the overall load time.
00:36:05.640 We introduced the Speed Index metric, which quantifies how quickly the page visually populates across the timeline. One way to represent this visually is through film strips showing captures of the page states as it loads. This provides an insightful comparison between optimized and unoptimized loading patterns, revealing how swiftly key elements of a page are rendered.
00:37:49.209 To facilitate better load performance on your site, Google's PageSpeed offers a variety of tools, including an online service where you can analyze specific URLs for performance scores and granular recommendations to enhance your site’s speed. PageSpeed can evaluate caching practices, compression, and image optimizations, all critical factors for loading time improvements.
00:39:51.210 For developers working locally, PageSpeed offers an open-source SDK and extensions compatible with various browsers, enabling you to run performance metrics directly from your workspace. Additionally, we're working on a service called PageSpeed as a Service, which automates optimizations for your site without requiring extensive developer intervention.
00:41:24.510 That said, I want to assure you that optimizing web performance isn't just a one-off activity; it's essential to maintain speed and responsiveness continuously. Although we have explored several tools today, many pieces remain to be uncovered concerning optimizations like mobile-specific strategies, JavaScript, CSS, and image optimizations. To conclude, remember to keep measuring user-perceived performance, aim for user-facing speed, and always work to improve the tools at our disposal. I appreciate your time today and encourage all of you to explore these resources further. Thank you, and I would be happy to take any questions.
00:52:11.510 That's all, and I'm glad to have piqued your interest in optimizing web performance at your own sites, and I wish you all the best in your optimization endeavors.
Explore all talks recorded at RailsConf 2012
+61