One question we all always get asked is, “how do I speed up my site?” Unless you don’t have anything on your pages there’s always more you can do to speed it up, the question is how fast your site is now and how much time & effort to spend improving that.
Before trying to optimize you need to set a baseline for how fast your site is now. Maybe it works great on your high speed connection from your office with the latest version of Chrome but is a disaster on mobile Safari. Maybe it seems really slow but it turns out that’s only one or two pages that are out of thousands. How can you know if you made it faster or not without setting that baseline?
We break that down into baseline down into two parts:
#1 – HTML page download speed.
Here we’re talking about how long it takes just to get the plain ‘ol HTML of the page. No external images, JS, CSS, etc. Just like this from the command line:
$ time curl http://www.yrsite.com
Or what you see from Google Webmaster Tools Crawl Stats:
This is also probably what you see from your uptime monitor (Pingdom, New Relic, Cloudwatch, etc.).
No doubt this number is critically important for many things, like checking if your site is up and how fast Googlebot can crawl it. However this is an easy number to get and not really what you want to set as an end user baseline (unless you have a text-only site).
The page download speed is usually only a fraction of the total time it takes for the user to be able to interact the page. For example just checking the amazon.com homepage it took me ~400ms for the HTML download but then an additional 1.5s to download all the rest of the page, so the HTML was only about 20% of the total time, so clearly we should be looking at the total download time, which leads us to:
#2 – Full page load speed.
Now we’re talking about the real world. But, since it’s the real world, it’s more complicated too. It’s our real users that can save us though! With the advent of HTML5’s navigation timing standard from 2012 available in modern browsers (sorry IE 8) we can actually get reporting on how long it takes our real users to use the site. We no longer need the older generation of 3rd party monitor tools that say how long they took to download our full page now that we can get the actual statistics from our actual users.
If you’re using tools like Pingdom’s full page test or show slow or any tool like that for ongoing metrics, you can do better. Those can still be very useful tools, but that’s not what your real users are seeing and not the baseline and then ongoing metrics we want.
This real-time monitoring API is great, but we still need something to get actionable reporting from that data. There’s one place you’re probably already getting that data, Google Analytics’ site speed reports. This is that real-time bowser data, and the reports are pretty good as you’d expect from Google Analytics, and you get them for free with GA without having to turn on any new options or code, etc. You’ve probably already looked at the data. But in my opinion these stats are not what I’d call user-friendly with a clear path to actionable reporting.
Firstly, they default to reporting on only 1% of users. This can is fine if you have a ton of traffic, but for small sites that gives bad results. You can change the sampling rate in your code, but still you’re limited to a relatively set number of overall samples.
My main problem is that the GA dashboard is just not geared towards reporting on speed in this way. It’s focused towards site traffic, trends & segmenting, not a speed and uptime operational dashboard. For this I really like like where Pingdom’s Real User Monitoring is at (and now you know why I didn’t feel bad about calling out their full-page test for being outdated above).
To me the RUM dashboard is much clearer and really tells you where you are at-a-glance for speed. I especially like the green/yellow/red users breakdown which can really help convey what the straight average from GA cannot.
They even include differentiating by median vs. average, which is a great feature to give you an idea of what the common browser might see. That’s a nice touch if (for example) you had a bunch of back-end admin pages that were very slow to load by design but might throw off your stats otherwise.
It’s just a well thought-through dashboard that can help you set that baseline, to move on to the next step of actually improving your speed.
GA has a lot of functionality that Pingdom’s RUM will never be able to match: seeing speed broken down by area of your site, or filtering out internal traffic, or tying experienced speed to conversion rates — but if you want a quick clear view of where you are at for a speed baseline I think Pingdom is a great easy option.