I’ve actually been discussing and fighting the site speed “battle” for several years, sometimes to open ears and sometimes not. My position has always been that the speed of your site is important to users since they only have so much patience to wait and it’s important to Google because the longer it takes to crawl your site, the more expensive it is to them. Coming up with the proof and the evidence was always very difficult because there was little internal information that could support these positions in spite of the common sense exhibited in making the points. Because of this the conversations typically either went nowhere or the conversation took months to even have let alone come to a resolution.
For those of you that are still having this discussion, I put some information together for a couple of clients and I hope that this helps you to illustrate some points.
Site Speed & Page Load
First, let’s first talk about what exactly I mean by site speed. It’s important to have this fully understood because each element has different ways to adjust and work on them. I break the discussion around site speed into 3 categories:
- Time to First Byte
- Download Time
- Page Load Time
The first item, Time to First Byte actually has 4 measures that are equally important with each of them measuring some element of the technical environment delivering your pages. These measure are:
- Domain Lookup
- Redirection Time
- Server Connection Time
- Server Response Time
The image below is a pretty good illustration of the full delivery time of your page(s):
The reason that I break out the page speed into these three buckets is actually fairly simple. The TTFB metrics at the left are all indications about how the technical environment is set up and optimized. All of these times represent what this group actually defines, how much time do your servers take to wake up and start to send the page contents to the user. The second item, Page Download is an indicator of just how the pages are coded, how complex they are and how much stuff you are delivering to your users. Remember about 1/2 or more of your users are coming in on a phone and some of them are not using wi-fi. The final arrow in the diagram above represents the Page Load Time, which takes all of the data delivered in stage 2 and starts to pull in all of the elements of your page, whether from your servers or from third-parties. This third one is typically the most complicated to have a conversation about since you may not have control over the elements that are being delivered here. Things like ads and items like Social Share buttons all rely on third parties to deliver the elements. Those conversations can be tough since ads help you monetize and stay in business and things like share buttons or content distribution network widgets help drive additional traffic.
All of the page timings illustrated above are available in Google Analytics. The information here is available for ALL of your traffic and includes all of the measurements. These measures include the communication time to deliver the contents of the page from the server to the user no matter where they are. For example, if you have a site that is hosted on a machine in Virginia and 1/2 of your audience is in the US and 1/2 is in India. If you are most concerned with your US users (or your Indian users) then you will need some other testing environment to provide you with the information that you need.
The overall recommendations that Google has made is likely to continue to get faster. The current recommendation from Google is to have a TTFB under 0.2 seconds. 0.5 seconds would be considered to be ‘not ideal’ and anything above this figure would likely be considered to be bad. If your site falls into the bad group, then you really need to start thinking about how to change things and improve for the better.
I highly recommend you doing a review of several of your pages (obviously you should look at important pages) using the Google Page Speed Insights Tool available here. This will help you analyze pieces of your code that could be improved upon, which can help you decrease the time it takes to download your page code. Some typical suggestions that I’ve seen for pages that take too much time to download are:
Speed Up Page Download
- Enable compression on your server
- Minify all js, css and html
- Minimize or eliminate page specific css and js
- Reduce redirects
- Utilize browser caching
- Use a CDN (Akamai)
- Optimize all images
If you receive any “bad grades” here there are a lot of suggestions that are offered to you and I will review some of them in a future post.
Now let’s finally get to the point of this post with why it’s important to have a fast site. Let’s talk about why Google wants your site to be fast and illustrate what they do if they deem you to be slow. Google always talks about the user experience and how they are thinking about the user. When a user gets impatient, they are very likely to abandon a site, go back to Google and click on the next most interesting link to them. This is referred to as a ‘bad user signal’ that they use to adjust the results on a SERP over time.
Google also has a selfish reason for this. They have something that’s often referred to as a crawl budget. This represents an allotment of time that they will spend crawling and analyzing your site. If your site takes too much time, they will crawl less of your site over time. Below is an illustration of what this looks like.
This image represents some information available from within Google Search Console (formerly known as Webmaster Tools). The top chart represents the number of pages that Google crawled during the period (of 90-days) and the bottom chart represents the time spend downloading each individual page in milliseconds.
The black trend lines indicate that as the time per page increased during the middle one-month block, the amount of pages crawled per day decreased. In this illustration, the average time per page went from a between 1.3 seconds up to nearly 2.0 seconds and then moved higher. Remember that Googles’ recommendation is 0.2 seconds. The declining trend continued into that third month.
If you have a large site that you want to have completely represented in the Google index you need to make sure that you are within the guidelines that they recommend. The reason that this is important is that if you make site-wide modifications to your site, you will want Google to update the index as fast as possible and if your crawl budget represents a small subset of your site, it can take weeks for Google to read analyze and reset your site within its index.
I think this is a pretty good illustration as to how your page speed is important from a crawl perspective. Your own internal information is not likely to have as vivid a depiction of this, so i encourage you to pass this around to people that can help you improve things. Now lets move on to the user.
There are a number of metrics that you can use to illustrate what the users are doing. If you have little to no fluctuation in your server timing it can be difficult to illustrate problems with your own data as there are few variations to point to. What I generally do is to look at the engagement timing measures within my analytics package. Most packages like Google Analytics and Omniture break the engagement time into the following time blocks: 0-10 seconds, 11-30 seconds, 31-60 seconds, 61-180 seconds, 181-600 seconds, 601-1800 seconds and over 1800 seconds. What I do is put them [again] into 3 groups. I group the first 3 time buckets ranging from 0 to 60 seconds into a group that I label as ‘Abandoned’. The last group of 1800 seconds is either employees, trolls of your site or bots that just continually monitor what is going on. The middle group is what I label as the ‘Sweet Spot’ of your visitors. This group represents users from 61 up to 1800 seconds, or 1 minute to 30 minutes. This is really the grouping where your visitors are going to be represented. This sweet spot of users has probably let your page load (hopefully) and has actually begun to engage with your content.
The illustration to the right illustrates how poor site speed can impact your overall traffic picture. This illustration depicts 2 sites that are hosted at the same hosting facility with very similar content. Site A has had ‘some’ speed improvements done during the period illustrated while Site B did not. As you can see with Site B, the majority of users never even allows the content to load before they abandon the site, losing over 3/4 of the visits before they even read anything. Site A is a little bit better but not optimal with nearly 1/2 of the audience abandoning within a minute.
This while a bit of an extreme illustration of what can and will happen to your traffic with a slow or unresponsive web site. My recommendation to this client is to port the speed improvements over to Site B and continue working to improve performance on Site A so that all of the work that is being done to generate this traffic does not go to waste.
If you’ve been involved in either user experience, product or search optimization over the last several years it’s highly likely that you’ve been in one of these discussions. I hope that these illustrations will help you have some difficult conversation about your traffic, your site and how to make things better. Using your own data might not be helpful, but the information here is from live sites that have had the same challenges that you’ve likely had.
More to come and have a great holiday.