When you think of page speed, what comes to mind?
For most people, page speed conjures up images of trying to access a page, only to be met with a frustratingly blank screen that’s loading… loading… loading… for more seconds than you’re willing to wait.
That’s certainly a huge component of page speed, but did you know that speed impacts your bot visitors as well as your human visitors?
Keep reading to learn what page speed really is and how it impacts both humans and bots, or jump to a specific section.
Slow page speeds can have negative ramifications for both users and bots, both of which can affect your organic search rankings and traffic.
Let’s dive into why that is.
Pages that load very slowly can cause users to get frustrated and leave your site, leading to higher bounce rates and lower conversions.
But what does that have to do with SEO?
According to Google’s Martin Splitt, “You don’t want to frustrate your users, and we as a search engine don’t want to have users frustrated. So for us, it makes sense to consider fast websites as a little more helpful to users than very slow websites.”
Because Google wants to provide a good experience to their users (searchers, AKA your potential website visitors), they consider speed as a factor in their ranking algorithms.
💡 When did Google announce speed as a ranking factor?
– April 9, 2010 (Desktop) “Today we’re including a new signal in our search ranking algorithms: site speed.”
– January 17, 2018 (Mobile) “Today we’re announcing that starting in July 2018, page speed will be a ranking factor for mobile searches.”
When it comes to algorithms, not all signals are created equal. Page speed, for example, is less important than the relevance of a page’s content.
Hearing again from Martin Splitt, “If you have bad content but you’re the fastest website out there, then that won’t help you.”
In other words, what good is a fast page if it doesn’t contain what the user is looking for?
Google’s John Mueller puts it this way, “We try to differentiate between sites that are significantly slow and sites within a normal range. When we’re looking at things that are really, really slow… that’s where algorithms might take action as far as how they show it in the search results. If you’re within the reasonable range…a couple of seconds or even half a minute, tweaking that isn’t going to have a direct effect on your rankings.”
Google also recently announced that they would be improving the way they factor page experience into rankings by creating a new signal that combines page experience factors like mobile-friendliness with Core Web Vitals metrics (Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift).
Although we typically tend to focus on the impact slow page speeds have on real human visitors, and how Google factors that into their ranking algorithm, there’s another side to the page speed coin.
Page speed impacts search engine bots too.
Just like human visitors, bots make requests to view your pages. This process is known as crawling, and it’s a necessary step if you want your content to get added to the search engine’s index where it can be found and clicked on by searchers.
But search engines have limited time and resources. They can’t crawl all the billions of pages on the web all the time. That’s why they give each site a crawl budget, which is the amount of time they can and will spend on your site in a given session. (Important note: crawl budget typically isn’t a problem for smaller sites).
Because there’s a limit on how much time Googlebot will spend on your site, page load times can impact your crawl budget. This means that new pages may not be discovered and existing pages may not be updated frequently enough to keep up with the pace of actual page changes.
We’ve run tests on this before, and can definitely confirm that page load times impact how Google crawls your site. In the example below, you can see that page load time has a drastic impact crawl ratio, particularly for sites with more than 10,000 pages.
💡 Page Speed & The Mobile-First Index: Based on our research, we’ve also seen a positive correlation between page speed and the mobile-first index (MFI). Google has been transitioning very slow sites to the MFI at a lower rate than faster sites. You can read more on our MFI research here.
Bots and users have different needs.
Removing those things might make pages faster for bots, but could compromise the human experience. So, how do we reconcile the two?
One Google-approved solution to this problem is dynamic rendering.
Dynamic rendering sends fully-rendered content to search engines while serving human visitors with normal, client-side rendered content. It’s pre-rendering for search engine bots, and Google likes it because they get the same content that you’re sending to human visitors — just in a format that’s easier and faster for them to view! In fact, when done well, Googlebot wouldn’t even be able to detect that you’re dynamic rendering.
💡 Read more: What Is Dynamic Rendering & How Does It Impact SEO?
But does it work for preventing speed-related crawl budget issues? Based on real data from our SpeedWorkers customers, yes!
The end result? More unique pages crawled, pages refreshed in the index more often, new pages discovered sooner, and increases in organic traffic and revenue.
There are lots of tests on the market that help you evaluate your page’s performance.
Tools like Lighthouse and PageSpeed Insights measure lab data (not what actual users see) as well as field (real world) data — both metrics are used by search, but Google has clarified “We’re not using the Lighthouse score for ranking. We’re bucketing sites into ones that are really problematic, ones that are OK, and fast ones. You can see that in the speed report as well in Google Search Console.”
Botify’s performance reports can also show you load time distribution across your entire site.
You can use your Google Analytics and see what devices your visitors are using, and what the load time is for those devices. You may even want to consider identifying the most common device your visitors use and regularly testing your site on that device.
In the end, search engine optimization is all about providing the best experience for both your human visitors and search engine bots.
By being mindful of your page load times, and taking measures like dynamic rendering to make it faster for bots to view your pages, you’ll be in good shape for SEO success.