We know that Google has massive data crawling, rendering, and indexing capabilities, and great big banks of servers all over the world. That doesn’t mean that the search engine resources are unlimited. Vast, but not unlimited.
When you have a finite resource – like time, money, or bandwidth – you budget it. When the SEO world talks about “crawl budget,” they’re talking about the resources that Google will spend on crawling, rendering, and indexing a specific website.
From an SEO perspective, the real-world impact of crawl budget appears in terms of how many of your pages are indexed and how quickly updates are reflected in the search results and cache.
The following metrics can help you learn more about your site’s crawl budget:
Crawl budget also has lagging indicators as well. If a page isn’t crawled, for example, it won’t be indexed. If it isn’t indexed, it won’t rank or earn any organic search traffic. On the flip side, crawl budget improvements can not only improve your leading metrics (e.g. total number of pages crawled), but also your lagging indicators (e.g. rankings and traffic).
If you’re not sure where to start, take a look at the deep pages on your site — the pages that take 5+ clicks to get to from the home page. Typically, the deeper a page is on the site, the less frequently search engine bots will crawl it.
Google is budgeting its resources to determine how frequently to return to your site, how much time to spend on it, and how deep to go into it. If you want to improve those searchbot behaviors, you’ve got to influence them.
According to Google, they determine crawl budget by looking at crawl rate limit and crawl demand.
Crawl rate limit is a combination of factors that, when taken together, determine when Google will stop crawling your website. Factors that influence crawl rate limit include:
This is where collaborating with your development team is crucial, since they are often the ones that can help make page speed improvements, server improvements, and other technical changes to the website.
The other factor Google looks at to determine crawl budget is crawl demand. Factors that influence crawl demand include:
The concept of “crawl budget” has been described as an allotment of time the bot spends on a website. If your pages are too slow, it will necessarily have to spend more time executing them, rather than skipping merrily along to the next page. If the landing page it finds when it gets there is old, repetitive, or low quality, it will keep going, but eventually it will determine to spend less time on the site the next time it visits.
There are a few top contenders for problem areas when it comes to crawl budget:
Faceted navigation is usually the worst culprit for crawl budget issues, as it is easy to create without noticing. When search filters, parameters, and on-site searches create new URLs that are accessible to the Googlebot, there are potentially hundreds of thousands of URLs being created and indexed. Some of these pages may be of value to your potential search users, but certainly not all of them are valuable. The key with faceted navigation is to be deliberate and strategic about it. Choose which pages to link to, optimize, and ensure are different from everything else on your site. The remainder can be marked no-index, or better yet, suppress the generation of new unique URLs with those parameters entirely.
Most of the time the Googlebot that is crawling and indexing the pages first is the mobilebot. This is important when the internal link structure and navigation on your mobile site are different from the desktop site. If the mobilebot can’t discover and crawl pages, then they are likely to wait a long while to be discovered.
As above, the quality and value of the pages for search users does impact the amount of time Google will spend on a site. Strategic pruning, redirecting and no-indexing of low-quality landing pages takes time, but can pay off in terms of crawl budget, not to mention overall site quality.