Crawl, render, index – it’s the foundation of Botify’s methodology. If you’re not optimizing for these steps, you run the risk of all your other work being ineffective.
This is especially true for large enterprise websites with millions of URLs. Google doesn’t have the resources to crawl the entire web constantly, so enterprises face the challenge of making sure Google finds and indexes all their important content. Unfortunately, what we often find is that Google misses half of the content on these websites.
Whether you’re unsure how much of your content is being seen by Google or you already know you have issues in this area, read on to learn about crawl and render rate, and how improving them can boost your organic traffic and revenue.
Crawl rate is the percentage of URLs on your site that have been crawled by Google. In Botify, we consider crawl rate within a 30-day period. If Google hasn’t crawled a page within the last 30 days, it’s considered not crawled.
We’ve been promoting the idea of crawl rate for years. It’s one of the foundations upon which we built our platform. The reason knowing your crawl rate is so critical is that, if Google isn’t crawling your content, it has no chance of showing up when people search for it. If it doesn’t show up to interested searchers, it can’t produce organic traffic and revenue. With about half of enterprise website’s content un-crawled by Google, there is a huge opportunity for enterprises to maximize revenue from the organic channel by optimizing for crawl rate.
In order to calculate website crawl rate, you’ll need to know how many URLs are on your website as well as how many of your URLs have been crawled by Googlebot. Those who have connected their log files to LogAnalyzer are able to see this automatically in our venn diagram visualization.
This makes it easy to see what parts of your website Google isn’t seeing. Knowing how many of your total pages are crawled by Botify only is the first step toward optimizing for crawling. You won’t know what you need to fix until you’ve diagnosed a crawl rate problem.
Once you’ve identified an issue with how Google is crawling your website, the next logical questions are “why?” and “how do I fix it?”
To answer those questions, we’ll need to dive into both sides of the venn diagram chart to see both what pages Google is and isn’t spending time on. The main goal of this exercise is to do whatever we possibly can to get Google’s attention on the right pages — for example, there may be money-making pages (ex: product pages) that Google is ignoring, and unimportant pages (ex: duplicate content) that Google is spending unnecessary time on.
One of the big things we notice when diagnosing crawl issues is that crawl rate tends to decrease by page depth. In other words, the deeper a page is on your website, the less likely Google will crawl it. This is something that you can easily check for in Botify.
There are lots of other reasons why Google might not be crawling your pages — maybe they lack freshness or maybe they’re even mistakenly blocked by your website’s robots.txt file. Whatever the issue, you have the ability to diagnose it by drilling down into the Botify crawl venn diagram.
An added benefit of crawl optimization is that, not only does it get Google’s attention on the right pages, it gets our attention on the right pages. For example, you might find that certain pages on your website are driving traffic but aren’t in the site structure. To maximize the value of those pages, you could add them to the site structure.
If you own or manage some aspect of an enterprise website and you aren’t sure how Google is seeing (or not seeing) your pages, evaluating your crawl rate is key to ensuring that you’re getting all the value out of your website that you possibly can.
Render rate is the percentage of total content that Google has rendered. If you’re not familiar with the concept of rendering, we encourage you to check out our article From Crawl Budget to Render Budget. In it, we explain that Google has a budget for rendering just like they have a budget for crawling. Because of their finite resources, they use a “second wave of indexing.”
So what percentage of your content is being rendered? Calculating your render rate is currently somewhat of a manual process:
Just like with crawl rate, knowing that your site has a rendering problem will prompt questions such as “why” and “how can I fix it?”
It’s a good idea to try and reduce this percentage, making it simpler for Google to crawl your site.
You’ll want to ask, “Is there a way to serve Google the same amount of important content while serving them less resources they have to render?” Some of the enterprise organizations we’ve worked with have answered this with prerender/dynamic rendering solutions.
Google will likely render your content eventually, but with many enterprise organizations such as publishers, time to indexation is critical. When your content is timely and needs to be discoverable in search engines as soon as possible after publishing, you need to care about render rate.
If Google isn’t crawling and rendering your important, revenue-driving content, all the on-page SEO strategies in the world won’t help you correct course. The bad news is that large enterprise organizations are often operating at half-capacity, driving traffic and revenue from just a portion of their pages. The good news is, that means there’s immense traffic and revenue potential by optimizing your website for discoverability.
When you make it easier for Google to find and understand your content, they’ll reward you. An enterprise-grade SEO tool like Botify can help. Book your demo with us today and learn how we can help you optimize for crawl and render rate!