Traditional SEO decisions are a paradox: SEO managers keep trying to rank their content without knowing exactly which pages have been crawled by search engines or the exact potential of pages…
Surely it would be more effective to know how your website is really built? Then you would only need to focus on pages that have real potential of ranking?
At Botify, we can give you the exact number of pages on a website, where they are located in the structure, ¬†their categories and many other SEO kpis.
Let’s use an example and see how a structural analysis can be efficient for your SEO traffic and revenues! ¬†(Data was modified and categorization has been anonymized).
The first step is the discovery of your website by our crawler.
With Botify, it is really easy and as fast as you want. Just tell us the day, the time and the speed you want us to crawl your website, and we will send you a complete report with all the relevant data.
Our crawler first collects data and SEO statistics from each page it discovers. The first feature is simply to count the number of pages in the structure (do you know yours ?).
Our crawler has found more than 800 000 pages in the entire structure of this website (this ¬†was the first surprise for the SEO manager because he thought he had no more than 600 000 pages…)
Second step : urls are categorized into eleven dimensions. Mainly based on url slugs, it is the first thing we do after the crawl.
We aim to comprehensively understand the category a page belongs to (= its objective) and analyze statistics from a selected part of your website.
You can see in the graph below how the website is organized into dimensions:
The structure looks pretty clean: 51% of the website is composed by “Product” and “Top Product” pages. Navigation is no more than 15% percent of the whole content.
At this point, we can only highlight that 10% of the structure is composed by warning pages!
Warning pages are pages defined as unnecessary or dangerous for search engine crawls.
It means that 10% of the overall crawl is likely to be wasted on pages with no potential of ranking (we discovered websites with millions of warning pages… Trust us, consequences were bad for their SEO…).
These warning pages will have to be blocked from search engines bots.
Discovering page distribution is only the first step. We can also detail how your website is organized by level of depth.
Where are my pages located in the structure?
A level of depth is the minimum number of clicks from the homepage to reach a page. If your content is reachable just a click away from the homepage, it means that it is at a level of depth 1.
In this example, we discover that more than 75% of pages were at, or after the 5th level of depth.
Also of interest, is knowledge of where pages are by depth and by category.
The “Top Products” category is widely positioned a the seventh level of depth. This may not be a good option for your SEO rankings… Would you say that the level of depth has a major influence on your crawl from search engines and your SEO traffic?
To conclude, this first post was just a short introduction to a small part of the data displayed by the Botify crawler.
We now have a better understanding of the site structure, but we still haven’t identified pages crawled by the bot “in reality” or pages that generate an SEO visit.
We’ll see in the next post how Botify can help.
Please leave your comments.¬†