SEO Culture
Publisher SEO
Business & Reporting
Botify News & Products
SEO Foundations
SEO News & Events
Future of SEO
E-Commerce SEO
Enterprise SEO
Content SEO
Technical SEO

Back to blog

Technical SEO

Website Depth: What's At The Bottom Of Your Iceberg?

X
 min read
October 23, 2014
Annabelle Bouard
Director of Education & Training Services

Could there be key content at the bottom of your website? Page depth, or the minimum number of clicks to reach a page from the home page, is an important indicator for SEO. But what needs to be considered as well, is what types of pages are deep. Your website can be quite deep and still manage to present content pages such as products or articles at reasonable depths. Or, on the contrary, deep pages could include content pages, which, as a result, are not crawled and generate no organic traffic.The typical scenario is as follows:

  • There are long paginated lists for top categories or high level tag pages
  • There are shorter paginated lists for subcategories
  • Some content pages are also linked from other content pages (similar products, related articles, and other types of suggestions).
  • Content pages can also be linked from top level pages such as the home page or section homes, for new content or promotions - but as these are temporary links by nature, that's not good enough for search engines.

So the longest lists may include very deep pages, but with well built navigation and internal linking, content pages linked in these long lists should be accessible through other shortests paths.What matters in the end is how deep your content pages are. Even more so if long tail traffic is important for you.With Botify Analytics, it is very easy to:

  • Check how deep pagination goes
  • Check how deep non-paginated pages go and where they are linked from - lists or other content pages.

Check Pagination Depth

First, in the Botify Analytics report home page (dashboard tab), verify that the crawler explored all pages. It could have stopped before exploring the deepest pages, if it reached the maximum number of crawled pages entered in the crawl setup before getting to the bottom of the website.To find out, check if there were still URLs in the crawler's queue when it stopped:

A full crawl is of course preferable, both for analysis accuracy as well as operational value of exported data. If there are still URLs in the queue, but much fewer than the number of URLs crawled, depth findings should nevertheless remain valid. Just keep in mind that you are looking at an understatement, when looking at analysis results.In the Distribution tab of the report, click on the "URLs by depth" block:

You will get a list of URLs sorted with the deepest URLs first, which may immediately give you a sense of what the deepest pages are.Here is an extreme example, with a depth that almost reaches 1,000 clicks::

But in less extreme cases, these deepest pages are perhaps not all composed of pagination.To focus on pagination, click on "Explore all URLs". This will bring you to the URL Explorer, where you will be able to select and display additional information.Change the filter at the top of the page (which is set to display all depths and can be removed) to match URLs with your pagination parameter.Let's take the example of a travel website. In this example, the parameter is called "page".

This will display only URLs which have a query string (the part of a URL which includes parameters, found after a "?") , where the query string include a "page" field name.For instance: http://www.mywebsite.com/hotels/rome-C5611?**page**=43Then click on Apply (in the example, we removed the URL's main image and title information from the results table to have less information to anonimize):

We get information about pagination's depth distribution and HTTP satus codes, directly through the results tabs:

We can also analyze pagination further:

  • Add a filter to look only at a type of page, for instance URL contains "/search"
  • If other parameters may be present and if the information could help your investigation (search query, navigation facet...), add "query string keys" to the displayed fields, to see all parameter names. In the URL Explorer's results table, the query string keys, or parameter names, are displayed as a single same column and separated by comas. They are exported as separate columns to make data analysis easier.

For more about pagination, check out check out these tips to reduce pagination, the top 5 pagination mistakes and read about how to monitor paginated pages with Botify Log Analyzer.Now that we have a clear idea of the situation regarding deep pagination, let's see if deep pages also include content pages.

Check Content Depth

In the URL Explorer, let's change the filter to exclude URLs with a pagination parameter, this time:As there's bound to be quite a few, let's also display only pages with a minimum depth (depth is found in the "Main" section of the filters list). For our example, let's see URLs that are at depth 6 or deeper.

Depending on how your website's URLs are built, you may need to adjust the filters - exclude other URL parameters for instance, or all, in which case, change the filter to "query string" "equals" and leave the field value empty.Let's now choose the information we want to display about these URLs. Let's add a sample of incoming links to find out where these deep content URLs are linked from (as there are probably very few, we can expect the sample to cover all). To do so, start typing the field name ("sample…") in the selected fields area and choose from the drop-down list. Click on Apply.The results are hotels pages. We can export this list to improve internal linking for these pages and reduce their depth..What's your experience with deep content? Care to share your comments?

Want to learn more? Connect with our team for a Botify demo!
Get in touch
Related articles
No items to show
Join our newsletter
SEO moves fast. Stay up-to-date with a monthly digest of the industry's best educational content, news and hot takes.