What’s buried in your website?
Believe it or not, your website could have innumerable pages hiding deep where no search engine crawler could ever find them. It happens to the best of us, no matter how well we know our sites; strategic content exists on pages too deep for Googlebot to index it. But excessive page depth is a deadly threat to your SEO, and it’s essential to understand just what content falls where in your site structure.
What is Page Depth?
Let’s start with the basics. How do you define “too deep”? What even is “page depth” anyway?
Page Depth is, essentially, the number of clicks it takes to get to a page from the Home page. If your Home page is at 0 depth – the ground floor of your website – then the first link you click on from that page (Shoes, for example) is at depth 1. A page that can only be accessed from a link on the About page (let’s say it’s called Children’s Shoes) is therefore at depth 2, as it requires a minimum of 2 clicks from the home page in order to be accessed using the shortest path.
Here’s a visual:
So, with the Home page at Depth 0, you can see that the main categories of Clothing, Shoes, Accessories, and Sale are all at Depth 1 – just one click away from Home. These pages are likely in the main navigation, front and center in this website’s layout. For subcategories, customers must go deeper to Depth 2 to find children’s shoes or men’s clothing. If you were to go even deeper to Depth 3, you might perhaps find pages for children’s sneakers or men’s pants.
Now, “too deep” can be a subjective accusation. If your website has 20 pages, you might have a different perspective than someone whose site has 20 million pages. What counts as excessive depth can also depend on the type of content found on the page in question. Is it crucial, strategic content? Is it extra fluff that’s not important to customers or search engines?
When it comes to SEO, however, Page Depth of important content becomes a critical metric to watch. In fact, strategic content should generally be found no deeper than Depth 5 in order to facilitate content accessibility to search engine crawlers and customers alike, both of whom might abandon their quest before they ever reach a page of depth 8 or 10. Impatient customers might bounce off your site if they can’t easily find what they’re seeking, and search engine bots might deem your deeper pages not worth their limited crawl budget resources.
How to avoid ‘too deep’ strategic content
When your website has many deep pages, there may be one or several factors to blame. Content inaccessibility is a serious SEO problem, and it’s important to know just where the issue is coming from within your site structure.
How do you prevent excessive page depth?
- Stop pagination problems
Pagination can quickly create depth, thanks to few items per page, very long lists, or only being able to navigate a few pages at a time. Try to create shorter lists or offer more items per page to cut down on the number of pages in total. You can also stop robots from accessing low-quality or SEO-unhelpful pages like very long lists by updating your robots.txt file to discourage crawlers.
- Limit navigation filters
When your navigation features too many filters, particularly those that create new pages, you can very quickly run into depth problems. Limit the number of filters crawlable by robots; best practice is to limit to a single filter, or two at most, so useless filter combinations aren’t creating deep pages that would never drive organic traffic.
- Move your URL tracking parameters
Tracking parameters (like ‘?source=thispage’) can create an indefinite number of URLs with multiple on-site parameters tracking all source pages and creating duplicates of important content. Simply move the tracking information behind a # at the end of the URL so it doesn’t change the destination and won’t duplicate important pages.
- Correct malformed URLs
Malformed URLs are the silent enemy of your website. While they may sometimes result in a 404 Not Found HTTP status code (bad enough for SEO and for your user), they may also return an OK 200 HTTP status code, which is even worse. Replace all malformed links – those missing human-readable elements or with repeated elements – with correct links on the website.
- Put an end to perpetual links
Certain templates feature links present on every page – like next day or next month on a calendar – which creates an infinite number of pages. Not good. Put a stop to perpetual ‘next’ links by adding an ‘end’ value, preventing the creation of new pages ad infinitum.
How do you know if your content is too deep?
Now there’s the trick. These issues all sound serious, but how do you know if they’re happening on your site? Of course, you could just guess based on what you know about your website, but without conclusive data you couldn’t be sure.
You need to run a complete analysis of your website: top to bottom, inside and out. With an exhaustive scan of all parts of your site, you’ll have access to the complete picture of every page, every URL, and every dark corner of your website whether you can see it on the surface or not.
That’s why Botify developed the distribution section of our analysis report. What is the distribution of your pages? How many pages are at Depth 1, 2, 3‚ or past that cut-off of Depth 5? More important still, what important content is buried down at those depths, or at Depth 10+?
Using this feature, you are able to discover exactly how your pages are distributed throughout your website, and then dig further to understand what content is found at those depths. Click through to find which sections of your site are located far too deep for crawlers to find them and, based on the causes above, create your to-do list of fixes.
Once you have a concrete understanding of your website’s structure and where strategic content lies, your next steps are clear.
Can’t get enough? Learn more about the Top Causes of Excessive Page Depth!
Ready to uncover what strategic content is buried deep in your website?