Site icon Botify

Top SEO Pitfalls By Website Type

Top SEO pitfalls by website type

15th April 2015AnnabelleAnnabelle

E-commerce SEO? Publisher SEO? Classifieds SEO? Although on-site SEO essentials are universal, it makes sense to pay special attention to specific items and related indicators, depending on the type of website you are managing. Because some issues – such as page accessibility to robots and low quality content – can reach particularly large proportions on some websites structures and have a huge impact on organic traffic.

Let’s take a look at a few issues that are particularly impactful for these three types of websites:

LARGE E-COMMERCE WEBSITES

E-commerce websites often face several types of issues related to their size, their navigation structure, and their typical organic traffic pattern: they tend to get either mainly long tail traffic (large product catalog, low number of visits per page, on many product pages) or mainly middle tail traffic (high brand recognition, significant direct traffic, and most organic traffic on top category pages, through more generic, competitive search queries).

Typical issues include:

1) A significant portion of products are not explored by Google

This generally has a big leverage: with a large product catalog, chances are only a portion of products are known to search engines. Either the website already generates most its traffic on product pages and having more products crawled will have a certain, mechanical effect on this long tail traffic; or it doesn’t, and this is an important source of potential incremental traffic.

The goal is to make sure that Google explores all products.

Global indicator: ratio of crawled pages among product pages (tool: LogAnalyzer).

What can be done:

2) Near duplicates within products

There can be many products which are almost the same, apart from a few details (color for clothing, minor technical characteristics for high tech products) that are not differentiators internet users are likely to include in search queries.
The goal is to make sure product pages present products that are differentiated enough to respond to different queries, while avoiding the negative impact undifferentiated content has on quality criteria.

What can be done:

3) Multi-faceted navigation implementation prevents middle tail organic traffic

Navigation pages are targets for top to middle tail SEO traffic queries (for instance, “Nike childrens shoes”). It’s an issue if they are not accessible to robots, or if too many are accessible through crawlable filter combinations.
The right balance must be found so that search engines see all navigation pages with potential for organic traffic, but are not swamped by additional pages that will waste search engine crawl and degrade global website quality indicators.

Indicators: number of navigation pages on the website, pages with organic traffic, HTML tags (website crawler which allows to filter data based on URL characteristics, such as URL parameters / parameter names).

What can be done:


PUBLISHING WEBSITES

Let’s leave aside news-specific SEO (Google News) and focus on “regular” search (like Google’s universal search). Publishing websites which regularly publish new articles have a continuously growing content which poses some challenges. Very often, SEO concentrates on new articles, while a great deal of untapped potential lays in the bulk of older content.

Typical issues:

1) Older articles get deeper and deeper in the website

This has to do with navigation and internal linking. Once they stop being linked from the home page, from top navigation, from a “most read” block, “hot topics” tags and the like, older articles get deeper and deeper in the website and become harder to reach for search engine robots. Typically, at this second stage in their life cycle – but by far the longest one – they are just linked from a long paginated list of articles, and related articles that may link to them are also deep. As a result, these older articles don’t reach their full potential, or don’t perform at all.

What can be done:

2) Tag pages that are not “hot” any more are not accessible via top navigation

For similar reasons, tag (topic) pages which don’t include a recent article also get deeper and deeper, if they are only linked from articles.

What can be done:

CLASSIFIEDS WEBSITES

Typical issues are related to user-generated content, which we have no control over, and the fact that content has a high rotation rate: many new pages are created on a daily basis, and they may expire quickly.

1) Search engine crawl does not focus on relevant ads

This implies, in particular, making sure new ads are crawled, and expired ads are not.

What can be done:

2) The way empty categories are managed sends confusing messages

A category page can at times be empty, as its ads list entirely depends on users. We should avoid creating categories that are likely to be often empty, or broaden their scope to minimize chances this will happen. But it can still happen, because of seasonal effects for instance, or market trends. So this should be carefully planned for.

If category page returns HTTP 404 (Not Found) when there aren’t any ads, and HTTP 200 (OK) when there are some, its chances of ranking will be low: this “blinking”‘ page which only exists part of the time won’t be considered as reliable by search engines. The page should exist at all times, whether there are ads or not – in which case the page content can include links to similar ads.

What can be done:

3) Ads are semantically poor

Some ads may fail to include some important keywords, or include many abbreviations. Or there won’t be much differentiation between some ads. Unfortunately, there is not much we can do at the ad level.

What can be done:

A shorter version of this article was published at Brighton SEO (April 2015 edition) in the conference’s print publication.

Blog comments powered by Disqus.

Exit mobile version