Companeo: “Botify Helped Identify Fundamental Issues That Needed To Be Addressed To Allow Organic Traffic Growth”

Posted on

Companeo: “Botify Helped Identify Fundamental Issues that Needed to be Addressed to Allow Organic Traffic Growth”

7th October 2014AnnabelleAnnabelle

Arnaud Valéry

Companeo brings service suppliers and their SMB potential clients together, through online quotes. Arnaud Valéry, Product Director, explains how Botify Log Analyzer was instrumental in the organic traffic growth the website experienced over the past few months.

Botify: When did you start using Botify Log Analyzer?

Arnaud Valery: A little over one year ago. I discovered Botify in June 2013, during Search Marketing Expo (SMX). I was about to take over the SEO department’s management at Companeo, a website with a dozen years of existence. Understanding where we were from handover meetings was not easy, especially with such history. I needed a detailed assessment of the situation. That is exactly what Botify’s value proposition was about, with answers to questions such as: how big is my website? which pages does Google know? I was fully convinced when I saw the reconciliation circles showing these two pieces of information *[see below]*, and the analytical logic behind Botify Log Analyzer dashboards and reports. This tool saves considerable time and provides visibility over what’s really going on – things we wouldn’t know about otherwise.

What was the context, at Companeo?

In July 2013, we were about to go through a website migration that had been prepared by my predecessor. This migration aimed at moving from a form-based quote website, to a content-oriented website, with guides and news articles for instance. We performed the first website crawl then, right after the migration. As a first step, Botify Log Analyzer mainly helped identify what needed to be “cleaned” on the website. That was a requirement, considering how the number of pages had kept growing over time, to reach several hundred thousand pages.
The graph shows, in a nutshell, the results we achieved: the website perimeter was reduced, resulting in higher quality, according to today’s criteria; Google crawls the site much more efficiently, now that it is smaller.

What did this “cleaning” consist of?

First, we removed duplicates from various origins, including tracking parameters for old marketing campaigns, as well as pseudo-duplicates. We did this in several rounds, to make sure we did not impact existing organic traffic, as a significant part of the traffic was generated by these pages. We also optimized redirections: there were many redirections, as they had been piling up over the years. It took us four months to complete this “heavy duty cleaning”.

How did that impact Google’s crawl?

Today, Google only explores the part of the website we decided it should see. Our site has a leaner structure. It is obvious that Google understands it better. Its crawl returns much higher quality signals, from a page content perspective as well as a from an HTTP status code perspective *[see graph below]*.

Volumes distribution has also changed, between the website area we focused our efforts on and now hosts the new content, and the historical area. Previously, 90% to 95% of Google’s crawl targeted the historical area. Today, it represents at most 60% of the crawl, and the rest of the crawl is dedicated to new content. The historical area remains interesting because of its long tail traffic potential. Overall, Google’s crawl is much more balanced.

What did you observe, in terms of visibility?

Our Search Metrics visibility index increased fivefold over the past six months.

And in terms of organic traffic?

Significant traffic growth year over year, as the following graphs show.
The first graph, below, shows global organic traffic growth.

This other graph shows SEO traffic by type of page. Visits are now generated by a wider variety of types of pages. This is much healthier.

This other graph, below, complements the information above: it shows active pages volumes by type of page, that is to say, the distribution of pages which generate organic visits. We can see that a lower number of pages generate more visits, as we removed pages which did not meet today’s requirements, and created new pages with strict specifications in terms of content quality. This is very positive for usage statistics.

Here is one last graph, which summarizes year-over-year progress, as far as Google’s view of the website is concerned, and as far as organic visits are concerned *[on the right]*, with page type details:

Our work’s result is clearly visible on pre-existing areas (numbered), and on new contents as well.

What’s next?

Up until May 2014, we mainly worked on technical projects. Now that we can rely on a healthier and smaller structure, with a higher crawl rate and a higher active pages rate, we are also working on other typical SEO topics, such as semantics – data on title H1, and other tags issued from the crawl Botify come in very handy – and internal linking.

A big thank you to Companeo and Arnaud for this testimonial.
And to our readers: let us know what you think, leave your comments!

Blog comments powered by Disqus.

 

 

Related posts