She touches on the following topics:
Botify: Could you please start with a quick overview of the websites you work on and what you do?
Charlotte Barre : Our team manages websites belonging to L’Express group: the news magazine’s website www.lexpress.fr, its thematic websites L’Entreprise, L’Expansion, Votre Argent, Styles, and also a home decoration website, C√¥t√© Maison. As a media website, we are of course constantly working on news topics, but our work always includes a technical and structural approach to website analysis. Our websites virtually all migrated to new versions over the past few months. We worked at harmonizing websites back-end and front-end systems, and at reorganizing content through new navigation structures, amongst other things. Additional important projects related to website structure are planned for the next few months. We need a wide variety of indicators to conduct the initial analysis when a project starts, to define which optimizations are needed and to follow their implementation process.
As with any project involving a large number of pages, we are exposed to potential side effects that Botify allows us to monitor.
How did the Logs Analyzer help, during these migrations?
For each migration, we use Botify to establish a pre-migration website evaluation: we identify the main categories of pages with problems (insufficient crawl rate from Google, inefficient areas, low active pages rate, under-optimized internal linking, pages returning errors, excessive redirections…) and decide how to deal with these pages. The migration also provides a good opportunity to ‚Äòclean’ the website structure, although this is not the main objective.
How, exactly, do you do that?
Using data from the Botify crawl and the Logs Analyser. To remove links to error pages, we extract URLs with a 404 HTTP status code and their referers, i.e. pages with a link to these error pages. The fact that URLs are categorized allows to quickly identify which pages these errors and referers correspond to. This allows us to decide how to treat these 404s based on the type of page (return HTTP 410 to indicate that they no longer exist, or redirect them); it also allows us to identify templates which include error links that need to be updated.
Same goes for redirections. For instance, we noticed that a few redirected URLs were crawled while they shouldn’t be: some breadcrumbs that are present on a very large number of pages included redirected links. This problem was easily identified and solved, which resulted in reduced useless crawl from Google.
Then, right after a migration, we use the Logs Analyzer to monitor the website, in particular new redirections or unexpected events or trends in terms of HTTP status codes. For instance, we detected a 404 surge, which was due to URLs that were not covered by redirections rules. Thanks to Botify, we were able to react immediately.
*Could you please show us a few examples of migration projects that were managed with the Logs Analyzer? *
Of course. For example, let’s look at lentreprise.lexpress.fr. Its new version went live at the end of April. The migration appears very clearly on a graph which shows pages returning an HTTP error or a redirection. The huge amount of HTTP 301 status codes was expected. These redirections correspond to changes in the website’s tree structure and URL syntax.
The temporary increase in 410 HTTP codes (‚ÄòGone’) was also expected, as we chose to return this status code for specific pages, to ‚Äòclean’ the site.
The 404 HTTP code (Not found), however, weren’t intentional, and were corrected. We can see on the graph that the red area that appears with the migration quickly melts away.
It’s very clear. Do you have an example where impact on traffic is visible?
The migration of L’Express Styles, which was conducted around the beginning of april of 2013. We accomplished significant work around website navigation. We added new sections and sub-sections that allow easier access to content. We monitored how these changes impacted Google’s crawl with Botify.
We saw a surge in Google’s crawl (in green) after the new pages went live. We also saw the number of distinct pages crawled daily stabilize while raising noticeably (in brown), which clearly showed that Google appreciated the new site structure.
We also noticed a peak in new URLs discovered by Google, in blue, and in lost URLs, in grey. The latter are URLs which used to return a 200 HTTP status (OK) and later returned an error or a redirection.
And, as a result, a clear, significant increase in active pages and in organic visits!
Here is another graph you chose to show us. Can you explain what it’s about?
I love this graph! Any imbalance immediately strikes the eye. It shows the amount of distinct pages crawled by Google on a specific area of the website (shown on the left), versus organic visits generated over the same period (on the right). This graph helped us identify areas we should focus our efforts on, and it helped validate decisions on several occasions. For instance, the area that corresponds to the first line is going to be removed because it’s not profitable. The area in red, the third from the top, is also clearly not cost-effective: we have planned to prevent robots from accessing 80% of these pages.
In summary, this graph provides an overview of imbalances that need to be investigated by type of page.
Apart from preparing for migrations, did the Botify crawler help identify specific tasks, or work on specific issues?
On several occasions. Last year, for instance, we worked on semantics in title tags for www.lexpress.fr. Data extracted from the Botify crawl’s databases helped considerably. We also worked on orphan pages, which are pages that still return content but are not accessible any more by navigating on the website. Botify provides this information by analyzing server logs data combined to data from its own crawl. We focused our attention on orphan pages which generated visits, as it made sense to reintegrate them into the website structure. We also identified very deep paginated pages, with articles lists. We corrected the problem by listing a higher number of articles per page.
Did you identify other issues?
Information brought to our attention by Botify is so rich, that we become aware of all optimizations that are needed to boost our search performance. This allows us to build roadmaps and to priorize projects.
As we implement optimizations, Botify brings to light topics that were not yet identified.
Regular product updates and new functionality that sometimes correspond to our needs, allow us to validate and/or identify new issues.
Aside from all these projects, how do you use the the logs analyzer on a daily basis?
Every morning, I start my day by checking Google Webmaster Tools indicators, and comparing them, complementing them with information from the Botify Logs Analyzer.
If I find any anomaly or malfunction, the logs analysis allows me to refine my diagnosis and take corrective actions.
A big thank you to L’Express and Charlotte for this testimonial.
And to our readers: let us know what you think, leave your comments!