Ridesharing website Blablacar, created in 2006, is now present in a dozen European countries - all monitored through Botify Log Analyzer. Jérôme Moussay, SEO manager, explains how the Botify tool is used in the company.
**Botify: How do you use Botify Log Analyzer? **
Jérôme Moussay: We started looking at web server logs before setting up Botify Log Analyzer. We were doing this in a "one shot" mode, for very specific purposes, without monitoring changes over time. But, constant visibility, in near-real time, has great advantages, I couldn't go back and do without it. When a change on the website impacts Google's crawl or organic visits, I can detect and measure the effects the very next day in Botify Log Analyzer, as well as potential side-effects. This goes much further than Google Webmaster Tools, where the change's impact will only show a few days later. In addition, in GWT, I can't be sure what really happened, as Google's exploration graph show the search engine's crawl on all pages, over the past 90 days, when Botify Log Analyzer provides visibility over each type of page through customized URL categorization.
The French website went through a migration a few months ago. Could you explain the migration's objectives and tell us how it went?
Yes, we updated the French platform last March. This implied changing all our URLs. From an SEO perspective, the migration was the opportunity to optimize the site structure. As changes impacted pages that were generating organic traffic, we had to be extremely careful not to impact the positions we had acquired - we were already number one for a large number of queries based on start city / destination city. We had to preserve existing traffic while changing completely the way Google saw our website structure. This had to be done through redirections, and also by optimizing Google's crawl, by disallowing certain areas via the robots.txt file. Previously, the crawl volume was extremely high, partly because of the large number of "trip" pages, which detail each trip posted by a driver. That's why Google Webmaster Tools issued a warning saying "the number of URLs detected by Googlebot is extremely high".
Botify Log Analyzer's added value lays, among other things, in the way it allows to analyze how Google reacts to redirections. The tool was extremely useful to closely monitor Google's crawl. For instance, the graph below shows all crawled pages returning any HTTP status code other than 200 (OK), that is to say any redirection or error:
With Botify Log Analyzer, we were able to not only verify that 301 redirects were increasing as expected, but also to detect a surge in HTTP 404 status code (page not found), and solve the problem immediately.
Could you provide more details about these "trip" pages, and why they created an issue?
A "trip" page details a driver's trip between two cities, on a specific date. As a result, there are an extremely large number of trip pages.
In the website's previous version, expired trip pages were still returning an HTTP 200 status code (OK), which caused two important problems: first, Google crawled an extremely large number of trip pages (hence the Google Webmaster Tools alert) and the number of indexed pages was very high; second, these pages were sometimes positioned in Google's search results, which created a bad user experience.
This shows in the graph below, which displays SEO efficiency by type of page, before the migration: the orange line represent trip pages, and the left part of the line Google's massive crawl on these pages. The active pages rate was also very low (0,3%).
We did not prevent Google from crawling these expired pages right away. We tagged some of them as "noindex" and redirected others to new pages that were much more appropriate for search traffic: "axis" pages, which list trips between two cities, offered by all drivers at different dates.
Only at a later stage did we prevent Google from crawling trip pages, as well as other sections which did not generate any traffic (such as pages in light green).
The graph below shows redirections from "trip" pages to "axis" pages in Google's crawl.
Ultimately, the transfer between old trip pages and axis pages went well. The graph below shows active pages (pages which generate at least one organic visit over the 30-day period considered for the analysis): trip pages in orange, axis pages in light brown.
(the few days without data correspond to missing log files)
Organic visits on axis pages doubled since the migration:
Blablacar is among the first Botiy Log Analyzer users to monitor Russian search engine Yandex's activity. What did you observe?
Our observations are in line with global statistics, regarding distribution of organic visits between Google and Yandex in Russia: we have more SEO traffic from Yandex. If we look at the search engines' crawl pattern, however, things are different: Google crawls our important pages more often than Yandex does. I also noticed the two search engines don't behave the same way when it comes to pages not found (HTTP 404) and 301 redirects, which Google tends to crawl again more often and over a longer period than Yandex. Here is, for instance, what we saw on the Russian website last May: these graphs show Google's and Yandex' crawl when some 404 pages appeared for a very specific type of page.
The daily crawl volume for HTTP 200 ("OK") pages before the 404s appeared was similar on the two search engines. The top daily volume for 404s was also roughly equivalent. However, Google continued exploring 404 pages much more often in the two weeks after they appeared, and also continued crawling them over a longer period.
A big thank you to Blablacar and Jérôme for this testimonial.
And to our readers: let us know what you think, leave your comments!