Site icon Botify

Botify Vs GWT, Problem Detection Championship Finals

Botify vs GWT, Problem Detection Championship Finals

17th June 2014AnnabelleAnnabelle

Let’s face it, this scoreboard is purely fictional. Actually, Google Webmaster Tools and Botify are on the same team. And they complement each other very well.

Googe Webmaster Tools alerts and provides pointers. Botify empowers your investigation.

Botify’s Logs Analyzer helps locate issues and evaluate problem severity: it provides complete data where Google Webmaster Tools provides samples.

Let’s look at GWT’s most common automated errors (we’ll leave aside manual sanctions, alert messages about ‚Äòunnatural’ external linking, malware detection on specific pages etc.).

Google Webmaster Tools sends alert messages when crawl error rates rise. These error rates are available in the Crawl / Crawl errors section, which is divided into Site errors and URL errors.

Main URL errors alerts:

1) Increase in not found errors
2) Increase in soft 404 errors
3) Increase in ‚Äòauthorization permission’ errors
4) Increase in not followed pages

Main site errors alerts:

5) Google can’t access your site
6) Possible outages

Important alerts unrelated to crawl errors:

7) Googlebot found an extremely high number of URLs on your site
8) Big traffic change for top URL

1) Increase in ‚Äònot found’ errors

What Google says:

‘Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error.’

Google Webmaster Tools provides:

How the Botify Logs Analyzer can help:

2) Increase in ‚Äòsoft 404′ errors

What Google says:

‘Google detected a significant increase in URLs we think should return a 404 (Page Not Found) error but do not.’

Google Webmaster Tools provides:

How the Botify Logs Analyzer can help:

3) Increase in ‚Äòauthorization permission’ errors

What Google says:

‘Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.’

In other words, Googlebot is getting a “Forbidden” status code (http 403) when requesting some urls.
Either these pages should be crawled by Googlebot (and return http 200 – OK), or they should not, in which case Googlebot should not waste crawl ressources trying to access these pages: they should be disallowed in the robots.txt file.

Google Webmaster Tools provides:

How the Botify Logs Analyzer can help:

4) Increase in not followed pages

What Google says:

‘Google detected a significant increase in the number of URLs that we were unable to completely follow.’

Examples of redirects which won’t be completed:

What GWT provides:

How the Botify Logs Analyzer can help:

5) Google can’t access your site

What Google says:

A variety of messages warning about site-level issues that result in a peak of DNS problems, Server connectivity problems, or problems getting the site’s robots.txt file.

For example:
‘Over the last 24 hours, Googlebot encountered 89 errors while attempting to connect to your site. Your site’s overall connection failure rate is 3.5%.’

Beware of robots.txt fetch errors: Google will stop crawling!

For example:
*‚ÄòOver the last 24 hours, Googlebot encountered 531 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%’ *

Google will announce that its crawl is postponed even if the error rate is not 100%. In most examples we’ve seen, error rates were above 40%, but we’ve also seen the same message with a robots.txt fetch error rate below 10%.

What GWT provides:

Of course the one to turn to regarding DNS and connectivity issues is your service provider.

How the Botify Logs Analyzer can help:

6) Possible outages

What Google says:

‚ÄòWhile crawling your site, we have noticed an increase in the number of transient soft 404 errors’

This is quite similar to connection failures.

What GWT provides:

How the Botify Logs Analyzer can help:

7) Googlebot found an extremely high number of URLs on your site

What Google says:

‚ÄòGooglebot encountered problems while crawling your site [site name].Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site’s URL structure. […]’

The message goes on to explain, in essence, that that does not look right: these urls must include duplicates, or pages that were not intended to be crawled by search engines.

GWT provides :

How the Botify Logs Analyzer can help:

8) Big traffic change for top URL

What Google says:

‘search results clicks for [this url] have increased/decreased significantly.’

Google Webmaster Tools provides:

*How the Botify Logs Analyzer can help: *

If this was caused by algorithm changes on Google’s part, chances are there are other significant trends on other pages.

In the case of a significant traffic decrease, the page’s content should also be checked, as well as the website’s internal linking (which can be done with the crawler that comes with the Botify Logs Analyzer).

Blog comments powered by Disqus.

Exit mobile version