Search Engine News & Info

Googlebot Updating User Agent String: What Does This Mean For SEO?

By Kyle Blanchette

Googlebot Updating User Agent String: What Does This Mean for SEO?

17th October 2019Kyle BlanchetteKyle Blanchette

Back in May 2019, Google launched an evergreen version of Googlebot, which would continuously run on the latest version of Chrome. However, the user agent string remained the same. That’s about to change.

Let’s backtrack: What’s a user agent string?

A user agent string is a short “string” of text that identifies the browser to the web server — every browser has a unique one! When a browser connects to a website, the user agent is essentially introducing itself to the server: “Hey, I’m {browser type} on {viewing device}.”

But how does this relate to Googlebot?

Google uses a Chrome-based browser to crawl and render webpages so it can add them to its index. So, just like other browsers, Googlebot has its own unique user agent string.

Web servers can use user agent information to change how they serve the page. For example, a web server could be configured to send mobile pages to visitors on mobile browsers (called “dynamic serving“). The user agent string is also what helps SEOs analyze their log files and understand which pages Google is visiting.

Googlebot’s user agent string will include the latest version of Chrome

So if Googlebot already had a unique user agent string, what’s changing?

Since Googlebot is now always using the latest version of Chrome, the user agent string should reflect that. Starting in December, it will.

What does this look like exactly? We’ll show you.

What Googlebot’s user agent string for desktop looks like today:

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; 
+ Safari/537.36

What Googlebot’s user agent string for desktop will look like come December 2019:

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; 
+ Chrome/W.X.Y.Z Safari/537.36  

Starting in December, Googlebot’s user agent string will reflect the latest version of Chrome and will continue to update in sync with Chrome. The W.X.Y.Z listed above will reflect the latest version of Chrome. For example, instead of W.X.Y.Z you may see “76.0.3809.100”. This means that not only will Googlebot run the current version of Chrome, give or take just a few weeks, but its user agent string will then update to include the current version numbers for identifying itself.

How does evergreen Googlebot benefit SEO on JavaScript websites?

To understand the significance behind Google’s update to the user agent string, we’ve got to first talk about the update to its user agent back in May. An evergreen Googlebot means leaps and bounds for your render budget. Where JavaScript’s impact on SEO may have had negative consequences for your website previously, now Googlebot can navigate more modern JS language as it continues to update alongside Chrome. This means that the updated user agent has opened the door to 1,000+ JavaScript features.

Additionally, you no longer need to use as many polyfills in JavaScript for functionality in older browsers for Googlebot. Previously, when Googlebot used an outdated version of Google chrome, it was important to use polyfills. Now that Googlebot is using the latest version of Chrome, you should evaluate if polyfills are still necessary.

So, how will Google’s update affect your enterprise website?

At Botify, we always use an updated version of Chrome to render pages. In fact, we may even be ahead of Google at times. That being said, changes to the naming of the user agents will not change the way you see data in the Botify Log Analyzer. Our rendering will always match or exceed Googlebot’s, giving you the most accurate picture of your SEO data.

It’s worth mentioning, though, that if your site looks for a specific user agent to change the way it serves the page, your site may be affected. Google recommends that you use feature detection and progressive enhancement instead of user agent sniffing, a tactic sometimes used by smaller, non-enterprise websites.

Feature detection identifies Googlebot by matching its capabilities to known features that Googlebot supports, while progressive enhancement ensures that websites serve their preferred, full-feature experience to browsers that can handle it while serving a more simple webpage to those that can’t. Using feature detection and progressive enhancement are the more scalable options for enterprise websites long-term and make even more sense now that Googlebot’s user agent string will continue to update. If there’s a particular instance that you need to detect Googlebot via the user agent, then simply look for “Googlebot” within the user agent string, rather than the full user agent string.

Here are a couple of other potential issues to look out for, as noted by Google:

  • Pages that present an error message: a page may assume Googlebot is a user with an ad-blocker and accidentally prevent it from accessing page contents
  • Pages that redirect to a roboted or noindex document

Keep calm and SEO on!

At Botify, we’re always thinking ahead and doing our best to anticipate Google’s updates. That’s why we’ve been using the latest version of Chrome for our crawls since the beginning. Therefore, Google’s change to the user agent string will have no impact on Botify’s reporting.

The only factors that SEOs should consider in regards to the new string, and the previously announced evergreen Googlebot, is a) reevaluating their usage of polyfills, b) implementing feature detection and progressive enhancement (if they don’t already), and c) keeping an eye on the two points above as suggested by Google. Otherwise, we’re all set to keep calm and SEO on!

Meanwhile, Googlebot’s not the only one to go evergreen. Just last week, Bing announced that Bingbot has gone evergreen, too. This is a very exciting time for SEOs! The future of SEO, and more specifically render budget, is looking brighter than ever.

Get more articles like this in your inbox monthly!