Caret leftBack to Blog

Is JavaScript Affecting Your SEO? Know for Sure with Our Even-Better JavaScript Crawler

10th October 2019The Botify TeamThe Botify Team

Depending on how your website is coded, JavaScript may or may not be impacting your performance in search — so how can you know for sure?

Conventional solutions often involve manually examining each page, comparing the source code to the fully-rendered page and noting the differences. But for an enterprise website with millions of pages, that doesn’t scale.

The only way to know whether JavaScript is changing critical content and markup on your pages is to compare the un-rendered version with the rendered version, which means SEOs need a tool that can render JavaScript across their entire website.

That’s exactly why we launched JavaScript Crawl in January 2017.

JavaScript Crawl enables SEOs to crawl their entire website, including the crawling and rendering of JavaScript. This allows SEOs to perform two tasks that are critical to their success in the modern web:

  • Find links on your site that are loaded with JavaScript
  • Find content on your site that’s loaded with JavaScript

Why are these such critical tasks? Because Google crawls and renders your JavaScript too! If Google can find it and index it, it’s important to include in your SEO analysis. It’s also important to understand whether Google is having trouble finding and indexing your JavaScript-loaded content, as rendering delays can sometimes prevent Google from seeing this content right away. We talk about this and other JavaScript-related SEO issues in JavaScript 101 for SEOs — check it out if you haven’t already!

So if we launched JavaScript Crawl in January 2017, why are we talking about it now? We’re so glad you asked!

New-and-improved JavaScript Crawl: Faster JS rendering, reduced server load, and expanded SEO insights

If you’re an SEO, it’s difficult to know where to start with JavaScript. How do you know what pages contain JavaScript-loaded content? Where can you look to determine whether Google is indexing that content? Questions like these are notoriously difficult — but simultaneously critical — to answer.

Our first iteration of JavaScript Crawl helped enterprise SEOs around the world answer these questions, but there were even more problems we wanted to solve for our customers. So what can you expect from our brand-new JavaScript rendering engine?

Render your JavaScript fast — like, really fast

There’s a reason Google doesn’t always crawl and render all the content on your site — it’s time-intensive! Crawling tools can have the same problem, which can leave SEOs waiting sometimes days or even weeks to get the results of their JavaScript crawl back.

You don’t have time for that, and we don’t think you should have to. That’s why our new JavaScript rendering engine allows you to render up to 100 URLs per second. To put that in other terms, our new rendering engine will allow you to conduct a 1 million page analysis in just three hours, whereas a desktop-based crawler might take a few weeks to fully complete.

The faster you can render the JavaScript on your site to identify issues, the faster you can update your SEO roadmap to include critical JavaScript fixes that can improve your rankings and organic search traffic.

Reduce server load and increase bandwidth

Our new JavaScript rendering engine not only crawls faster, but it also does so in a way that reduces server load (on customer and third-party servers) and bandwidth costs. Rendering the JavaScript on an enterprise website can be very resource intensive — another reason why Google doesn’t crawl your entire website, let alone render all of the JavaScript.

Botify crawls and renders your site much more efficiently, which reduces the load on your servers and frees up bandwidth.

Perform device-specific JavaScript analysis

The modern web not only requires SEOs to be able to optimize JavaScript for search engines. It also means that SEOs need to know how JavaScript might be affecting search performance across multiple devices.

To do that, SEOs need to be able to crawl and analyze their pages by how they appear on different screen sizes, but many multi-device auditing options are only able to change the user-agent. Many websites display a different number of elements (e.g. products) based on screen size, which can also impact lazy loading. This means that auditing the mobile version of your website requires more than just changing the user-agent. SEOs need to be able to do a pixel-perfect screen emulation to audit their mobile versions, which we made possible in JavaScript Crawl.

Botify’s new JavaScript rendering engine allows SEOs to understand the true effect of JavaScript across different devices, which empowers you to improve not only organic search metrics, but user experiences as well.

Audit your changes even before they go live

In Botify, SEOs can now inject content/HTML changes dynamically into each page being rendered. This can help you test changes at scale — and this isn't limited to testing JavaScript changes!

Using JavaScript, you can test any content change that can then be implemented with JavaScript or static. This means this feature is useful for websites that don't need JavaScript Crawl on a regular basis.

By injecting changes onto your page and then running an analysis, SEOs have the benefit of seeing how those changes could affect their performance in search engines before risking the potential traffic drops that could accompany immediate implementation.

Know which resources take the longest to load

Up until now, SEOs have been able to use Botify to see how JavaScript might be affecting website performance, but we wanted to add more granularity. Very soon, SEOs will be able to pinpoint the root cause of loading issues. In other words, Botify will show, at the URL level, all JavaScript resources that were used to render the page, when they loaded, and how long they took to load.

It’s great to know about JavaScript performance on the whole, but it’s even more actionable to be able to pinpoint the exact resources that are the culprits of slow loading so that you can take specific actions to address it.

A JavaScript tool built specifically with SEOs in mind

We already know that, on average, only half of an enterprise website is being crawled by search engines due to numerous technical roadblocks. The larger the site, the larger the problem; and those sites built with JavaScript have it even worse.

Many tools analyze JavaScript from the perspective of developers, like Google’s Lighthouse and Puppeteer. While helpful, they lack some key features and metrics that SEOs need, such as robots.txt support, JavaScript pushState and redirections, and websockets detection — all of which are possible with Botify’s new rendering engine.

Botify’s new JavaScript rendering engine allows SEOs to crawl websites just like Google does (we use the same rendering engine as Googlebot), but pulls that information into an interface designed specifically to help SEOs identify instances where JavaScript could be affecting the site’s performance in search engines.

Just like our HTML-crawler can tell you with a click of a button if Google is crawling your HTML-based site, our JavaScript Crawler can quickly identify the specific JS issues impacting Google's ability to crawl and render your site's pages. That means you can:

  • Pinpoint quickly any drop in organic search traffic on your JavaScript website
  • Proactively understand the impact of your site’s JavaScript on search engine’s ability to crawl and index your pages

And with improved speed and efficiency, SEOs get this data faster and with less burden on servers.

We’re confident that faster rendering and these new features can help you tackle any JavaScript SEO issue you’re faced with. Have questions about it? We’re here to help! Get in touch with our team today.