JavaScript SEO Performance & Speed Robots.txt Technical SEO

Your JavaScript SEO Checklist: How to Optimize for Bots & Humans

When it comes to your site’s SEO ROI, JavaScript can open up a world of opportunity. JS can enhance the user experience, provide helpful details to your visitors, and even accelerate potential customers through the buyer’s journey. But when JavaScript is implemented incorrectly, it can also wreak havoc.

If Google can’t find your JavaScript content, whatever the reason may be, that content won’t end up in the index and will lose out on a ton of traffic. 

Botify’s Methodology calls for optimizing every step of the SEO funnel if you want organic traffic to be a top revenue generator. If your site uses JavaScript, it may be getting hit by hidden issues affecting parts of your site – and those issues may be impacting your traffic and revenue. 

The SEO Funnel: crawl, render, index, rank, convert
The SEO Funnel

We want to help you make the most of JavaScript’s box of tricks. Here are five ways you can ensure JS is helping you – not hurting you! 

But first, how does JavaScript get indexed?

At TechSEO Boost 2019, Google’s Martin Splitt talked through the complexities of how Google renders JavaScript. You can find a recap of Martin’s talk + the live video recording on our blog, as well as a less technical breakdown of the process in our post, “JavaScript 101 for SEOs!”  

What’s especially important to note is that Googlebot indexes sites with JavaScript in two waves. During the first pass, the crawler looks at the HTML content and uses only that content to begin indexing the site. At a later time – whether it’s minutes later, or weeks – the crawler returns to render the JavaScript. This is why the date and time of crawling versus indexing may differ. 

How JavaScript is rendered within the SEO Funnel: crawl, index, render, index
How JavaScript is rendered within the SEO Funnel

There’s one exception to this. Sites that server-side render all of their main content should not see any delay in Javascript content being indexed, since their most critical content will be in the HTML of their site. Therefore, if Googlebot comes back for a second pass, it will have already seen your most important content. 

However, server-side rendering can be costly when it comes to bandwidth. That’s why some sites choose to server-side render some content and client-side render less meaningful content that’s not as important for Google’s initial pass.

Now, onto the checklist! What do you need to know to optimize your JavaScript for SEO? 

1. Avoid unintentionally blocking search engines from seeing your JS content

It’s critical for Google to be able to find your JavaScript content. Some sites, however, use a process called cloaking to serve their JS content to visitors but hide it from crawlers. This is a violation of Google’s Webmaster Guidelines and a big no-no! Instead of avoiding the issue, you should identify areas of your site where Google can’t find your JS content and fix the it – and fast! Here are a couple of instances where you might be blocking Google from seeing your content without even realizing it. 

If your site hosts resources on a different domain (for example: hosting media files on Botify.net for Botify.com), you’ll need to make sure that your robots.txt directives on both domains aren’t blocking search engines from retrieving resources required for rendering. If one domain is blocking Google, it’s likely your pages will look incomplete to them. You can use the “Fetch as Google” function to identify which resources the robots.txt directives are blocking or by checking the JavaScript crawl in Botify Analytics’ SiteCrawler. 


Third-party functionality that your site implements is also hosted on different domains and can potentially block Google – so it’s important to check those as well. Plus, if your site uses a third-party feature to feed in reviews, for example, you’ll want to make sure to investigate what’s actually happening when a bot or human visits your site. If you need to scroll down in order to trigger the content to load, Google won’t be able to see it – even if the content can be found in the HTML. You can find more on this in step #3!  

2. Pay attention to URL structure

When it comes to your pages’ URL structure, it’s best to have URLs that are SEO-friendly. As in, your URLs shouldn’t change once the visitor arrives on the page. In some cases, websites might implement a pushState change to the URL that can cause confusion for Google when it’s trying to identify which page is the canonical version. 

While pushState changes may be helpful for referencing the visitor’s history later in their journey on your site, it can cause issues for Google crawling your site. The goal is to get Google to crawl the page without seeing more than one URL. Otherwise, you’re essentially sending Google through a “JS redirect.” This eats away at your crawl budget, so only the clean URL should be surfaced. 

3. Try not to play favorites with UX over crawl budget, and vice versa 

There are often instances when a developer may prioritize the user experience while neglecting to consider how Google discovers their content. But in reality, the JavaScript that helps a visitor have a better, more engaging experience on your site can also hurt your SEO – and vice versa. 

When it comes to client-side rendering, you have to remember that the visitor – whether bot or human – may have difficulties accessing content that’s JS-loaded. Google can’t trigger content to be loaded – they’ll only see what’s available to them on the surface or through a link. Bots won’t be able to see JS content that’s triggered by scrolling, since they’re unable to scroll. Rather, you’ll need a button that they can use to reveal the content. While lazy loading can be great for visitors, it can prevent bots from accessing potentially critical content.  

On the other hand, server-side rendering is a great option for bots. Since the crawler doesn’t have to fetch the resources, it saves valuable time that in turn improves your site’s crawl budget.

While SSR and CSR each have pros and cons, the most important thing is to make sure your pages are friendly to both bots and humans.

4. Take steps to avoid site latency

Botify crawls your site just like a search engine would – and the same goes for rendering! Botify’s JavaScript Crawler uses timers to track time to first paint, first contentful paint, and first meaningful paint (the same terminology also used by Google). This ensures that we’re observing the best-case scenario to determine how long it takes to load a given page while excluding resources from third parties like Facebook, Twitter, and so on, which aren’t necessary for SEO analysis. 

Here’s an example of a website before and after Botify performs a JavaScript crawl configuration. It went from 354 resources down to just 13 resources that were critical for SEO analysis!

JS Crawl Configuration showing resources fetched
JS Crawl Configuration showing resources fetched

You’ll be able to see all of the resources Botify has executed and the types of resources executed, giving you complete visibility into how a search engine is rendering your content. 

When we pick up an issue that may be affecting the way Google is rendering your page, you’ll see each call at the page level. We’ll show you all of the resources that were executed, the type of request and how long each request took. You’ll also be able to see if there are especially large or slow resources that may be slowing down render times, and requests that may be slowing down rendering of pages like XMLHttpRequests.

If you have segmentation set up, you can use Botiy’s JS crawl to discover how JavaScript resources are impacting performance across different sections of your site – like your site’s top converting pages. You can also use the free tool GTMetrix to learn which resources are being used and what’s slowing your website down. 

5. Test that your JS appears in the DOM tree 

Testing is critical when it comes to JavaScript. You should test whether or not the content on your site’s pages appears in the DOM tree (which shows the structure of the page’s content and each element’s relationship to one another). The DOM can be found in the brower’s “inspect element” in your pages’ code or source code of any rendering tool (like Botify). The DOM will appear blank at first, and then it will grow as JS is rendered. 

It’s important to look at the rendered DOM to make sure your content is loaded properly. If you search for a snippet of on-page text and it’s not found in the DOM, Google likely won’t be able to see it – even though the visitor or crawler might see the content there. 

To get an idea of what search engines are rendering, we recommend using Botify to render your site’s pages at scale. Drilling down on the major differences between your site’s millions (or billions) of pages and splitting them up into digestible sections will help you uncover any issues that might be lurking. 

Additionally, if you want to spot check a couple of priority URLs, many tools like GSC, Mobile Friendly Testing Tool, and Structured Data Testing Tool should utilize Google’s evergreen bot, which allows for rendering JS as Googlebot would. You can use these single page testers to inspect the page and then download to see if your content is loaded.

Uncovering hidden opportunities with JavaScript: case study

We can’t think of a better argument for testing your JavaScript than from our recent case study with Botify customer Carvana. After running a series of tests with Botify, the Carvana team found that search engines weren’t rendering their server-side code, which was causing client-side issues. Essentially, Google was missing two huge parts of Carvana’s website! 

With the help of the engineering team, Carvana implemented a series of code to fix the rendering failures. Then they moved some of their critical content so that it would be discoverable during Google’s first HTML pass. The result? More traffic to Carvana’s site, a 332% boost in ranking keywords, and a 749% boost in ranking URLs! 

   332% increase in ranking keywords and 749% increase in ranking URLs
332% increase in ranking keywords and 749% increase in ranking URLs

Carvana’s story is a testament to the true power of optimizing your JavaScript for SEO.

Drive better results with JavaScript

JavaScript doesn’t need to be every SEO’s biggest nightmare (after all, there are 3,200 search algorithm updates each year that can fill that role!). When JS is implemented correctly, it can benefit your user experience and your rankings. It’s only when Google can’t find SEO-critical elements like links and content that your site – and ultimately your ROI – can suffer. 

We recommend that you dedicate time to fixing JS issues that may be affecting all or part of your site. If you’re a Botify customer, you can use our JavaScript Crawler to:  

  • Execute all of your pages’ JS code
  • Select which JS resources to execute, and which to ignore
  • Cache external resources
  • Follow your robots.txt derivatives (or specify alternative instructions), and flag warnings 
  • Check your pages’ load times 

If Google can’t find your content, then it’s likely you’re losing out on traffic and falling short of your site’s true revenue potential. Uncover hidden issues with your JavaScript, and unlock greater ROI! 



Jun 6, 2021 - 4 mins

3 Takeaways From Our Webinar on Page Experience

JavaScript SEO Performance & Speed Robots.txt Technical SEO
Oct 10, 2019 - 5 mins

6 Ways To Benefit From Unifying Your SEO Data

JavaScript SEO Performance & Speed Robots.txt Technical SEO
Feb 2, 2018 - 3 mins

HTML Extract Select: Easy Access To The Information That Matters To You

JavaScript SEO Performance & Speed Robots.txt Technical SEO