JavaScript SEO

What Is Dynamic Rendering & How Does It Impact SEO?

By The Botify Team

At Google’s 2018 I/O conference, John Mueller introduced the concept of dynamic rendering (although in practice, many sites were already doing this via self-made solutions or utilizing third-party software). 

“In a nutshell,” John said, “dynamic rendering is the principle of sending normal, client-side rendered content to users, and sending fully server-side rendered content to search engines.” 

And on June 30, Bing released updated Webmaster Guidelines with an interesting addition — “Bing recommends Dynamic Rendering to switch between client-side rendered and pre-rendered content for specific user agents such as Bingbot, especially for large websites.”

If it’s big enough for both Google and Bing to announce as a recommended solution, it’s important for us as SEOs to understand so that we can know in what situations it’s appropriate, how it can help, and what implications we need to watch out for.

Jump to a specific section or continue reading an overview of dynamic rendering:

What is dynamic rendering?   

TL;DR: Dynamic Rendering sends fully-rendered content to search engines while serving human visitors with normal, client-side rendered content. It’s pre-rendering, but just for search engine bots.   

Dynamic rendering is pre-rendering for search engine bots. It creates and serves a static HTML (server-side rendered) version of your page to Googlebot, Bingbot, and others.  

It’s a technique where the page will render differently depending on what user-agent calls it — switching between client-side rendered content and pre-rendered content for certain user agents. 

In other words, Dynamic Rendering sends normal client-side rendered content to users and transforms your dynamic content into flat HTML to send to search engines. This means that your content can be crawled and indexed without Google needing to execute JavaScript.

When search engine bots go to access a dynamically rendered page, they get a version of the page that was rendered on-demand in a headless chromium. 

💡 “Headless” just means a browser with no visual representation/screen — it’s all the underlying technology of a browser, except the output is code rather than an interactive screen.

Who is dynamic rendering for?

TL;DR: Dynamic rendering is best for large, JavaScript-heavy sites that change rapidly. It can also benefit companies who are budget-conscious and low on engineering resources. 

According to Google, “dynamic rendering is good for indexable, public JavaScript-generated content that changes rapidly.” 

Because dynamic rendering can also help search engine bots crawl and index more of your important content (more on that later), it’s great for sites that struggle with crawl budget issues — typically large websites. 

We also know that it’s easier to deploy than server-side rendering, and less expensive than pre-rendering your content for both humans and bots. 

Still not sure if dynamic rendering is right for you? If you can answer “yes” to one or more of these questions, you may want to consider dynamic rendering as an option for your website:

  • Is the web property you’re considering implementing dynamic rendering on indexable? (i.e. you want people to be able to find it in search engines)
  • Does the web property in question rely on JavaScript to generate part or all of the content?
  • Does the content on your web property change rapidly? (i.e. an e-commerce website with constantly-changing inventory)
  • Are you struggling with crawl budget issues? (i.e. search engine bots aren’t finding all your important content)
  • Does your engineering team have too much on their plates to implement server-side rendering?
  • Are you facing budget constraints?  

Why do Google and Bing recommend dynamic rendering? 

TL;DR: While Google and Bing can process JavaScript, they face limitations trying to do that at scale. Dynamic rendering removes those limitations, since it means search engine bots get your content without needing to render it.    

Google is on the record saying: 

“Even though Googlebot can render JavaScript, we don’t want to rely on that.” — Martin Splitt, explaining why someone would implement dynamic rendering

“Currently, it’s difficult to process JavaScript and not all search engine crawlers are able to process it successfully or immediately. In the future, we hope that this problem can be fixed, but in the meantime, we recommend dynamic rendering as a workaround solution to this problem.” — Google’s documentation on dynamic rendering 

It’s important here to remember that Google has a rendering queue and two waves of indexing. 

Google’s HTML crawler can’t process the JavaScript, so when Googlebot encounters it, the page goes into a queue where it waits for rendering resources to become available, at which time Googlebot can fully render the page. 

Bing faces similar limitations when it comes to JavaScript. To clarify their position on this issue, they recently added this section to their Webmaster Guidelines:

“Bing is generally able to process JavaScript, however, there are limitations to processing JavaScript at scale while minimizing the number of HTTP requests. Bing recommends Dynamic Rendering to switch between client-side rendered and pre-rendered content for specific user agents such as Bingbot especially for large web sites.”

If you’ve deployed a dynamic rendering solution, when Googlebot, Bingbot, and other search engine bots encounter your pages, they’ll be served a fully-rendered page, meaning they don’t have to render anything. That means you don’t have to worry about search engines missing any of your content. 

What problems does dynamic rendering solve?

⚡ TL;DR: JavaScript can both slow pages down and be difficult for search engines to process successfully or immediately. By eliminating the need for search engines to process JavaScript, you can improve both speed-related crawl budget issues and prevent search engines from missing your JavaScript-loaded content.

How does dynamic rendering fix crawl budget issues?

Both users and bots are affected by page speed. 

For users, they’ll get frustrated and bounce/not purchase if the page loads slowly. For bots, slow pagespeeds mean they can’t crawl as many of your pages.

Because dynamic rendering is a solution exclusively for bots (i.e. human users continue to get normal, client-side rendered content), we’re going to focus on the SEO benefits of sending fully-rendered pages to bots. 

Large sites, especially JavaScript-heavy ones, can suffer from crawl budget issues. 

Because search engines like Google and Bing don’t have unlimited time, they set a cap on how many pages they can and will crawl on a single website at a time. That cap is your crawl budget, and it’s different for every website.

At least for Google, they calculate crawl budget by combining crawl rate limit (influenced by factors like page load time) with crawl demand (influenced by how popular and/or fresh your pages are). 

It can be difficult for search engine bots to crawl through large sites because of these limits. According to our research, this issue has led to Google missing about 51% of all the pages on enterprise websites.

Based on Botify’s analysis of 6.2B Googlebot requests across 413M web pages June 2018 “How Does Google Crawl the Web?”

Add in JavaScript, which can slow search engine bots down, and it can make your crawl budget issues even worse.  

To illustrate the impact page speed improvements can have on bots, let’s take a look at a website that improved their average delay from 1,053ms to 730ms, resulting in a huge increase in crawl frequency. 

💡 Why is crawl improvement important? Remembering the SEO funnel, we know that in order to make $ from organic search, you have to be getting traffic, which comes from rankings, which requires being indexed, which requires being crawled and rendered. Pages that Google isn’t crawling can’t make you money.

When sites serve faster pages to search engine bots, those bots can crawl more pages on your site. 

The more pages you have and the slower they are, the less likely it is that Googlebot will have time to get through them all, leading to a % of pages on your site not crawled at all (bad) or not crawled frequently enough (also bad if your content changes frequently). 

Here’s another example. Pages <500 MS in load time typically have a ~50% increase in crawl than pages that are between 500-1000 MS load time, and a ~130% increase in crawl than pages > 1,000 MS. 

But how exactly can we improve page speed through dynamic rendering? 

Because JavaScript has added seconds of load time to webpages, and dynamic rendering sends a fully-rendered page to search engine bots, dynamic rendering means Googlebot (and other search engines) get pages faster, allowing them to get through more pages on your site, which means more pages indexed, ranking, and driving traffic & revenue. 

How does dynamic rendering help get your JavaScript-loaded content indexed?

In order for search engines to be able to see content loaded by JavaScript in the browser, they have to render it, and because rendering pages at the scale of the web requires a lot of time and computational resources, the search engine bots defer rendering JavaScript until they have the resources available. 

The JavaScript-loaded content you have and the JavaScript-loaded content search engines have rendered are not always the same, and that’s because of something we call render budget

Like crawl budget, render budget means that some details about your page might be missed in the interim, especially if your website content changes frequently (think e-commerce sites with constantly-changing inventory or media sites that publish hundreds of new articles daily).

Here’s an example of what that looks like in practice. This website switched to a JavaScript framework (not server-side rendered), and they immediately saw a huge drop in organic search traffic. 

However, if you lift the burden of rendering your content off the search engine, this no longer becomes an issue. 

Through dynamic rendering, search engine bots get the fully-rendered version of your pages, so you don’t have to worry about them missing your JavaScript loaded content and links.   

Dynamic rendering is a less resource-intensive solution

⚡ TL;DR: Dev teams prioritize user-focused projects, so making improvements for bots might not make it onto their roadmap. Dynamic rendering is an easier, less resource-intensive option for giving the bots what they want. 

One of the biggest barriers to making improvements for bots is that most engineering and dev teams are focused on the user, and therefore prioritize UX issues and projects. 

Even if, for example, your page speed optimizations make it onto their radar, it can be very resource-intensive and complex to implement. 

How resource-intensive? If you remember our client from earlier who improved their average delay from 1,053ms to 730ms, that project took an entire year and a ton of resources — and they’re definitely not alone.  

Dynamic rendering is a great solution because it’s faster and less resource-intensive than options like server-side rendering, making it much easier to actually implement this type of optimization.  

Is dynamic rendering cloaking?

TL;DR: Dynamic rendering is only considered cloaking if you use it to serve a very different version of your page to search engine bots vs. human visitors. 

According to Google, dynamic rendering is not cloaking. 

Google describes cloaking as “the practice of presenting different content or URLs to human users and search engines.” 

If you think that sounds a lot like the definition of dynamic rendering, you’re not wrong! So if you’re confused about this subject, it’s definitely understandable. 

Here’s a good way to understand the difference though. 

  • Cloaking is the process of sending different content to users vs. search engines for the purpose of deceiving one or both parties.
  • Dynamic rendering is the process of sending different content to users vs. search engines for the purpose of pre-rendering your content for bots.   

Google first addressed cloaking in its early days when cloaking was primarily used to trick search engines and artificially inflate a page’s position in search results. 

For example, people would use cloaking to stuff keywords onto a page only when the user-agent requesting the page was a search engine. When a human visitor requested the page, they’d receive the human-readable, non-keyword-stuffed version. 

In other words, you can use dynamic rendering to cloak (which is bad — definitely don’t do this), but not all dynamic rendering is cloaking.  

If you’re going to use dynamic rendering, make sure to minimize the differences between the version of your page you’re sending to search bots and the version you’re sending to users. 

Will dynamic rendering improve what my human visitors experience?

No. Dynamic rendering serves fully-rendered content to search engines and other bots, not to users. 

Why wouldn’t you just use server-side rendering and improve load speeds for both users and bots?

TL;DR: Server-side rendering often isn’t financially viable or something engineering teams have time for.

Server-side rendering is a preferred option for many because all the meaningful content gets rendered on the website’s server — both users and bots receive a fully-loaded page, without the need to request additional resources. This takes the burden of rendering a page off of both your human visitors and search engine bots and places it on your website’s server.

So why doesn’t everyone just server-side render their JavaScript pages? It can be incredibly time and resource-intensive.

Unlike server-side rendering, dynamic rendering can be a fairly turnkey solution. This means you can give search engine bots what they want without taking up too much of your engineering team’s time.

Relative to server-side rendering, dynamic rendering is also cheaper. That’s because instead of paying to pre-render your content for both human visitors and bots, you’re only paying to pre-render it for bots. 

What questions do you have about dynamic rendering?

Do you have questions about dynamic rendering? Or feel there’s anything important SEOs should know about dynamic rendering that we didn’t address here? If so, respond in the comments or Tweet to us @Botify!

We’ll also be hosting a fireside chat on the topic with Google’s Martin Splitt where you’ll have the chance to ask him your dynamic rendering questions directly. You won’t want to miss it! Sign up here.

Get more articles like this in your inbox monthly!