However, server-side rendering can be costly when it comes to bandwidth. That’s why some sites choose to server-side render some content and client-side render less meaningful content that’s not as important for Google’s initial pass.
1. Avoid unintentionally blocking search engines from seeing your JS content
Third-party functionality that your site implements is also hosted on different domains and can potentially block Google – so it’s important to check those as well. Plus, if your site uses a third-party feature to feed in reviews, for example, you’ll want to make sure to investigate what’s actually happening when a bot or human visits your site. If you need to scroll down in order to trigger the content to load, Google won’t be able to see it – even if the content can be found in the HTML. You can find more on this in step #3!
2. Pay attention to URL structure
When it comes to your pages’ URL structure, it’s best to have URLs that are SEO-friendly. As in, your URLs shouldn’t change once the visitor arrives on the page. In some cases, websites might implement a pushState change to the URL that can cause confusion for Google when it’s trying to identify which page is the canonical version.
While pushState changes may be helpful for referencing the visitor’s history later in their journey on your site, it can cause issues for Google crawling your site. The goal is to get Google to crawl the page without seeing more than one URL. Otherwise, you’re essentially sending Google through a “JS redirect.” This eats away at your crawl budget, so only the clean URL should be surfaced.
3. Try not to play favorites with UX over crawl budget, and vice versa
When it comes to client-side rendering, you have to remember that the visitor – whether bot or human – may have difficulties accessing content that’s JS-loaded. Google can’t trigger content to be loaded – they’ll only see what’s available to them on the surface or through a link. Bots won’t be able to see JS content that’s triggered by scrolling, since they’re unable to scroll. Rather, you’ll need a button that they can use to reveal the content. While lazy loading can be great for visitors, it can prevent bots from accessing potentially critical content.
On the other hand, server-side rendering is a great option for bots. Since the crawler doesn’t have to fetch the resources, it saves valuable time that in turn improves your site’s crawl budget.
While SSR and CSR each have pros and cons, the most important thing is to make sure your pages are friendly to both bots and humans.
4. Take steps to avoid site latency
You’ll be able to see all of the resources Botify has executed and the types of resources executed, giving you complete visibility into how a search engine is rendering your content.
When we pick up an issue that may be affecting the way Google is rendering your page, you’ll see each call at the page level. We’ll show you all of the resources that were executed, the type of request and how long each request took. You’ll also be able to see if there are especially large or slow resources that may be slowing down render times, and requests that may be slowing down rendering of pages like XMLHttpRequests.
5. Test that your JS appears in the DOM tree
It’s important to look at the rendered DOM to make sure your content is loaded properly. If you search for a snippet of on-page text and it’s not found in the DOM, Google likely won’t be able to see it – even though the visitor or crawler might see the content there.
To get an idea of what search engines are rendering, we recommend using Botify to render your site’s pages at scale. Drilling down on the major differences between your site’s millions (or billions) of pages and splitting them up into digestible sections will help you uncover any issues that might be lurking.
Additionally, if you want to spot check a couple of priority URLs, many tools like GSC, Mobile Friendly Testing Tool, and Structured Data Testing Tool should utilize Google’s evergreen bot, which allows for rendering JS as Googlebot would. You can use these single page testers to inspect the page and then download to see if your content is loaded.
With the help of the engineering team, Carvana implemented a series of code to fix the rendering failures. Then they moved some of their critical content so that it would be discoverable during Google’s first HTML pass. The result? More traffic to Carvana’s site, a 332% boost in ranking keywords, and a 749% boost in ranking URLs!
- Execute all of your pages’ JS code
- Select which JS resources to execute, and which to ignore
- Cache external resources
- Follow your robots.txt derivatives (or specify alternative instructions), and flag warnings
- Check your pages’ load times