About once a quarter, the Botify team hosts an event where we get together for good food, good drinks, and a lively discussion about hot topics in our industry — it’s called BotifyCONNECT, and if you’ve never been, we hope you can join us for one in the future!
Here are the top 10 takeaways!
We simply don’t know unless we look at our log files. Log file analysis for SEO can allow us to see how often Google is coming back to execute our scripts. Not only will this help you understand when you can expect Google to index your full page content, but it can also reveal where Google might be wasting time on scripts that aren’t even critical to changing the page’s content.
If you notice during your log file analysis that Google is spending time on scripts that aren’t critical to the page content, try caching those resources so Google doesn’t try to crawl them again.
Remember, log files are the only way to understand what Google is spending time on, so if you’re dealing with an issue where Google isn’t indexing all of your important content, try tactics like caching to direct Google’s time away from unimportant scripts and toward the most important elements of your pages.
Optimizing for render budget is all about helping Google out. Focus on making your sites faster and as minimally JS-dependent as possible when it comes to your content and links.
We’re still seeing many instances where, because something works for the user, people assume it works for search engines as well. This isn’t the case.
Developers have a lot of options to achieve a certain user experience, but not all of those options are equally accessible to search engines.
This is entirely dependent on your organization, but your product team probably isn’t thinking (at least primarily) about how their work is affecting SEO. This can cause issues unless you unsilo, embed yourself in the organization, and make good relationships with your developers and other product people.
Some tools block Googlebot from crawling their API because it’s a lot of additional resources, but you need to make sure anything you want indexed is accessible to Googlebot. It’s very possible that a developer blocked something because they assumed Google would never need to see it, but as SEOs, we need to do our due diligence and make sure that’s actually true.
When we polled the audience, some people were using a prerender solution, some weren’t. This underscores our stance on prerendering as well — it may make sense on some sites, but not others.
Frank polling the audience about prerendering. First of all, prerender is expensive, so it likely won’t be a viable solution for many companies that are operating on a tight budget.
Second, prerender makes the most sense for websites with pages that don’t change often or are completely static. So if you’re an online marketplace like eBay or a publisher like NPR, prerendering wouldn’t really make sense. For example, we worked with an e-commerce business whose prerender solution was causing product prices to get out of sync — Google was seeing one price while users were seeing another.
If you’re using a prerender solution, make sure you audit it to see what a user is experiencing versus what Google is seeing.
The issue is that Google is caching a version of their API with a key that lasts only five minutes. Essentially, if Google doesn’t come back within five minutes, the key changes and it no longer can be accessed. If you do cache a resource, Google will typically come back to it. When Google encounters an expired resource, they’ll usually make an attempt to crawl and render it again.
We’ll keep this takeaway short and sweet — don’t force your visitors to load everything! They’re going to get your content later than you want them to see it and waste a ton of their RAM in the process.
Want to join us for the next BotifyCONNECT? We’d love to have you! Sign up for updates so we can notify you when the next event is going to be and how you can register.