Caret leftBack to Blog

The Future of Enterprise SEO

8th August 2019Botify NewsBotify News

This is part one in a multi-part series about enterprise SEO.

What does it take to perform well in search engines like Google? That question has been at the heart of search engine optimization (SEO) since its inception, and with some studies indicating that search engines contribute ~35% of all website traffic, it’s a question that deserves an answer.

For many years, people have viewed Google as a “black box” — completely mysterious and unknowable. As a result, SEO specialists have historically spent much of their time trying to reverse engineer the algorithm in an attempt to leverage the traffic and revenue-driving potential of the organic search channel.

But just as quickly as we think we understand Google, it changes and matures, causing many exasperated SEOs and marketers to ask, “Can this channel ever be mastered?”

Time has not only brought about many changes in the algorithm. It’s also brought about many changes in the web. While Google was improving its algorithm, it was also sprinting to keep pace with new technology like JavaScript, leading to an average of 50% of website content being missed.

At the intersection of search engine maturity and web complexity lies the future of SEO. The question is no longer “What can I do to rank?” but “How can I make sure every piece of content I publish is immediately discovered and indexed by search engines so that my audience can find it?” This is arguably the most difficult for enterprises that have to account for potentially millions of pages.

To better prepare for the future of enterprise SEO, we need to explore the complexities of the web today, and how they change the way we think about our websites.

Complexities presented by the modern web

When we say the web is more complex than ever before, we mean on just about every level.

JavaScript and how Google renders your pages

Larry Page and Sergey Brin developed Google’s algorithm when web pages were primarily HTML (hypertext markup language). When new technology like JavaScript entered the scene, allowing pages to become more dynamic, Google knew it had to start treating pages more like a modern browser.

Essentially, JavaScript can change the way a page looks from the browser itself, rather than the server. In order for Google to see those modifications, it needs to render the pages just like a browser would. But rendering is time and resource-intensive, leaving Google little choice but to index web pages in two phases. At Botify, we’re calling this “render budget,” and it can lead to Google missing some details about your pages on the first pass. When this happens, your organic performance can suffer.

Many organizations’ transition from HTML to JavaScript wasn’t a smooth one. Due to incorrect deployment — things like bad development or canned/non-customized code — organic website traffic tanked, prompting many to revert back to the HTML versions of their websites. Since then, developers have advanced in this area, leading to a renewed emphasis on JavaScript. If an enterprise organization doesn’t already have JavaScript deployed on their website, then they’re likely in the planning stages to make it happen.

Enterprise-level websites today often rely on JavaScript for some SEO-critical elements like text and links, which is why it’s critical for enterprises to test and ensure Google isn’t missing their important content.

Botify has been hard at work building tools to address this new reality, and developed JavaScript Crawl to do just that. Traditional crawling tools don’t render JavaScript, and therefore miss this important content. Botify crawls and renders your content just like Google, allowing you to know with certainty whether search engines are using your content for indexing and ranking.

Mobile-first indexing and understanding mobile/desktop parity

Google recently began moving websites over to mobile-first indexing. While Google used to crawl, index, and rank web pages based on their desktop version, Google now does this based on the website’s mobile version.

Based on Botify’s analysis, smaller websites are more likely to be among the first to enter the mobile-first index compared to large websites. Google is giving the largest sites longer to prepare, is moving them over at a slower pace to ensure successful transitions, or both.

From a risk assessment point of view, this makes sense, because smaller websites should be easier to move for both Google and the site stakeholders than enterprise-grade websites.

future-enterprise-1

Botify’s research also indicates that responsive websites are much more likely to enter the mobile-first index first. Google appears to feel that it’s less risky to transition responsive websites, since the same page is served regardless of device/user agent. By contrast, Mobile URL and Dynamic Serving websites serve different pages depending on the user agent, meaning content lacks parity when accessed on mobile versus desktop.

enterprise-seo

Mobile-first indexing makes it critical for enterprises to ensure that there’s parity between the mobile and desktop versions of their websites. If the mobile version of a web page didn’t contain all the elements the desktop version had, its organic performance could suffer upon being moved over to the mobile-first index.

For many enterprises, the move over to mobile-first indexing has been challenging. Due to the size of their websites, achieving parity has become a long project tackled in phases. Some have opted to fix parity issues on small sections of their site first, monitor the results, then take their learnings and apply them toward fixing larger sections of their site.

Here we find another critical activity that’s nearly impossible to scale -- how can enterprises hope to have the time to compare the mobile and desktop versions of all their pages? Botify has solved for that as well.

With reports that help you ensure mobile compliance, enforce content parity, and optimize load time, enterprise websites can move into the mobile-first index with confidence.

Google indexes trillions of pages, and more are being published every day

Both the web and Google’s index of the web are growing daily. According to Google, “Search starts with the web. It’s made up of over 130 trillion individual pages and it’s constantly growing.”

This places a heavy demand on Google, who has to crawl and render all this content. As we discussed in our blog post on render budget, Google doesn’t have the time and the resources to be able to crawl and render everything. Their answer to this is a budget, or limit of how many URLs Google’s crawler will crawl before leaving your website.

Because of these budgets, Google may be missing some of a website’s important content. In fact, our data reveals that Google is ignoring about half of the content on enterprise websites. Google may also be spending time on “non-compliant” pages — pages that don’t create any meaningful experiences for searchers such as duplicate content or pages responding with errors. When Google has limited time to explore your content, you can’t afford to have them waste time on these types of pages.

If Google isn’t crawling your important content, it won’t be indexing your content either, and content that doesn’t exist in the index has no chance of earning any traffic from Google organic search. In order to correct course, enterprises need to be able to see their website how Google is seeing it. This is made possible by the Botify Log Analyzer.

enterprise-seo-3

This adds transparency to the search process for enterprises who traditionally have little insight into how Google is treating their pages. Once you know which of your important pages are invisible to Google, Botify makes it easy to diagnose the cause and correct course.

Search engines and enterprise websites: creating a mutually beneficial relationship

With statistics like Google ignoring half of enterprise website content, it might seem like search engines simply don’t like large or technologically advanced websites. The reality is, search engines and enterprise websites don’t have to be at odds.

Google’s stated goal is to “organize the world’s information and make it universally accessible and useful.” That includes the information on enterprise websites. If enterprise websites want to benefit from the traffic that search engines can send them though, they need to adopt a new methodology.

Large websites simply cannot start with rankings and keywords or they will miss enormous opportunity. They need to go deeper, focusing on the entire search process from crawling to conversion.

We’re going to explore that unique methodology in the next post in this series, following it up with a deep dive into each facet of that methodology.

When adopted, this methodology creates a symbiotic relationship between search engines and enterprise websites -- help Google find and understand your pages and you’ll be rewarded.