Google can often seem like a black box, and understandably so. Not even Google’s own engineers fully understand all the components of its complex algorithm. But what isn’t a black box is Google’s stance on what they want their algorithm to reward. That’s where Google’s Search Quality team comes in.
We had the honor of speaking with Kaspar Szymanski, a former senior member of the Google Search Quality team, to get his behind-the-curtain insights on what Google wants, and why so many brands get this wrong in their approach to SEO.
While he was at Google, Kaspar’s responsibilities included combing through huge amounts of data to uncover violations of Google Webmaster Guidelines — guidelines that not only cover how to help Google find, index, and rank your site, but also go over specific actions that Google considers manipulative and can warrant penalties.
“My daily tasks included investigating spam signals‚Ä¶ for instance, looking into linking networks and at times applying penalties. I also worked on reconsideration requests, evaluating sites that wished to repent.”
In other words, Kaspar’s job was to be extremely familiar with the ways in which people attempted to manipulate Google’s algorithm for their own benefit.
Going back ~10 years ago, it was much more common for people to attempt tricks and tactics like hidden text, keyword stuffing, article comment links, and the like to rank in Google. Why? Because in many cases, it worked!
According to Kaspar, “Both spam (black hat methods) and Google Algorithms were less sophisticated back then.”
We asked Kaspar whether, because Google is now more sophisticated, algorithm manipulation tactics had also become more sophisticated. His answer was that the search quality conversation has largely shifted to topics like EAT (expertise, authoritativeness, and trustworthiness) for this very reason.
“Looking at the quality discussions around EAT and other recent SEO industry trends, it is obvious that a website can only be successful in a competitive niche if it demonstrates a unique selling proposition that is compelling for users.”
While Google’s algorithm has grown more sophisticated over the years, and the search quality conversation has shifted with it, Google’s goal has never really changed.
“Fundamentally, Google has not changed their stand that much. They still do not care which site ranks well for any given query, as long as it seems to live up to their users’ expectations. Managing user expectations and meeting, better yet, exceeding their expectations is the only way to drive positive user signals which in turn boost organic rankings. In that sense today, as much as in the past, it is still true that ‚ÄòGoogle loves websites that are popular with users.'”
We think it’s fair to say then, that in Google’s eyes, search quality is what happens when you’re focused on providing an exceptional experience to your search audience — answering their questions thoroughly, accurately, and clearly. This is what Google wants to reward with visibility in the SERPs, so chase your audience rather than the algorithm, and the algorithm will reward you.
In response to most algorithm updates nowadays, Google tends to describe them broadly as “quality updates” — this can frustrate companies that want more specifics so they can better understand what they need to fix if their traffic dropped as a result of the update. We asked Kaspar what advice he would give to companies in that situation.
“Attempting to follow trends, a method frequently seen, indeed is a strategy doomed to fail from the start. It is true that Google isn’t very specific about many of the updates launched. That’s however understandable given that there’s likely been an update during the time the reader has been reading this interview. There are just too many updates to comment on a substantial number of them in detail.”
But does this lack of information hurt our ability to succeed as SEOs?
“There’s no handicap to site operators as long as they focus on the more important factors, which are users. Their happiness is by far more important than any algorithmic update. Again, that is the reason why managing and meeting or exceeding user expectations throughout their entire site experience, including snippet representation but also landing page optimization, is the key to success in organic Google Search.”
Botify’s customers are overwhelmingly SEOs who work on large (hundreds of thousands or millions of URLs) websites, so naturally we were curious whether big brands struggled with different quality issues than SMBs and other brands with small websites.
In Kaspar’s experience, “The only difference is that large brands tend to have the resources required to introduce changes to their websites more swiftly, which may convey the wrong impression that they have some sort of an advantage.”
Big brands may or may not have more money and resources to throw around on staying ahead of the Google curve, but what is certain, according to Kaspar, is that Google holds all websites to the same quality standards.
“The only factor that counts towards rankings are SEO signals.”
Statements like these can frustrate some SEOs and brands who see the obvious differences in performing SEO for small sites versus large sites. How can we possibly say the ranking signals are the same when the effort required seems so different? The answer is not that Google has different signals for large sites and small sites, but rather that large sites face different challenges than small sites.
Case in point, Botify’s “How Google Crawls the Web” study in which we found that Google misses over half the pages on enterprise websites.
When we asked Kaspar about this, he said that “the problem isn’t that Google doesn’t have the capacity or desire to crawl, index, and rank content. It is rather that content frequently is not crawled due to poor or no crawl budget management, poor hosting services, conflicting technical SEO signals, or content quality issues. Or because of all these issues at once.”
“Large websites tend to struggle with managing their crawl budget more often than smaller sites, for all the obvious reasons. Afterall, there isn’t much danger of mismanaging crawl budget with a boutique website, as in comparison to a service with tens of millions of indexable pages.”
He added that “what also does help tremendously is saving, preserving, and utilizing server logs for SEO. If done correctly, it allows for a much more in-depth analysis and very precise actions to course correct if needed.”
“In my experience, nine out of ten websites do not properly and completely store their server logs files long term. This is both unfortunate and unnecessary. While occasionally legal or cost reasons are brought forward against saving logs, both are easily refuted. Compressed data can be saved cost effectively and doing so with data that only pertains to verified bots, not users, does not increase any legal liabilities. On the upside, it really is the greatest importance for large websites to collect that data. And of course to use for SEO auditing purposes at least one per year. No amount of data is going to be useful, without being applied for improvement.”
We also asked Kaspar whether he noticed any search quality issues that were specific to certain types of industries. For example, do e-commerce websites struggle with different quality issues than publisher websites?
“Every website is different and individual. As a consultant, I firmly believe that there are but a few ready-made solutions and that it is best to approach every website with a fresh mindset.”
If, as Kaspar says, enterprise websites struggle with issues like crawl budget management, then what’s the cause? Why do so many large brands struggle with these same types of issues?
“With website size, both opportunities as much as challenges multiply.”
While each website and organization is unique, it’s often the sheer size (of both the organization and the website) that can add complexity and difficulty to the SEO process.
To combat these challenges, Kaspar recommends the following:
If you’re still struggling to navigate organizational red tape, improve content quality, and overall achieve SEO success, this may stem from low organizational SEO maturity (which Forrester Consulting, in a Botify-commissioned study, found was common). According to Kaspar, in these situations, the problem may be bigger than SEO.
“It is the company’s culture and or structure that prevents collaboration, knowledge exchange, and growth. These things impair all kinds of progress, not just SEO.”
Kaspar has since switched to SEO consulting, where he helps guide site owners to make choices that will be in the best interests of their audience, and therefore, their standing in Google. So how different is this than his days as a Google search quality strategist?
“The work done on behalf of Google wasn’t too different from SEO consulting. When the idea was conceived to build our own brand SearchBrothers together with Fili, a trusted colleague at Google Search and friend, we had many years of successful collaboration to look back at. It has been a tremendously rewarding experience ever since and I’m profoundly grateful to be able to say that my profession is at the same time my passion.”
When speaking with Kaspar, it became obvious that to be an SEO does not require animosity toward Google. In fact, he showed us that Google is ready and willing to reward those who focus on creating exceptional experiences for their users.
This may seem overly optimistic to some, but we think a symbiotic relationship between websites and search engines is possible by pursuing a common goal (searcher satisfaction) and making it as easy as possible for Google to understand your website. We created Botify to help you do both, so don’t hesitate to reach out for a demo to learn how.
Kaspar Szymanski is a renowned SEO expert, former senior member of the famed Google Search Quality team, and among the select few former Googlers with extensive policy driving, webspam hunting, and webmaster outreach expertise. Nowadays, Kaspar applies his skillset to recover websites from Google penalties and help clients to max out the potential of their websites in search engines.