Alerts & Monitoring E-Commerce SEO Enterprise SEO Mitigating Risk Technical SEO

Mitigating SEO Disasters – An Interview with Kaspar Szymanski

Kaspar Szymanski is the founder of SearchBrothers.com, and an ex-Googler. He joined Botify for our recent SEO Horror Stories webinar, and was kind enough to answer a few follow-up questions.

If your websites were attacked by hackers, could that attack have an SEO impact? Any recommendations to prevent it?

Yes, a website compromised by hackers can suffer greatly in organic Google Search results. In rare cases they can trigger an SEO disaster. Some changes an unauthorised third parts can introduce are more consequential than others. For instance, if user agent cloaking is introduced and Googlebot is served content that differs substantially from users that is likely to trigger a temporary removal from Google Search. This is a rare, however truly catastrophic scenario. More often hackers introduce new, spammy content landing pages and or links pointing to undesirable websites, such as pharma thin affiliate or adult content pages. If Google isn’t able to isolate and mitigate the impact of such a hack a decline in SERPs tends to be the consequence, unless the issue is fixed. Any instance of a website being compromised must be taken very seriously. The best type of prevention, next to secure procedures, passwords and up-to date systems is monitoring. That includes saving and preserving in perpetuity raw server logs.  

A negative SEO attack is frequently only possible when compounding existing, legacy backlink issues that were left neglected for far too long. It is possible for a negative SEO attack to trigger an SEO disaster, including a catastrophic drop in Google Search visibility. It is however a rare scenario and it can be prevented by utilizing Google’s Disavow Tool, which is hugely important when dealing with undesirable backlinks. There are best practices which need to be observed when disavowing backlinks, described in detail in the Ultimate Disavow Guide.

Are there any cool or simple automated activities that you could share with us? Any that have prevented common SEO mistakes or just made your lives easier?

Three main ones:

  1. Keep an eye on Google Search Console messages.
  2. Save, preserve and analyze web server logs.
  3. Do not rely on Google organic traffic exclusively for converting traffic. That’s not an automated activity, however following this advice does make a website operators life less stressful.

What about a migration of multiple web sites (let’s say… 5) towards a single one that groups all of their content. Any special flags?

Any migration is a complex operation. Migrating a multitude of websites to a new destination more so. All of the sites need to be audited and checked for on- and off-page legacy issues in order to prevent passing them onto the new website. Depending on the volumes of landing pages and backlinks the process is likely to take several weeks before completion. Which is why a migration must be planned much in advance, with adequate time for preparation, transition, monitoring and after care factored in. On the upside, a migration is an excellent opportunity to drop ballast and increase visibility with a new, unburdened website.  

We usually think of Horror Stories as reactions to big events – site migrations, algorithm updates, one large technical issue, etc. But what are some common issues that are just subtle enough to keep putting off, but become a horror story after a lengthy period of time?

Ongoing, PageRank passing link building is an activity that over time has the potential to trigger a chain of events that ultimately lead to a website’s disappearance from Google Search Results. Often link building starts slow and once presumed effects seem felt, it is continued, even expedited ever more. Over time the process is sometimes streamlined and simplified, making backlinks look even more uniform. All of which contributes to detection and identification by Google. 

E-commerce sites, especially large ones, with a substantial product stock must be on a constant lookout for their crawl management.

Another initially slow process that can truly doom a website’s rankings are deteriorating on-page user signals. These can be adversely affected by declining site performance, when landing pages become progressively slower for users. Reducing user satisfaction in the process. Another reason why user signals do decline progressively are poorly managed snippet representation and or poor user expectation management on landing pages, when little or no compelling content is served. Soft 404 pages are a particularly poignant example and can jeopardize rankings when served to bots and users on-scale.

One of the more interesting aspects of all this is navigating the politics after an SEO disaster. How do you help your team or your stakeholders understand the reasons, and how can this be a catalyst for better understanding the importance of SEO?

SEO disasters or near misses can function as a wake-up call and help to unlock the resources required to grow in-house SEO expertise. In fact, the experience shows that marketer teams which did experience a critical situation are more amenable to not merely consider but also embrace SEO best practices. In that process, previously applied processes are adapted and priorities refocused. Teams that have been shaken up thoroughly are easier to convince that e.g. continuing building PageRank passing links is a folly today, even if it may have worked in the past. They tend to comprehend that more indexable landing pages does not necessarily mean more relevant, converting landing pages. An SEO disaster can be in the long run a positive event, to the extent that it has a catalyzing event and helps to strengthen the online operation over time.

Any specific horror stories to watch out for when it comes to e-commerce sites?

There are a few truly dramatic, real life SEO horrors described in detail in a recent article titled SEO horror stories: Here’s what not to do. These are extreme cases, demonstrating how a cascade of poorly informed decisions can spell doom of even the most prosperous website. Still, colossal failures such as these are infrequent. E-commerce sites, especially large ones, with a substantial product stock must be on a constant lookout for their crawl management. It isn’t the only SEO priority, yet it is one of the crucial elements contributing to a website’s health. For that purpose, server logs remain a key element. All commercial sites, large and small, are well advised to also keep an eye on their backlink signals. While backlinks are important and traffic link building is recommended, PageRank passing link building can be a cause of rapidly declining rankings. For all the reasons above, regular, ideally annual on- and off-page website audits are the safest way to avoid SEO disasters. 

Sep 9, 2020 - 5 mins

SEO Growth by Acquisition: What Is It & How Can You Do It Well?

Alerts & Monitoring E-Commerce SEO Enterprise SEO Mitigating Risk Technical SEO
Oct 10, 2022 - 4 mins

How Your Brand Can Set the Standard in Organic Search Performance

Alerts & Monitoring E-Commerce SEO Enterprise SEO Mitigating Risk Technical SEO
Apr 4, 2020 - 11 mins

10 Challenges E-Commerce SEOs Face & How They Impact a Retailer’s Online and Offline Success

Alerts & Monitoring E-Commerce SEO Enterprise SEO Mitigating Risk Technical SEO