By Frank Vitovitch, VP, Solutions, Botify
Have you ever fallen victim to a website hack?
They happen more often than we’d like to think. Whether due to a plugin vulnerability, social engineering, or other brute force attack, hacks happen, and when they do, they can hurt your website’s SEO performance.
Website hacks often cause ranking and organic traffic loss because site owners typically aren’t notified until the damage has been done.
For example, the only sign most website owners get that their website has been hacked is finding this security warning message next to their Google search results:
Or in their Google Search Console report:
It’ll say something like:
But by the time a website owner receives this message from Google, organic performance may have already taken a hit. In some cases, website owners have to go through a manual review process to get their website’s pages restored by Google. This can take days or even weeks, and every day your website is compromised is money left on the table.
Thankfully, there’s an alternative.
Instead of being reactive about a website hack, be proactive.
Botify has a feature called Log Analyzer that gives website owners a window into how Google sees their site, which is often a lot different than how we see our websites.
Your server keeps track of every single request for a page on your website, and this history of requests is called a log file. Whenever Google attempts to visit a page on your website, your logs will show a request from Googlebot (along with a whole bunch of other information).
On their own, server log files can be difficult to wade through, let alone pull any meaningful insights from, but the Botify Log Analyzer automates this process, organizing your log data into actionable reports with tons of use cases.
So how can seeing Google’s activity on your website help you detect if someone is hacking your website?
Let’s look at a real life example.
This website’s log reports showed a slight uptick in Googlebot hits to error pages that only continued to grow. Eventually, it reached the point that Googlebot was crawling 50,000 URLs on the website each day, even though this website only contained 14,000 URLs.
This was a key sign that something was wrong.
Using Botify to explore the issue further, the website owner was able to get a closer look at the error pages. Sure enough, a spammer was attempting to hijack the site search results page in an apparent attempt to create unlinked citations, creating broken pages in the process.
Identifying the root cause turned out to be quite simple and should serve as a reminder of a general SEO best practice that over gets overlooked – blocking your search results pages with robots.txt. Doing so stopped the crawling of thousands of 404 pages overnight.
Had this website owner not been able to spot this so quickly, scenarios like this have been known to generate spam warnings in Google Search Console which would have led to a long and slow process of getting all of those URLs deindexed and then eventually blocked via robots.
Because Botify Log Analyzer allowed the website owner to see the bad URLs as soon as Googlebot started visiting them, the hack was caught and corrected before the hacking had a chance to negatively impact the website’s SEO performance.
Because many website owners don’t have a way to easily view their log files, they aren’t alerted to a website hack until they receive a “hacked site” notice from Google, but by this point, their website is likely already suffering in search results.
The website owner would have already begun to see thousands of spammy pages in the index for their site – pages they did not create. According to Google, “Low-quality content on some parts of a website can impact the whole site’s rankings.” So these pages could have begun to harm the performance of even the good pages on the website.
Because Google has finite resources, it can only crawl so many pages on a website per day. Allowing Google to visit these spammy pages means that Google could start ignoring their important pages.
The “hacked site” message next to this website’s pages in search results is also a huge deterrent to searchers who might have otherwise clicked to visit the website. This would have resulted in a sharp decline in organic traffic.
It’s a good idea to take preventative measures to avoid being hacked such as:
But sometimes, even when we take website security measures, hacks still happen. When they do, those with robust website monitoring software will be able to detect the hack early and take corrective action before it harms the website’s performance in search results.
When most people hear “SEO software“, they think of tools that help with things like keyword research, backlink analysis, and rank tracking. While those are certainly important components of SEO, they don’t give you the full picture.
Botify acts as an interface between search engines and your website, giving you a window into what’s going on “under the hood.” This transparency makes it easy to catch errors before they become issues, like the website owner in this example was able to do.
If you’d like to learn more about how Botify can help with the early detection of hacked sites, or the hundreds of other things Botify can do for your website, book your demo today.