Like all SEOs, Botify team members have always been huge fans of Google Search Console (GSC).
It’s no secret that Botify and Google Search Console share common DNA: both of our platforms strive to optimize crawlability, indexability, and actual user queries. That’s why we were so excited to see the significant updates GSC rolled out in January of this year.
This blog post presents some use cases about how GSC, combined with Botify, can work together as complementary partners and serve a the perfect tool kit for enterprise SEOs. Fast forward to specific sections if you want a targeted read:
Understand Exactly How People Search for, and Find, Your Site by Using GSC and Botify
The most significant change in the new Google Search Console is found in its keywords section. GSC now pulls keywords data over a period of 16 months, as opposed to the 90 day period that the older version implemented.
This gives you a fuller picture of your site’s keyword trends over time, and allows you to make year-over-year comparisons.
Google Search Console’s approach to keywords is unmatched. The platform relies on actual user queries, culled from Google’s own proprietary search data, and not from scraped keywords that can be biased by geotargeting or personalization.
The latest version of GSC can pull up to 1,000 user queries per site. However, the Search Console API, used to feed Botify Keywords, enables developers to extract many more user queries.
Botify Keywords is the only solution that bridges real keywords with technical SEO, all by:
- Pulling up to millions of actual user queries per account and tying them back to 500+ content, ranking, and technical SEO KPIs, at both the URL and segment level.
- Mapping real search intent, and detecting changes in search behavior, on an enterprise scale
- Pairing rank tracking with structured data, content quality, AMP, and device-specific insights, across all keywords
- Capturing and charting mobile-first metrics, including keyword contextualization by device
For the first time, Botify Keywords offers enterprise SEOs access to real rankings and control over what impacts them.
Search Console Coupled with Botify is the Best Combination to Optimize Your Crawl Budget
Combining insights from Botify and GSC enables SEOs to optimize every step a page takes before it enters the index and becomes capable of generating organic traffic.
The new version of GSC has a “Discovered - currently not crawled” report that lists the URLs in your crawl queue. Your crawl queue represents the URLs that Google is aware of, but hasn’t yet crawled.
If you have a high amount of URLs in your crawl queue, your site probably has crawl budget issues. That is, Google knows about other pages on your site, but has exhausted its resources, and hasn’t crawled them.
With Botify, you can precisely optimize your crawl budget, and compare its approximate volume to the uncrawled pages in the crawl queue.
Botify also allows you to identify the URLs that are not in the crawl queue - pages that Google has not discovered. You can isolate these pages in Botify, and easily identify the segments that have the biggest crawl ratio issues and the highest number of unknown pages.
Botify can show you the precise crawling factors, such as depth, low quality segments, poor content quality, slow load times, and other factors that contribute to Google not knowing about or crawling these pages.
Furthermore, Botify and GSC can help you derive value from the pages that are crawled, but remain unindexed. “Crawled - currently not indexed” is a new report in the latest version of Google Search Console.
GSC allows you to download a sample size of up to 1,000 URLs from this report, which is a great start. Larger websites, with tens of thousands or millions of pages, can use Botify Log Analyzer to uncover the full volume of pages on the site that are crawled but likely aren’t indexed (because of technical factors or poor content).
Compare GSC data alongside Botify’s depthful analysis to understand the reasons why pages are getting crawled, but might have indexation issues:
- Are the pages not compliant?
- Do the pages have poor content quality?
- Do they suffer from too little content or content duplication?
- Are there certain segments of the site that are often crawled but not indexed?
In Botify, you can isolate the pages that are crawled but not indexable, and break the data down into segments that represent the sections of your site. You can also export the full data for use outside of the Botify platform.
Optimize the URLs Almost Over the Finish Line. Botify & GSC Turn Indexable Pages into Indexed Pages.
But what about pages that are indexable, but still aren’t being indexed? It’s a tricky group of pages to isolate, but with GSC and Botify, you can do just that. With Botify, you can easily calculate the number of pages that are likely to enter Google’s index (i.e. compliant pages). GSC has a cool new chart that counts the number of pages that have been indexed (i.e. “valid URLs”)
By comparing the number of compliant URLs to the number of valid URLs, you can gain high level insights into how your pages are entering, or not entering, Google’s index.
If your compliant URLs and valid URLs are approximately the same, you’re in a good spot. That means most of your indexable URLs are getting indexed. However, if you have more valid URLs than compliant URLs, that may mean you have a significant number of orphan URLs.
If, conversely, your site has more compliant pages than valid URLs, it means not all of your indexable pages are getting indexed. This probably indicates that you have crawl budget issues. We’ve created several pieces of content to assist in crawl budget optimization so that you can get valuable content in the index and ready to serve to searchers:
- Google Crawl Budget Optimization Webinar
- Quick Wins: Increasing Google Crawl by over 300%
- Expand on Google Search Console Data with Botify
Optimize Your AMP Strategy with the Combined Insights of GSC and Botify
GSC’s newest AMP report helps you fix errors that prevent your AMP pages from appearing in the Google Search results that require AMP-specific features, such as the news carousel. The new report includes a sample list of pages affected by those issues (up to 1,000 URLs). GSC also provides information about how to fix the errors, and a process to notify Google about your fixes for faster validation.
GSC offers an impressive level of AMP error detection. For large sites, Botify can provide the additional visibility needed. Discover and crawl AMP pages to root out errors for an entire site, analyzing up to 25 million pages per crawl.
Effortlessly compare how the number of visits, the number of impressions, real rankings, and other SEO indicators differ between AMP and canonical pages to maximize AMP performance. Use AMP Parity in Botify to measure content similarity between your AMP pages and their corresponding canonical pages, side-by-side.
Also follow Googlebot’s crawl specifically on your AMP pages to discover what might be keeping your pages out of search.
In the mobile-first world, it’s essential that all sites have the tools they need to optimize their mobile strategies. Botify and GSC both offer key tools to optimize AMP pages. Cross-reference your findings in Botify with your data from GSC for the clearest picture of AMP performance.
Botify & GSC: An Unbeatable Combination for Enterprise Sites
We’ve always loved Google Search Console, and we’re impressed with the new features they recently released. Although our platforms offer two separate solutions, Botify and GSC tend to overlap when it comes to core SEO principles. Some of our customers have even called Botify the “enterprise Google Search Console”.
While we’re flattered by the comparison, we know our respective platforms can’t serve as substitutes for each other. Our platforms do share common DNA, but each solution offers analyses that boost, rather than replace, the insights of the other tool.
That’s why Botify and GSC compliment each other so well. In fact, as we began highlighting in this blog, Botify and GSC are perhaps the only tools an enterprise SEO needs to gain full visibility into the entire organic search process.