Written By: Adam Dince
A few days ago, one of the Sites I’m responsible for experienced a big drop in revenue from organic search. Of course, the first thought I had was, “Oh no, Google penalty!” Turns out the culprit wasn’t a Panda or Penguin penalty, it was an SEO Bot.
Here’s a high-level overview of the process used to find the issue:
- Reviewed Keyword Rankings and ALL KPIs: Once we were able to verify that our keyword rankings and organic traffic were stable, I was able to rule out any penalization
- Checked GoogleBot Status: We peaked in to Google Webmaster Tools and didn’t find any crawl or indexation errors
- Check for Recent Site Updates: Our Site management team confirmed no recent updates or releases
We then reached out to IT and took a deep dive in to our Web server log to look for any awkward activity. We found that there was one specific SEO robot, from an SEO SAAS we use, hurting our pages.
Some of our Websites use a fairly antiquated faceted navigation platform that operates by conducting complex queries when a facet is selected. Because our site was being crawled so vigorously by the SEO Bot, the frequency of faceted queries was causing many of our important pages to load slowly and in some cases, not load at all. I quickly disabled the problematic SEO robot and shortly thereafter, we saw revenue go back to normal.
If you’re experiencing abnormal results from the organic search channel, make sure you’re looking at all possible causes. Especially, Web server logs. They can help expose issues that aren’t showing elsewhere.