Impact of Site Speed on Number of Indexed Pages on Google

It’s rare that we get clear evidence of how Google works.

Here is a theory:
We know that higher page-load-time is not good for any website specially large eCommerce sites. Higher page load time normally will:
  • Increases bounce rate
  • Reduces conversion rate
  • Impacts revenue
  • Decreases SERP
  • All of the above
  • and _______ ?!
In addition to all of above, this blog on is about the impact of site speed on total numbers of pages indexed on Google. In other words, here, I’m making an argument that if your page load time spikes significantly above your normal range and GoogleBot happens to be crawling your website at the same time, then there is a chance that number of pages indexed in Google may begin to decrease. So site speed is important for number of indexed pages in Google in addition of all eCommerce KPIs.

Recently, on one of our websites, a significant drop (over 15,000 pages) in terms of number of indexed pages was observed. Within a period of a week, the number of indexed pages dropped by 10.6% from 140K to 125K.

Date Total indexed
8/24/2014 140,487
8/31/2014 125,461 (why?)

Why such a huge drop on # of indexed pages on Google?
Obviously, a huge drop of 15K indexed pages in a week prompted SEO investigation, and soon an interesting correlation was found.
On Aug 31st Saturday, our page load time was 60% above average site average. (13.40 sec vs 8.34 sec). Server logs confirmed traces of GoogleBot on the site at the same time period. Site maintenance on weekends seemed like a natural cause for the trend that became apparent.
See below image that shows the trend of spikes of page load time which is well over the average time.
This brings up following actionable question on our part:
Q. What happens on Saturdays that pushes Page Load Time above average significantly? Can we improve it?
Aug 27 Sat – 14:42 sec
Aug 31 Sat – 13:40 sec
Sep 6 Sat – 12:78 sec
Avg – 8.34 sec (last 30 days average)

By analyzing data on Page Load Time vs Conversion Rate and Revenue, previously, we had clearly seen the impact of site load. When page load time gets over certain threshold point, there is clearly an inverse relationship between Page-Load-Time against Conversion Rate and Revenue. This piece of new data gave us a theory that if GoogleBot is crawling your website when the pages are taking way too long to load, then your sites normal crawl budget expires quickly and even worse, Google may punish you by de-indexing some of those pages if the spike trend is too frequent.

Summary: Improve your website’s first byte times, known as TTFB so you don’t suffer from slow-load ranking penalties.

Has anyone else made any such observation or was this just a fluke?


About Ujjwal Bhattarai

Ujjwal is an engineer by education, a programmer by hobby, and an internet marketer by choice. Other than 1 minute chess, and biking, his passion includes SEO, SEM/PPC, CRO, and Web Development. As a lifelong student of Internet Development, he is hopelessly addicted to Internet, and sincerely believes after fire, wheel, and decimal point, internet is the fourth most important invention in the human history. Catch up with him on Twitter at @uj2wal or at his Google+ .

Leave a Reply

79,238 Spam Comments Blocked so far by Spam Free Wordpress

HTML tags are not allowed.