Google uses sophisticated algorithms for determining how often to crawl your site. Their goal is to crawl as many pages as possible from your site on each visit without overwhelming your server's bandwidth.
If you are experiencing bandwidth related server load issues (i.e. too many requests too quickly), you may want to reduce how fast Google and other search engines crawl your site. Too many requests on your site in a very short period can cause your site to load slowly and even cause load issues on the server. This is especially true of very busy websites, or websites that are poorly or inefficiently coded.
If you are not experiencing any bandwidth or bandwidth related load issues on your server, it is recommended that you allow Google to determine the optimal crawl rate for your website.
Changing Google's Crawl Rate
Google allows you to adjust the crawl rate (the time used by Googlebot to crawl your website) for an entire domain or subdomain. You cannot specify different crawl rates for sections of your site (e.g. specific folders or subdirectories).
For example, you can specify a custom crawl rate for www.yoursitesdomain.com and subdomain.yoursitesdomain.com, but you cannot specify a custom crawl rate for www.yoursitesdomain.com/subfolder.
Changing the crawl rate only changes the speed of Googlebot's requests during the crawl process. It does not have any effect on how often Google crawls your site, nor how deeply they crawl your URL structure.
To change Google's crawl rate:
- Login to Google Webmaster Tools
- Add your site to Google Webmaster Tools, if you have not done so already.
- On the Webmaster Tools Home page, click on the site you want.
- Under Site Configuration, click Settings.
- In the Crawl Rate section, select the option you want.
The new crawl rate will be valid for only 90 days.