Googlebot is Google’s web crawler, which is often referred to as a spider. It’s a network of powerful computers that work together like a gigantic, super-speedy web browser. It visits web servers, requests thousands of pages at a time, then downloads those pages and delivers them to the Google indexers.
There are two Googlebots. Deepbot visits all the pages it can find on the web by harvesting every link it discovers and following it. It currently takes it about a month to perform this deep crawl.
Freshbot keeps the index fresh by visiting sites that change frequently at more regular intervals. The rate at which the website is updated dictates how often Freshbot visits it. Ideally, the website of a daily newspaper, for example, should be crawled once a day and a weekly e-zine should be crawled every seven days.