Is there any useful ways to make googlebot crawl more pages on your site?

Status
Not open for further replies.

Mvajdava

Member
24
2011
0
0
Is there any useful ways to make googlebot crawl more pages on your site? the difficult position I am facing is that the

indexed pages of my site is less and googlebot just crawl less than 50 pages everyday. Anything useful ways? Thanks in

advance.
 
8 comments
create sitemap.xml of your entire web page and submit it to google webmasters tools.

Make sure your robots.txt is well written and there is no any restrictions for the google bot.

If you have option of tagging your post/page do so and use previous used tags on other posts/pages.

Make sure you dont have noindex nofollow in your metatags for the posts/pages you want google bot to crawl.

I hope this will help you.
 
If your server uptime is 100% and provides a good speed, you can set custom crawl rate.
And if you think that will use something which catches and brings spiders....LOL!

thnx
Kc
 
An XML sitemap always helps. Great content with your choice of keywords always helps. The keywords should be in places that should be in all the right places and also having a great number of high PR links coming to your site will help your cause.
 
Content is king, add 500+ word blog posts that are optimized with your keywords. Regularly add this like 3 times a week or more. Make sure your page, pictures and videos are all optimized with your keywords as well.
 
Content is king, add 500+ word blog posts that are optimized with your keywords. Regularly add this like 3 times a week or more. Make sure your page, pictures and videos are all optimized with your keywords as well.


I like this advise.
It's boring work. But as Bizzoyce said.. It works.
 
Googlebot is Google's web crawling bot (sometimes also called a "spider"). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
We use a huge set of computers to fetch (or "crawl") billions of pages on the web. Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.
 
Status
Not open for further replies.
Back
Top