Status
Not open for further replies.

vgkumar

Member
21
2013
1
0
Robots.txt is a text file used to give instructions to the search engine crawlers about the caching and indexing of a webpage, domain, directory or a file of a website.
 
7 comments
robots.txt allows bots to crawl your site. With it you can restrict several directories which you want to prevent from being crawled.
helpful for SEO
this is all i know :|
 
The robots.txt is a simple text file in your web site that inform search engine bots how to crawl and index website or web pages.By default search engine bots crawl everything possible unless they are forbidden from doing so. They always scan the robots.txt file before crawling the web site.
 
[FONT=Verdana, Arial]Robos .txt is a text file you put on your site to tell search robots which pages you would like them not to visit.[/FONT]
 
Hello,
Robots.txt is a simple text file not html. You have to put on your site to tell search engine robots which pages you would like them not to visit.

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots.
The "Disallow: /" tells the robots that it should not visit any pages on the site.
 
Make sure you have one on your site, otherwise you will find errors (file not found) in your server log. Create regular .txt and put there:

User-agent: *
Crawl-delay: 10

You will say: Dear bots, read my web by speed 1 page per 10 seconds. You will avoid server overload.
 
Status
Not open for further replies.
Back
Top