Status
Not open for further replies.

HostQWIK

Active Member
29
2016
0
0
Hello,
when i was checking my website on woorank seo, it's showing that your website is missing robot.txt file.
please advice, how i will create this file ?
 
8 comments
You need to create the robots.txt file & upload the same to your root server. While creating the robots file you should know which all files you want to allow for google to visible & which file are important & those files you don't want to allow to view by google you should make it as disallow.
 
Pretty straight forward, and from an SEO standpoint what should you allow Google to see and not see?

Most CMS's people use nowadays will automatically create a robots file. If you're using hand coded HTML, there may not be a reason to use robots unless you have a certain directory that contains information you don't want to get indexed.
 
Robots.txt file is crucial thing for your particular web site. We can set files to allow google to allow visit. So it is main part of the web site.
 
Make a file called "Robots.txt" and save it in your root directory.

Code:
User-Agent: *
Disallow: 

Sitemap: http://example.com/sitemap.xml

I don't think you wanna disable any crawler or disallow any folder or file ..

Create a new line for each sitemap starting with "Sitemap : ".
 
Hello,
when i was checking my website on woorank seo, it's showing that your website is missing robot.txt file.
please advice, how i will create this file ?

The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
Here is a link that will help you create this file:

Robots Text Generator Tool
 
Robot.txt is a crucial tool for SEO because it tells the search engine bots not to crawl certain web pages, HCFGrizzly has already posted a useful tool for creating the .txt file, highly recommended.
 
Status
Not open for further replies.
Back
Top