Google Stopped Crawling (Robot TXTS Issue)

Status
Not open for further replies.

8cores

Active Member
227
2013
36
0
Hello WJ members, few days ago my website micromkv.com went down for like 8 hours actually the data center blocked it as I forget to remove an abuse but it came back online later in the day!

That day I got a message in Google Webmaster Tools - 146 failed attempts to crawl your site!

They also mentioned when your site comes back online, fetch your Robot txts in fetch as Google bot to let us know that the site is working fine.

Well, I fetched the robot txts twice and it's been like a week it keeps showing your robot txts are
Google couldn't crawl your site because we were unable to access the robots.txt file!

Here is my robot txt file - http://micromkv.com/robots.txt

All help will be appreciated! My website is no where to be found in google and before that block from the data center it was working fine in google!

Screenshot by Lightshot
 
2 comments
Is this resolved now? It takes some time for google to re-crawl a site after downtime.

FYI - Your site is listed in google as of now.
 
Status
Not open for further replies.
Back
Top