How to Fix Crawl Errors in Google Webmaster Tools ?

Status
Not open for further replies.

sonth321

Active Member
87
2016
3
0
Hello,

Anyone here know ways How to fix crawl errors in Google Webmaster Tools ?

I got 15 Soft 404 errors ( URL does not exist, but your server is not returning a 404 error.)
and
263 Not found errors ( URL points to a non-existent page )

I strongly want to solve these errors in Google Webmaster Tools for my site

Please share some suggestions
 
4 comments
404 errors ( URL does not exist, but your server is not returning a 404 error.)
and
263 Not found errors ( URL points to a non-existent page )

All of them are broken links. Delete all them by Google Search Console. Here are the easiest way for you.
1. Open the Remove URLs page.
2. Click Temporarily hide.
3. Enter the relative path of the desired image, page, or directory and click Continue. The path is relative to your Search Console property root, and should include a leading / mark.
4. Choose one of the following actions on the form:
- Temporarily hide page from search results and remove from cache: Hides the page from Google search results for about 90 days, and also clears the cached copy of the page and snippet. The page can reappear in search results after the blackout period. Google will recrawl the page during the blackout period and refresh the page cache and snippet, but will not show them until the blackout period expires.
- Remove page from cache only: Clears the cached page and snippet, but does not remove the page from search results. Google will refresh the page cache and snippet.
- Temporarily hide directory: Hides an entire directory from search results for about 90 days and also clears cached pages and snippets for all pages in the specified directory. The directory can reappear in search results after the blackout period. Google will recrawl the pages during the blackout period and refresh the page caches and snippets.
- Click Submit Request. The request can take up to a day to process, but is not guaranteed to be accepted. Check back to see the status of the request. If your request has been denied, click Learn more to see the explanation. For example, your request might have failed because the URL you submitted didn’t meet the requirements for the type of blocking you requested, or you may need to make a different type of request in order to successfully block a specific URL.
 
To view links bug come from, you log in Webmaster Tools interface
DASHBOARD => CRAWL => CRAWL ERRORS => NOT FOUND tab => Click the URL in the error dialog => LINKED FROM tab.


User dealing with faulty links
+ If the error link from your site. Please edit or remove the link.


+ If the error link from external sites. You can "take advantage" to navigate the user to your site using a 301 redirect function.
 
To fix access denied errors, you'll need to remove the element that's blocking the Googlebot's access:

  • Remove the login from pages that you want Google to crawl, whether it's an in-page or popup login prompt.
  • Check your robots.txt file to ensure the pages listed on there are meant to be blocked from crawling and indexing.
 
Last edited:
[FONT=q_serif]First check what type of crawl errors are and then try to redirect these with the help of .htaccess file also you can block the some url in robots.txt file[/FONT]
 
Status
Not open for further replies.
Back
Top