Did your blog suddenly dropped out of search engine? Are you losing search engine ranking for your pages? One thing every webmaster desires most is to see a positive result for his hard work. There is always a chance your site may have errors without your notice and these errors could be harming your search engine ranking and your site in general.
Google WebTools provide you with important and detailed reports about your site’s visibility on Google, most especially your site’s health and performance. Google WebTools is an invaluable resource for site owners providing all sorts of important/useful data and reports to help you improve your site.
Once you have verified the ownership of your site with Google WebTools often times referred to as Google Webmaster Tools, you will be privileged to know a lot about your site with the data Google WebTools will provide you. There are so many information you can be privy with using the Google WebTools.
Google WebTools will provide you with data like Crawl Errors, Crawl Stats, Index Status, Search Queries, Links to your site, Internal Links, Sitemaps, Structured Data, Content Keywords, HTML Improvements etc. For the purpose of this article let’s concentrate on Crawl errors and how to fix them.
So many factors can lead to crawl errors for your website, just few days ago there was a server overload one of my blog’s hosting server and the next day I got an alert that Google couldn’t crawl my site and when I checked my Google WebTools account, alas! There were six (6) server errors.
Note: There are different categories of errors that can show up on Google WebTools and when you have knowledge of the type of error you will know how to easily fix it. The categories include:
Site Errors: lists errors that prevent Googlebot from accessing your site at all
URL errors: lists errors Googlebot encountered while trying to crawl specific URLs.
Issues that may cause crawl errors include the following;
Your DNS server is down or misconfigured.
Your web server itself is firewalled off.
Your web server is refusing connections from Googlebot.
Your web server is overloaded, or down.
Your site’s robots.txt is inaccessible.
You delete a page on your site and do not 301 redirect it
You change the name of a page on your site and don’t 301 redirect it
You have a typo in an internal link on you site, which links to a page that doesn’t exist
Someone else from another site links to you but has a typo in their link
You migrate a site to a new domain and the subfolders do not match up exactly
So from the above list you can see your site errors may not be your fault alone, while you can be doing everything right, your site can still have crawl errors from your host side. So the issue here is not just how to prevent it but how to fix it when it happens.
How to Fix Crawl Errors in Google WebTools:
Site errors don’t often occur, from my experience I have noticed most crawl errors are often caused by redirect errors, I recommend you do 301 redirect using WordPress Redirect Plugin to redirect your 404 error pages to a valid page on your site.
If you are getting crawl errors as a result of restrictions by your robot.txt file, check your robot.txt file and ensure that you really do not want to block such URLs listed on the crawl errors. Consult this article for “How to Block or Remove pages using a robot.txt file”.
If you are having Unreachable errors or DNS errors it will be best to contact your host about this problem as it may be better they handle this error for you or at least give you a professional advice on what to do about it.
Most webmasters may be too busy to routinely check their Google WebTools for crawl errors, so I recommend you set up an alert for your account by turning on message forwarding so as to receive messages to your email whenever there is an issue on your account, that way you can easily be on top of every situation and fix your site errors before they harm your site.