SEO success is an ongoing process - one that depends on giving a site regular check-ups. Thankfully Google make this relatively easy to do through its Webmaster Tools.
One of the reasons Webmaster Tools is such a great resource is that it identifies the problems Googlebot encounters when trying to crawl a site. Google Webmaster Tools breaks these errors down into three primary categories:
- Server Errors
- Access Denied Errors
- Not Found Errors
Server and Access Denied Errors
Server errors appear most often in the form of a 500 Internal Server Error HTTP response code. Basically this means there's been an unspecified problem with the site's server.
Access denied errors often take the form of a 403 Forbidden response code. When Google returns this error it generally means that the website's server or host is blocking Googlebot's access.
These errors are bad for SEO because the search engines can't crawl that page. If the error isn't fixed after a certain period of time Google may just de-index the page entirely.
Not Found ErrorsNot found errors take the form of a 404 response code. Unlike server and access denied errors, 404s don't negatively impact SEO directly. This will make for a horrible user experience and in-turn negatively affect SEO.
Visitors get a 404 error when they follow a link to a site and the page that URL points to can't be found. Instead of getting the information they were looking for they get a useless error page which is little more than an invitation to bounce from the site.
Google's recent tweaks to its algorithm mean more weight and consideration is being placed on how real live people respond to a site. So ranking well depends on being proactive about errors, as well as appropriately loading the site with key words and links.
If Google sees that a lot of people are bouncing, that's an indication that the site doesn't provide much utility and Google may downgrade it as a result. On top of ruining the user experience, 404 errors deprive a site of all the valuable link juice [can you make link juice a link to a definition?] those pages have accumulated, which means the overall site will rank lower in Google..Both the user experience and link juice problems can be fixed by implementing redirects that take visitors who follow that URL to a different but still relevant page, which should prevent them from bouncing. It's essential that the redirect be a permanent 301 redirect. 301s not only send users to a relevant page but pass the old page's link juice on to the new page, ensuring the site's SEO doesn't suffer.
Making an effort to track and correct errors will ensure a site's SEO remains as good as it can be, while also ensureing the site has a sound overall architecture that keeps both users and Googlebots happy.