Top 10 Things You May Not Have Checked That Negatively Impact SEO

Top10CropSEO is a pretty complicated subject, with enough moving parts to stump the most seasoned veterans. There are plenty of obscure problems that can wreak SEO havoc, even on sites that have mastered the fundamentals.

Making your site’s SEO as good as possible involves paying attention to the details. On that note, here’s a list of ten problems you may have overlooked that could negatively impact your SEO.

1. An Accidentally Indexed Development Site

A development site that accidentally gets indexed by the search engines can damage your SEO because usually the development site is very similar to the actual site. This means the actual site could get downgraded for duplicate content.

Plus, the development site could show up in the search engine results pages (SERPs) and divert visitors from the real site. If the information on the development site is incorrect or out-of-date this could result in a lot of unhappy customers and damage to the brand.

Generally it’s best to disallow search engines from crawling the development altogether with a robots.txt file. If it’s too late you can go into Google Webmaster Tools and request that the site be removed from its index under the “Remove URL” tab.

2. Extensive Error Pages

Error pages come in several varieties and all of them can damage SEO.

404 not found errors do not negatively impact SEO per se – Google will not downgrade a site based on how many 404 errors its crawlers encounter. But they can impact SEO indirectly by making for a horrible user experience. 404s usually inspire visitors to bounce back to the SERPs immediately. This tells Google that the site does not include valuable content and it will be downgraded as a result.

Other errors, such as 500 internal server errors and 403 forbidden errors do impact SEO directly. Thankfully these errors are conveniently listed in Webmaster Tools and can usually be fixed with a simple redirect or request to remove the URL.

3. Duplicate Footers

A footer with the exact same content at the bottom of every page can be bad for SEO because it constitutes duplicate content. Google loves unique content and will downgrade a site if it has the same content on every page.

If possible, try rewriting that content so that it’s unique on every page. Or move the information in the footer to its own dedicated page. Or, if it absolutely must be in every page, try putting that information in an image.

4. Site Speed

How quickly a website responds to web requests affects how well it ranks in Google. If your site is slow, make an effort to speed it up. Aside from the outright SEO benefits, increasing site speed makes users happy (which has its own SEO benefits). Plus visitors tend to spend more time on sites that are fast.

5. Keyword Stuffing

Thanks to an algorithm update by the name of Panda, Google has been cracking down on pages that are stuffed with keywords. Make sure whoever writes content for your website isn’t over-optimizing. It’s also worth reviewing old content written before the rules changed. Content that was effective in 2005 can be deadly now. If you find any offending pages, rewrite and re-optimize them for the 2013 search world.

6. Spammy Links

The same thing goes for a site’s link profile. Google is now cracking down on links from spammy sites and over-optimized anchor text. Even if a site has been strictly practicing white hat SEO for the past few years, there could still be old links that are hurting the site now. Engaging in some very targeted link un-building could boost the site’s performance.

7. Broken Links

Speaking of links, it’s not just the spammy ones that can negatively impact SEO. Search engines also punish sites with dead links, not to mention the fact that they make for a terrible user experience. Broken links are links, either internal or external, that lead to a page that no longer exists or no longer includes the relevant content. The links should either be changed to point to the correct content or deleted. Thankfully tools that identify broken links make the task easier.

8. Robots.txt File

A robots.txt file is useful in disallowing crawlers from accessing parts of your site you don’t want search engines to see. However, if the robots.txt file has been configured so that it’s accidentally disallowing access to parts of the site that you do want to be indexed, it can be disastrous for your SEO. Manually check your robots.txt file and make sure there aren’t any messages in Webmaster Tools telling you Googlebot is having trouble accessing your site.

9. Incorrect, Out of Date Sitemap

Sitemaps help crawlers discover the content on your site. However, there are a variety of problems that could prevent a site from receiving the full benefit of a sitemap. A sitemap needs to be a well-formed XML documents that follows a certain protocol. If it’s not in this correct format search engines might have trouble processing it. Furthermore, you should make sure it’s been submitted to Webmaster Tools. It also needs to be up to date – all of the pages in the sitemap need to show up in the site crawl and vice versa.

10. URL-Based Duplicate Content

Sometimes a site will have different URLs (not redirected) pointing to either the same page or pages that are virtually identical, such as the same product page ranked by different criteria. Google will register this as two different pages and the site as guilty of duplicate content. As everyone knows, duplicate content comes with major SEO costs. Consider specifying a canonical page so that Google only indexes one version.

There you have it, ten little-known factors that can impact SEO. Though these problems often fly under the radar, fixing them could well result in better search engine rankings.

Leave a comment

Your email address will not be published. Required fields are marked *