Website Usability: How You Can Prevent Duplicate Content In Your Website And Improve Website Usability

This post was written by Internet Marketing John on October 10, 2010
Posted Under: Website Usability

Learn how you can improve your website usability and prevent duplicate content in your website by simply paying attention to the internal web pages in your site that can cause you problems.

There are certain types of web pages in all websites that have the potential for inadvertently causing duplicate content indexing by the search engines.

These innocent looking web pages are usually generated to attract more website visitors and are common to almost all websites. They can be harmful and cause you problems when indexed if you don’t take preventative measures beforehand.

There is an excellent chance that featured web pages of the following types are already being indexed on your website.

  • Print pages
  • Tell a Friend Pages
  • Internal Search Pages
  • Filtered Navigation Pages
  • Admin Pages
  • Some Squeeze Pages

Although filtered navigation is a convenient feature that can be useful to your web site visitors by helping them use navigation menus to locate your products instead of just relying on searches; filtered navigation can be indexed and cause duplicate content issues.

The same is true for “tell a friend” or any of the above pages.

Although “tell a friend” is convenient and useful to have on your website for humans; because there is only an email address input field and a send button, it is not at all important to the search engines.

You can improve your website usability by checking to see if any of your web pages are being indexed using the following query:

SITE:YOURDOMAIN.COM INURL:TELL-A-FRIEND

All of your URLs showing “tell-a-friend” will be shown when using this query.

If every product you sell has the “tell-a-friend” option on it’s own page, a new URL will be generated for every product on your website. Although this is a convenience for your visitors, it simply duplicates every product page on your site for no reason.

There are several methods you can use to improve website usability and prevent duplicate content, by excluding pages like these from being indexed by the search engines.

The best way to keep these pages from the search engines is to not link to them.  This can be done by using JavaScript.

Because the search engines don’t execute much JavaScript; for faceted navigation this is probably the easiest method to prevent your web pages from being indexed.

You can also use meta instructions, specifically the “robots” and “nofollow” or “follow” meta tags, for pages you don’t want indexed.

If you don’t want to have a page indexed, add the meta tag “robots” to the page and add the “noindex” instruction to the meta to prevent the page from being indexed.

Add “follow” to pages that have already been indexed by the search engines. This prevents them from being re-indexed.

<meta name=”robots” content=”noindex, follow”>

Another effective method that will improve website usability, is to set exclusion rules in robots.txt to exclude pages.

Search engines are supposed to strictly adhere to these rules and when they are added to disallow indexing of all “tell-a-friend.php” pages; they normally will avoid touching the excluded pages.

You can avoid duplicate content issues in your website, improve your website usability by using the above rules, and still give your visitors the convenience they require, by excluding these innocent looking web pages from being indexed by the search engines.

It's only fair to share....Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInPin on PinterestDigg thisShare on StumbleUpon

Add a Comment

required, use real name
required, will not be published
optional, your blog address