What to Do with Duplicate Content

Mar 2


Maria Wixman

Maria Wixman

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

When you decide to make a website, one aspect that should always be considered is that in the all important world of the search engines duplicate content is penalized, the penalty being negative filtering which will hugely affect your page ranking and friendliness to the search engines.


The filtering process acts like a sieve and on many occasion accidental duplicate content gets filtered along with all the intentional multi-listed web pages looking for higher rankings.

What is Duplicate Content?

In order to understand what to do with duplicate content,What to Do with Duplicate Content Articles we need to ascertain exactly what is considered duplicate content. There are primarily 4 types of duplicate content which get filtered out:

Scraped Content – This is where content is taken from a web page and repackaged in a way to look different, but really it is just duplicate content. With the increasing popularity of blogging, scraping is becoming more and more of a problem

Article Distribution – If an article is republished all across the internet this is necessarily a good thing because whereas Yahoo and MSN will determine where the source of the article and will rank that as the highest, Google, the largest of the search engines may not.

Websites with Identical Pages – websites that are identical to other websites is considered Spam and filtered appropriately. Affiliate sites suffer greatly from this as do websites with doorway pages.

Ecommerce websites with product descriptions – It is still considered Spam when duplicate product descriptions are posted on the many ecommerce websites.

There are some other non-malicious forms of duplicate content which can be:

Pages created for mobile devices

Web pages that are printer-only versions

Store items shown or linked via multiple distinct URLs

All the above can be considered as duplicate content by the search engines and whether malicious or not there is a very good chance that they will be seen as spam and penalized in the rankings. If this is the case there are some things you can do when you make a website to help overcome this

Redirect – It is possible to use 301 redirect which informs search engine spiders that the domain being accessed has been "permanently moved" to a different location. This will be your main URL. Using a 301 redirect will redirect your visitors and search engines automatically to the new URL

Be Careful Where You Syndicate – If you have content that you syndicate or gets syndicated on other websites make sure that a link back to the original article is included in the websites that you syndicate the content in. Google will always show the version it thinks is appropriate in the search rankings so it is good practice to request those that syndicate your content to use a no index meta tag.

Understand the CMS – the CMS is the content management system and it is a good idea to understand how the content is displayed on the website, especially if it is a blog or forum.

Canonicalization - This is the process in converting data that has more than one possible representation into a standard canonical for. Examples of canonicalization could be:



This is a great way to avoid duplicate content and keep the friendliness going with the search engines.

Keep the content original – easy to say but often very similar content is viewed as duplicate content and treated as spam. Try and make the pages unique and if there two pages which are similar then look at ways to merge them both into one page

Avoid publishing stubs: - Avoid placeholders where possible as users do not like to see empty pages. For example, pages for which do not have actual content yet should not be published. If placeholder pages need to be created, then ensure noindex meta tags are used to prevent these pages being indexed and affecting the search engines.

Duplicate Content can happen very easily and does not have to be intentionally done, but if it does happen there is a very good chance it will affect how the rankings and search engine friendliness. Once you have decided to make a website, by taking certain actions as well as remaining vigilant you can ensure that any problems with duplicate can be resolved without having to rewrite or delete.