How To Avoid Having Duplicate Content On Your Website Caused By Overuse Of Tags

Nothing says search engine kryptonite like duplicate website content. Since Google begin to push the boundaries of intelligent search results, duplicate content has been relegated to its rightful place.

If you have made sure your web pages carry original content, you’re safe.

Well, almost.

A lot of web masters are surprised to know that post tags are another way to mess up their site’s search ranking potential. They are not as generic as categories so it only makes sense to streamline your content into super specific tags right? Wrong.

There is one glitch – the problem of URL duplication. Using tags conspicuously adds a URL to your site which means the same content appears on two different pages. Something like this:


This can cause search engines to penalize your site over time taking it off the search results altogether. Ooh scary!

Here’s how to effectively avoid the problem caused by overuse of tags and have a website that makes SEO sense:

Use tags sparingly

This is pretty simple. Tags can sure help you streamline content but they can damage your SEO potential as well. And don’t forget, unlike categories, tags do NOT provide direct value to users. They do not work like hashtags on social media. Your content has everything the search engine needs for indexing.

Avoid using more than 2 tags on a post. A huge trail of tags on a page makes it seem tackier. Even Matt Cutts, Google’s once search engine engineer doesn’t use tags. That’s more incentive to use them less.

Instead, have a well thought out category structure for your page. Anything between 5-8 categories is the golden standard and help you better streamline your content before even making posts. If you’re behind the best search ranking possible, do away with too many tags.


“But what about all the posts I’ve already used a lot of tags for” you say. One option is a 301 redirect. 301 redirects take visitors directly to the new home of the content on the page it is used on. If you ever need to change file names or delete pages, a 301 is the safest option to avoid dead links.

In case of pages that open in a list of posts under a specific tag, a 301 will redirect to the page you want the URL to go. Search engines identify a 301 as “moved permanently.” It, however, has a possibility of losing you some precious SEO points if overused.

Go Canonical

Separate URLs direct to identical pages instead of just the original one. A 301 may need you to talk to your developer if there are too many pages to do. This is where the rel=canonical comes in. All you need to do is go to the identical pages and add a canonical tag pointing to the original page in the head section. Like this:

The tag simply tells search engines to treat this page as a copy of the URL mentioned. This is one way to take care of the problem without wasting development time. Plus, it saves you precious SEO juice that would otherwise be lost.

Meta Robots

Another way to fight the page duplication menace caused by tags is meta robots. Here’s the code sample:

But be sure to use the noindex/follow value for the content. Meta robots is actually an instruction to search engines. It tells the engines not to index a certain page, in this case the duplicate page but still follow it.

This is much more beneficial SEO wise when compared to blocking duplicate pages through robots.txt (another common way to eliminate duplicate content on your site). The search engines know which page to show but also use juice from all the links that direct to an identical page. This is what the “follow” value indicates.

There’s isn’t a lot tags could do these days. And from the end user point of view, they don’t make much sense vis-à-vis categories. And if tags are causing serious SEO problems like URL duplication, its time to rethink how you use them for your website.

Leave a Reply

Your email address will not be published. Required fields are marked *