Letâs say you mailed a letter but there are 50 other people with the same name in the same city, considering thereâs no exact house number. Where would the letter get delivered? How will the postman know who is the correct addressee out of the 50 people? Not only will it confuse the poor postman, but also cause him a lot of distress.
Now if the postman is a search engine, and there is more than one web page with the same content, imagine its plight when it is trying to figure out which one to rank in search results. This is what causes a duplicity issue in SEO. In fact 29% of pages out of 200 million web crawls have duplicate content.
Duplication of content is simply the presence of the same content on multiple websites with different addresses (or URLs). Content duplication isnât always intentional. No website owner or developer would want to lose out on search engine rankings because their content is similar to content on another website.
Some of the common causes of duplicity are:
Suppose you find a great blog post on a website and share it on yours. Here you are copying that content on your website, which may not be ethically or morally wrong. But for a search engine, it means that there are multiple locations for the same content.
This is especially a problem that ecommerce websites face. Products from a particular manufacturer may be sold on many online stores but the product description is the same. Here, only a few prominent stores will get the business.
Siteliner and Copyscape are two tools that are commonly used to detect duplicity. While Siteliner checks a website for internal duplicity, Copyscape checks a websiteâs content for duplication with other websites.
At times, a slight change in the order of a URLâs parameter can create duplicate content. These parameters will not change a pageâs content. But for a search engine, these are two different URLs.
Similarly, while shopping online as well, the website gives you a session ID which is essentially a log of your activity on the website. Because of this, many systems end up using these session IDs for URLs.
Image Source: Moz
Search engine rankings of a website are adversely impacted when its URL has multiple parameters. These parameters end up creating many URLs with similar content that can confuse a crawler and affect the websiteâs proper indexing. For Google, non-parameter URLs have a better search ranking.
There are many sites that are found on both URLs www as well as non-www. Certain websites have two versions, one with http:// and one with https:// as prefixes. If both these versions of a website are live, they would get indexed separately by the search engines.
If you are looking for a successful SEO campaign, duplicity needs to be addressed. How can you avoid this issue of duplicity?
Redirecting is one of the best ways to deal with content duplicity. A 301 redirect means that the page has permanently moved from the âduplicateâ to the âoriginalâ page.
This will solve the problem of a competition among multiple pages with a slight URL variation. Websites with âhttps://â and âhttp://,â or www and non-www prefix will get automatically get integrated to the same location.
Image source: Moz
The ârel canonicalâ tag is nothing but a way to tell the search engine that there is no difference between the two URLs. This tag indicates to the search engine that Page X is a duplicate of the original Page Y and for all future purposes of ranking and content, Page Y should be considered.
The ârel canonicalâ tag is placed in a web pageâs HTML head and it should be added to every copied version of the web page.
Image Source: HiTechWork
Create unique and high-quality content for your web pages to avoid the duplication problem. If you have a product description from a manufacturer, write new ones and use original pictures that will avoid the overlapping with other ecommerce sites.
This is time-consuming. But in the long run, it is important that your website stands out. The unique content will automatically be picked by the search engine, which will boost the websiteâs ranking.
To avoid the issue of duplication through session IDs, just disable them from your system settings. For URL variations, make sure that the script is built with the same parameter order to avoid duplicity.
Google Search Console has a URL Parameters tool that helps create search engine friendly URLs. It basically helps improve the search engine appearance of the website. Using proper words, punctuation, and cookies to prevent creation of session IDs are some simple ways to resolve this issue.
Image Source: Hallam
In case you are copying content from another website and wish to avoid the duplication issue, simply add the link to the original either at the beginning or the end of web page. This can be done when you are syndicating content. Here, ensure that the syndicating website (say RSS) links back to the original site.
ConclusionYou donât need to get spooked by content duplicity. Many times it happens accidentally. But there are also times when website owners take content from some other website and put it on theirs after slight changes. This also leads to duplicity, even though you might think that the language is different.
While ranking a website, a search engine will consider how much content is copied, which content was seen first, and which website has more strength. When a search engine finds a web page with copied content, not just the ranking of the page gets affected, but the site is also highlighted as a non-reliable source and its quality points get taken off.
Duplicity is a big pitfall when it comes to improving a websiteâs search ranking but it is highly fixable. Once this issue is resolved, you can keep a track of the improvement that your websiteâs search engine ranking has seen over time.
But keeping a constant watch on a websiteâs performance is difficult. And for that you can use Pro Rank Tracker. This tool gives you regular updates about your websiteâs ranking before and after you solve its duplicity issues so you can efficiently keep track of your growth.
Search engines donât have any exceptions while trying to gauge the value of a content. They purely follow set algorithms. To get noticed by the search engine, fix the duplicity problem immediately. A good search engine ranking means good traffic which will drive up you ROI.
Well,thank you for sharing great content. It is really helpful post not only for me but also everybody that's why I also would like to share my view on the same. could you help me out.WHAT IS A CANONICAL ISSUE IN SEO? HOW TO FIX THESE PROBLEMS
ReplyDelete