As you get busy handling the different elements of the medical SEO of your practice, you could be overlooking more basic but crucial aspects such as duplicate content in the pages of your site. By that we mean content in two or more pages of your site targeting the same keyword theme.
Why You Shouldn’t Have Duplicated Content
The problem with duplicated content is that you create competition for your own content. In other words, if you’ve got two pages with content targeting the same keyword theme, you obviously take away the undivided attention of the search engine’s algorithms. Both these pages decrease in uniqueness in terms of their relevance to search engines, because it gets confusing for them to determine which of your two pages to rank for a given keyword or search term.
When multiple pages get internally linked with the same keyword theme, the links which should all be strengthening a single page instead have to support multiple pages, but all of them weakly, because of which none of these pages gain superiority. This again messes with the signals transmitting from a site to the search engine’s algorithms. This obviously affects the ranking ability of your site and brings it further down the rankings for the keyword – not what you want after all the effort you’ve put in.
So, as we’ve seen, duplicate content or content in multiple pages targeting the same keyword theme results in the pages of your site losing relevance, since both deal with the same keyword or search term and the link authority gets diluted since the links need to support multiple pages.
Tackling Duplicated Content
There are many ways to deal with duplicated content, but just removing the pages is not always the preferred option. That’s because those pages could be having important information that your readers need to see. Despite the duplication, there’s at least a bit of link authority for each indexed URL. Killing the pages causes the link authority to get wasted, which hurts the SEO performance of your site.
Then you’ve got to ask yourself whether you really need to get rid of the pages from your site due to any legal issues? If not, would you wish to eliminate the page just for the search engines but keep it for the shoppers?
301 redirect is one of the foremost options you can check out. The 301 redirect can help remove duplicate content since only this scan redirect the user as well as the search engine bot while passing the link authority of the older page to the new URL and carrying out de-indexing of the old URL. It’s important to note that the 301 redirect can’t be ignored by the search engines since it is a command rather than a request.
302 redirects are like the 301 redirect. 302 redirects pass link authority, and while 301 redirects also do the same and are more effective in it, 302 redirects work for temporary redirects when you want the page to come back. With a 302 redirect, Google won’t de-index the page so you can make it visible again.
There are also canonical tags by which users can continue to see the page. You need developer support to implement them. However, they consume lesser server resources and require much less testing, though Google can ignore them too.
Google Index > Remove URLs
For temporary exclusion of the page from Google’s search results, you can head to the Google Search Console in the Remove URLs tool. Your readers can still see the page on your site though Google will de-index it right away. But be warned that using this tool incorrectly can result in your entire site being de-indexed.
To remove a page completely from the viewership of both your readers and the search engine bots, you need to use the 301 redirect we mentioned earlier to redirect people visiting the page to a newer page, or take away the page from your servers which would cause users to see a 404 error message -“File not found”.
Robots.txt Disallow Commands and No-index Tags
Some of the less preferable options are robots.txt disallow commands and no-index tags, since they cause the link authority of your pages to be wasted. They tell search engines to not index the pages. However, search engines nowadays often consider these commands more like suggestions, particularly for content already indexed. So, these robots.txt disallow commands and no-index tags are more like hit and miss attempts. Even if search engines get to obey these commands, they take a good deal of time, even months, to take effect. These commands serve best as safety measures after the content is de-indexed, and they can prevent future indexation of content that isn’t yet indexed.
So, if you must kill a page, ensure that you have valid reasons for it. As we mentioned before, you need to determine whether you do it just for SEO reasons or legal factors.
Remember that link authority isn’t earned easily, so you need to ensure that you don’t waste it once you get even a bit of it. That’s why some SEO experts believe killing pages isn’t such a great idea. With relevant, informative and quality content you can optimize the internal linking structure of your site so that the authority moves around to where you need it. But increasing link authority from external sites is a harder proposition. Professional medical SEO companies can handle this for you.