Kentico CMS 7: SEO Improvements

Search Engine Optimization is one of the most crucial aspects of running a successful website. Kentico CMS 7 contains several improvements that make it easier to perform operations that have an impact on SEO. This post provides a basic overview of the new features and settings.
Kentico CMS 7 adds many new settings and general enhancements that help create a SEO-friendly website without complicated configuration. Many of the following features are already supported in the current version, but they don’t provide quite as many options, aren’t as straightforward and in some cases require customization.

Review of HTTP response codes

If a page isn’t successfully serving content, the returned HTTP response code shouldn’t be 200. It should always correspond with the current state (Page not found – 404, Site offline – 503, Access denied – 403 etc.), so that applications and web crawlers can correctly interpret the situation. All status-pages have been reviewed and now return the appropriate response code.

Document-based 404 pages

Even though Kentico CMS 6 has partial support for 404 error pages specified in the content tree, the recommended approach is to create a physical page in the file system. This scenario however has limitations when it comes to instances with multiple sites and multilingual websites. In version 7, 404 pages can be defined in the content tree just like any other document, with all the advantages that Kentico CMS offers.

Accessible URLs for deleted pages

There is a new option of specifying a replacement page when deleting documents. This means that one of your website’s other pages will be displayed whenever the URL of the deleted document is accessed. Of course, this can be combined with a permanent 301 redirection of the URL.

Consider a situation where a visitor opens a deleted page and sees a Page not found error instead of the expected content. In this case, the user will probably lose interest and leave the website to look somewhere else. But if you display alternative content that is similar to the original document, or at least offers an explanation and further options, the visitor will often stay.

Avoiding duplicate content

One of the key rules of SEO is “Don’t create duplicate content”. This happens whenever multiple URLs are used to serve the same page and it can have a significant negative effect on the search ranking of the given page. Consider the following URLs that could be used for a website’s Home page:
  • etc.

All of these URLs lead to the same page. In Kentico CMS 7, you can set up the system to perform permanent redirects to the preferred Home page URL using a simple setting.

It is also possible to force consistency of the www domain prefix in URLs. You can choose to leave the domain as it was entered or have it rewritten to either always or never include the prefix.

Culture specific domains

Kentico CMS 6 already has support for language prefixes, which insert a custom subdirectory into the URL according to the current language. This offers a way to split multilingual content without having a negative effect on SEO. Version 7 extends the multilingual SEO support by providing a way to enforce separate domain names for individual culture versions of the website.
If this feature is enabled, the lang query string parameter and language prefix options are ignored. Visitors are always redirected to the appropriate domain when the language is changed. The domain-language relationships are specified by assigning the primary culture to the website’s main domain and additional ones to specific domain aliases.

XML Sitemap and robots.txt management

The XML Sitemap (Google sitemap) was optimized to generate unified URLs with dependence on all settings which affect SEO (domain, culture, format, etc.). Documents also have new settings for specifying the values of the optional tags in the XML Sitemap.

A new XML Sitemap web part allows you to create a custom sitemap page as a document in the content tree. This offers a way to specify exactly which pages should be included, and even modify the format of the output XML.

Robots.txt instructions can now be generated directly by the site’s content thanks to the new Custom response web part. The advantage over a regular text file is that you can use macros to get values dynamically, set a custom robots.txt response for every website on a multi-site instance, and edit the content directly without having to access the file system.
While user-generated content can be a good way to improve SEO, it also carries risks in some cases. In version 7, you can choose to instruct search engine crawlers not to follow links posted by users on forums, message boards or in blog comments. The system will then automatically include the rel="nofollow" attribute in the output code of all such link tags.

Using this precaution can prevent spammers from damaging the search ranking of your website by posting a large numbers of links leading to unrelated content, and it also stops them from passing page rank to other sites.

Search crawler reports

Kentico CMS 7 brings a new web analytics statistic for tracking search crawlers. You can review your search engine optimization by monitoring what types of crawlers visit your website and how many pages are crawled.

This statistic is also available for individual documents, so you can see in detail how often a given page or its specific version is actually crawled.

More details and examples will be available in the documentation for version 7. In future posts, I would like to expand on this and provide more specific information on how to create SEO friendly websites in Kentico CMS.
Share this article on   LinkedIn

Jakub Oczko

Jakub is a Chief Technology Officer at Kentico.


Jakub Oczko commented on

>> Is there a way how to avoid duplicated content in version 6.0?

The first think you have to do is identify why you have duplicate content to be able to solve the issue.

Version 6.0 have the following options:

For the changed documents you can use docuement aliases:

For URL parameters you can use canonical links:

Hope it helps,


Johny Lee commented on

Is there a way how to avoid duplicated content in version 6.0?

seoonlinejaipur commented on

Hi! Just a quick note to let you know that your blog has proven to be of great value to what I am working on right now. Thanks!

Jakub Oczko commented on


The problem is that we can't check the slashes. The URL is available for .NET application as (the slashes are removed within the IIS processing). You can use IIS Rewriter to remove the slashes or specify canonical link on the pages. The question is where are these links generated? I can imagine the situation that some web sites have link to your web site with this type of URL but is there any other scenario how these links are served to the search crawler?

XML sitemap web part allows you to filter the displayed content in similar way as a document viewers. You can define specific sitemap for different situations (e.g. multiple web sites or cultures) without any changes in aspx code.


Rhino Car Hire commented on


What about setting for redirecting double slash issues in urls.

For example : shows the same page and could be duplicate content in search engines.

Also, will the new xml sitemap settings allow document types other than menuitem to be listed. Currently have to manually add the class names to the aspx file which is a bit backwards ;-)

Jakub Oczko commented on

Hi Georgio,

I agree, these features are crucial for successful SEO support. This is a reason, why we improved the way how to setup it in version 7. What are you missing in Kentico CMS regarding the SEO support if you compare it with other CMS systems?

The URL with GUID and specific prefix (getmedia/getattachment/getdocument etc.) is generated if you have enabled permanent URLs. You can disable it in Site manager -> Settings -> URLs & SEO -> Use permanent URLs.

If you mean meta keywords, there is no plan to extend the current functionality, because keywords meta tag isn't important for major search engines. But if you need this meta tag, you can use macros to generate keyword values dynamically. If you mean keyword analysis for the pages, we are planning SEO Tools with support for keyword analysis.

The designer for robots.txt isn't planned. You can edit it in plain text and use macros to add some kind of automation. But if we get a feedback that you need it, we will consider it. You can use to suggest this feature.

SEO Tools (incl. SEO checker) won't be in version 7 but we are planning this feature in some of next versions or release it separately.



Georgino commented on


these "new" features are MUST features for any CMS...Look at open source CMS, they have built-in such features.

What about better names for images/documents referenced via /getattachment and other functions. Links like this one are "very SEO friendly";;?pf=1

How do you helo me with Keywords? Any AI will be very benefitial.

Are you planning a designer for robots.txt or just simple plain text editor?

Any SEO Checker tools that checks my page/website against SEO recommendet rulez?

Mohamed Abd El Gawad commented on

very good features

Roel de Bruijn commented on

Can't wait! Can't wait. This is great news!!

Jeroen Fürst commented on

Loving these new settings! Will be a great summer :)