Hi, I am very new to Kentico and I have been recently employed by a start up company to look after the digital aspects and the website.
The website was launched last week, and although the development agency had created the sitemap and the robots.txt file, I am having some problems within Google Search Console.
When I test the robots.txt file, it seems to allow traffic :
User-agent: * Disallow: /Admin/ Allow: /
Is the text within our robots.txt file good practice, because when I attempt to test the sitemap I get the following error message :
Sitemap contains urls which are blocked by robots.txt.
Showing 339 warnings, which I imagine is the entire site.
Any ideas on where we are going wrong?
Kind Regards
Anthony
My couple cents. According to google specification robots.txt is case sensitive.Check if that cause the issue. Secondly the way it is done it is probably better to remove it completely.You allow everything except the admin folder, but this folder is protected anyway so your robots.txt is kinda useless in such configuration. Thirdly you can check your sitemap.xml file on your server to check if there is any URLs with '/admin' path. But honestly speaking there is nothing criminal with robots.txt that looks like this
user-agent: * disallow: /admin/ allow: /
Not a 100% expert but just removing the allow should do the trick.
Hi David
Thanks for getting back to me so quickly.
I removed the allow tag as suggested, but I am still having the same problem.
Could there be anything else blocking the robots.txt file?
Hi Peter
Thanks for this, I have run the sitemap test now and it is no longer showing any errors.
I do appreciate both yourself and David in getting back to me so quickly and helping out.
Please, sign in to be able to submit a new answer.