We keep getting an issue where Google Webmaster tools are detecting and error fetching our Robots.txt file. We can hit the file fine, but when we even do the Fetch as Google, it comes back with an error. It is not just 1 site, we seem to get this from a lot. Has anyone else had this issue. We do have many sites on one server.
Version 8.2
As long as you're setting the location of the robots.txt file in the settings for each site you should be fine. Are you using the content tree in each site to serve up the robots.txt file or do you have a robots.txt file in the root of the websites file system?
I set the robots.txt path in settings. Here is the link to the file http://www.griffin-realtors.com/robots.txt
Humans / browsers can view it.
Hi,
I think it might 'crash' on the wildcard (not even all bots support it) but I believe it has to start with a / (slash)
as in Disallow: /*.axd
if that doesn't help you could perhaps try the Google robots.txt Tester can help you identify if thats the problem: https://support.google.com/webmasters/answer/6062598?hl=en&ref_topic=6061961&vid=1-635767305003500591-2056396079
It can only be used by the owner of the domain :)
Please, sign in to be able to submit a new answer.