Google Webmaster Tools - Fetch Error

Eric Garrison asked on September 1, 2015 04:17

We keep getting an issue where Google Webmaster tools are detecting and error fetching our Robots.txt file. We can hit the file fine, but when we even do the Fetch as Google, it comes back with an error. It is not just 1 site, we seem to get this from a lot. Has anyone else had this issue. We do have many sites on one server.

Version 8.2

Recent Answers


Brenden Kehren answered on September 1, 2015 04:28

As long as you're setting the location of the robots.txt file in the settings for each site you should be fine. Are you using the content tree in each site to serve up the robots.txt file or do you have a robots.txt file in the root of the websites file system?

0 votesVote for this answer Mark as a Correct answer

Eric Garrison answered on September 1, 2015 04:33

I set the robots.txt path in settings. Here is the link to the file http://www.griffin-realtors.com/robots.txt

Humans / browsers can view it.

0 votesVote for this answer Mark as a Correct answer

David te Kloese answered on September 1, 2015 21:00

Hi,

I think it might 'crash' on the wildcard (not even all bots support it) but I believe it has to start with a / (slash)

as in Disallow: /*.axd

if that doesn't help you could perhaps try the Google robots.txt Tester can help you identify if thats the problem: https://support.google.com/webmasters/answer/6062598?hl=en&ref_topic=6061961&vid=1-635767305003500591-2056396079

It can only be used by the owner of the domain :)

1 votesVote for this answer Mark as a Correct answer

   Please, sign in to be able to submit a new answer.