I’m getting errors from Google Webmaster Tools about the Googlebot crawler being unable to fetch my robots.txt 50% of the time (but I can fetch it with 100% success rate from various other hosts). (On a plain old nginx server and an mit.edu host.) Yang, Palo Alto, CA Fetch as Google: support.google.com Have a question? Ask it in our Webmaster Help Forum: groups.google.com Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: twitter.com More videos: www.youtube.com Webmaster Central Blog: googlewebmastercentral.blogspot.com Webmaster Central: www.google.com
You are here: Home / Tampa Internet Marketing - Google Optimization Tips / Why might Googlebot get errors when trying to access my robots.txt file?