Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt? Blind Five Year Old, SF, CA
You are here: Home / Tampa Internet Marketing - Google Optimization Tips / Can I use robots.txt to optimize Googlebot’s crawl?