 Today, I'll be answering Anthony's question regarding robots.txt. Should I disallow asterix.css, disallow php.ini, or even disallow.htxs? Thanks. No. I can't disallow you from disallowing those files, but that sounds like a bad idea. You mentioned a few special cases, so let's take a look. Asterix.css would block all CSS files. We need to be able to access CSS files so that we can properly render your pages. This is critical so that we can recognize when a page is mobile-friendly, for example. CSS files generally won't get indexed on their own, but we need to be able to crawl them. You also mentioned php.ini. This is a configuration file for php. In general, this file should be locked down or in a special location so that nobody can access it. And if nobody can access it, then that includes Googlebot too. So again, no need to disallow crawling of that. Finally, you mentioned .htxs. This is a special control file that can't be accessed externally by default. Like other lockdown files, you don't need to explicitly disallow it from crawling, since it can't be accessed at all. My recommendation is not to just reuse someone else's robot sex file and assume it'll work. Instead, think about which parts of your site you really don't want to have crawled and just disallow crawling of those. I hope that answers your question, and stay tuned until the next episode of Ask Google Webmasters.