I'm getting 404 errors for sitemap.xml and robots.txt when I have Browser Cache turned on. I'm using the latest versions of W3TC and WordPress, and version 4.0b11 of Arne Brachold's Google XML Sitemaps plugin (http://www.arnebrachhold.de/projects/wordpress-plugins/google-xml-sitemaps-generator/). My site is running on Nginx 1.5.2. I'm using the configuration generated by W3TC for Nginx.
The default 404 exception rules for W3TC are in use:
robots\.txt
sitemap(_index)?\.xml(\.gz)?
[a-z0-9_\-]+-sitemap([0-9]+)?\.xml(\.gz)?
Here is the result of a curl header check for sitemap.xml:
$ curl -I http://christiaanconover.com/sitemap.xml
HTTP/1.1 404 Not Found
Server: nginx/1.5.2
Date: Thu, 04 Jul 2013 00:47:38 GMT
Content-Type: application/xml; charset=utf-8
Connection: keep-alive
X-Pingback: http://christiaanconover.com/xmlrpc.php
X-Powered-By: W3 Total Cache/0.9.2.11
X-W3TC-Minify: On
X-Robots-Tag: noindex
Turning off Browser Cache then yields the following headers:
$ curl -I http://christiaanconover.com/sitemap.xml
HTTP/1.1 200 OK
Server: nginx/1.5.2
Date: Thu, 04 Jul 2013 00:48:05 GMT
Content-Type: application/xml; charset=utf-8
Connection: keep-alive
X-Powered-By: PHP/5.4.9-4ubuntu2.1
X-Pingback: http://christiaanconover.com/xmlrpc.php
X-W3TC-Minify: On
X-Robots-Tag: noindex
Any suggestions?