CloudFlare Not Sending If-Modified-Since Headers
From my understanding and experience it seems to me perhaps you've misunderstood how CDN caching works:
From the example you've given, the CDN would not ask your web-server when the file was last modified because you've already told it that the file has expired and will therefore need to be re-fetched anyway.
Web browsers will only send an
If-Modified-Since
header if the page/resource has been previously cached by the web-browser and came with aLast-Modified
header when first requested.If you send an
E-Tag
header as in your example then the handling of this is not consistent across browsers and you may end up with anIf-None-Match
header in the subsequent request from the browser instead of anIf-Modified-Since
.
CDN's however do not typically behave the same as web-browsers, in some ways more similar to proxy servers or web accelerators:
If a CDN-cached page has expired, when next requested by a web-browser, the CDN will simply fetch a fresh copy from the web-server, update the CDN-cached copy and forward the resource back to the web-browser in the HTTP response. There would normally be no requirement for
If-Modified-Since
headers reaching your web-server at all, though my testing has revealed that CloudFlare do still issue this header.To reduce your server load, enable the CDN to provide a level of caching beyond that already done by the web browsers, by using the
s-maxage
directive. Similar to themax-age
directive on yourCache-Control
header, however this one will be observed by CDN's (and similar services) in precedence over any other headers, while themax-age
will be observed by the web-browser clients.header( 'Cache-Control: public, max-age=20, s-maxage=60' );
If you use this header, the first request from a web-browser will be a CDN MISS but be cached then at both the web-browser and CDN. After the first 20 seconds, the web-browser cached copy will expire. If the page is then reloaded, the CDN will HIT for a further 40 seconds returning a copy from CDN-cache to the web-browser in the HTTP response. Sixty seconds after the first request the CDN-cache has expired and a subsequent request will be a CDN EXPIRED but otherwise be treated the same as a CDN MISS taking a fresh copy from your web-server, and so the loop sequence would continue.
A more production-ready header, for web-browser caching of 1 hour, and CDN caching of 6 hours might look like:
header( 'Cache-Control: public, max-age=3600, s-maxage=21600' );
If you ever needed to publish an update faster than the 6 hour CDN expiry you can always tell the CDN to take a fresh cache from their web-based control panel. With this header each resource would only be fetched from your web-server 4 times a day while the CDN handles all the bulk of the web traffic.
Edited on 11-Nov-2014 @ 12:45pm UTC-0:
On a side note that might also have an impact, there are issues with your PHP code that may be impacting on its correct operation - your ETag and Expires header lines of code when I test them produce the following headers:
ETag: W/""
Expires: Thu, 01 Jan 1970 00:00:20 GMT
Perhaps instead try these lines in your test so that you can see your sent headers too:
<?php
$iClientCacheSecs = 20;
$iProxyCacheSecs = 60;
$dtNow = time();
$dtExpires = strtotime( sprintf( '+%s seconds', $iClientCacheSecs ));
$aHeaders = array();
$aHeaders[] = 'ETag: ' . $dtNow;
$aHeaders[] = 'Expires: ' . date( 'r', $dtExpires );
$aHeaders[] = 'Last-Modified: ' . date( 'r', $dtNow );
$aHeaders[] = sprintf( 'Cache-Control: public, max-age=%s, s-maxage=%s',
$iClientCacheSecs, $iProxyCacheSecs );
foreach( $aHeaders as $sHeader ) header( $sHeader );
echo( 'Now: ' . date( 'r', $dtNow ) . '<br />' );
foreach( $aHeaders as $sHeader ) echo( $sHeader . '<br />' );
echo( '<hr />' );
foreach( $_SERVER as $sParam => $sValue ) {
if(( strpos( $sParam, 'HTTP_CF' )) !== false )
echo( $sParam . ': ' . $sValue . '<br />' );
if(( strpos( $sParam, 'HTTP_IF' )) !== false )
echo( $sParam . ': ' . $sValue . '<br />' );
}