Last modified: 2009-01-02 11:07:53 UTC
includes/RawPage.php (line 16):
$smaxage = $this->mRequest->getIntOrNull( 'smaxage', $wgSquidMaxage );
This doesn't correspond to getIntOrNull() function definition in includes/WebRequests.php which has only one parameter.
So if there is no 'smaxage' parameter in the web request then the $smaxage variable is null.
Later on in the code, if the raw page is not generated ('gen' request parameter is not set) then $smaxage is getting 0 which eventually causes a "Cache-Control: private" header to be sent.
Is this a desired behavior? Wouldn't use of getInt( 'smaxage', $wgSquidMaxage ) be better here?
Attached is a patch and a result of it on one wiki where I tested it.
Created attachment 4108 [details]
Created attachment 4109 [details]
a cache hit ratio graph for an example wiki
After applying this patch the cache hit ratio jumped up from 65% to 80%
It looks like the intention is to avoid a surprising s-maxage for things that aren't CSS or JS. Extra backend caching could surprise various bots and other tools trying to load pages via the raw interface.
r27456 adds some extra s-maxage forcing for CSS and JS pages that aren't made via the 'gen' option, which may partially obviate this patch.
Personally I think this whole thing is a big mess and probably needs a serious overhaul. :( :)
Fixed in r45160
Reverted in r45251 -- this rev changes the behavior, forcing $smaxage to $wgSquidMaxage in cases where we would have previously ended up with $wgForcedRawSMaxage or 0.
Misread this. The "default" param has no effect here.
Removed in r45312.