Last modified: 2009-01-02 11:07:53 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 11330 - Improper use of WebRequest::getIntOrNull()
Improper use of WebRequest::getIntOrNull()
Status: RESOLVED FIXED
Product: MediaWiki
Classification: Unclassified
Parser (Other open bugs)
1.12.x
All All
: Low trivial (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2007-09-14 08:31 UTC by Emil Podlaszewski
Modified: 2009-01-02 11:07 UTC (History)
2 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
a patch (1.16 KB, patch)
2007-09-14 08:32 UTC, Emil Podlaszewski
Details
a cache hit ratio graph for an example wiki (29.13 KB, image/png)
2007-09-14 08:34 UTC, Emil Podlaszewski
Details

Description Emil Podlaszewski 2007-09-14 08:31:41 UTC
includes/RawPage.php (line 16):
$smaxage = $this->mRequest->getIntOrNull( 'smaxage', $wgSquidMaxage );

This doesn't correspond to getIntOrNull() function definition in includes/WebRequests.php which has only one parameter.

So if there is no 'smaxage' parameter in the web request then the $smaxage variable is null.

Later on in the code, if the raw page is not generated ('gen' request parameter is not set) then $smaxage is getting 0 which eventually  causes a "Cache-Control: private" header to be sent.


Is this a desired behavior? Wouldn't use of getInt( 'smaxage', $wgSquidMaxage ) be better here?


Attached is a patch and a result of it on one wiki where I tested it.
Comment 1 Emil Podlaszewski 2007-09-14 08:32:24 UTC
Created attachment 4108 [details]
a patch
Comment 2 Emil Podlaszewski 2007-09-14 08:34:03 UTC
Created attachment 4109 [details]
a cache hit ratio graph for an example wiki

After applying this patch the cache hit ratio jumped up from 65% to 80%
Comment 3 Brion Vibber 2008-02-13 01:27:44 UTC
It looks like the intention is to avoid a surprising s-maxage for things that aren't CSS or JS. Extra backend caching could surprise various bots and other tools trying to load pages via the raw interface.

r27456 adds some extra s-maxage forcing for CSS and JS pages that aren't made via the 'gen' option, which may partially obviate this patch.

Personally I think this whole thing is a big mess and probably needs a serious overhaul. :( :)
Comment 4 Aaron Schulz 2008-12-29 18:58:39 UTC
Fixed in r45160
Comment 5 Brion Vibber 2008-12-31 21:06:14 UTC
Reverted in r45251 -- this rev changes the behavior, forcing $smaxage to $wgSquidMaxage in cases where we would have previously ended up with $wgForcedRawSMaxage or 0. 
Comment 6 Aaron Schulz 2009-01-02 11:07:53 UTC
Misread this. The "default" param has no effect here.

Removed in r45312.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links