Last modified: 2014-05-05 11:33:13 UTC
Hello, I've recently upgraded from MediaWiki 1.19 to MediaWiki 1.20.3. I'm getting PHP errors "Allowed memory size of ... bytes exhaused" on some pages with a lot of templates (in includes/parser/Preprocessor_DOM.php on line 1029). I understand that it's perfectly normal, however with MediaWiki 1.19 I had my memory_limit set to 64M, and it worked perfectly; I've increased this value to 128M but some pages still fail to render (the same pages that rendered OK before the upgrade with only 64M). Maybe there were some changes to parser or preprocessor, which could account for this? A memory leak somewhere. The pages in question are about 200K of wiki code and produce 1-2Mb of HTML, however they contain some heavy templates with >10 parameters each (multiple invocations of en.wikipedia.org/wiki/Template:Chess_diagram). I have a feeling that every template invocation is being kept in memory, even when this is not really needed. If this is the issue, this has to be optimized.
Please provide more info about exact pages and the setup (database version etc). Also see http://www.mediawiki.org/wiki/Manual:How_to_debug
Created attachment 11929 [details] An article that can't be rendered with 128M memory_limit (XML, for Special:Import)
Created attachment 11930 [details] The MediaWiki debug log
Created attachment 11931 [details] Nginx error log (with PHP message "Allowed memory size of ... bytes exhausted")
The database is MySQL 5.5. $wgMainCacheType = CACHE_ACCEL; $wgParserCacheType = CACHE_DBA; # db4
Note the unusual amount of operations with images in mediawiki.log. The template includes only two images. An the article includes this templates, say, 200 times. Looks like MediaWiki does something with them 400 times (instead of 2 times).
Created attachment 11932 [details] The MediaWiki debug log for another page (which parsed successfully but took almost 64M) Note the amount of memory used in Parser::braceSubstitution and PPFrame_DOM::expand.
(a strategic suggestion) Even if we assume that the memory usage is what it should be: why should one increase the PHP memory_limit just because it is not enough to parse 10-20 pages on the wiki? (which all other pages requiring much less memory) MediaWiki should predict the cases of high memory usage and handle it (by creating a temporary file, for example) instead of letting PHP crash.