Last modified: 2011-03-13 18:04:31 UTC
mediawiki was running just fine on my site until i edited http://www.autistichat.net/wiki/MediaWiki:Copyright then it stopped sending any data back to the browser. i got the following message in the apache error log Allowed memory size of 20971520 bytes exhausted (tried to allocate 12228096 bytes) i turned off $wgUseDatabaseMessages and that broght the wiki back to life but left it in a rather nasty state. i tried increasing the memory setting in Localsettings.php to 30 from 20 but that just changed the error in the apache error log to Allowed memory size of 31457280 bytes exhausted (tried to allocate 24456192 bytes) does anyone have any idea what could cause this or how to fix it? Im willing to take steps to track down the bug but first i need to know what those steps are. it looks to me its getting into some kind of infinite loop alocating memory until it runs out. The system is debian woody btw.
it seems to have something to do with the messages loading from the database and its apparently not plain volume of text [18:42] <plugwash> mysql> select sum(length(cur_text)) from cur; [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> | sum(length(cur_text)) | [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> | 54795 | [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> 1 row in set (0.00 sec) [18:42] <plugwash> mysql> select sum(length(cur_text)) from cur where cur_namespace=8; [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> | sum(length(cur_text)) | [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> | 44296 | [18:42] <plugwash> +-----------------------+ [18:42] <plugwash> 1 row in set (0.00 sec) <--some other chat on other topics snipped--> [18:43] <plugwash> does that look normal to you? [18:43] <dammit> yes, absolutely normal.
commenting out the following lines in the get function of includes/MessageCache.php seems to make it work if( $this->mDeferred ) { $this->load(); }
(In reply to comment #2) > commenting out the following lines in the get function of > includes/MessageCache.php seems to make it work Will this Problem be solved in Mediawiki 1.4.1? On my server is also a debian woody running. The Problem is exactly the same as described before.
Another person ran into this on IRC. I was able to replicate the problem on a local Debian Woody installation with a partial copy of their database, and isolated the giant memory allocation to the gzinflate() call in SqlBagOStuff::_unserialize() when loading the objectcache record for the messages. This seems to be a bug in either the old version of PHP or the old version of zlib in Woody; if I dump the compressed data and try to read it back with a minimal PHP script, it fails this way on Woody (claims to want to allocate a huge amount of memory, hitting the 8mb memory limit) but works fine on a modern Ubuntu install with that same 8mb memory limit. As a workaround, you can disable compression in the object cache; comment out the bits to use gzdeflate and gzinflate and let it deal in raw text.
Created attachment 582 [details] Test case This tarball, created by brion, demonstrates the bug. Known to fail in debian woody and succeed in at least ubuntu breezy.
Upstream fix. Debian 3.1 out since June 6th 2005, that should solve the issue.
changing resoloution to WONTFIX since this bug hasn't actually been fixed by us its just that woody is no longer worth caring about.