Last modified: 2011-03-13 18:04:31 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T3730, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 1730 - mediawiki suddenly started causing a memory limit error even if i increase the limit
mediawiki suddenly started causing a memory limit error even if i increase th...
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
General/Unknown (Other open bugs)
1.4.x
PC Linux
: Lowest major (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2005-03-21 17:50 UTC by peter green
Modified: 2011-03-13 18:04 UTC (History)
0 users

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
Test case (24.03 KB, application/x-compressed-tar)
2005-05-31 10:09 UTC, Michael (Micksa) Slade
Details

Description peter green 2005-03-21 17:50:29 UTC
mediawiki was running just fine on my site until i edited 

http://www.autistichat.net/wiki/MediaWiki:Copyright

then it stopped sending any data back to the browser.

i got the following message in the apache error log

Allowed memory size of 20971520 bytes exhausted (tried to allocate 12228096 bytes)

i turned off $wgUseDatabaseMessages and that broght the wiki back to life but
left it in a rather nasty state.

i tried increasing the memory setting in Localsettings.php to 30 from 20 but
that just changed the error in the apache error log to

Allowed memory size of 31457280 bytes exhausted (tried to allocate 24456192 bytes)

does anyone have any idea what could cause this or how to fix it? Im willing to
take steps to track down the bug but first i need to know what those steps are.

it looks to me its getting into some kind of infinite loop alocating memory
until it runs out.

The system is debian woody btw.
Comment 1 peter green 2005-03-21 18:48:35 UTC
it seems to have something to do with the messages loading from the database and
its apparently not plain volume of text

[18:42] <plugwash> mysql> select sum(length(cur_text)) from cur;
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> | sum(length(cur_text)) |
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> |                 54795 |
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> 1 row in set (0.00 sec)
[18:42] <plugwash> mysql> select sum(length(cur_text)) from cur where
cur_namespace=8;
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> | sum(length(cur_text)) |
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> |                 44296 |
[18:42] <plugwash> +-----------------------+
[18:42] <plugwash> 1 row in set (0.00 sec)
<--some other chat on other topics snipped-->
[18:43] <plugwash> does that look normal to you?
[18:43] <dammit> yes, absolutely normal.
Comment 2 peter green 2005-03-21 18:59:42 UTC
commenting out the following lines in the get function of
includes/MessageCache.php seems to make it work

if( $this->mDeferred ) {
        $this->load();
}
Comment 3 Daniel Beyer 2005-04-05 20:27:59 UTC
(In reply to comment #2)
> commenting out the following lines in the get function of
> includes/MessageCache.php seems to make it work

Will this Problem be solved in Mediawiki 1.4.1? On my server is also a debian
woody running. The Problem is exactly the same as described before.
Comment 4 Brion Vibber 2005-05-30 11:45:40 UTC
Another person ran into this on IRC. I was able to replicate the 
problem on a local Debian Woody installation with a partial copy of 
their database, and isolated the giant memory allocation to the 
gzinflate() call in SqlBagOStuff::_unserialize() when loading the 
objectcache record for the messages.

This seems to be a bug in either the old version of PHP or the old 
version of zlib in Woody; if I dump the compressed data and try to 
read it back with a minimal PHP script, it fails this way on Woody 
(claims to want to allocate a huge amount of memory, hitting the 8mb 
memory limit) but works fine on a modern Ubuntu install with that 
same 8mb memory limit.

As a workaround, you can disable compression in the object cache; 
comment out the bits to use gzdeflate and gzinflate and let it deal 
in raw text.
Comment 5 Michael (Micksa) Slade 2005-05-31 10:09:49 UTC
Created attachment 582 [details]
Test case

This tarball, created by brion, demonstrates the bug.
Known to fail in debian woody and succeed in at least ubuntu breezy.
Comment 6 Antoine "hashar" Musso (WMF) 2005-07-12 20:00:31 UTC
Upstream fix.

Debian 3.1 out since June 6th 2005, that should solve the issue.
Comment 7 peter green 2005-07-12 21:04:43 UTC
changing resoloution to WONTFIX since this bug hasn't actually been fixed by us
its just that woody is no longer worth caring about.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links