Last modified: 2014-02-10 09:13:02 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 24647 - Job queue memory usage
Job queue memory usage
Status: NEW
Product: MediaWiki
Classification: Unclassified
JobQueue (Other open bugs)
All All
: Low normal (vote)
: ---
Assigned To: Nobody - You can work on this!
Depends on:
  Show dependency treegraph
Reported: 2010-08-03 18:57 UTC by Siebrand Mazeland
Modified: 2014-02-10 09:13 UTC (History)
5 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Description Siebrand Mazeland 2010-08-03 18:57:52 UTC
I haven't been able to pinpoint this exactly, but something is wrong with the job queue. I've experienced this for example while it's processing edit jobs for ReplaceText, but also when changing a template that's used for a few hundred times, as well as when using the Translate extension script 'fuzzy.php'.

In the cases of Translate and ReplaceText actual edits are made, for a template change, other stuff happens that I'm not really up to speed with (link table updates?).

Generally at some point in time the script will exit with "Allowed memory size exhausted" (at about 150M), even though, as far as I know, command line scripts shouldn't have a default memory limit (?).

I've once looked into this, and found that memory usage of the job queue keeps growing with each job that it handles. I forgot about the actual number, but it was many megabytes per job then (12M comes to mind - probably depends on the exact wiki properties). All attempts to do "garbage collection" setting references to null to reduce memory usage failed.
Comment 1 Siebrand Mazeland 2010-08-03 19:03:30 UTC
The 150M memory limit is hard coded in runJobs.php, but that's not really the issue, obviously. It's the growing memory usage.
Comment 2 Max Semenik 2012-12-21 21:44:41 UTC
Is this still an issue? As far as I remember TWN used PHP 5.2 then while PHP 5.3 should have much better memory management.
Comment 3 Nemo 2014-02-10 09:13:02 UTC
(In reply to comment #0)
> Generally at some point in time the script will exit with "Allowed memory
> size
> exhausted" (at about 150M)

Heh, good old times. Nowadays, even 550M is not enough: see bug 58969, bug 60844. No idea what to do with this bug, it's now superseded by MediaWiki's memory hogs being standard it seems?

Note You need to log in before you can comment on or make changes to this bug.