Last modified: 2014-02-10 09:13:02 UTC
I haven't been able to pinpoint this exactly, but something is wrong with the job queue. I've experienced this for example while it's processing edit jobs for ReplaceText, but also when changing a template that's used for a few hundred times, as well as when using the Translate extension script 'fuzzy.php'.
In the cases of Translate and ReplaceText actual edits are made, for a template change, other stuff happens that I'm not really up to speed with (link table updates?).
Generally at some point in time the script will exit with "Allowed memory size exhausted" (at translatewiki.net about 150M), even though, as far as I know, command line scripts shouldn't have a default memory limit (?).
I've once looked into this, and found that memory usage of the job queue keeps growing with each job that it handles. I forgot about the actual number, but it was many megabytes per job then (12M comes to mind - probably depends on the exact wiki properties). All attempts to do "garbage collection" setting references to null to reduce memory usage failed.
The 150M memory limit is hard coded in runJobs.php, but that's not really the issue, obviously. It's the growing memory usage.
Is this still an issue? As far as I remember TWN used PHP 5.2 then while PHP 5.3 should have much better memory management.
(In reply to comment #0)
> Generally at some point in time the script will exit with "Allowed memory
> exhausted" (at translatewiki.net about 150M)
Heh, good old times. Nowadays, even 550M is not enough: see bug 58969, bug 60844. No idea what to do with this bug, it's now superseded by MediaWiki's memory hogs being standard it seems?