Last modified: 2010-05-06 16:58:57 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T24807, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 22807 - Increase $wgExpensiveParserFunctionLimit to 1000 in pt.wiktionary
Increase $wgExpensiveParserFunctionLimit to 1000 in pt.wiktionary
Status: RESOLVED INVALID
Product: Wikimedia
Classification: Unclassified
Site requests (Other open bugs)
unspecified
All All
: Normal trivial (vote)
: ---
Assigned To: Nobody - You can work on this!
: shell
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2010-03-11 18:24 UTC by Malafaya
Modified: 2010-05-06 16:58 UTC (History)
3 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Malafaya 2010-03-11 18:24:37 UTC
Currently the limit for $wgExpensiveParserFunctionLimit is 500 at pt.wiktionary.
We need this to be around 1000, which is still acceptable, because we currently have some 6 pages which issue between 500 and 800 calls.
Comment 1 Mike.lifeguard 2010-04-26 07:19:35 UTC
On what basis are you asserting that 1000 is "still acceptable"? Surely that's a call for a sysadmin to make...
Comment 2 Max Semenik 2010-04-26 17:13:16 UTC
If you have just several pages that exceed this limit, fixing them would make much more sense.
Comment 3 Malafaya 2010-04-27 17:46:28 UTC
Mike, as we do have only half a dozen pages in this situation, I believe that it is acceptable. We're just doubling up for these exceptions. Bare in mind I'm making this assertion uniquely for this wiki, and not Wikimedia-wide.

Max, if they were fixable, I would fix them. :)

Ideally, bug:22808 is the best solution: caching #ifexist's and/or not counting duplicates for the 500 threshold.
Comment 4 Mike.lifeguard 2010-05-02 20:56:39 UTC
(In reply to comment #3)
> Mike, as we do have only half a dozen pages in this situation

If you have only a half-dozen pages, it isn't so much work to fix them.
Comment 5 Platonides 2010-05-02 21:34:47 UTC
Maybe you could provide the links to such pages and explain why they can't be made less expensive?
Comment 6 Malafaya 2010-05-04 15:07:26 UTC
Example in the following category:
http://pt.wiktionary.org/wiki/Categoria:P%C3%A1ginas_com_demasiadas_chamadas_a_fun%C3%A7%C3%B5es_exigentes

As these pages have a lot of translations (in excess of 500) and each translation require an #ifexist, they are considered expensive, even if most of the #ifexist calls are duplicated (so, in fact, there are some 200/300 unique #ifexist calls).
This is because the text of the language does not necessarily correspond to the link. This is accomplished with an call to #ifexist with a template parameter based on the language code. If that template exists, then it is used instead of the direct language name.
Comment 7 Platonides 2010-05-04 15:13:09 UTC
#ifexist should perform its queries as a linkbatch, but it would require a bit of work to move things around.
Comment 8 Malafaya 2010-05-05 10:22:51 UTC
This is not needed anymore.
I reformed the template logic behind the translations.
Thanks nevertheless.
(Hmmm, it seems I can only mark this bug as Assigned or Resolved. Isn't there a Delete/Drop/WontFix choice?)
Comment 9 p858snake 2010-05-05 10:49:29 UTC
Apparently it's fixed... WONTFIXED. Reopen if its not.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links