Last modified: 2011-03-13 18:04:26 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 14425 - Allow multiple categories for categorymembers list
Allow multiple categories for categorymembers list
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
API (Other open bugs)
unspecified
All All
: Lowest enhancement (vote)
: ---
Assigned To: Roan Kattouw
http://th.wikipedia.org/w/api.php
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2008-06-06 02:28 UTC by Jutiphan
Modified: 2011-03-13 18:04 UTC (History)
3 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Jutiphan 2008-06-06 02:28:34 UTC
In list=categorymembers function, it would be great to add support to cmtitle to handle more than one category. This would be useful when requesting members from multiple categories at once. The possible scenario I would use this for to to query list of categories to be checked for tagging WikiProject banners.
Comment 1 Roan Kattouw 2008-07-07 12:33:34 UTC
This can't be implemented efficiently, because we sort results by category name first, then by sortkey, then by page ID. That means cmtitles=Category:Foo|Category:Bar will first list all members of Category:Foo, then all members of Category:Bar, meaning it's no better than just doing two queries.
Comment 2 aaron brick 2008-11-17 23:54:16 UTC
roan, an inefficient implementation would still help me a lot, so i hope you might reconsider.

i am writing an application which, for a given article, acquires lists of peer articles in some of its categories. without paging, this means making one asynchronous HTTP connection per category, which breaks badly on "Amtrak" (with over 60 categories, many of which have almost no members). the order the contents were returned is less important than a big cut in the number of connections i need to make.

by executing action=query&generator=categories&prop=categoryinfo in advance, i can pick out which categories have suitable numbers of pages in them to proceed to the member listing step. this means that a category with hundreds of members could come last in the list, giving me access to a good subset of the category members in a single HTTP request (several if i page on through the results, but still not 60).

Comment 3 Roan Kattouw 2008-11-18 13:36:17 UTC
(In reply to comment #2)
> roan, an inefficient implementation would still help me a lot, so i hope you
> might reconsider.
I meant inefficient on the server side. So I'm sure it would help you a lot, but it would also overload the database servers.

 
> i am writing an application which, for a given article, acquires lists of peer
> articles in some of its categories. without paging, this means making one
> asynchronous HTTP connection per category, which breaks badly on "Amtrak" (with
> over 60 categories, many of which have almost no members). the order the
> contents were returned is less important than a big cut in the number of
> connections i need to make.
I understand that ordering isn't important to everyone, but it's needed to make query-continue possible.

> by executing action=query&generator=categories&prop=categoryinfo in advance, i
> can pick out which categories have suitable numbers of pages in them to proceed
> to the member listing step. this means that a category with hundreds of members
> could come last in the list, giving me access to a good subset of the category
> members in a single HTTP request (several if i page on through the results, but
> still not 60).
> 

Sounds like a good solution.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links