Last modified: 2014-11-09 20:58:40 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T66489, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 64489 - Clean up limit handling (data/api.py)
Clean up limit handling (data/api.py)
Status: NEW
Product: Pywikibot
Classification: Unclassified
General (Other open bugs)
core-(2.0)
All All
: Unprioritized major
: ---
Assigned To: Pywikipedia bugs
:
Depends on: 64997 65013
Blocks: 56190
  Show dependency treegraph
 
Reported: 2014-04-26 20:03 UTC by Merlijn van Deen (test)
Modified: 2014-11-09 20:58 UTC (History)
2 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Merlijn van Deen (test) 2014-04-26 20:03:09 UTC
QueryGenerator has a very complicated structure to handle request limit handling; some of it is currently working by accident, e.g.:

new_limit = None
(...)
new_limit = min(new_limit, self.api_limit // 10, 250) 

Now, new_limit is what? None!
(at least, on python 2 -- this returns an error in python 3)

Which is actually required as some requests don't allow limits, such as

https://en.wikipedia.org/w/api.php?maxlag=5&pageids=1969218|225758|38275275|34684|34550&uiprop=hasmsg|blockinfo&rvprop=ids|flags|timestamp|comment|content&prop=revisions|info|categoryinfo&meta=userinfo&action=query&format=json

adding &rvlimit=50 returns

error: {
code: "rvmultpages",
info: "titles, pageids or a generator was used to supply multiple pages, but the limit, startid, endid, dirNewer, user, excludeuser, start and end parameters may only be used on a single page."
}

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links