Last modified: 2009-08-28 16:38:42 UTC
Max. limits can be and are changed on the server-side, like recently for list=search; bots can log off e.g. (5000=>500); and rate limiting etc. is planned, thus let the API output fall back to the maximally allowed limit instead of doing nothing at all if the given limit is too big! Example (bots > *10): * http://als.wikipedia.org/w/api.php?action=query&list=search&srwhat=text&srsearch=kategorie&srlimit=500 formerly worked. Imagine a ("simple") script using this URL, it has to be changed to "50" now, or it will fatally fail. Of course it could look for <error code="srlimit" info="srlimit may not be over 50 (set to 500) for users" xml:space="preserve"> and adapt its URL, but if it isn't too hard to do, just fall back to the max. limit on the server side automatically, if possible. Maybe create a parameter &limitfallback=1 (e.g.) to let the server do so, some scripts may rely on a specific count if *from is given. *Otherwise* mention somewhere in the API documentation that allowed limits can (and will) change (e.g. during server attacks/problems, or like for list=search) and mention that a helping error code will appear on top then. So that people know that quick-and-dirty scripts will just fail in those cases ;-)
Done in r55652