Last modified: 2013-02-25 09:46:32 UTC
Currently, if [[MediaWiki:Robots.txt]] exists, it is used as this wiki's robots.txt. However, this method blocks the way for developers to introduce centralised changes to robots.txt, such as blocking of new spiders. Therefore, I propose to build it from two parts: centralised one, containing user agent rules and prohibition from indexing /w/, and content of MediaWiki:Robots.txt, if present. Centralised part could be accessible to shell users, or be mirrored from Meta, like www portals currently do.
http://meta.wikimedia.org/wiki/Template:Robots.txt would (imo) be a logical location.
No, it would likely be located at the default location, not on-wiki. The whole point is that the common part will be set by sysadmins, not end-users. They can edit the per-wiki part at MediaWiki:Robots.txt already.
Done.
For reference, this is documented at [[wikitech:robots.txt]].