Last modified: 2013-02-25 09:45:49 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 13249 - Create MediaWiki:Robots.txt
Create MediaWiki:Robots.txt
Product: MediaWiki
Classification: Unclassified
General/Unknown (Other open bugs)
All All
: Lowest enhancement (vote)
: ---
Assigned To: Nobody - You can work on this!
Depends on:
  Show dependency treegraph
Reported: 2008-03-04 18:56 UTC by Danny B.
Modified: 2013-02-25 09:45 UTC (History)
2 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Description Danny B. 2008-03-04 18:56:01 UTC
There should be robots.txt regularly editable message so setting of it won't be depending on developers, but admins could edit it anytime needed.

.htaccess could then have line like
RewriteRule robots.txt /w/index.php?title=MediaWiki:Robots.txt&action=raw&ctype=text/plain
Comment 1 Daniel Friesen 2008-03-04 20:32:27 UTC
Firstly, if you're already doing something as ugly as rewriting to the Message in the way you talk about, then there is no point to adding it to the interface because all you need to do is paste the text of the current robots.txt there, nothing inside of MediaWiki is needed for that.

Secondly, text/plain is not a &ctype= which is valid, &ctype= only uses text/x-wiki, text/css, and text/javascript, so that will actually be served with text/x-wiki, not text/plain and that may cause issues with some browsers.
Comment 2 Brion Vibber 2008-03-05 00:38:09 UTC
robots.txt is pretty low-level and honestly I'd just not be comfortable with leaving things open to this degree.

There has been some discussion now and again of adding a relatively easy way to set the meta robots tags for individual pages; perhaps as a special form of protection. Offhand I can't find relevant bugzilla entries to point this one to, but probably they're in there somewhere. :)

We'd be much more likely to go ahead and implement that sort of thing than a raw robots.txt editor.
Comment 3 bdk 2008-03-06 02:29:25 UTC
(In reply to comment #2)

see bug 8068 and bug 9415, and, as an example request: bug 10648
Comment 4 Danny B. 2008-03-06 16:56:57 UTC
(In reply to comment #2)
> We'd be much more likely to go ahead and implement that sort of thing than a
> raw robots.txt editor.

OK, so I tried to change the summary for something expressing the issue more generally.

Maybe __NOINDEX__ magic word could work then?
Comment 5 Daniel Friesen 2008-03-06 17:21:00 UTC
Actually, while Rob Church says that there should be no reason for a User to make his userpage not index, on a publicly viewable wiki.

IMHO, having a __NOINDEX__ which would perhaps even add to a generated [[Special:Robots.txt]] which someone could use a quick rewrite to (It would work in either mod_rewrite, or even alias) would be an even nicer way of keeping Wikimedia's RfA and AfD pages out of search engines than going and having a bunch of Bugzilla requests for adding them to the core robots.txt.
Comment 6 bdk 2008-03-07 05:16:28 UTC
(In reply to comment #4)
> OK, so I tried to change the summary for something expressing the issue more
> generally.
Please note the links in comment #3 and see bug 9415 for such a request. 
(please do not duplicate bugs; I changed back the summary therefore)

> Maybe __NOINDEX__ magic word could work then?
Again, please see comment #3 and bug 8068 for that suggestion.
Comment 7 Brion Vibber 2008-03-07 20:11:23 UTC
Reopened bug 8068 for further consideration. Resolving this one WONTFIX for the original issue.
Comment 8 MZMcBride 2013-02-25 09:45:49 UTC
This was resolved on Wikimedia wikis by bug 15601 (cf. [[wikitech:robots.txt]]).

Note You need to log in before you can comment on or make changes to this bug.