Last modified: 2011-03-13 17:46:07 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T16419, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 14419 - Expansion of robots.txt
Expansion of robots.txt
Status: RESOLVED WONTFIX
Product: Wikimedia
Classification: Unclassified
Site requests (Other open bugs)
unspecified
All All
: Lowest enhancement with 2 votes (vote)
: ---
Assigned To: Nobody - You can work on this!
: shell
Depends on:
Blocks: robots.txt
  Show dependency treegraph
 
Reported: 2008-06-05 19:59 UTC by Bence Damokos
Modified: 2011-03-13 17:46 UTC (History)
4 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Bence Damokos 2008-06-05 19:59:16 UTC
Please include in the robots.txt the following lines, so that Hungarian Wikipedia's deletion discussions are not indexed:
Disallow: /wiki/Wikipédia:Törlésre javasolt lapok
Disallow: /wiki/Wikipédia:Törlésre javasolt lapok/

Thank you,
Comment 1 Daniel Tar 2008-08-18 22:35:13 UTC
What about these two lines? It would be a very big help, as there were many complains about the top position of these deletion requests in Google.
Comment 2 darklama 2008-08-18 22:38:45 UTC
This is probably a WONTFIX. Use the new __NOINDEX__ magic word inside a template used on all those pages to achieve the desired behavior.
Comment 3 Daniel Tar 2008-08-18 23:43:29 UTC
Why a wiki workaround with templates and magic words better that inserting two, t-w-o lines of code to robots.txt? I think the second version is 1) use less resources (not as many as turning off for example the thumbnail gererator, but less), 2) not need of continuous inserting of templates.
Comment 4 Aryeh Gregor (not reading bugmail, please e-mail directly) 2008-08-18 23:51:22 UTC
I don't think there are plans to freeze robots.txt and require communities to use __NOINDEX__.  However, this kind of request gets done very slowly.  You would just have to add __NOINDEX__ to [[hu:Sablon:Törlés teteje]] to make almost all of them noindexed right away (assuming that's used consistently).
Comment 5 JeLuF 2008-09-12 23:56:04 UTC
You can create a custom robots.txt by editing the page Mediawiki:robots.txt on your wiki.
Comment 6 Mark A. Hershberger 2011-03-13 17:46:07 UTC
Changing all WONTFIX high priority bugs to lowest priority (no mail should be generated since I turned it off for this.)

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links