Last modified: 2010-05-15 15:41:09 UTC
In DefaultSettings.php you should document this better:
* Robot policies for namespaces
* e.g. $wgNamespaceRobotPolicies = array( NS_TALK => 'noindex' );
$wgNamespaceRobotPolicies = array();
else one is hard pressed to figure out what other things one can use
other than noindex. The array we can figure out from a nearby hint
about Language.php, but not the choices other than noindex, nor their
Document them here or say what other .php file to see.
Don't just hope the user will use a search engine or check around
meta.mediawiki.org, as that makes getting this simple answer dependent
on having a working and external network connection, even before
production time (i.e., at mere examining and testing the software
Wait, mention parts of http://meta.wikimedia.org/wiki/Robots.txt like
The only way to keep a URL out of Google's index is to let Google
crawl the page and see a meta tag specifying robots="noindex".
Although this meta tag is already present on the edit page HTML
template, Google does not spider the edit pages (because they are
forbidden by robots.txt) and therefore does not see the meta tag.
Wait, SpecialPage.php already has
$wgOut->setRobotPolicy( "noindex,nofollow" );
so users will wonder why tinkering with NS_SPECIAL is futile.
(Should be noted in bug 8338.)
OK, enough for one day.
I would expect that a user installing a piece of web server software and wishing
to enable custom robot policies for specific namespaces ("meta tags") would
either know what options are available, or else be able to research this.
Please be concise and to the point in bug reports, not chatty. If something
occurs to you midway through writing the report, go back and amend it, don't add
it on the end.
I added some comments in r18810