Last modified: 2010-07-18 21:55:36 UTC
Assumption: 1. 'wgNamespaceRobotPolicies' => array( NS_USER_TALK => 'noindex,follow' ) like von dewiki Testcase: 1. Create page [[de:Benutzer Diskussion:Raymond/test für robots]] From HTML source: <meta name="robots" content="noindex,follow" /> This is OK 2. Move page to NS0: [[de:Test für robots]]. From HTML source: <meta name="robots" content="noindex,nofollow" /> Completly wrong
The problem is even bigger: Every article-page in depw has '<meta name="robots" content="noindex,nofollow" />' in it as fas as I see – moved from user-namespace before or not.
According to another comment of an user, every article has noindex,nofollow if read as a logged in user. That could make sense, but seems to be a bit overkill to me. Do we expect spiders to run as logged in users?
Does it matter? Doesn't seem bad being a bit paranoid.