Last modified: 2010-03-24 18:44:14 UTC
From mediawiki-l discussion about not viewing robots utf characters http://thread.gmane.org/gmane.org.wikimedia.mediawiki/33601 robots.txt served from wikimedia wikis should have content-type text/plain;charset=utf-8 not just text/plain robots.txt spec doesn't take content-types into account, no bot should be relying on a specific content-type. http://www.robotstxt.org/orig.html
Done.
Now it is Content-Type: text/html; charset=utf-8 The mime should have remained text/plain, not changed to text/html. The requested Content-Type is: text/plain; charset=utf-8
Grmpf. Cut&Paste-Error, please excuse. Should be fixed now.