Last modified: 2006-03-30 03:24:10 UTC
Some people complained on :fr that their names appear in Google in our requests for deletion, the robots.txt seems to be correctly set (see Bug 4776) but it doesn't seem to work. # fr: Disallow: /wiki/Wikip%C3%A9dia:Pages_%C3%A0_supprimer/ For example, try "metametris" on google.com, the third or fourth link starts with the disallowed URL. Could there be a problem related to encoding ?
No summary appears for the page in the google results; this is consistent with Google's treatment of robots.txt-disallowed pages. Unfortunately you'll have to complain to Google if you don't like that; that's just how they do things. The URL still appears in their index, it's just not spidered and the content isn't available for search or cache.